Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Tobiszewski, Marek; Orłowski, Aleksander
2015-03-27
The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Laboratory Workhorse: The Analytical Balance.
ERIC Educational Resources Information Center
Clark, Douglas W.
1979-01-01
This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)
Determining a carbohydrate profile for Hansenula polymorpha
NASA Technical Reports Server (NTRS)
Petersen, G. R.
1985-01-01
The determination of the levels of carbohydrates in the yeast Hansenula polymorpha required the development of new analytical procedures. Existing fractionation and analytical methods were adapted to deal with the problems involved with the lysis of whole cells. Using these new procedures, the complete carbohydrate profiles of H. polymorpha and selected mutant strains were determined and shown to correlate favourably with previously published results.
NASA Technical Reports Server (NTRS)
Snow, L. Dale
1996-01-01
Dextroamphetamine has potential as a pharmacologic agent for the alleviation of two common health effects associated with microgravity. As an adjuvant to Space Motion Sickness (SMS) medication, dextroamphetamine can enhance treatment efficacy by reducing undesirable Central Nervous System (CNS) side effects of SMS medications. Secondly, dextroamphetamine may be useful for the prevention of symptoms of post-mission orthostatic intolerance caused by cardiovascular deconditioning during spaceflight. There is interest in developing an intranasal delivery form of dextroamphetanmine for use as a countermeasure in microgravity conditions. Development of this dosage form will require an analytical detection method with sensitivity in the low ng range (1 to 100 ng/mL). During the 1995 Summer Faculty Fellowship Program, two analytical methods were developed and evaluated for their suitability as quantitative procedures for dextroamphetamine in studies of product stability, bioavailability assessment, and pharmacokinetic evaluation. In developing some of the analytical methods, beta-phenylethylamine, a primary amine structurally similar to dextroamphetamine, was used. The first analytical procedure to be evaluated involved hexane extraction and subsequent fluorescamine labeling of beta-phenylethylamine. The second analytical procedure to be evaluated involved quantitation of dextroamphetamine by an Enzyme-Linked ImmunoSorbent Assay (ELISA).
Microbial ecology laboratory procedures manual NASA/MSFC
NASA Technical Reports Server (NTRS)
Huff, Timothy L.
1990-01-01
An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.
ERIC Educational Resources Information Center
Cepriá, Gemma; Salvatella, Luis
2014-01-01
All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…
Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos
2018-06-01
To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.
A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions
Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.
2009-01-01
Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Discordance between net analyte signal theory and practical multivariate calibration.
Brown, Christopher D
2004-08-01
Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.
Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2015-09-07
In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.
Automated dynamic analytical model improvement for damped structures
NASA Technical Reports Server (NTRS)
Fuh, J. S.; Berman, A.
1985-01-01
A method is described to improve a linear nonproportionally damped analytical model of a structure. The procedure finds the smallest changes in the analytical model such that the improved model matches the measured modal parameters. Features of the method are: (1) ability to properly treat complex valued modal parameters of a damped system; (2) applicability to realistically large structural models; and (3) computationally efficiency without involving eigensolutions and inversion of a large matrix.
Comellas, L; Portillo, J L; Vaquero, M T
1993-12-24
A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.
High Throughput Determination of Ricinine Abrine and Alpha ...
Analytical Method This document provides the standard operating procedure for determination of ricinine (RIC), abrine (ABR), and α-amanitin (AMAN) in drinking water by isotope dilution liquid chromatography tandem mass spectrometry (LC/MS/MS). This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving one or a combination of these analytes.
Operator's manual on the visual-accumulation tube method for sedimentation analysis of sands
Colby, V.C.; Witzgman, F.W.
1958-01-01
The personnel who will be operating these units may have little or no previous knowledge of either the principles involved or the details of opera.ting procedure. This manual is intended as an aid to them in setting up the apparatus, learning the analytical procedure, interpreting the results, and understanding the primary principles encountered.
Cooking Potatoes: Experimentation and Mathematical Modeling.
ERIC Educational Resources Information Center
Chen, Xiao Dong
2002-01-01
Describes a laboratory activity involving a mathematical model of cooking potatoes that can be solved analytically. Highlights the microstructure aspects of the experiment. Provides the key aspects of the results, detailed background readings, laboratory procedures and data analyses. (MM)
In-Situ Analysis System for Correlated Electron Heterostructures
2014-11-20
semiconductor materials and elemental metals. Specifically, films must be pristine and ideally remain intact during analytical procedure [1]. In addition...involves a rather complex engineering design described below. A laser heater ! (a) ! (b) ! 1 Figure 1. (a) An empty Neocera’s sample holder rack...the center of the analytical chamber. (fiber-coupled, high-power 808 nm diode laser JOLD -100-CPXF-2P, Jenoptik), is free of such limitations because
Sajnóg, Adam; Hanć, Anetta; Koczorowski, Ryszard; Barałkiewicz, Danuta
2017-12-01
A new procedure for determination of elements derived from titanium implants and physiological elements in soft tissues by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is presented. The analytical procedure was developed which involved preparation of in-house matrix matched solid standards with analyte addition based on certified reference material (CRM) MODAS-4 Cormorant Tissue. Addition of gelatin, serving as a binding agent, essentially improved physical properties of standards. Performance of the analytical method was assayed and validated by calculating parameters like precision, detection limits, trueness and recovery of analyte addition using additional CRM - ERM-BB184 Bovine Muscle. Analyte addition was additionally confirmed by microwave digestion of solid standards and analysis by solution nebulization ICP-MS. The detection limits are in range 1.8μgg -1 to 450μgg -1 for Mn and Ca respectively. The precision values range from 7.3% to 42% for Al and Zn respectively. The estimated recoveries of analyte addition line within scope of 83%-153% for Mn and Cu respectively. Oral mucosa samples taken from patients treated with titanium dental implants were examined using developed analytical method. Standards and tissue samples were cryocut into 30µm thin sections. LA-ICP-MS allowed to obtain two-dimensional maps of distribution of elements in tested samples which revealed high content of Ti and Al derived from implants. Photographs from optical microscope displayed numerous particles with µm size in oral mucosa samples which suggests that they are residues from implantation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.
Interactive Management and Updating of Spatial Data Bases
NASA Technical Reports Server (NTRS)
French, P.; Taylor, M.
1982-01-01
The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.
De Neys, Wim
2006-06-01
Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).
Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A
2007-01-15
Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.
Development of analytic intermodal freight networks for use within a GIS
DOT National Transportation Integrated Search
1997-05-01
The paper discusses the practical issues involved in constructing intermodal freight networks that can be used within GIS platforms to support inter-regional freight routing and subsequent (for example, commodity flow) analysis. The procedures descri...
Student Receivables Management: Opportunities for Improved Practices.
ERIC Educational Resources Information Center
Jacquin, Jules C.; Goyal, Anil K.
1995-01-01
The college or university's business office can help reduce problems with student receivables through procedural review of the tuition revenue process, application of analytical methods, and improved operating practices. Admissions, financial aid, and billing offices must all be involved. (MSE)
Moderate severity heart failure does not involve a downregulation of myocardial fatty acid oxidation
2004-10-01
malonyl-CoA-sensitive form of carnitine palmitoyltransferase is not local - ized exclusively in the outer membrane of rat liver mitochondria . J Biol...for the isolation of fresh mitochondria , both subsarcolemmal and interfibrillar. Analytic methods. Detailed analytic methods have been previously cited...populations of mitochondria , the subsarcolemmal and inter- fibrillar, were isolated from hearts of normal and HF dogs using the procedure of Palmer et al
Sensitive Electroanalysis Using Solid Electrodes.
ERIC Educational Resources Information Center
Wang, Joseph
1982-01-01
A hydrodynamic modulation voltammetry (HMV) experiment involving use of simple hydrodynamic modulation procedures is described. Competing with time/equipment restrictions of most teaching laboratories (stopped-stirring and stopped-flow volumetry), students perform both batch and flow analyses and are introduced to analytical flow systems and the…
Sampling and Analysis Plan - Guidance and Template v.4 - General Projects - 04/2014
This Sampling and Analysis Plan (SAP) guidance and template is intended to assist organizations in documenting the procedural and analytical requirements for one-time, or time-limited, projects involving the collection of water, soil, sediment, or other
Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
Combining Heterogeneous Correlation Matrices: Simulation Analysis of Fixed-Effects Methods
ERIC Educational Resources Information Center
Hafdahl, Adam R.
2008-01-01
Monte Carlo studies of several fixed-effects methods for combining and comparing correlation matrices have shown that two refinements improve estimation and inference substantially. With rare exception, however, these simulations have involved homogeneous data analyzed using conditional meta-analytic procedures. The present study builds on…
Diffusion of Super-Gaussian Profiles
ERIC Educational Resources Information Center
Rosenberg, C.-J.; Anderson, D.; Desaix, M.; Johannisson, P.; Lisak, M.
2007-01-01
The present analysis describes an analytically simple and systematic approximation procedure for modelling the free diffusive spreading of initially super-Gaussian profiles. The approach is based on a self-similar ansatz for the evolution of the diffusion profile, and the parameter functions involved in the modelling are determined by suitable…
Busatto, Zenaís; da Silva, Agnaldo Fernando Baldo; de Freitas, Osvaldo; Paschoal, Jonas Augusto Rizzato
2017-04-01
This paper describes the development of analytical methods for the quantification of albendazole (ABZ) in fish feed and ABZ and its main known metabolites (albendazole sulfoxide, albendazole sulfone and albendazole aminosulfone) in fish fillet employing LC-MS/MS. In order to assess the reliability of the analytical methods, evaluation was undertaken as recommended by related guides proposed by the Brazilian Ministry of Agriculture for analytical method validation. The calibration curve for ABZ quantification in feed showed adequate linearity (r > 0.99), precision (CV < 1.03%) and trueness ranging from 99% to 101%. The method for ABZ residues in fish fillet involving the QuEChERS technique for sample extraction had adequate linearity (r > 0.99) for all analytes, precision (CV < 13%) and trueness around 100%, with CCα < 122 ng g - 1 and CCβ < 145 ng g - 1 . Besides, by aiming to avoid the risk of ABZ leaching from feed into the aquatic environment during fish medication via the oral route, a promising procedure for drug incorporation in the feed involving coating feed pellets with ethyl cellulose polymer containing ABZ was also evaluated. The medicated feed had good homogeneity (CV < 3%) and a lower release of ABZ (< 0.2%) from feed to water when the medicated feed stayed in the water for up to 15 min.
Marcinkowska, Monika; Komorowicz, Izabela; Barałkiewicz, Danuta
2016-05-12
Analytical procedure dedicated for multielemental determination of toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples using high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-DRC-MS) technique was developed. Optimization of the detection and separation conditions was conducted. Dynamic reaction cell (DRC) with oxygen as a reaction gas was involved in the experiments. Obtained analytical signals for species separation were symmetrical, as studied by anion-exchange chromatography. Applied mobile phase consisted of 3 mM of EDTANa2 and 36 mM of ammonium nitrate. Full separation of species in the form of the following forms: H3AsO3, H2AsO4(-), SbO2(-), Sb(OH)6(-), CrO4(2-) was achieved in 15 min with use of gradient elution program. Detailed validation of analytical procedure proved the reliability of analytical measurements. The procedure was characterized by high precision in the range from 1.7% to 2.4%. Detection limits (LD) were 0.067 μg L(-1), 0.068 μg L(-1), 0.098 μg L(-1), 0.083 μg L(-1) and 0.038 μg L(-1) for As(III), As(V), Cr(VI), Sb(III) and Sb(V), respectively. Obtained recoveries confirmed the lack of interferences' influence on analytical signals as their values were in the range of 91%-110%. The applicability of the proposed procedure was tested on drinking water samples characterized by mineralization up to 650 mg L(-1). Copyright © 2016 Elsevier B.V. All rights reserved.
Parametric study of minimum converter loss in an energy-storage dc-to-dc converter
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1982-01-01
Through a combination of analytical and numerical minimization procedures, a converter design that results in the minimum total converter loss (including core loss, winding loss, capacitor and energy-storage-reactor loss, and various losses in the semiconductor switches) is obtained. Because the initial phase involves analytical minimization, the computation time required by the subsequent phase of numerical minimization is considerably reduced in this combination approach. The effects of various loss parameters on the optimum values of the design variables are also examined.
Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.
Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel
2015-01-01
Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.
Gas diffusion as a new fluidic unit operation for centrifugal microfluidic platforms.
Ymbern, Oriol; Sández, Natàlia; Calvo-López, Antonio; Puyol, Mar; Alonso-Chamarro, Julian
2014-03-07
A centrifugal microfluidic platform prototype with an integrated membrane for gas diffusion is presented for the first time. The centrifugal platform allows multiple and parallel analysis on a single disk and integrates at least ten independent microfluidic subunits, which allow both calibration and sample determination. It is constructed with a polymeric substrate material and it is designed to perform colorimetric determinations by the use of a simple miniaturized optical detection system. The determination of three different analytes, sulfur dioxide, nitrite and carbon dioxide, is carried out as a proof of concept of a versatile microfluidic system for the determination of analytes which involve a gas diffusion separation step during the analytical procedure.
Model transformations for state-space self-tuning control of multivariable stochastic systems
NASA Technical Reports Server (NTRS)
Shieh, Leang S.; Bao, Yuan L.; Coleman, Norman P.
1988-01-01
The design of self-tuning controllers for multivariable stochastic systems is considered analytically. A long-division technique for finding the similarity transformation matrix and transforming the estimated left MFD to the right MFD is developed; the derivation is given in detail, and the procedures involved are briefly characterized.
Quantifying the measurement uncertainty of results from environmental analytical methods.
Moser, J; Wegscheider, W; Sperka-Gottlieb, C
2001-07-01
The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.
NASA Technical Reports Server (NTRS)
Ehlers, F. E.; Sebastian, J. D.; Weatherill, W. H.
1979-01-01
Analytical and empirical studies of a finite difference method for the solution of the transonic flow about harmonically oscillating wings and airfoils are presented. The procedure is based on separating the velocity potential into steady and unsteady parts and linearizing the resulting unsteady equations for small disturbances. Since sinusoidal motion is assumed, the unsteady equation is independent of time. Three finite difference investigations are discussed including a new operator for mesh points with supersonic flow, the effects on relaxation solution convergence of adding a viscosity term to the original differential equation, and an alternate and relatively simple downstream boundary condition. A method is developed which uses a finite difference procedure over a limited inner region and an approximate analytical procedure for the remaining outer region. Two investigations concerned with three-dimensional flow are presented. The first is the development of an oblique coordinate system for swept and tapered wings. The second derives the additional terms required to make row relaxation solutions converge when mixed flow is present. A finite span flutter analysis procedure is described using the two-dimensional unsteady transonic program with a full three-dimensional steady velocity potential.
NASA Astrophysics Data System (ADS)
Wu, Peng; Zhang, Yunchang; Lv, Yi; Hou, Xiandeng
2006-12-01
A simple, low cost and highly sensitive method based on cloud point extraction (CPE) for separation/preconcentration and thermospray flame quartz furnace atomic absorption spectrometry was proposed for the determination of ultratrace cadmium in water and urine samples. The analytical procedure involved the formation of analyte-entrapped surfactant micelles by mixing the analyte solution with an ammonium pyrrolidinedithiocarbamate (APDC) solution and a Triton X-114 solution. When the temperature of the system was higher than the cloud point of Triton X-114, the complex of cadmium-PDC entered the surfactant-rich phase and thus separation of the analyte from the matrix was achieved. Under optimal chemical and instrumental conditions, the limit of detection was 0.04 μg/L for cadmium with a sample volume of 10 mL. The analytical results of cadmium in water and urine samples agreed well with those by ICP-MS.
Ontological Foundations for Tracking Data Quality through the Internet of Things.
Ceusters, Werner; Bona, Jonathan
2016-01-01
Amongst the positive outcomes expected from the Internet of Things for Health are longitudinal patient records that are more complete and less erroneous by complementing manual data entry with automatic data feeds from sensors. Unfortunately, devices are fallible too. Quality control procedures such as inspection, testing and maintenance can prevent devices from producing errors. The additional approach envisioned here is to establish constant data quality monitoring through analytics procedures on patient data that exploit not only the ontological principles ascribed to patients and their bodily features, but also to observation and measurement processes in which devices and patients participate, including the, perhaps erroneous, representations that are generated. Using existing realism-based ontologies, we propose a set of categories that analytics procedures should be able to reason with and highlight the importance of unique identification of not only patients, caregivers and devices, but of everything involved in those measurements. This approach supports the thesis that the majority of what tends to be viewed as 'metadata' are actually data about first-order entities.
Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A
2000-02-11
The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi
Hodgkinson, A.
1971-01-01
A better understanding of the physico-chemical principles underlying the formation of calculus has led to a need for more precise information on the chemical composition of stones. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi which is suitable for routine use is presented. The procedure involves five simple qualitative tests followed by the quantitative determination of calcium, magnesium, inorganic phosphate, and oxalate. These data are used to calculate the composition of the stone in terms of calcium oxalate, apatite, and magnesium ammonium phosphate. Analytical results and derived values for five representative types of calculi are presented. PMID:5551382
The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...
ERIC Educational Resources Information Center
Lavoie, Jean-Michel; Chornet, Esteban; Pelletier, Andre
2008-01-01
This experiment targets undergraduate students in an analytical or organic instructional context. Using a simple extraction, this protocol allows students to quantify and qualify monoterpenes in essential oils from citrus fruit peels. The procedures involve cooling down the peels by immersing them into icy water. After a few minutes, the chilled…
ERIC Educational Resources Information Center
Mehra, M. C.; Rioux, J.
1982-01-01
Experimental procedures, typical observations, and results for the simultaneous analysis of Fe(III) and Cu(II) in a solution are discussed. The method is based on selective interaction between the two ions and potassium hexacyanoruthenate(II) in acid solution involving no preliminary sample preparations. (Author/JN)
ERIC Educational Resources Information Center
Boiani, James A.
1986-01-01
Describes an experiment which uses the Gran plot for analyzing free ions as well as those involved in an equilibrium. Discusses the benefits of using Gran plots in the study of acids, as well as other analytes in solutions. Presents background theory along with a description of the experimental procedures. (TW)
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Melucci, Dora; Locatelli, Marcello; Locatelli, Clinio
2013-12-01
An analytical procedure regarding the voltammetric determination of mercury(II), copper(II), lead(II), cadmium(II) and zinc(II) by square wave anodic stripping voltammetry (SWASV) in matrices involved in food chain is proposed. In particular, tea leaves were analyzed as real samples. The digestion of each matrix was carried out using a concentrated HCl-HNO3-H2SO4 acidic attack mixture; 0.01 mol L(-1) EDTA-Na2+ 0.15 mol L(-1) NaCl + 0.5 mol L(-1) HCl was employed as the supporting electrolyte. The voltammetric measurements were carried out using a conventional three electrode cell, employing, as working electrodes, a gold electrode (GE) and a stationary hanging mercury drop electrode (HMDE). The analytical procedure has been verified on the standard reference materials Spinach Leaves NIST-SRM 1570a, Tomato Leaves NIST-SRM 1573a and Apple Leaves NIST-SRM 1515. For all the elements, the precision as repeatability, expressed as relative standard deviation (sr) was of the order of 3-5%, while the trueness, expressed as relative error (e) was of the order of 3-7%. Once set up on the standard reference materials, the analytical procedure was applied to commercial tea leaves samples. A critical comparison with spectroscopic measurements is also discussed.
Milk Bottom-Up Proteomics: Method Optimization
Vincent, Delphine; Ezernieks, Vilnis; Elkins, Aaron; Nguyen, Nga; Moate, Peter J.; Cocks, Benjamin G.; Rochfort, Simone
2016-01-01
Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones, and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows' milk. Method A (urea) involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone) involved a trichloroacetic acid (TCA)/acetone precipitation, and method C (methanol/chloroform) involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionization-tandem mass spectrometry (nLC-ESI-MS/MS) analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection) to assess reproducibility. Mass spectrometry (MS) data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform) yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea) yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone) was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows. PMID:26793233
Johnstone, Daniel M.; Riveros, Carlos; Heidari, Moones; Graham, Ross M.; Trinder, Debbie; Berretta, Regina; Olynyk, John K.; Scott, Rodney J.; Moscato, Pablo; Milward, Elizabeth A.
2013-01-01
While Illumina microarrays can be used successfully for detecting small gene expression changes due to their high degree of technical replicability, there is little information on how different normalization and differential expression analysis strategies affect outcomes. To evaluate this, we assessed concordance across gene lists generated by applying different combinations of normalization strategy and analytical approach to two Illumina datasets with modest expression changes. In addition to using traditional statistical approaches, we also tested an approach based on combinatorial optimization. We found that the choice of both normalization strategy and analytical approach considerably affected outcomes, in some cases leading to substantial differences in gene lists and subsequent pathway analysis results. Our findings suggest that important biological phenomena may be overlooked when there is a routine practice of using only one approach to investigate all microarray datasets. Analytical artefacts of this kind are likely to be especially relevant for datasets involving small fold changes, where inherent technical variation—if not adequately minimized by effective normalization—may overshadow true biological variation. This report provides some basic guidelines for optimizing outcomes when working with Illumina datasets involving small expression changes. PMID:27605185
A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines
Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua
2018-01-01
The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective
NASA Astrophysics Data System (ADS)
Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.
2009-04-01
Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the societal values that are driving prioritization in river management, it will only support their elicitation and rational use. This is particularly important for the assessment of micro-pollutants because of severe limitations in scientific knowledge of their effects on river ecosystems. This makes the influence of pollution by micro-pollutants on prioritization of measures strongly dependent on the weight of the precautionary principle relative to other societal objectives of river management.
NASA Astrophysics Data System (ADS)
Mittal, R.; Rao, P.; Kaur, P.
2018-01-01
Elemental evaluations in scanty powdered material have been made using energy dispersive X-ray fluorescence (EDXRF) measurements, for which formulations along with specific procedure for sample target preparation have been developed. Fractional amount evaluation involves an itinerary of steps; (i) collection of elemental characteristic X-ray counts in EDXRF spectra recorded with different weights of material, (ii) search for linearity between X-ray counts and material weights, (iii) calculation of elemental fractions from the linear fit, and (iv) again linear fitting of calculated fractions with sample weights and its extrapolation to zero weight. Thus, elemental fractions at zero weight are free from material self absorption effects for incident and emitted photons. The analytical procedure after its verification with known synthetic samples of macro-nutrients, potassium and calcium, was used for wheat plant/ soil samples obtained from a pot experiment.
NASA Technical Reports Server (NTRS)
Book, W. J.
1973-01-01
An investigation is reported involving a mathematical procedure using 4 x 4 transformation matrices for analyzing the vibrations of flexible manipulators. Previous studies with the procedure are summarized and the method is extended to include flexible joints as well as links, and to account for the effects of various power transmission schemes. A systematic study of the allocation of structural material and the placement of components such as motors and gearboxes was undertaken using the analytical tools developed. As one step in this direction the variables which relate the vibration parameters of the arm to the task and environment of the arm were isolated and nondimensionalized. The 4 x 4 transformation matrices were also used to develop analytical expressions for the terms of the complete 6 x 6 compliance matrix for the case of two flexible links joined by a rotating joint, flexible about its axis of rotation.
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
Ultraviolet, Visible, and Fluorescence Spectroscopy
NASA Astrophysics Data System (ADS)
Penner, Michael H.
Spectroscopy in the ultraviolet-visible (UV-Vis) range is one of the most commonly encountered laboratory techniques in food analysis. Diverse examples, such as the quantification of macrocomponents (total carbohydrate by the phenol-sulfuric acid method), quantification of microcomponents, (thiamin by the thiochrome fluorometric procedure), estimates of rancidity (lipid oxidation status by the thiobarbituric acid test), and surveillance testing (enzyme-linked immunoassays), are presented in this text. In each of these cases, the analytical signal for which the assay is based is either the emission or absorption of radiation in the UV-Vis range. This signal may be inherent in the analyte, such as the absorbance of radiation in the visible range by pigments, or a result of a chemical reaction involving the analyte, such as the colorimetric copper-based Lowry method for the analysis of soluble protein.
Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry
2014-01-01
Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.
Quantification of astaxanthin in shrimp waste hydrolysate by HPLC.
López-Cervantes, J; Sánchez-Machado, D I; Gutiérrez-Coronado, M A; Ríos-Vázquez, N J
2006-10-01
In the present study, a simple and rapid reversed-phase HPLC method for the determination of astaxanthin in shrimp waste hydrolysate has been developed and validated. The analytical procedure involves the direct extraction of astaxanthin from the lipid fraction with methanol. The analytical column, SS Exil ODS, was operated at 25C. The mobile phase consisted of a mixture of water:methanol:dichloromethane:acetonitrile (4.5:28:22:45.5 v/v/v/v) at a flow rate of 1.0 mL/min. Detection and identification were performed using a photodiode array detector (lambda(detection) = 476 nm). The proposed HPLC method showed adequate linearity, repeatability and accuracy.
High-order moments of spin-orbit energy in a multielectron configuration
NASA Astrophysics Data System (ADS)
Na, Xieyu; Poirier, M.
2016-07-01
In order to analyze the energy-level distribution in complex ions such as those found in warm dense plasmas, this paper provides values for high-order moments of the spin-orbit energy in a multielectron configuration. Using second-quantization results and standard angular algebra or fully analytical expressions, explicit values are given for moments up to 10th order for the spin-orbit energy. Two analytical methods are proposed, using the uncoupled or coupled orbital and spin angular momenta. The case of multiple open subshells is considered with the help of cumulants. The proposed expressions for spin-orbit energy moments are compared to numerical computations from Cowan's code and agree with them. The convergence of the Gram-Charlier expansion involving these spin-orbit moments is analyzed. While a spectrum with infinitely thin components cannot be adequately represented by such an expansion, a suitable convolution procedure ensures the convergence of the Gram-Charlier series provided high-order terms are accounted for. A corrected analytical formula for the third-order moment involving both spin-orbit and electron-electron interactions turns out to be in fair agreement with Cowan's numerical computations.
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1985-01-01
Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Segura, J; Stramesi, C; Redón, A; Ventura, M; Sanchez, C J; González, G; San, L; Montagna, M
1999-03-05
The work presents an analytical strategy to detect drugs of abuse in hair. It involves two sequential steps: a screening by a simple enzyme-linked immunosorbent assay (ELISA) methodology to detect opiates, cocaine and its metabolites, and benzodiacepines, followed by confirmation of opiates and cocaine metabolites in positive samples by gas chromatography coupled to mass spectrometry (GC-MS). In the same GC-MS run other drugs for substitution therapy (e.g. methadone and its main metabolite) can also be detected. After a double washing of hair samples with dichloromethane, hair specimens were cut into small pieces and 10 mg samples were incubated in 2 ml of methanol-trifluoroacetic acid (9:1) mixture, overnight at 37 degrees C. Aliquots of the extract were then evaporated, reconstituted in buffer and analysed according to the ELISA procedure. Confirmation involved solid-phase extraction of another fraction of the extract kept at -20 degrees C, derivatization with heptafluorobutyric anhydride and hexafluoroisopropanol and detection of cocaine, benzoylecgonine, ecgonine methylester, cocaethylene, morphine, codeine, 6-monoacetylmorphine, methadone and 2-ethylidene-1.5-dimethyl-3,3-diphenylpirrolidine (methadone metabolite) by selective ion monitoring after gas chromatographic separation. During the development of the method it was verified that no more than 10% of cocaine, opiates and benzodiacepines were lost when dichloromethane was used to wash real samples. The results also confirmed the increase of extractability power of TFA when it was added to methanol: the recovery for the analytes (cocaine and its metabolites and opiates) added to methanol-TFA alone was of the order of 90% except for benzoylecgonine (75%), and the recovery for the analytes added to methanol-TFA extract of drug-free hair was about 90% for all analytes except for benzoylecgonine and 6-MAM (around 70%). Regarding the stability of labile compounds, only small amounts of ecgonine methylester (2.3%) and morphine (7.2%) were produced, from cocaine and 6-MAM respectively, after the whole extraction procedure and two weeks of storage of methanol-TFA extracts at -20 degrees C. Satisfactory results were obtained when the procedures were applied to the analysis of external proficiency testing hair samples and actual specimens from drug addicts.
Laboratory Analytical Procedures | Bioenergy | NREL
analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing
Identification and evaluation of software measures
NASA Technical Reports Server (NTRS)
Card, D. N.
1981-01-01
A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.
Purification of Carbon Nanotubes: Alternative Methods
NASA Technical Reports Server (NTRS)
Files, Bradley; Scott, Carl; Gorelik, Olga; Nikolaev, Pasha; Hulse, Lou; Arepalli, Sivaram
2000-01-01
Traditional carbon nanotube purification process involves nitric acid refluxing and cross flow filtration using surfactant TritonX. This is believed to result in damage to nanotubes and surfactant residue on nanotube surface. Alternative purification procedures involving solvent extraction, thermal zone refining and nitric acid refiuxing are used in the current study. The effect of duration and type of solvent to dissolve impurities including fullerenes and P ACs (polyaromatic compounds) are monitored by nuclear magnetic reasonance, high performance liquid chromatography, and thermogravimetric analysis. Thermal zone refining yielded sample areas rich in nanotubes as seen by scanning electric microscopy. Refluxing in boiling nitric acid seem to improve the nanotube content. Different procedural steps are needed to purify samples produced by laser process compared to arc process. These alternative methods of nanotube purification will be presented along with results from supporting analytical techniques.
MacGrogan, Gaëtan; Mathieu, Marie-Christine; Poulet, Bruno; Penault-Llorca, Frédérique; Vincent-Salomon, Anne; Roger, Pascal; Treilleux, Isabelle; Valent, Alexander; Antoine, Martine; Becette, Véronique; Bor, Catherine; Brabencova, Eva; Charafe-Jauffret, Emmanuelle; Chenard, Marie-Pierre; Dauplat, Marie-Mélanie; Delrée, Paul; Devouassoux, Mojgan; Fiche, Maryse; Fondrevelle, Marie-Eve; Fridman, Viviana; Garbar, Christian; Genin, Pascal; Ghnassia, Jean-Pierre; Haudebourg, Juliette; Laberge-Le Couteulx, Sophie; Loussouarn, Delphine; Maran-Gonzalez, Aurélie; Marcy, Myriam; Michenet, Patrick; Sagan, Christine; Trassard, Martine; Verriele, Véronique; Arnould, Laurent; Lacroix-Triki, Magali
2014-10-01
Biomarker assessment of breast cancer tumor samples is part of the routine workflow of pathology laboratories. International guidelines have recently been updated, with special regards to the pre-analytical steps that are critical for the quality of immunohistochemical and in situ hybridization procedures, whatever the biomarker analyzed. Fixation and specimen handling protocols must be standardized, validated and carefully tracked. Cooperation and training of the personnel involved in the specimen workflow (e.g. radiologists, surgeons, nurses, technicians and pathologists) are of paramount importance. The GEFPICS' update of the recommendations herein details and comments the different steps of the pre-analytical process. Application of these guidelines and participation to quality insurance programs are mandatory to ensure the correct evaluation of oncotheranostic biomarkers. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Prediction of turning stability using receptance coupling
NASA Astrophysics Data System (ADS)
Jasiewicz, Marcin; Powałka, Bartosz
2018-01-01
This paper presents an issue of machining stability prediction of dynamic "lathe - workpiece" system evaluated using receptance coupling method. Dynamic properties of the lathe components (the spindle and the tailstock) are assumed to be constant and can be determined experimentally based on the results of the impact test. Hence, the variable of the system "machine tool - holder - workpiece" is the machined part, which can be easily modelled analytically. The method of receptance coupling enables a synthesis of experimental (spindle, tailstock) and analytical (machined part) models, so impact testing of the entire system becomes unnecessary. The paper presents methodology of analytical and experimental models synthesis, evaluation of the stability lobes and experimental validation procedure involving both the determination of the dynamic properties of the system and cutting tests. In the summary the experimental verification results would be presented and discussed.
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...
This regulation prescribes Chemical Data Quality Management (CDQM) responsibilities and procedures for projects involving hazardous, toxic and/or radioactive waste (HTRW) materials. Its purpose is to assure that the analytical data meet project data quality objectives. This is the umbrella regulation that defines CDQM activities and integrates all of the other U.S. Army Corps of Engineers (USACE) guidance on environmental data quality management .
The Effect of Multispectral Image Fusion Enhancement on Human Efficiency
2017-03-20
human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
On-matrix derivatization extraction of chemical weapons convention relevant alcohols from soil.
Chinthakindi, Sridhar; Purohit, Ajay; Singh, Varoon; Dubey, D K; Pardasani, Deepak
2013-10-11
Present study deals with the on-matrix derivatization-extraction of aminoalcohols and thiodiglycols, which are important precursors and/or degradation products of VX analogues and vesicants class of chemical warfare agents (CWAs). The method involved hexamethyldisilazane (HMDS) mediated in situ silylation of analytes on the soil. Subsequent extraction and gas chromatography-mass spectrometry analysis of derivatized analytes offered better recoveries in comparison to the procedure recommended by the Organization for the Prohibition of Chemical Weapons (OPCW). Various experimental conditions such as extraction solvent, reagent and catalyst amount, reaction time and temperature were optimized. Best recoveries of analytes ranging from 45% to 103% were obtained with DCM solvent containing 5%, v/v HMDS and 0.01%, w/v iodine as catalyst. The limits of detection (LOD) and limit of quantification (LOQ) with selected analytes ranged from 8 to 277 and 21 to 665ngmL(-1), respectively, in selected ion monitoring mode. Copyright © 2013 Elsevier B.V. All rights reserved.
Final report on mid-polarity analytes in food matrix: mid-polarity pesticides in tea
NASA Astrophysics Data System (ADS)
Sin, Della W. M.; Li, Hongmei; Wong, S. K.; Lo, M. F.; Wong, Y. L.; Wong, Y. C.; Mok, C. S.
2015-01-01
At the Paris meeting in April 2011, the CCQM Working Group on Organic Analysis (OAWG) agreed on a suite of Track A studies meant to support the assessment of measurement capabilities needed for the delivery of measurement services within the scope of the OAWG Terms of Reference. One of the studies discussed and agreed upon for the suite of ten Track A studies that support the 5-year plan of the CCQM Core Competence assessment was CCQM-K95 'Mid-Polarity Analytes in Food Matrix: Mid-Polarity Pesticides in Tea'. This key comparison was co-organized by the Government Laboratory of Hong Kong Special Administrative Region (GL) and the National Institute of Metrology, China (NIM). To allow wider participation, a pilot study, CCQM-P136, was run in parallel. Participants' capabilities in measuring mid-polarity analytes in food matrix were demonstrated through this key comparison. Most of the participating NMIs/DIs successfully measured beta-endosulfan and endosulfan sulphate in the sample, however, there is room for further improvement for some participants. This key comparison involved not only extraction, clean-up, analytical separation and selective detection of the analytes in a complex food matrix, but also the pre-treatment procedures of the material before the extraction process. The problem of incomplete extraction of the incurred analytes from the sample matrix may not be observed simply by using spike recovery. The relative standard deviations for the data included in the KCRV calculation in this key comparison were less than 7 % which was acceptable given the complexity of the matrix, the level of the analytes and the complexity of the analytical procedure. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel
2013-12-15
In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.
Cantarero, Samuel; Zafra-Gómez, Alberto; Ballesteros, Oscar; Navalón, Alberto; Vílchez, José L; Crovetto, Guillermo; Verge, Coral; de Ferrer, Juan A
2010-11-01
We have developed a new analytical procedure for determining insoluble Ca and Mg fatty acid salts (soaps) in agricultural soil and sewage sludge samples. The number of analytical methodologies that focus in the determination of insoluble soap salts in different environmental compartments is very limited. In this work, we propose a methodology that involves a sample clean-up step with petroleum ether to remove soluble salts and a conversion of Ca and Mg insoluble salts into soluble potassium salts using tripotassium ethylenediaminetetraacetate salt and potassium carbonate, followed by the extraction of analytes from the samples using microwave-assisted extraction with methanol. An improved esterification procedure using 2,4-dibromoacetophenone before the liquid chromatography with ultraviolet detection analysis also has been developed. The absence of matrix effect was demonstrated with two fatty acid Ca salts that are not commercial and are never detected in natural samples (C₁₃:₀ and C₁₇:₀). Therefore, it was possible to evaluate the matrix effect because both standards have similar environmental behavior (adsorption and precipitation) to commercial soaps (C₁₀:₀) to C₁₈:₀). We also studied the effect of the different variables on the clean-up, the conversion of Ca soap, and the extraction and derivatization procedures. The quantification limits found ranged from 0.4 to 0.8 mg/kg. The proposed method was satisfactorily applied for the development of a study on soap behavior in agricultural soil and sewage sludge samples. © 2010 SETAC.
Analytical solutions for tomato peeling with combined heat flux and convective boundary conditions
NASA Astrophysics Data System (ADS)
Cuccurullo, G.; Giordano, L.; Metallo, A.
2017-11-01
Peeling of tomatoes by radiative heating is a valid alternative to steam or lye, which are expensive and pollutant methods. Suitable energy densities are required in order to realize short time operations, thus involving only a thin layer under the tomato surface. This paper aims to predict the temperature field in rotating tomatoes exposed to the source irradiation. Therefore, a 1D unsteady analytical model is presented, which involves a semi-infinite slab subjected to time dependent heating while convective heat transfer takes place on the exposed surface. In order to account for the tomato rotation, the heat source is described as the positive half-wave of a sinusoidal function. The problem being linear, the solution is derived following the Laplace Transform Method. In addition, an easy-to-handle solution for the problem at hand is presented, which assumes a differentiable function for approximating the source while neglecting convective cooling, the latter contribution turning out to be negligible for the context at hand. A satisfying agreement between the two analytical solutions is found, therefore, an easy procedure for a proper design of the dry heating system can be set up avoiding the use of numerical simulations.
Crystal structure optimisation using an auxiliary equation of state
NASA Astrophysics Data System (ADS)
Jackson, Adam J.; Skelton, Jonathan M.; Hendon, Christopher H.; Butler, Keith T.; Walsh, Aron
2015-11-01
Standard procedures for local crystal-structure optimisation involve numerous energy and force calculations. It is common to calculate an energy-volume curve, fitting an equation of state around the equilibrium cell volume. This is a computationally intensive process, in particular, for low-symmetry crystal structures where each isochoric optimisation involves energy minimisation over many degrees of freedom. Such procedures can be prohibitive for non-local exchange-correlation functionals or other "beyond" density functional theory electronic structure techniques, particularly where analytical gradients are not available. We present a simple approach for efficient optimisation of crystal structures based on a known equation of state. The equilibrium volume can be predicted from one single-point calculation and refined with successive calculations if required. The approach is validated for PbS, PbTe, ZnS, and ZnTe using nine density functionals and applied to the quaternary semiconductor Cu2ZnSnS4 and the magnetic metal-organic framework HKUST-1.
On a family of nonoscillatory equations y double prime = phi(x)y
NASA Technical Reports Server (NTRS)
Gingold, H.
1988-01-01
The oscillation or nonoscillation of a class of second-order linear differential equations is investigated analytically, with a focus on cases in which the functions phi(x) and y are complex-valued. Two linear transformations are introduced, and an asymptotic-decomposition procedure involving Shur triangularization is applied. The relationship of the present analysis to the nonoscillation criterion of Kneser (1896) and other more recent results is explored in two examples.
Modified procedure to determine acid-insoluble lignin in wood and pulp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Effland, M.J.
1977-10-01
If wood is treated with strong acid, carbohydrates are hydrolyzed and solubilized. The insoluble residue is by definition lignin and can be measured gravimetrically. The standard method of analysis requires samples of 1 or 2 g of wood or pulp. In research at this laboratory these amounts of sample are often not available for analytical determinations. Thus we developed a modification of the standard procedure suitable for much smaller sample amounts. The modification is based on the procedure of Saeman. Wood samples require extraction prior to lignin analysis to remove acid-insoluble extractives that will be measured as lignin. Usually thismore » involves only a standard extraction with ethanol--benzene. However, woods high in tannin must also be subjected to extraction with alcohol. Pulps seldom require extraction.« less
Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D
2017-11-01
Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty than on trueness factor. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen
2016-04-01
Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support
40 CFR 140.5 - Analytical procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...
Pueyo, M; Rauret, G; Bacon, J R; Gomez, A; Muntau, H; Quevauviller, P; López-Sánchez, J F
2001-02-01
There is an increasing requirement for assessment of the bioavailable metal fraction and the mobility of trace elements in soils upon disposal. One of the approaches is the use of leaching procedures, but the results obtained are operationally defined; therefore, their significance is highly dependent on the extraction protocol performed. So, for this type of study, there is a need for reference materials that allow the quality of measurements to be controlled. This paper describes the steps involved in the certification of an organic-rich soil reference material, BCR-700, for the EDTA- and acetic acid-extractable contents of some trace elements, following collaboratively tested and harmonised extraction procedures. Details are given for the preparation of the soil, homogeneity and stability testing, analytical procedures and the statistical selection of data to be included in the certification.
NASA Astrophysics Data System (ADS)
Stoykova, Boyka; Chochkova, Maya; Ivanova, Galya; Markova, Nadezhda; Enchev, Venelin; Tsvetkova, Iva; Najdenski, Hristo; Štícha, Martin; Milkova, Tsenka
2017-05-01
N-phenylpropenoyl amino acid amides have been brominated using two alternative sonochemically activated green chemistry procedures. The first synthetic procedure has involved an ultrasound assisted bromination in an aqueous medium using ionic liquid as a catalyst of the reaction, whereas in the second one an in situ formation of Br2 via oxidation of HBr by H2O2 has been used. For comparison, the conventional bromination procedure was also used. The newly brominated compounds were characterized by appropriate analytical techniques. A detailed NMR spectroscopic analysis and quantum chemical calculations using Density Functional Theory (DFT) methods have been used to define the stereochemistry of the products. The results confirmed the physicochemical identity and similar yields of the products obtained by the three synthetic procedures employed, and reveal the co-existence of two diastereoisomeric forms of the newly synthesized products. The antibacterial and antifungal activities of the dibrominated amides were evaluated.
Assessment of passive drag in swimming by numerical simulation and analytical procedure.
Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A
2018-03-01
The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2 = 0.98) and after log-log transformation (R 2 = 0.99). The C D also obtained a very high adjustment for both absolute (R 2 = 0.97) and log-log plots (R 2 = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.
Analytical sensor redundancy assessment
NASA Technical Reports Server (NTRS)
Mulcare, D. B.; Downing, L. E.; Smith, M. K.
1988-01-01
The rationale and mechanization of sensor fault tolerance based on analytical redundancy principles are described. The concept involves the substitution of software procedures, such as an observer algorithm, to supplant additional hardware components. The observer synthesizes values of sensor states in lieu of their direct measurement. Such information can then be used, for example, to determine which of two disagreeing sensors is more correct, thus enhancing sensor fault survivability. Here a stability augmentation system is used as an example application, with required modifications being made to a quadruplex digital flight control system. The impact on software structure and the resultant revalidation effort are illustrated as well. Also, the use of an observer algorithm for wind gust filtering of the angle-of-attack sensor signal is presented.
[Biosensor development in clinical analysis].
Boitieux, J L; Desmet, G; Thomas, D
1985-01-01
The use of enzymes immobilized or as markers formed the subject of more than thousand publications in the field of industry or biomedical applications, during the last five years. Recently, some authors published works concerning immobilization of total microorganisms for catalytic purposes, others use the enzymatic activity for marking molecules involved in immunological analysis processes. Together industrial biotechnology and medical analysis laboratory are interested with the evolution of these procedures involving the activity of immobilized enzymes. Enzyme immobilization allowed the lowering of analysis costs for, in this case, the enzyme can be used several times. We take account of the two main cases which are encountered during utilization of immobilized enzymes of analytical purposes. The enzyme is used directly for the catalysed reaction or it is used as enzymatic marker. These both aspects are developed mainly for the elaboration of enzymatic and immunoenzymatic electrodes and the realization of automatic computerized devices allowing continuous estimation of numerous biological blood parameters. From these two precise examples, glucose and antigen determination, the authors show the evolution of these technologies in the field of immobilized enzymes or captors and the analysis of signals given by these electrodes requiring a computerized treatment. This new technology opens to important potentialities in the analytical field. The automatization of these devices allowing the control in real time, will probably make easier the optimization steps of procedures actually used in the biomedical sphere.
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
The Da Vinci European BioBank: A Metabolomics-Driven Infrastructure
Carotenuto, Dario; Luchinat, Claudio; Marcon, Giordana; Rosato, Antonio; Turano, Paola
2015-01-01
We present here the organization of the recently-constituted da Vinci European BioBank (daVEB, https://www.davincieuropeanbiobank.org/it). The biobank was created as an infrastructure to support the activities of the Fiorgen Foundation (http://www.fiorgen.net/), a nonprofit organization that promotes research in the field of pharmacogenomics and personalized medicine. The way operating procedures concerning samples and data have been developed at daVEB largely stems from the strong metabolomics connotation of Fiorgen and from the involvement of the scientific collaborators of the foundation in international/European projects aimed to tackle the standardization of pre-analytical procedures and the promotion of data standards in metabolomics. PMID:25913579
Tracking Matrix Effects in the Analysis of DNA Adducts of Polycyclic Aromatic Hydrocarbons
Klaene, Joshua J.; Flarakos, Caroline; Glick, James; Barret, Jennifer T.; Zarbl, Helmut; Vouros, Paul
2015-01-01
LC-MS using electrospray ionization is currently the method of choice in bio-organic analysis covering a wide range of applications in a broad spectrum of biological media. The technique is noted for its high sensitivity but one major limitation which hinders achievement of its optimal sensitivity is the signal suppression due to matrix inferences introduced by the presence of co-extracted compounds during the sample preparation procedure. The analysis of DNA adducts of common environmental carcinogens is particularly sensitive to such matrix effects as sample preparation is a multistep process which involves “contamination” of the sample due to the addition of enzymes and other reagents for digestion of the DNA in order to isolate the analyte(s). This problem is further exacerbated by the need to reach low levels of quantitation (LOQ in the ppb level) while also working with limited (2-5 μg) quantities of sample. We report here on the systematic investigation of ion signal suppression contributed by each individual step involved in the sample preparation associated with the analysis of DNA adducts of polycyclic aromatic hydrocarbon (PAH) using as model analyte dG-BaP, the deoxyguanosine adduct of benzo[a]pyrene (BaP). The individual matrix contribution of each one of these sources to analyte signal was systematically addressed as were any interactive effects. The information was used to develop a validated analytical protocol for the target biomarker at levels typically encountered in vivo using as little as 2 μg of DNA and applied to a dose response study using a metabolically competent cell line. PMID:26607319
Irregular analytical errors in diagnostic testing - a novel concept.
Vogeser, Michael; Seger, Christoph
2018-02-23
In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.
Mass Spectrometric Rapid Diagnosis of Infectious Diseases.
1980-01-01
the analytical procedures to warrant reporting anew the whole analytical procedure. A. Sample Collection and Storage Procedure Urine samples were...positives or false-negatives. Next we have carried out a longitudinal study on urine samples obtained from groups of volunteer subjects vaccinated with...sterilization and storage procedures. 2. Developed new, simpler sample preparation techniques including one to handle tissue culture media. 3. Improved on the
General method of solving the Schroedinger equation of atoms and molecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakatsuji, Hiroshi
2005-12-15
We propose a general method of solving the Schroedinger equation of atoms and molecules. We first construct the wave function having the exact structure, using the ICI (iterative configuration or complement interaction) method and then optimize the variables involved by the variational principle. Based on the scaled Schroedinger equation and related principles, we can avoid the singularity problem of atoms and molecules and formulate a general method of calculating the exact wave functions in an analytical expansion form. We choose initial function {psi}{sub 0} and scaling g function, and then the ICI method automatically generates the wave function that hasmore » the exact structure by using the Hamiltonian of the system. The Hamiltonian contains all the information of the system. The free ICI method provides a flexible and variationally favorable procedure of constructing the exact wave function. We explain the computational procedure of the analytical ICI method routinely performed in our laboratory. Simple examples are given using hydrogen atom for the nuclear singularity case, the Hooke's atom for the electron singularity case, and the helium atom for both cases.« less
Toward improved understanding and control in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hieftje, Gary M.
1989-01-01
As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE... Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
Maloney, T.J.; Ludtke, A.S.; Krizman, T.L.
1994-01-01
The US. Geological Survey operates a quality- assurance program based on the analyses of reference samples for the National Water Quality Laboratory in Arvada, Colorado, and the Quality of Water Service Unit in Ocala, Florida. Reference samples containing selected inorganic, nutrient, and low ionic-strength constituents are prepared and disguised as routine samples. The program goal is to determine precision and bias for as many analytical methods offered by the participating laboratories as possible. The samples typically are submitted at a rate of approximately 5 percent of the annual environmental sample load for each constituent. The samples are distributed to the laboratories throughout the year. Analytical data for these reference samples reflect the quality of environmental sample data produced by the laboratories because the samples are processed in the same manner for all steps from sample login through data release. The results are stored permanently in the National Water Data Storage and Retrieval System. During water year 1991, 86 analytical procedures were evaluated at the National Water Quality Laboratory and 37 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic (major ion and trace metal) constituent data for water year 1991 indicated analytical imprecision in the National Water Quality Laboratory for 5 of 67 analytical procedures: aluminum (whole-water recoverable, atomic emission spectrometric, direct-current plasma); calcium (atomic emission spectrometric, direct); fluoride (ion-exchange chromatographic); iron (whole-water recoverable, atomic absorption spectrometric, direct); and sulfate (ion-exchange chromatographic). The results for 11 of 67 analytical procedures had positive or negative bias during water year 1991. Analytical imprecision was indicated in the determination of two of the five National Water Quality Laboratory nutrient constituents: orthophosphate as phosphorus and phosphorus. A negative or positive bias condition was indicated in three of five nutrient constituents. There was acceptable precision and no indication of bias for the 14 low ionic-strength analytical procedures tested in the National Water Quality Laboratory program and for the 32 inorganic and 5 nutrient analytical procedures tested in the Quality of Water Service Unit during water year 1991.
Hogendoorn, E A; Westhuis, K; Dijkman, E; Heusinkveld, H A; den Boer, A C; Evers, E A; Baumann, R A
1999-10-08
The coupled-column (LC-LC) configuration consisting of a 3 microm C18 column (50 x 4.6 mm I.D.) as the first column and a 5 microm C18 semi-permeable-surface (SPS) column (150 x 4.6 mm I.D.) as the second column appeared to be successful for the screening of acidic pesticides in surface water samples. In comparison to LC-LC employing two C18 columns, the combination of C18/SPS-C18 significantly decreased the baseline deviation caused by the hump of the co-extracted humic substances when using UV detection (217 nm). The developed LC-LC procedure allowed the simultaneous determination of the target analytes bentazone and bromoxynil in uncleaned extracts of surface water samples to a level of 0.05 microg/l in less than 15 min. In combination with a simple solid-phase extraction step (200 ml of water on a 500 mg C18-bonded silica) the analytical procedure provides a high sample throughput. During a period of about five months more than 200 ditch-water samples originating from agricultural locations were analyzed with the developed procedure. Validation of the method was performed by randomly analyzing recoveries of water samples spiked at levels of 0.1 microg/l (n=10), 0.5 microg/l (n=7) and 2.5 microg/l (n=4). Weighted regression of the recovery data showed that the method provides overall recoveries of 95 and 100% for bentazone and bromoxynil, respectively, with corresponding intra-laboratory reproducibilities of 10 and 11%, respectively. Confirmation of the analytes in part of the samples extracts was carried out with GC-negative ion chemical ionization MS involving a derivatization step with bis(trifluoromethyl)benzyl bromide. No false negatives or positives were observed.
Code of Federal Regulations, 2013 CFR
2013-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Code of Federal Regulations, 2014 CFR
2014-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Code of Federal Regulations, 2011 CFR
2011-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
NASA Astrophysics Data System (ADS)
Cacho, Juan-Ignacio; Campillo, Natalia; Viñas, Pilar; Hernandez-Cordoba, Manuel
2015-04-01
Nitrophenols (NPs) are widely distributed environmental contaminants that can be present in soils and sediments due to the degradation of some pesticides (parathion and fenitrothion) or by accidental spilling in ammunition plants or storage places. This communication reports a rapid and sensitive procedure for the determination of the most common NPs in soils by using gas chromatography coupled to mass spectrometry (GC-MS) as the analytical technique. Ultrasound assisted extraction (UAE) was employed for the extraction of the NPs from the soil samples to an organic solvent. Next, the resulting UAE extracts were submitted to dispersive liquid-liquid microextraction (DLLME) for achieving an effective preconcentration. DLLME is an easy-to-carry out, environmentally friendly separation technique involving minimal amounts of organic solvents. Since the volatility of NPs is low, as a previous stage to the GC-MS measurement the compounds were derivatized using a simple "in-situ" acetylation procedure. The main parameters affecting the UAE stage, as well as the DLLME and derivatization steps, were investigated looking for maximum analytical signals. The optimized procedure provided extraction recoveries in the 72-86% range, with precision values (expressed as relative standard deviation, RSD) ≤ 12%, and detection limits ranging from 1.3 and 3.3 ng g-1, depending on the compound. 20 soil and sediment samples, from military, industrial and agricultural areas were analyzed by the studied procedure in order to check its applicability.
ERIC Educational Resources Information Center
Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella
2012-01-01
Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…
Derivation and application of a class of generalized boundary conditions
NASA Technical Reports Server (NTRS)
Senior, Thomas B. A.; Volakis, John L.
1989-01-01
Boundary conditions involving higher order derivatives are presented for simulating surfaces whose reflection coefficients are known analytically, numerically, or experimentally. Procedures for determining the coefficients of the derivatives are discussed, along with the effect of displacing the surface where the boundary conditions are applied. Provided the coefficients satisfy a duality relation, equivalent forms of the boundary conditions involving tangential field components are deduced, and these provide the natural extension to nonplanar surfaces. As an illustration, the simulation of metal-backed uniform and three-layer dielectric coatings is given. It is shown that fourth order conditions are capable of providing an accurate simulation for uniform coating at least a quarter of a wavelength in thickness.
Gentili, Stefano; Mortali, Claudia; Mastrobattista, Luisa; Berretta, Paolo; Zaami, Simona
2016-09-10
A procedure based on headspace solid-phase microextraction (HS-SPME) coupled with gas chromatography/mass spectrometry (GC/MS) has been developed for the determination of most commonly used drugs of abuse in sweat of drivers stopped during roadside controls. DrugWipe 5A sweat screening device was used to collect sweat by a specific pad rubbed gently over forehead skin surface. The procedure involved an acid hydrolysis, a HS-SPME extraction for drugs of abuse but Δ(9)-tetrahydrocannabinol, which was directly extracted in alkaline medium HS-SPME conditions, a GC separation of analytes by a capillary column and MS detection by electron impact ionisation. The method was linear from the limit of quantification (LOQ) to 50ng drug per pad (r(2)≥0.99), with an intra- and inter-assay precision and accuracy always less than 15% and an analytical recovery between 95.1% and 102.8%, depending on the considered analyte. Using the validated method, sweat from 60 apparently intoxicated drivers were found positive to one or more drugs of abuse, showing sweat patches testing as a viable economic and simple alternative to conventional (blood and/or urine) and non conventional (oral fluid) testing of drugs of abuse in drugged drivers. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Hess, R. A.
1976-01-01
Paramount to proper utilization of electronic displays is a method for determining pilot-centered display requirements. Display design should be viewed fundamentally as a guidance and control problem which has interactions with the designer's knowledge of human psychomotor activity. From this standpoint, reliable analytical models of human pilots as information processors and controllers can provide valuable insight into the display design process. A relatively straightforward, nearly algorithmic procedure for deriving model-based, pilot-centered display requirements was developed and is presented. The optimal or control theoretic pilot model serves as the backbone of the design methodology, which is specifically directed toward the synthesis of head-down, electronic, cockpit display formats. Some novel applications of the optimal pilot model are discussed. An analytical design example is offered which defines a format for the electronic display to be used in a UH-1H helicopter in a landing approach task involving longitudinal and lateral degrees of freedom.
Integrating laboratory robots with analytical instruments--must it really be so difficult?
Kramer, G W
1990-09-01
Creating a reliable system from discrete laboratory instruments is often a task fraught with difficulties. While many modern analytical instruments are marvels of detection and data handling, attempts to create automated analytical systems incorporating such instruments are often frustrated by their human-oriented control structures and their egocentricity. The laboratory robot, while fully susceptible to these problems, extends such compatibility issues to the physical dimensions involving sample interchange, manipulation, and event timing. The workcell concept was conceived to describe the procedure and equipment necessary to carry out a single task during sample preparation. This notion can be extended to organize all operations in an automated system. Each workcell, no matter how complex its local repertoire of functions, must be minimally capable of accepting information (commands, data), returning information on demand (status, results), and being started, stopped, and reset by a higher level device. Even the system controller should have a mode where it can be directed by instructions from a higher level.
Advances in the analysis of biological samples using ionic liquids.
Clark, Kevin D; Trujillo-Rodríguez, María J; Anderson, Jared L
2018-02-12
Ionic liquids are a class of solvents and materials that hold great promise in bioanalytical chemistry. Task-specific ionic liquids have recently been designed for the selective extraction, separation, and detection of proteins, peptides, nucleic acids, and other physiologically relevant analytes from complex biological samples. To facilitate rapid bioanalysis, ionic liquids have been integrated in miniaturized and automated procedures. Bioanalytical separations have also benefited from the modification of nonspecific magnetic materials with ionic liquids or the implementation of ionic liquids with inherent magnetic properties. Furthermore, the direct detection of the extracted molecules in the analytical instrument has been demonstrated with structurally tuned ionic liquids and magnetic ionic liquids, providing a significant advantage in the analysis of low-abundance analytes. This article gives an overview of these advances that involve the application of ionic liquids and derivatives in bioanalysis. Graphical abstract Ionic liquids, magnetic ionic liquids, and ionic liquid-based sorbents are increasing the speed, selectivity, and sensitivity in the analysis of biological samples.
Stege, Patricia W; Sombra, Lorena L; Messina, Germán A; Martinez, Luis D; Silva, María F
2009-05-01
Many aromatic compounds can be found in the environment as a result of anthropogenic activities and some of them are highly toxic. The need to determine low concentrations of pollutants requires analytical methods with high sensitivity, selectivity, and resolution for application to soil, sediment, water, and other environmental samples. Complex sample preparation involving analyte isolation and enrichment is generally necessary before the final analysis. The present paper outlines a novel, simple, low-cost, and environmentally friendly method for the simultaneous determination of p-nitrophenol (PNP), p-aminophenol (PAP), and hydroquinone (HQ) by micellar electrokinetic capillary chromatography after preconcentration by cloud point extraction. Enrichment factors of 180 to 200 were achieved. The limits of detection of the analytes for the preconcentration of 50-ml sample volume were 0.10 microg L(-1) for PNP, 0.20 microg L(-1) for PAP, and 0.16 microg L(-1) for HQ. The optimized procedure was applied to the determination of phenolic pollutants in natural waters from San Luis, Argentina.
Goudarzi, Nasser
2009-02-11
A simple, low cost and highly sensitive method based on solvent microextraction (SME) for separation/preconcentration and flame atomic absorption spectrometry (FAAS) was proposed for the determination of ultratrace amounts of cadmium in meat and fish samples. The analytical procedure involved the formation of a hydrophobic complex by mixing the analyte solution with an ammonium pyrrolidinedithiocarbamate (APDC) solution. In suitable conditions, the complex of cadmium-APDC entered the micro organic phase, and thus, separation of the analyte from the matrix was achieved. Under optimal chemical and instrumental conditions, a detection limit (3 sigma) of 0.8 ng L(-1) and an enrichment factor of 93 were achieved. The relative standard deviation for the method was found to be 2.2% for Cd. The interference effects of some anions and cations were also investigated. The developed method has been applied to the determination of trace Cd in meat and fish samples.
Applications of computer algebra to distributed parameter systems
NASA Technical Reports Server (NTRS)
Storch, Joel A.
1993-01-01
In the analysis of vibrations of continuous elastic systems, one often encounters complicated transcendental equations with roots directly related to the system's natural frequencies. Typically, these equations contain system parameters whose values must be specified before a numerical solution can be obtained. The present paper presents a method whereby the fundamental frequency can be obtained in analytical form to any desired degree of accuracy. The method is based upon truncation of rapidly converging series involving inverse powers of the system natural frequencies. A straightforward method to developing these series and summing them in closed form is presented. It is demonstrated how Computer Algebra can be exploited to perform the intricate analytical procedures which otherwise would render the technique difficult to apply in practice. We illustrate the method by developing two analytical approximations to the fundamental frequency of a vibrating cantilever carrying a rigid tip body. The results are compared to the numerical solution of the exact (transcendental) frequency equation over a range of system parameters.
Goveia, Danielle; Rosa, André Henrique; Bellin, Iramaia Corrêa; Lobo, Fabiana Aparecida; Fraceto, Leonardo Fernandes; Roveda, José Arnaldo Frutuoso; Romão, Luciane Pimenta Cruz; Dias Filho, Newton Luiz
2008-02-01
This work involved the development and application of a new analytical procedure for in-situ characterization of the lability of metal species in aquatic systems by using a system equipped with a diffusion membrane and cellulose organomodified with p-aminobenzoic acid groups (DM-Cell-PAB). To this end, the DM-Cell-PAB system was prepared by adding cellulose organomodified with p-aminobenzoic acid groups (Cell-PAB) to pre-purified cellulose bags. After the DM-Cell-PAB system was sealed, it was examined in the laboratory. The in-situ application involved immersing the DM-Cell-PAB system in two different rivers, enabling us to study the relative lability of metal species (Cu, Cd, Fe, Mn, and Ni) as a function of time and quantity of exchanger. The procedure is simple and opens up a new perspective for understanding environmental phenomena relating to the complexation, transport, stability, and lability of metal species in aquatic systems rich in organic matter.
Review of calcium methodologies.
Zak, B; Epstein, E; Baginski, E S
1975-01-01
A review of calcium methodologies for serum has been described. The analytical systems developed over the past century have been classified as to type beginning with gravimetry and extending to isotope dilution-mass spectrometry by covering all of the commonly used technics which have evolved during that period. Screening and referee procedures are discussed along with comparative sensitivities encountered between atomic absorption spectrophotometry and molecular absorption spectrophotometry. A procedure involving a simple direct reaction for serum calcium using cresolphthalein complexone is recommended in which high blanks are minimized by repressing the ionization of the color reagent on lowering the dielectric constant characteristics of the mixture with dimethylsulfoxide. Reaction characteristics, errors which can be encountered, normal ranges and an interpretative resume are included in its discussion.
Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe
2017-08-01
In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.
Ludtke, Amy S.; Woodworth, Mark T.; Marsh, Philip S.
2000-01-01
The U.S. Geological Survey operates a quality-assurance program based on the analyses of reference samples for two laboratories: the National Water Quality Laboratory and the Quality of Water Service Unit. Reference samples that contain selected inorganic, nutrient, and low-level constituents are prepared and submitted to the laboratory as disguised routine samples. The program goal is to estimate precision and bias for as many analytical methods offered by the participating laboratories as possible. Blind reference samples typically are submitted at a rate of 2 to 5 percent of the annual environmental-sample load for each constituent. The samples are distributed to the laboratories throughout the year. The reference samples are subject to the identical laboratory handling, processing, and analytical procedures as those applied to environmental samples and, therefore, have been used as an independent source to verify bias and precision of laboratory analytical methods and ambient water-quality measurements. The results are stored permanently in the National Water Information System and the Blind Sample Project's data base. During water year 1998, 95 analytical procedures were evaluated at the National Water Quality Laboratory and 63 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic and low-level constituent data for water year 1998 indicated 77 of 78 analytical procedures at the National Water Quality Laboratory met the criteria for precision. Silver (dissolved, inductively coupled plasma-mass spectrometry) was determined to be imprecise. Five of 78 analytical procedures showed bias throughout the range of reference samples: chromium (dissolved, inductively coupled plasma-atomic emission spectrometry), dissolved solids (dissolved, gravimetric), lithium (dissolved, inductively coupled plasma-atomic emission spectrometry), silver (dissolved, inductively coupled plasma-mass spectrometry), and zinc (dissolved, inductively coupled plasma-mass spectrometry). At the National Water Quality Laboratory during water year 1998, lack of precision was indicated for 2 of 17 nutrient procedures: ammonia as nitrogen (dissolved, colorimetric) and orthophosphate as phosphorus (dissolved, colorimetric). Bias was indicated throughout the reference sample range for ammonia as nitrogen (dissolved, colorimetric, low level) and nitrate plus nitrite as nitrogen (dissolved, colorimetric, low level). All analytical procedures tested at the Quality of Water Service Unit during water year 1998 met the criteria for precision. One of the 63 analytical procedures indicated a bias throughout the range of reference samples: aluminum (whole-water recoverable, inductively coupled plasma-atomic emission spectrometry, trace).
Fluorescence In Situ Hybridization Probe Validation for Clinical Use.
Gu, Jun; Smith, Janice L; Dowling, Patricia K
2017-01-01
In this chapter, we provide a systematic overview of the published guidelines and validation procedures for fluorescence in situ hybridization (FISH) probes for clinical diagnostic use. FISH probes-which are classified as molecular probes or analyte-specific reagents (ASRs)-have been extensively used in vitro for both clinical diagnosis and research. Most commercially available FISH probes in the United States are strictly regulated by the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the Centers for Medicare & Medicaid Services (CMS) the Clinical Laboratory Improvement Amendments (CLIA), and the College of American Pathologists (CAP). Although home-brewed FISH probes-defined as probes made in-house or acquired from a source that does not supply them to other laboratories-are not regulated by these agencies, they too must undergo the same individual validation process prior to clinical use as their commercial counterparts. Validation of a FISH probe involves initial validation and ongoing verification of the test system. Initial validation includes assessment of a probe's technical specifications, establishment of its standard operational procedure (SOP), determination of its clinical sensitivity and specificity, development of its cutoff, baseline, and normal reference ranges, gathering of analytics, confirmation of its applicability to a specific research or clinical setting, testing of samples with or without the abnormalities that the probe is meant to detect, staff training, and report building. Ongoing verification of the test system involves testing additional normal and abnormal samples using the same method employed during the initial validation of the probe.
Raman spectroscopy for the analytical quality control of low-dose break-scored tablets.
Gómez, Diego A; Coello, Jordi; Maspoch, Santiago
2016-05-30
Quality control of solid dosage forms involves the analysis of end products according to well-defined criteria, including the assessment of the uniformity of dosage units (UDU). However, in the case of break-scored tablets, given that tablet splitting is widespread as a means to adjust doses, the uniform distribution of the active pharmaceutical ingredient (API) in all the possible fractions of the tablet must be assessed. A general procedure to accomplish with both issues, using Raman spectroscopy, is presented. It is based on the acquisition of a collection of spectra in different regions of the tablet, that later can be selected to determine the amount of API in the potential fractions that can result after splitting. The procedure has been applied to two commercial products, Sintrom 1 and Sintrom 4, with API (acenocoumarol) mass proportion of 2% and 0.7% respectively. Partial Least Squares (PLS) calibration models were constructed for the quantification of acenocoumarol in whole tablets using HPLC as a reference analytical method. Once validated, the calibration models were used to determine the API content in the different potential fragments of the scored Sintrom 4 tablets. Fragment mass measurements were also performed to estimate the range of masses of the halves and quarters that could result after tablet splitting. The results show that Raman spectroscopy can be an alternative analytical procedure to assess the uniformity of content, both in whole tablets as in its potential fragments, and that Sintrom 4 tablets can be perfectly split in halves, but some cautions have to be taken when considering the fragmentation in quarters. A practical alternative to the use of UDU test for the assessment of tablet fragments is proposed. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Fisher, James E.; Sealey, Ronald W.
The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.
This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.
An analytic survey of signing inventory procedures in Virginia.
DOT National Transportation Integrated Search
1972-01-01
An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...
Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L
2004-11-15
This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.
Containment of composite fan blades
NASA Technical Reports Server (NTRS)
Stotler, C. L.; Coppa, A. P.
1979-01-01
A lightweight containment was developed for turbofan engine fan blades. Subscale ballistic-type tests were first run on a number of concepts. The most promising configuration was selected and further evaluated by larger scale tests in a rotating test rig. Weight savings made possible by the use of this new containment system were determined and extrapolated to a CF6-size engine. An analytical technique was also developed to predict the released blades motion when involved in the blade/casing interaction process. Initial checkout of this procedure was accomplished using several of the tests run during the program.
NASA Technical Reports Server (NTRS)
Young, J. W.; Schy, A. A.; Johnson, K. G.
1977-01-01
An analytical method has been developed for predicting critical control inputs for which nonlinear rotational coupling may cause sudden jumps in aircraft response. The analysis includes the effect of aerodynamics which are nonlinear in angle of attack. The method involves the simultaneous solution of two polynomials in roll rate, whose coefficients are functions of angle of attack and the control inputs. Results obtained using this procedure are compared with calculated time histories to verify the validity of the method for predicting jump-like instabilities.
Rare earth chalcogenide stoichiometry determination. [of thermoelectric properties
NASA Technical Reports Server (NTRS)
Lockwood, R. A.
1983-01-01
Rare earth chalcogenides, and particularly lanthanum sulfide, are currently explored as candidate materials for thermoelectric applications. Since the electrical properties of LaS(x) are largely determined by its stoichiometry, a simple and accurate method has been developed for determining the value of x. The procedure involves dissolving a weighted sample in acid and measuring the amount of hydrogen evolved by the lanthanum that is in excess of the 1.500 ratio of S/La. The analytical error in the determination of x in LaS(x) is about 0.001.
A Study of Convergence of the PMARC Matrices Applicable to WICS Calculations
NASA Technical Reports Server (NTRS)
Ghosh, Amitabha
1997-01-01
This report discusses some analytical procedures to enhance the real time solutions of PMARC matrices applicable to the Wall Interference Correction Scheme (WICS) currently being implemented at the 12 foot Pressure Tunnel. WICS calculations involve solving large linear systems in a reasonably speedy manner necessitating exploring further improvement in solution time. This paper therefore presents some of the associated theory of the solution of linear systems. Then it discusses a geometrical interpretation of the residual correction schemes. Finally some results of the current investigation are presented.
A Study of Convergence of the PMARC Matrices Applicable to WICS Calculations
NASA Technical Reports Server (NTRS)
Ghosh, Amitabha
1997-01-01
This report discusses some analytical procedures to enhance the real time solutions of PMARC matrices applicable to the Wall Interference Correction Scheme (WICS) currently being implemented at the 12 foot Pressure Tunell. WICS calculations involve solving large linear systems in a reasonably speedy manner necessitating exploring further improvement in solution time. This paper therefore presents some of the associated theory of the solution of linear systems. Then it discusses a geometrical interpretation of the residual correction schemes. Finally, some results of the current investigation are presented.
Managing heteroscedasticity in general linear models.
Rosopa, Patrick J; Schaffer, Meline M; Schroeder, Amber N
2013-09-01
Heteroscedasticity refers to a phenomenon where data violate a statistical assumption. This assumption is known as homoscedasticity. When the homoscedasticity assumption is violated, this can lead to increased Type I error rates or decreased statistical power. Because this can adversely affect substantive conclusions, the failure to detect and manage heteroscedasticity could have serious implications for theory, research, and practice. In addition, heteroscedasticity is not uncommon in the behavioral and social sciences. Thus, in the current article, we synthesize extant literature in applied psychology, econometrics, quantitative psychology, and statistics, and we offer recommendations for researchers and practitioners regarding available procedures for detecting heteroscedasticity and mitigating its effects. In addition to discussing the strengths and weaknesses of various procedures and comparing them in terms of existing simulation results, we describe a 3-step data-analytic process for detecting and managing heteroscedasticity: (a) fitting a model based on theory and saving residuals, (b) the analysis of residuals, and (c) statistical inferences (e.g., hypothesis tests and confidence intervals) involving parameter estimates. We also demonstrate this data-analytic process using an illustrative example. Overall, detecting violations of the homoscedasticity assumption and mitigating its biasing effects can strengthen the validity of inferences from behavioral and social science data.
The laboratory of the 1990s—Planning for total automation
Brunner, Linda A.
1992-01-01
The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925
Katz, B.G.; Collins, J.J.
1998-01-01
A cooperative study between the Florida Department of Environmental Protection (FDEP) and the U.S. Geological Survey was conducted to assess the integrity of selected water-quality data collected at 150 sites in the FDEP Surface-Water Ambient Monitoring Program (SWAMP) in Florida. The assessment included determining the consistency of the water-quality data collected statewide, including commonality of monitoring procedures and analytes, screening of the gross validity of a chemical analysis, and quality assurance and quality control (QA/QC) procedures. Four tests were used to screen data at selected SWAMP sites to estimate the gross validity of selected chemical data: (1) the ratio of dissolved solids (in milligrams per liter) to specific conductance (in microsiemens per centimeter); (2) the ratio of total cations (in milliequivalents per liter) multiplied by 100 to specific conductance (in microsiemens per centimeter); (3) the ratio of total anions (in milliequivalents per liter) multiplied by 100 to specific conductance (in microsiemens per centimeter); and (4) the ionic charge-balance error. Although the results of the four screening tests indicate that the chemical data generally are quite reliable, the extremely small number of samples (less than 5 percent of the total number of samples) with sufficient chemical information to run the tests may not provide a representative indication of the analytical accuracy of all laboratories in the program. In addition to the four screening tests, unusually low or high values were flagged for field and laboratory pH (less than 4.0 and greater than 9.0) and specific conductance (less than 10 and greater than 10,000 microsiemens per centimeter). The numbers of flagged data were less than 1 percent of the 19,937 water samples with pH values and less than 0.6 percent of the 16,553 water samples with specific conductance values. Thirty-four agencies responded to a detailed questionnaire that was sent to more than 60 agencies involved in the collection and analysis of surface-water-quality data for SWAMP. The purpose of the survey was to evaluate quality assurance methods and consistency of methods statewide. Information was compiled and summarized on monitoring network design, data review and upload procedures, laboratory and field sampling methods, and data practices. Currently, most agencies that responded to the survey follow FDEP-approved QA/QC protocol for sampling and have quality assurance practices for recording and reporting data. Also, most agencies responded that calibration procedures were followed in the laboratory for analysis of data, but no responses were given about the specific procedures. Approximately 50 percent of the respondents indicated that laboratory analysis methods have changed over time. With so many laboratories involved in analyzing samples for SWAMP, it is difficult to compare water quality from one site to another due to different reporting conventions for chemical constituents and different analytical methods over time. Most agencies responded that calibration methods are followed in the field, but no specific details were provided. Grab samples are the most common method of collection. Other data screening procedures are necessary to further evaluate the validity of chemical data collected at SWAMP sites. High variability in the concentration of targeted constituents may signal analytical problems, but more likely changes in concentration are related to hydrologic conditions. This underscores the need for accurate measurements of discharge, lake stage, tidal stage at the time of sampling so that changes in constituent concentrations can be properly evaluated and fluxes (loads) of nutrients or metals, for example, can be calculated and compared over time.
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
The transfer of analytical procedures.
Ermer, J; Limberger, M; Lis, K; Wätzig, H
2013-11-01
Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.
Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K
2000-08-01
A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...
Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech
2015-01-01
Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.
40 CFR 600.108-08 - Analytical gases.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...
40 CFR 600.108-08 - Analytical gases.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...
NASA Technical Reports Server (NTRS)
Baker, L. R.; Sulyma, P. R.; Tevepaugh, J. A.; Penny, M. M.
1976-01-01
Since exhaust plumes affect vehicle base environment (pressure and heat loads) and the orbiter vehicle aerodynamic control surface effectiveness, an intensive program involving detailed analytical and experimental investigations of the exhaust plume/vehicle interaction was undertaken as a pertinent part of the overall space shuttle development program. The program, called the Plume Technology program, has as its objective the determination of the criteria for simulating rocket engine (in particular, space shuttle propulsion system) plume-induced aerodynamic effects in a wind tunnel environment. The comprehensive experimental program was conducted using test facilities at NASA's Marshall Space Flight Center and Ames Research Center. A post-test examination of some of the experimental results obtained from NASA-MSFC's 14 x 14-inch trisonic wind tunnel is presented. A description is given of the test facility, simulant gas supply system, nozzle hardware, test procedure and test matrix. Analysis of exhaust plume flow fields and comparison of analytical and experimental exhaust plume data are presented.
Cardellicchio, N; Giandomenico, S; Decataldo, A; Di Leo, A
2001-03-01
A method for the determination of organotin compounds (monobutyl = MBT, dibutyl = DBT, and tributyltin = TBT) in marine sediments by headspace Solid Phase Microextraction (SPME) has been developed. The analytical procedure involved 1) extraction of TBT, DBT and MBT from sediments with HCl and methanol mixture, 2) in situ derivatization with sodium tetraethylborate and 3) headspace SPME extraction using a fiber coated with poly(dimethylsiloxane). The derivatized organotin compounds were desorbed into the splitless injector and simultaneously analyzed by gas chromatography - mass spectrometry. The analytical method was optimized with respect to derivatization reaction and extraction conditions. The detection limits obtained for MBT, DBT and TBT ranged from 730 to 969 pg/g as Sn dry weight. Linear calibration curves were obtained for all analytes in the range of 30-1000 ng/L as Sn. Analysis of a standard reference sediment (CRM 462) demonstrates the suitability of this method for the determination of butyltin compounds in marine sediments. The application to the determination of TBT, DBT and MBT in a coastal marine sediment is shown.
Clinical decision making: how surgeons do it.
Crebbin, Wendy; Beasley, Spencer W; Watters, David A K
2013-06-01
Clinical decision making is a core competency of surgical practice. It involves two distinct types of mental process best considered as the ends of a continuum, ranging from intuitive and subconscious to analytical and conscious. In practice, individual decisions are usually reached by a combination of each, according to the complexity of the situation and the experience/expertise of the surgeon. An expert moves effortlessly along this continuum, according to need, able to apply learned rules or algorithms to specific presentations, choosing these as a result of either pattern recognition or analytical thinking. The expert recognizes and responds quickly to any mismatch between what is observed and what was expected, coping with gaps in information and making decisions even where critical data may be uncertain or unknown. Even for experts, the cognitive processes involved are difficult to articulate as they tend to be very complex. However, if surgeons are to assist trainees in developing their decision-making skills, the processes need to be identified and defined, and the competency needs to be measurable. This paper examines the processes of clinical decision making in three contexts: making a decision about how to manage a patient; preparing for an operative procedure; and reviewing progress during an operative procedure. The models represented here are an exploration of the complexity of the processes, designed to assist surgeons understand how expert clinical decision making occurs and to highlight the challenge of teaching these skills to surgical trainees. © 2013 The Authors. ANZ Journal of Surgery © 2013 Royal Australasian College of Surgeons.
Internal quality control: planning and implementation strategies.
Westgard, James O
2003-11-01
The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.
Ultrasonic inspection of carbon fiber reinforced plastic by means of sample-recognition methods
NASA Technical Reports Server (NTRS)
Bilgram, R.
1985-01-01
In the case of carbon fiber reinforced plastic (CFRP), it has not yet been possible to detect nonlocal defects and material degradation related to aging with the aid of nondestructive inspection method. An approach for overcoming difficulties regarding such an inspection involves an extension of the ultrasonic inspection procedure on the basis of a use of signal processing and sample recognition methods. The basic concept involved in this approach is related to the realization that the ultrasonic signal contains information regarding the medium which is not utilized in conventional ultrasonic inspection. However, the analytical study of the phyiscal processes involved is very complex. For this reason, an empirical approach is employed to make use of the information which has not been utilized before. This approach uses reference signals which can be obtained with material specimens of different quality. The implementation of these concepts for the supersonic inspection of CFRP laminates is discussed.
A numerical approach to controller design for the ACES facility
NASA Technical Reports Server (NTRS)
Frazier, W. Garth; Irwin, R. Dennis
1993-01-01
In recent years the employment of active control techniques for improving the performance of systems involving highly flexible structures has become a topic of considerable research interest. Most of these systems are quite complicated, using multiple actuators and sensors, and possessing high order models. The majority of analytical controller synthesis procedures capable of handling multivariable systems in a systematic way require considerable insight into the underlying mathematical theory to achieve a successful design. This insight is needed in selecting the proper weighting matrices or weighting functions to cast what is naturally a multiple constraint satisfaction problem into an unconstrained optimization problem. Although designers possessing considerable experience with these techniques have a feel for the proper choice of weights, others may spend a significant amount of time attempting to find an acceptable solution. Another disadvantage of such procedures is that the resulting controller has an order greater than or equal to that of the model used for the design. Of course, the order of these controllers can often be reduced, but again this requires a good understanding of the theory involved.
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...
40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...
40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 10 2013-07-01 2013-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES... to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater... times the standard deviation of replicate instrumental measurements of the analyte in reagent water. (c...
Methods for determination of inorganic substances in water and fluvial sediments
Fishman, Marvin J.; Friedman, Linda C.
1989-01-01
Chapter Al of the laboratory manual contains methods used by the U.S. Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, the total recoverable and total of constituents in water-suspended sediment samples, and the recoverable and total concentrations of constituents in samples of bottom material. The introduction to the manual includes essential definitions and a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including the accuracy and precision of analyses, the use of standard-reference water samples, and the operation of an effective quality-assurance program. Methods for sample preparation and pretreatment are given also. A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods of these techniques are arranged alphabetically by constituent. For each method, the general topics covered are the application, the principle of the method, the interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 126 methods are given for the determination of 70 inorganic constituents and physical properties of water, suspended sediment, and bottom material.
Analytic tests and their relation to jet fuel thermal stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heneghan, S.P.; Kauffman, R.E.
1995-05-01
The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less
Response demands and the recruitment of heuristic strategies in syllogistic reasoning.
Reverberi, Carlo; Rusconi, Patrice; Paulesu, Eraldo; Cherubini, Paolo
2009-03-01
Two experiments investigated whether dealing with a homogeneous subset of syllogisms with time-constrained responses encouraged participants to develop and use heuristics for abstract (Experiment 1) and thematic (Experiment 2) syllogisms. An atmosphere-based heuristic accounted for most responses with both abstract and thematic syllogisms. With thematic syllogisms, a weaker effect of a belief heuristic was also observed, mainly where the correct response was inconsistent with the atmosphere of the premises. Analytic processes appear to have played little role in the time-constrained condition, whereas their involvement increased in a self-paced, unconstrained condition. From a dual-process perspective, the results further specify how task demands affect the recruitment of heuristic and analytic systems of reasoning. Because the syllogisms and experimental procedure were the same as those used in a previous neuroimaging study by Goel, Buchel, Frith, and Dolan (2000), the result also deepen our understanding of the cognitive processes investigated by that study.
Boyer, Chantal; Gaudin, Karen; Kauss, Tina; Gaubert, Alexandra; Boudis, Abdelhakim; Verschelden, Justine; Franc, Mickaël; Roussille, Julie; Boucher, Jacques; Olliaro, Piero; White, Nicholas J.; Millet, Pascal; Dubost, Jean-Pierre
2012-01-01
Near infrared spectroscopy (NIRS) methods were developed for the determination of analytical content of an antimalarial-antibiotic (artesunate and azithromycin) co-formulation in hard gelatin capsule (HGC). The NIRS consists of pre-processing treatment of spectra (raw spectra and first-derivation of two spectral zones), a unique principal component analysis model to ensure the specificity and then two partial least-squares regression models for the determination content of each active pharmaceutical ingredient. The NIRS methods were developed and validated with no reference method, since the manufacturing process of HGC is basically mixed excipients with active pharmaceutical ingredients. The accuracy profiles showed β-expectation tolerance limits within the acceptance limits (±5%). The analytical control approach performed by reversed phase (HPLC) required two different methods involving two different preparation and chromatographic methods. NIRS offers advantages in terms of lower costs of equipment and procedures, time saving, environmentally friendly. PMID:22579599
Gough, H; Luke, G A; Beeley, J A; Geddes, D A
1996-02-01
The aim of this project was to develop an analytical procedure with the required level of sensitivity for the determination of glucose concentrations in small volumes of unstimulated fasting whole saliva. The technique involves high-performance ion-exchange chromatography at high pH and pulsed amperometric detection. It has a high level of reproducibility, a sensitivity as low as 0.1 mumol/l and requires only 50 microliters samples (sensitivity = 0.002 pmol). Inhibition of glucose metabolism, by procedures such as collection into 0.1% (w/v) sodium fluoride, was shown to be essential if accurate results are to be obtained. Collection on to ice followed by storage at -20 degrees C was shown to be unsuitable and resulted in glucose loss by degradation. There were inter- and intraindividual variations in the glucose concentration in unstimulated mixed saliva (range; 0.02-0.4 mmol/l). The procedure can be used for the analysis of other salivary carbohydrates and for monitoring the clearance of dietary carbohydrates from the mouth.
Nevado, Juan José Berzas; Robledo, Virginia Rodríguez; Callado, Carolina Sánchez-Carnerero
2012-07-15
The enrichment of virgin olive oil (VOO) with natural antioxidants contained in various herbs (rosemary, thyme and oregano) was studied. Three different enrichment procedures were used for the solid-liquid extraction of antioxidants present in the herbs to VOO. One involved simply bringing the herbs into contact with the VOO for 190 days; another keeping the herb-VOO mixture under stirring at room temperature (25°C) for 11 days; and the third stirring at temperatures above room level (35-40°C). The efficiency of each procedure was assessed by using a reproducible, efficient, reliable analytical capillary zone electrophoresis (CZE) method to separate and determine selected phenolic compounds (rosmarinic and caffeic acid) in the oil. Prior to electrophoretic separation, the studied antioxidants were isolated from the VOO matrix by using an optimised preconcentration procedure based on solid phase extraction (SPE). The CZE method was optimised and validated. Copyright © 2012 Elsevier Ltd. All rights reserved.
An artificial system for selecting the optimal surgical team.
Saberi, Nahid; Mahvash, Mohsen; Zenati, Marco
2015-01-01
We introduce an intelligent system to optimize a team composition based on the team's historical outcomes and apply this system to compose a surgical team. The system relies on a record of the procedures performed in the past. The optimal team composition is the one with the lowest probability of unfavorable outcome. We use the theory of probability and the inclusion exclusion principle to model the probability of team outcome for a given composition. A probability value is assigned to each person of database and the probability of a team composition is calculated from them. The model allows to determine the probability of all possible team compositions even if there is no recoded procedure for some team compositions. From an analytical perspective, assembling an optimal team is equivalent to minimizing the overlap of team members who have a recurring tendency to be involved with procedures of unfavorable results. A conceptual example shows the accuracy of the proposed system on obtaining the optimal team.
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck
2015-02-01
Practical vibroacoustic systems involve passive acoustic treatments consisting of highly dissipative media such as poroelastic materials. The numerical modeling of such systems at low to mid frequencies typically relies on substructuring methodologies based on finite element models. Namely, the master subsystems (i.e., structural and acoustic domains) are described by a finite set of uncoupled modes, whereas condensation procedures are typically preferred for the acoustic treatments. However, although accurate, such methodology is computationally expensive when real life applications are considered. A potential reduction of the computational burden could be obtained by approximating the effect of the acoustic treatment on the master subsystems without introducing physical degrees of freedom. To do that, the treatment has to be assumed homogeneous, flat, and of infinite lateral extent. Under these hypotheses, simple analytical tools like the transfer matrix method can be employed. In this paper, a hybrid finite element-transfer matrix methodology is proposed. The impact of the limiting assumptions inherent within the analytical framework are assessed for the case of plate-cavity systems involving flat and homogeneous acoustic treatments. The results prove that the hybrid model can capture the qualitative behavior of the vibroacoustic system while reducing the computational effort.
NASA/FAA general aviation crash dynamics program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.; Carden, H. D.
1981-01-01
The program involves controlled full scale crash testing, nonlinear structural analyses to predict large deflection elastoplastic response, and load attenuating concepts for use in improved seat and subfloor structure. Both analytical and experimental methods are used to develop expertise in these areas. Analyses include simplified procedures for estimating energy dissipating capabilities and comprehensive computerized procedures for predicting airframe response. These analyses are developed to provide designers with methods for predicting accelerations, loads, and displacements on collapsing structure. Tests on typical full scale aircraft and on full and subscale structural components are performed to verify the analyses and to demonstrate load attenuating concepts. A special apparatus was built to test emergency locator transmitters when attached to representative aircraft structure. The apparatus is shown to provide a good simulation of the longitudinal crash pulse observed in full scale aircraft crash tests.
Simple and Efficient Numerical Evaluation of Near-Hypersingular Integrals
NASA Technical Reports Server (NTRS)
Fink, Patrick W.; Wilton, Donald R.; Khayat, Michael A.
2007-01-01
Recently, significant progress has been made in the handling of singular and nearly-singular potential integrals that commonly arise in the Boundary Element Method (BEM). To facilitate object-oriented programming and handling of higher order basis functions, cancellation techniques are favored over techniques involving singularity subtraction. However, gradients of the Newton-type potentials, which produce hypersingular kernels, are also frequently required in BEM formulations. As is the case with the potentials, treatment of the near-hypersingular integrals has proven more challenging than treating the limiting case in which the observation point approaches the surface. Historically, numerical evaluation of these near-hypersingularities has often involved a two-step procedure: a singularity subtraction to reduce the order of the singularity, followed by a boundary contour integral evaluation of the extracted part. Since this evaluation necessarily links basis function, Green s function, and the integration domain (element shape), the approach ill fits object-oriented programming concepts. Thus, there is a need for cancellation-type techniques for efficient numerical evaluation of the gradient of the potential. Progress in the development of efficient cancellation-type procedures for the gradient potentials was recently presented. To the extent possible, a change of variables is chosen such that the Jacobian of the transformation cancels the singularity. However, since the gradient kernel involves singularities of different orders, we also require that the transformation leaves remaining terms that are analytic. The terms "normal" and "tangential" are used herein with reference to the source element. Also, since computational formulations often involve the numerical evaluation of both potentials and their gradients, it is highly desirable that a single integration procedure efficiently handles both.
SRC-I demonstration plant analytical laboratory methods manual. Final technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klusaritz, M.L.; Tewari, K.C.; Tiedge, W.F.
1983-03-01
This manual is a compilation of analytical procedures required for operation of a Solvent-Refined Coal (SRC-I) demonstration or commercial plant. Each method reproduced in full includes a detailed procedure, a list of equipment and reagents, safety precautions, and, where possible, a precision statement. Procedures for the laboratory's environmental and industrial hygiene modules are not included. Required American Society for Testing and Materials (ASTM) methods are cited, and ICRC's suggested modifications to these methods for handling coal-derived products are provided.
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...
ERIC Educational Resources Information Center
Wang, Tianyou
2009-01-01
Holland and colleagues derived a formula for analytical standard error of equating using the delta-method for the kernel equating method. Extending their derivation, this article derives an analytical standard error of equating procedure for the conventional percentile rank-based equipercentile equating with log-linear smoothing. This procedure is…
Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek
2016-01-15
In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.
High Throughput Determination of VX in Drinking Water by ...
Methods Report This document provides the standard operating procedure for determination of the chemical warfare agent VX (O-Ethyl S-2-Diisopropylamino-Ethyl Methylphosphonothioate) in drinking water by isotope dilution liquid chromatography tandem mass spectrometer (LC/MS/MS). This method was adapted from one that was initially developed by the Centers for Disease Control and Prevention, in the National Center for Environmental Health for the determination and quantitation of VX in aqueous matrices. This method is designed to support site-specific cleanup goals of environmental remediation activities following a homeland security incident involving this analyte.
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Procedures For Microbial-Ecology Laboratory
NASA Technical Reports Server (NTRS)
Huff, Timothy L.
1993-01-01
Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.
NASA Astrophysics Data System (ADS)
Kolecki, J.
2015-12-01
The Bundlab software has been developed mainly for academic and research application. This work can be treated as a kind of a report describing the current state of the development of this computer program, focusing especially on the analytical solutions. Firstly, the overall characteristics of the software are provided. Then the description of the image orientation procedure starting from the relative orientation is addressed. The applied solution is based on the coplanarity equation parametrized with the essential matrix. The problem is reformulated in order to solve it using methods of algebraic geometry. The solution is followed by the optimization involving the least square criterion. The formation of the image block from the oriented models as well as the absolute orientation procedure were implemented using the Horn approach as a base algorithm. The second part of the paper is devoted to the tools and methods applied in the stereo digitization module. The solutions that support the user and improve the accuracy are given. Within the paper a few exemplary applications and products are mentioned. The work finishes with the concepts of development and improvements of existing functions.
Laboratory, Field, and Analytical Procedures for Using ...
Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas
Giese, Matthew W; Lewis, Mark A; Giese, Laura; Smith, Kevin M
2015-01-01
The requirements for an acceptable cannabis assay have changed dramatically over the years resulting in a large number of laboratories using a diverse array of analytical methodologies that have not been properly validated. Due to the lack of sufficiently validated methods, we conducted a single- laboratory validation study for the determination of cannabinoids and terpenes in a variety of commonly occurring cultivars. The procedure involves high- throughput homogenization to prepare sample extract, which is then profiled for cannabinoids and terpenes by HPLC-diode array detector and GC-flame ionization detector, respectively. Spike recovery studies for terpenes in the range of 0.03-1.5% were carried out with analytical standards, while recovery studies for Δ9-tetrahydrocannabinolic acid, cannabidiolic acid, Δ9-tetrahydrocannabivarinic acid, and cannabigerolic acid and their neutral counterparts in the range of 0.3-35% were carried out using cannabis extracts. In general, accuracy at all levels was within 5%, and RSDs were less than 3%. The interday and intraday repeatabilities of the procedure were evaluated with five different cultivars of varying chemotype, again resulting in acceptable RSDs. As an example of the application of this assay, it was used to illustrate the variability seen in cannabis coming from very advanced indoor cultivation operations.
Vilaplana, Francisco; Martínez-Sanz, Marta; Ribes-Greus, Amparo; Karlsson, Sigbritt
2010-01-15
The emission of low molecular weight compounds from recycled high-impact polystyrene (HIPS) has been investigated using headspace solid-phase microextraction (HS-SPME) and gas chromatography-mass spectrometry (GC-MS). Four released target analytes (styrene, benzaldehyde, acetophenone, and 2-phenylpropanal) were selected for the optimisation of the HS-SPME sampling procedure, by analysing operating parameters such as type of SPME fibre (polarity and operating mechanism), particle size, extraction temperature and time. 26 different compounds were identified to be released at different temperatures from recycled HIPS, including residues of polymerisation, oxidated derivates of styrene, and additives. The type of SPME fibre employed in the sampling procedure affected the detection of emitted components. An adsorptive fibre such as carbowax/polydimethylsiloxane (CAR/PDMS fibre) offered good selectivity for both non-polar and polar volatile compounds at lower temperatures; higher temperatures result in interferences from less-volatile released compounds. An absorptive fibre as polydimethylsiloxane (PDMS) fibre is suitable for the detection of less-volatile non-polar molecules at higher temperatures. The nature and relative amount of the emitted compounds increased with higher exposure temperature and smaller polymeric particle size. HS-SPME proves to be a suitable technique for screening the emission of semi-volatile organic compounds (SVOCs) from polymeric materials; reliable quantification of the content of target analytes in recycled HIPS is however difficult due to the complex mass-transfer processes involved, matrix effects, and the difficulties in equilibrating the analytical system. 2009 Elsevier B.V. All rights reserved.
Innovations in coating technology.
Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut
2008-01-01
Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.
SAM Companion Documents and Sample Collection Procedures provide information intended to complement the analytical methods listed in Selected Analytical Methods for Environmental Remediation and Recovery (SAM).
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2003 through June 2005. Results for the quality-control samples for 20 analytical procedures were evaluated for bias and precision. Control charts indicate that data for five of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, pH, silicon, and sodium. Seven of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: dissolved organic carbon, chloride, nitrate (ion chromatograph), nitrite, silicon, sodium, and sulfate. The calcium and magnesium procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 17 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 22 analytes. At least 85 percent of the samples met data-quality objectives for all analytes except total monomeric aluminum (82 percent of samples met objectives), total aluminum (77 percent of samples met objectives), chloride (80 percent of samples met objectives), fluoride (76 percent of samples met objectives), and nitrate (ion chromatograph) (79 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with ratings for each sample in the satisfactory, good, and excellent ranges or less than 10 percent error. The P-sample (low-ionic-strength constituents) analysis had one marginal and two unsatisfactory ratings for the chloride procedure. The T-sample (trace constituents)analysis had two unsatisfactory ratings and one high range percent error for the aluminum procedure. The N-sample (nutrient constituents) analysis had one marginal rating for the nitrate procedure. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 84 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were ammonium, total aluminum, and acid-neutralizing capacity. The ammonium procedure did not meet data quality objectives in all studies. Data-quality objectives were not met in 23 percent of samples analyzed for total aluminum and 45 percent of samples analyzed acid-neutralizing capacity. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, sodium, and sulfate. Data-quality objectives were not met by samples analyzed for fluoride.
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's LabMaster data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality-control samples analyzed from July 1999 through June 2001. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, calcium, chloride and nitrate (ion chromatography and colormetric method) and sulfate. The total aluminum and dissolved organic carbon procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits. The calcium and specific conductance procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The magnesium procedure was biased for the high-concentration and low concentration samples, but was within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 14 of 15 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 17 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except ammonium (81 percent of samples met objectives), chloride (75 percent of samples met objectives), and sodium (86 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with most ratings for each sample in the good to excellent range. The P-sample (low-ionic-strength constituents) analysis had one satisfactory rating for the specific conductance procedure in one study. The T-sample (trace constituents) analysis had one satisfactory rating for the aluminum procedure in one study and one unsatisfactory rating for the sodium procedure in another. The remainder of the samples had good or excellent ratings for each study. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 89 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were ammonium, total aluminum, dissolved organic carbon, and sodium. Results indicate a positive bias for the ammonium procedure in all studies. Data-quality objectives were not met in 50 percent of samples analyzed for total aluminum, 38 percent of samples analyzed for dissolved organic carbon, and 27 percent of samples analyzed for sodium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 91 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, and sulfate. Data-quality objectives were met by 75 percent of the samples analyzed for sodium and 58 percent of the samples analyzed for specific conductance.
Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek
2018-02-01
Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Dévier, Marie-Hélène; Le Menach, Karyn; Viglino, Liza; Di Gioia, Lodovico; Lachassagne, Patrick; Budzinski, Hélène
2013-01-15
The aim of this work was to investigate the potential presence of a broad range of organic compounds, such as hormones, alkylphenols, bisphenol A and phthalates, as well as pharmaceutical substances in two brands of bottled natural mineral waters (Evian and Volvic, Danone). The phthalates were determined by solid-phase microextraction coupled to gas chromatography-mass spectrometry (SPME-GC-MS) and the other compounds by liquid chromatography-tandem mass spectrometry (LC-MS/MS) or gas chromatography-mass spectrometry (GC-MS) after solid-phase extraction. The potential migration of alkylphenols, bisphenol A and phthalates from polyethylene terephthalate (PET) bottles was also investigated under standardized test conditions. Evian and Volvic natural mineral waters contain none of the around 120 targeted organic compounds. Traces of 3 pharmaceuticals (ketoprofen, salicylic acid, and caffeine), 3 alkylphenols (4-nonylphenol, 4-t-octylphenol, and 4-nonylphenol diethoxylate), and some phthalates including di(2-ethylhexyl)phthalate (DEHP) were detected in the samples, but they were also present in the procedural blanks at similar levels. The additional test procedures demonstrated that the few detected compounds originated from the background laboratory contamination. Analytical procedures have been designed both in the bottling factory and in the laboratory in order to investigate the sources of DEHP and to minimize to the maximum this unavoidable laboratory contamination. It was evidenced that no migration of the targeted compounds from bottles occurred under the test conditions. The results obtained in this study underline the complexity of reaching a reliable measure to qualify the contamination of a sample at ultra-trace level, in the field of very pure matrices. The analytical procedures involving glassware, equipment, hoods, and rooms specifically dedicated to trace analysis allowed us to reach reliable procedural limits of quantification at the ng/L level, by lowering the background laboratory contamination. Copyright © 2012 Elsevier B.V. All rights reserved.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Digital forensics: an analytical crime scene procedure model (ACSPM).
Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut
2013-12-10
In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic investigations each case is unique and needs special examination, it is not possible to cover every aspect of crime scene digital forensics, but the proposed procedure model is supposed to be a general guideline for practitioners. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Guided-inquiry laboratory experiments to improve students' analytical thinking skills
NASA Astrophysics Data System (ADS)
Wahyuni, Tutik S.; Analita, Rizki N.
2017-12-01
This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.
Remane, Daniela; Wissenbach, Dirk K; Meyer, Markus R; Maurer, Hans H
2010-04-15
In clinical and forensic toxicology, multi-analyte procedures are very useful to quantify drugs and poisons of different classes in one run. For liquid chromatographic/tandem mass spectrometric (LC/MS/MS) multi-analyte procedures, often only a limited number of stable-isotope-labeled internal standards (SIL-ISs) are available. If an SIL-IS is used for quantification of other analytes, it must be excluded that the co-eluting native analyte influences its ionization. Therefore, the effect of ion suppression and enhancement of fourteen SIL-ISs caused by their native analogues has been studied. It could be shown that the native analyte concentration influenced the extent of ion suppression and enhancement effects leading to more suppression with increasing analyte concentration especially when electrospray ionization (ESI) was used. Using atmospheric-pressure chemical ionization (APCI), methanolic solution showed mainly enhancement effects, whereas no ion suppression and enhancement effect, with one exception, occurred when plasma extracts were used under these conditions. Such differences were not observed using ESI. With ESI, eleven SIL-ISs showed relevant suppression effects, but only one analyte showed suppression effects when APCI was used. The presented study showed that ion suppression and enhancement tests using matrix-based samples of different sources are essential for the selection of ISs, particularly if used for several analytes to avoid incorrect quantification. In conclusion, only SIL-ISs should be selected for which no suppression and enhancement effects can be observed. If not enough ISs are free of ionization interferences, a different ionization technique should be considered. 2010 John Wiley & Sons, Ltd.
A new method for constructing analytic elements for groundwater flow.
NASA Astrophysics Data System (ADS)
Strack, O. D.
2007-12-01
The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.
Parametric study of minimum reactor mass in energy-storage dc-to-dc converters
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1981-01-01
Closed-form analytical solutions for the design equations of a minimum-mass reactor for a two-winding voltage-or-current step-up converter are derived. A quantitative relationship between the three parameters - minimum total reactor mass, maximum output power, and switching frequency - is extracted from these analytical solutions. The validity of the closed-form solution is verified by a numerical minimization procedure. A computer-aided design procedure using commercially available toroidal cores and magnet wires is also used to examine how the results from practical designs follow the predictions of the analytical solutions.
X-Graphs: Language and Algorithms for Heterogeneous Graph Streams
2017-09-01
INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d
Bioanalytical procedures for monitoring in utero drug exposure
Gray, Teresa
2009-01-01
Drug use by pregnant women has been extensively associated with adverse mental, physical, and psychological outcomes in their exposed children. This manuscript reviews bioanalytical methods for in utero drug exposure monitoring for common drugs of abuse in urine, hair, oral fluid, blood, sweat, meconium, amniotic fluid, umbilical cord tissue, nails, and vernix caseosa; neonatal matrices are particularly emphasized. Advantages and limitations of testing different maternal and neonatal biological specimens including ease and invasiveness of collection, and detection time frames, sensitivities, and specificities are described, and specific references for available analytical methods included. Future research involves identifying metabolites unique to fetal drug metabolism to improve detection rates of in utero drug exposure and determining relationships between the amount, frequency, and timing of drug exposure and drug concentrations in infant biological fluids and tissues. Accurate bioanalytical procedures are vital to defining the scope of and resolving this important public health problem. PMID:17370066
Progress in protein crystallography.
Dauter, Zbigniew; Wlodawer, Alexander
2016-01-01
Macromolecular crystallography evolved enormously from the pioneering days, when structures were solved by "wizards" performing all complicated procedures almost by hand. In the current situation crystal structures of large systems can be often solved very effectively by various powerful automatic programs in days or hours, or even minutes. Such progress is to a large extent coupled to the advances in many other fields, such as genetic engineering, computer technology, availability of synchrotron beam lines and many other techniques, creating the highly interdisciplinary science of macromolecular crystallography. Due to this unprecedented success crystallography is often treated as one of the analytical methods and practiced by researchers interested in structures of macromolecules, but not highly competent in the procedures involved in the process of structure determination. One should therefore take into account that the contemporary, highly automatic systems can produce results almost without human intervention, but the resulting structures must be carefully checked and validated before their release into the public domain.
Modified application of HS-SPME for quality evaluation of essential oil plant materials.
Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P
2016-01-01
The main limitation in the standard application of head space analysis employing solid phase microextraction (HS-SPME) for the evaluation of plants as sources of essential oils (EOs) are different quantitative relations of EO components from those obtained by direct analysis of EO which was got in the steam distillation (SD) process from the same plant (EO/SD). The results presented in the paper for thyme, mint, sage, basil, savory, and marjoram prove that the quantitative relations of EO components established by HS-SPME procedure and direct analysis of EO/SD are similar when the plant material in the HS-SPME process is replaced by its suspension in oil of the same physicochemical character as that of SPME fiber coating. The observed differences in the thyme EO composition estimated by both procedures are insignificant (F(exp)
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...
ERIC Educational Resources Information Center
Ember, Lois R.
1977-01-01
The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)
Casado, Natalia; Morante-Zarcero, Sonia; Pérez-Quintanilla, Damián; Sierra, Isabel
2016-08-12
A quick, sensitive and selective analytical reversed-phase multi-residue method using ultra-high performance liquid chromatography coupled to an ion-trap mass spectrometry detector (UHPLC-IT-MS/MS) operating in both positive and negative ion mode was developed for the simultaneous determination of 23 veterinary drug residues (β-blockers, β-agonists and Non-Steroidal Anti-inflammatory Drugs (NSAIDs)) in meat samples. The sample treatment involved a liquid-solid extraction followed by a solid-phase extraction (SPE) procedure. SBA-15 type mesoporous silica was synthetized and modified with octadecylsilane, and the resulting hybrid material (denoted as SBA-15-C18) was applied and evaluated as SPE sorbent in the purification of samples. The materials were comprehensively characterized, and they showed a high surface area, high pore volume and a homogeneous distribution of the pores. Chromatographic conditions and extraction procedure were optimized, and the method was validated according to the Commission Decision 2002/657/EC. The method detection limits (MDLs) and the method quantification limits (MQLs) were determined for all the analytes in meat samples and found to range between 0.01-18.75μg/kg and 0.02-62.50μg/kg, respectively. Recoveries for 15 of the target analytes ranged from 71 to 98%. In addition, for comparative purpose SBA-15-C18 was evaluated towards commercial C18 amorphous silica. Results revealed that SBA-15-C18 was clearly more successful in the multi-residue extraction of the 23 mentioned analytes with higher recovery values. The method was successfully tested to analyze prepacked preparations of mince bovine meat. Traces of propranolol, ketoprofen and diclofenac were detected in some samples. Copyright © 2016 Elsevier B.V. All rights reserved.
2012-01-01
Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples. PMID:23050842
Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria
2012-10-10
Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples.
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Michael G.
This quality assurance project plan describes the technical requirements and quality assurance activities of the environmental data collection/analyses operations to close Central Facilities Area Sewage treatment Plant Lagoon 3 and the land application area. It describes the organization and persons involved, the data quality objectives, the analytical procedures, and the specific quality control measures to be employed. All quality assurance project plan activities are implemented to determine whether the results of the sampling and monitoring performed are of the right type, quantity, and quality to satisfy the requirements for closing Lagoon 3 and the land application area.
The Shock and Vibration Bulletin. Part 3. Analytical Methods, Dynamic Analysis, Vehicle Systems
1981-05-01
1 Strnge Spcingramtr Fig 3% aito fma qaedslcmn ihsrne pcn 654 12 100 0 0 0 4 0 s 0 2 Distance along peal edge in - - Fig. 4 - Variation of...7 ~ ! v’j 4 pq -1 iijiL - -bp(C) I bpp ’ (24) L V* ki k2K 2 + (1 (ZT~~ - !I + !- 2) Y(IJ) and the value of bpp ’ can be tabulated as in 101 L .’blel1I...mI Table It. Values of bpp in Equation (24) The assembling of equations for a gun dynamics problem is more involved. The basic r1 procedures
Donor-impurity-related optical response and electron Raman scattering in GaAs cone-like quantum dots
NASA Astrophysics Data System (ADS)
Gil-Corrales, A.; Morales, A. L.; Restrepo, R. L.; Mora-Ramos, M. E.; Duque, C. A.
2017-02-01
The donor-impurity-related optical absorption, relative refractive index changes, and Raman scattering in GaAs cone-like quantum dots are theoretically investigated. Calculations are performed within the effective mass and parabolic band approximations, using the variational procedure to include the electron-impurity correlation effects. The study involves 1 s -like, 2px-like, and 2pz-like states. The conical structure is chosen in such a way that the cone height is large enough in comparison with the base radius thus allowing the use a quasi-analytic solution of the uncorrelated Schrödinger-like electron states.
Numerical solution of potential flow about arbitrary 2-dimensional multiple bodies
NASA Technical Reports Server (NTRS)
Thompson, J. F.; Thames, F. C.
1982-01-01
A procedure for the finite-difference numerical solution of the lifting potential flow about any number of arbitrarily shaped bodies is given. The solution is based on a technique of automatic numerical generation of a curvilinear coordinate system having coordinate lines coincident with the contours of all bodies in the field, regardless of their shapes and number. The effects of all numerical parameters involved are analyzed and appropriate values are recommended. Comparisons with analytic solutions for single Karman-Trefftz airfoils and a circular cylinder pair show excellent agreement. The technique of application of the boundary-fitted coordinate systems to the numerical solution of partial differential equations is illustrated.
Analytical and numerical analysis of frictional damage in quasi brittle materials
NASA Astrophysics Data System (ADS)
Zhu, Q. Z.; Zhao, L. Y.; Shao, J. F.
2016-07-01
Frictional sliding and crack growth are two main dissipation processes in quasi brittle materials. The frictional sliding along closed cracks is the origin of macroscopic plastic deformation while the crack growth induces a material damage. The main difficulty of modeling is to consider the inherent coupling between these two processes. Various models and associated numerical algorithms have been proposed. But there are so far no analytical solutions even for simple loading paths for the validation of such algorithms. In this paper, we first present a micro-mechanical model taking into account the damage-friction coupling for a large class of quasi brittle materials. The model is formulated by combining a linear homogenization procedure with the Mori-Tanaka scheme and the irreversible thermodynamics framework. As an original contribution, a series of analytical solutions of stress-strain relations are developed for various loading paths. Based on the micro-mechanical model, two numerical integration algorithms are exploited. The first one involves a coupled friction/damage correction scheme, which is consistent with the coupling nature of the constitutive model. The second one contains a friction/damage decoupling scheme with two consecutive steps: the friction correction followed by the damage correction. With the analytical solutions as reference results, the two algorithms are assessed through a series of numerical tests. It is found that the decoupling correction scheme is efficient to guarantee a systematic numerical convergence.
Ma, Jian; Yang, Bo; Byrne, Robert H
2012-06-15
Determination of chromate at low concentration levels in drinking water is an important analytical objective for both human health and environmental science. Here we report the use of solid phase extraction (SPE) in combination with a custom-made portable light-emitting diode (LED) spectrophotometer to achieve detection of chromate in the field at nanomolar levels. The measurement chemistry is based on a highly selective reaction between 1,5-diphenylcarbazide (DPC) and chromate under acidic conditions. The Cr-DPC complex formed in the reaction can be extracted on a commercial C18 SPE cartridge. Concentrated Cr-DPC is subsequently eluted with methanol and detected by spectrophotometry. Optimization of analytical conditions involved investigation of reagent compositions and concentrations, eluent type, flow rate (sample loading), sample volume, and stability of the SPE cartridge. Under optimized conditions, detection limits are on the order of 3 nM. Only 50 mL of sample is required for an analysis, and total analysis time is around 10 min. The targeted analytical range of 0-500 nM can be easily extended by changing the sample volume. Compared to previous SPE-based spectrophotometric methods, this analytical procedure offers the benefits of improved sensitivity, reduced sample consumption, shorter analysis time, greater operational convenience, and lower cost. Copyright © 2012 Elsevier B.V. All rights reserved.
Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes
Nord, G.L.
1982-01-01
Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.
Gbylik-Sikorska, Malgorzata; Sniegocki, Tomasz; Posyniak, Andrzej
2015-05-15
The original analytical method for the simultaneous determination and confirmation of neonicotinoids insecticides (imidacloprid, clothianidin, acetamiprid, thiametoxam, thiacloprid, nitenpyram, dinotefuran) and some of their metabolites (imidacloprid guanidine, imidacloprid olefin, imidacloprid urea, desnitro-imidacloprid hydrochloride, thiacloprid-amid and acetamiprid-N-desmethyl) in honey bee and honey was developed. Preparation of honey bee samples involves the extraction with mixture of acetonitrile and ethyl acetate followed by cleaned up using the Sep-Pak Alumina N Plus Long cartridges. Honey samples were dissolved in 1% mixture of acetonitrile and ethyl acetate with addition of TEA, then extracts were cleaned up with Strata X-CW cartridges. The identity of analytes was confirmed using liquid chromatography tandem mass spectrometry. All compounds were separated on a Luna C18 column with gradient elution. The whole procedure was validated according to the requirements of SANCO 12571/2013. The average recoveries of the analytes ranged from 85.3% to 112.0%, repeatabilities were in the range of 2.8-11.2%, within-laboratory reproducibility was in the range of 3.3-14.6%, the limits of quantitation were in the range of 0.1-0.5μgkg(-1), depending of analyte and matrices. The validated method was successfully applied for the determination of clothianidin, imidacloprid and imidacloprid urea in real incurred honey bee samples and clothianidin in honey. Copyright © 2015 Elsevier B.V. All rights reserved.
Removal of uranium from soil sample digests for ICP-OES analysis of trace metals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foust, R.D. Jr.; Bidabad, M.
1996-10-01
An analytical procedure has been developed to quantitatively remove uranium from soil sample digests, permitting ICP-OES analysis of trace metals. The procedure involves digesting a soil sample with standard procedures (EPA SW-846, Method 3050), and passing the sample digestate through commercially available resin (U/TEVA{sm_bullet}Spec, Eichrom Industries, Inc.) containing diarryl amylphosphonate as the stationary phase. Quantitative removal of uranium was achieved with soil samples containing up to 60% uranium, and percent recoveries averaged better than 85% for 9 of the 10 metals evaluated (Ag, As, Cd. Cr, Cu, Ni, Pb, Se and Tl). The U/TEVA{sm_bullet}Spec column was regenerated by washing withmore » 200 mL of a 0.01 M oxalic acid/0.02 M nitric acid solution, permitting re-use of the column. GFAAS analysis of a sample spiked with 56.5% uranium, after treatment of the digestate with a U/TEVA{sm_bullet}Spec resin column, resulted in percent recoveries of 97% or better for all target metals.« less
A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.
Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R
2017-07-01
The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.
Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas
2017-12-01
N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. Graphical abstract ᅟ.
NASA Astrophysics Data System (ADS)
Bontempi, E.; Benedetti, D.; Massardi, A.; Zacco, A.; Borgese, L.; Depero, L. E.
2008-07-01
Europe has a very rich and diversified cultural heritage of art works, including buildings, monuments and objects of all sizes, involving a great variety of materials. The continuous discovery of new art works opens the problem of their authentication. Advanced analytical techniques can be fundamental to understand the way of life, the culture and the technical and intellectual know-how of the artists. Indeed, the authentication of an art work involves the identification of the used materials, their production techniques and procedures used for the work realization. It is possible to know the origin and provenance of materials, including the location of the natural sources. Advanced analytical techniques also help one to understand degradation processes, corrosion, weathering, and preservation-conservation protocols. In this paper we present a painting attributed to Domenico Ghirlandaio. Ghirlandaio is a well-known artist of fifteenth century who contributes to the apprenticeship of Michelangelo Buonarroti. The study of the pigments used in this painting, which belongs to a private collection, has been supported mainly by means of laboratory two-dimensional X-ray microdiffraction (μXRD2). The possibility to obtain information about not only the phase, but also microstructure allows one to extract interesting consideration and to obtain evidence of the painter’s style and intention.
The Hispanic Stress Inventory--Adolescent Version: a culturally informed psychosocial assessment.
Cervantes, Richard C; Fisher, Dennis G; Córdova, David; Napper, Lucy E
2012-03-01
A 2-phase study was conducted to develop a culturally informed measure of psychosocial stress for adolescents: the Hispanic Stress Inventory--Adolescent Version (HSI-A). Phase 1 involved item development through the collection of open-ended focus group interview data (n = 170) from a heterogeneous sample of Hispanic youths residing in the southwest and northeast United States. In Phase 2, we examined the psychometric properties of the HSI-A (n = 1,651), which involved the use of factor analytic procedures to determine the underlying scale structure of the HSI-A for foreign-born and U.S.-born participants in an aggregated analytic approach. An 8-factor solution was established, with factors that include Family Economic Stress, Acculturation-Gap Stress, Culture and Educational Stress, Immigration-Related Stress, Discrimination Stress, Family Immigration Stress, Community and Gang-Related Stress, and Family and Drug-Related Stress. Concurrent, related validity estimates were calculated to determine relations between HSI-A and other measures of child psychopathology and behavioral and emotional disturbances. HSI-A total stress appraisal scores were significantly correlated with both the Children's Depression Inventory and the Youth Self Report (p < .001). Reliability estimates for the HSI-A were conducted, and they yielded high reliability coefficients for most factor subscales, with the HSI-A total stress appraisal score reliability alpha at .92.
The Hispanic Stress Inventory-Adolescent Version: A Culturally Informed Psychosocial Assessment
Cervantes, Richard C.; Fisher, Dennis G.; Córdova, David; Napper, Lucy
2012-01-01
A 2-phase study was conducted to develop a culturally informed measure of psychosocial stress for adolescents, the Hispanic Stress Inventory-Adolescent Version (HSI-A). Phase I involved item development through the collection of open-ended focus group interview data (n=170) from a heterogeneous sample of Hispanic youth residing in the southwest and northeast United States. Phase 2 examined the psychometric properties of the HSI-A (n=1651) involving the use of factor analytic procedures to determine the underlying scale structure of the HSI-A, for foreign-born and U.S.-born participants in an aggregated analytic approach. An eight factor solution was established with factors that include Family Economic Stress, Acculturation Gaps Stress, Culture and Educational Stress, Immigration Related Stress, Discrimination Stress, Family Immigration Stress, Community and Gang Violence Stress and Family Drug Related Stress. Concurrent related validity estimates were calculated to determine relationships between HSI-A and other measures of child psychopathology, behavioral and emotional disturbances. HSI-A Total Stress Appraisal Scores were significantly correlated with both the CDI and YSR (p<.001 respectively). Reliability estimates for the HSI-A were conducted and yielded high reliability coefficients for most all factor sub-scales with HSI-A Total Stress Appraisal score reliability at alpha=.92. PMID:21942232
NASA Astrophysics Data System (ADS)
Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas
2017-12-01
N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. [Figure not available: see fulltext.
Trends in Analytical Scale Separations.
ERIC Educational Resources Information Center
Jorgenson, James W.
1984-01-01
Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry
ERIC Educational Resources Information Center
Adami, Gianpiero
2006-01-01
A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…
Bijttebier, Sebastiaan; D'Hondt, Els; Noten, Bart; Hermans, Nina; Apers, Sandra; Voorspoels, Stefan
2014-11-15
Alkaline saponification is often used to remove interfering chlorophylls and lipids during carotenoids analysis. However, saponification also hydrolyses esterified carotenoids and is known to induce artifacts. To avoid carotenoid artifact formation during saponification, Larsen and Christensen (2005) developed a gentler and simpler analytical clean-up procedure involving the use of a strong basic resin (Ambersep 900 OH). They hypothesised a saponification mechanism based on their Liquid Chromatography-Photodiode Array (LC-PDA) data. In the present study, we show with LC-PDA-accurate mass-Mass Spectrometry that the main chlorophyll removal mechanism is not based on saponification, apolar adsorption or anion exchange, but most probably an adsorption mechanism caused by H-bonds and dipole-dipole interactions. We showed experimentally that esterified carotenoids and glycerolipids were not removed, indicating a much more selective mechanism than initially hypothesised. This opens new research opportunities towards a much wider scope of applications (e.g. the refinement of oils rich in phytochemical content). Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Madsen, C. A.; Kragh-Poulsen, J.-C.; Thage, K. J.; Andreassen, M. J.
2017-12-01
The monopile foundation is the dominant solution for support of wind turbines in offshore wind farms. It is normally grouted to the transition piece which connects the foundation to the turbine. Currently, the bolted steel ring flange connection is investigated as an alternative. The monopile--transition piece connection has specific problems, such as out-of-verticality and installation damage from driving the MP into the seabed and it is not fully known how to design for these. This paper presents the status of the ongoing development work and an estimate of what still needs to be covered in order to use the connection in practice. This involves presentation of an analytical and non-linear FE analysis procedure for the monopile-transition piece connection composed of two L flanges connected with preloaded bolts. The connection is verified for ultimate and fatigue limit states based on an integrated load simulation carried out by the turbine manufacturer.
Ordoñez, Edgar Y; Rodil, Rosario; Quintana, José Benito; Cela, Rafael
2015-02-15
A new analytical procedure involving the use of water and a low percentage of ethanol combined to high temperature liquid chromatography-tandem mass spectrometry has been developed for the determination of nine high-intensity sweeteners in a variety of drink samples. The method permitted the analysis in 23min (including column reequilibration) and consuming only 0.85mL of a green organic solvent (ethanol). This methodology provided limits of detection (after 50-fold dilution) in the 0.05-10mg/L range, with recoveries (obtained from five different types of beverages) being in the 86-110% range and relative standard deviation values lower than 12%. Finally, the method was applied to 25 different samples purchased in Spain, where acesulfame and sucralose were the most frequently detected analytes (>50% of the samples) and cyclamate was found over the legislation limit set by the European Union in a sample and at the regulation boundary in three others. Copyright © 2014 Elsevier Ltd. All rights reserved.
Rapid determination of minoxidil in human plasma using ion-pair HPLC.
Zarghi, A; Shafaati, A; Foroutan, S M; Khoddam, A
2004-10-29
A rapid, simple and sensitive ion-pair high-performance liquid chromatography (HPLC) method has been developed for quantification of minoxidil in plasma. The assay enables the measurement of minoxidil for therapeutic drug monitoring with a minimum detectable limit of 0.5 ng ml(-1). The method involves simple, one-step extraction procedure and analytical recovery was complete. The separation was performed on an analytical 150 x 4.6 mm i.d. microbondapak C18 column. The wavelength was set at 281 nm. The mobile phase was a mixture of 0.01 M sodium dihydrogen phosphate buffer and acetonitrile (60:40, v/v) containing 2.5 mM sodium dodecyl sulphate adjusted to pH 3.5 at a flow rate of 1 ml/min. The column temperature was set at 50 degrees C. The calibration curve was linear over the concentration range 2-100 ng ml(-1). The coefficients of variation for inter-day and intra-day assay were found to be less than 8%.
Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej
2018-04-01
Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients <15% have been achieved for most of the analyzed substances. Analytical conditions allowed for a successful separation of variety of poisons with the typical screening detection limit at ≤10 μg/kg levels. The method was used to investigate more than 100 animals poisoning incidents and proved that is useful to be used in animal forensic toxicology cases.
Analytical study of comet nucleus samples
NASA Technical Reports Server (NTRS)
Albee, A. L.
1989-01-01
Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.
The Capillary Flow Experiments Aboard the International Space Station: Increments 9-15
NASA Technical Reports Server (NTRS)
Jenson, Ryan M.; Weislogel, Mark M.; Tavan, Noel T.; Chen, Yongkang; Semerjian, Ben; Bunnell, Charles T.; Collicott, Steven H.; Klatte, Jorg; dreyer, Michael E.
2009-01-01
This report provides a summary of the experimental, analytical, and numerical results of the Capillary Flow Experiment (CFE) performed aboard the International Space Station (ISS). The experiments were conducted in space beginning with Increment 9 through Increment 16, beginning August 2004 and ending December 2007. Both primary and extra science experiments were conducted during 19 operations performed by 7 astronauts including: M. Fincke, W. McArthur, J. Williams, S. Williams, M. Lopez-Alegria, C. Anderson, and P. Whitson. CFE consists of 6 approximately 1 to 2 kg handheld experiment units designed to investigate a selection of capillary phenomena of fundamental and applied importance, such as large length scale contact line dynamics (CFE-Contact Line), critical wetting in discontinuous structures (CFE-Vane Gap), and capillary flows and passive phase separations in complex containers (CFE-Interior Corner Flow). Highly quantitative video from the simply performed flight experiments provide data helpful in benchmarking numerical methods, confirming theoretical models, and guiding new model development. In an extensive executive summary, a brief history of the experiment is reviewed before introducing the science investigated. A selection of experimental results and comparisons with both analytic and numerical predictions is given. The subsequent chapters provide additional details of the experimental and analytical methods developed and employed. These include current presentations of the state of the data reduction which we anticipate will continue throughout the year and culminate in several more publications. An extensive appendix is used to provide support material such as an experiment history, dissemination items to date (CFE publication, etc.), detailed design drawings, and crew procedures. Despite the simple nature of the experiments and procedures, many of the experimental results may be practically employed to enhance the design of spacecraft engineering systems involving capillary interface dynamics.
Analytical characterization of seventeen ring-substituted N,N-diallyltryptamines
Brandt, Simon D.; Kavanagh, Pierce V.; Dowling, Geraldine; Talbot, Brian; Westphal, Folker; Meyer, Markus R.; Maurer, Hans H.; Halberstadt, Adam L.
2017-01-01
Many N,N-dialkylated tryptamines show psychoactive properties in humans and the number of derivatives involved in multidisciplinary areas of research has grown over the last few decades. Whereas some derivatives form the basis of a range of medicinal products, others are predominantly encountered as recreational drugs, and in some cases, the areas of therapeutic and recreational use can overlap. In recent years, 5-methoxy-N,N-diallyltryptamine (5-MeO-DALT) has appeared as a new psychoactive substance (NPS) and ‘research chemical’ whereas 4-acetoxy-DALT and the ring-unsubstituted DALT have only been detected very recently. Strategies pursued in the authors’ laboratories included the preparation and biological evaluation of previously unreported N,N-diallyltryptamines (DALTs). This report describes the analytical characterization of seventeen DALTs. Fifteen DALTs were prepared by a microwave-accelerated Speeter and Anthony procedure following established procedures developed previously in the authors’ laboratories. In addition to DALT, the substances included in this study were 2-phenyl-, 4-acetoxy-, 4-hydroxy-, 4,5-ethylenedioxy-, 5-methyl-, 5-methoxy-, 5-methoxy-2-methyl-, 5-ethoxy-, 5-fluoro-, 5-fluoro-2-methyl-, 5-chloro-, 5-bromo-, 5,6-methylenedioxy-, 6-fluoro-, 7-methyl, and 7-ethyl-DALT, respectively. The DALTs were characterized by nuclear magnetic resonance spectroscopy (NMR), gas chromatography (GC) quadrupole and ion trap (EI/CI) mass spectrometry (MS), low and high mass accuracy MS/MS, ultraviolet diode array detection and GC solid-state infrared analysis, respectively. A comprehensive collection of spectral data was obtained that are provided to research communities who face the challenge of encountering newly emerging substances where analytical data are not available. These data are also relevant to researchers who might wish to explore the clinical and non-clinical uses of these substances. PMID:27100373
Berlioz-Barbier, Alexandra; Buleté, Audrey; Faburé, Juliette; Garric, Jeanne; Cren-Olivé, Cécile; Vulliet, Emmanuelle
2014-11-07
Aquatic ecosystems are continuously contaminated by agricultural and industrial sources. Although the consequences of this pollution are gradually becoming visible, their potential impacts on aquatic ecosystems are poorly known, particularly regarding the risk of bioaccumulation in different trophic levels. To establish a causality relationship between bioaccumulation and disease, experiments on biotic matrices must be performed. In this context, a multi-residue method for the analysis of 35 emerging pollutants in three benthic invertebrates (Potamopyrgus antipodarum, Gammarus fossarum, and Chironomus riparius) has been developed. Because the variation in response of each individual must be taken into account in ecotoxicological studies, the entire analytical chain was miniaturised, thereby reducing the required sample size to a minimum of one individual and scaling the method accordingly. A new extraction strategy based on a modified, optimised and miniaturised "QuEChERS" approach is reported. The procedure involves salting out liquid-liquid extraction of approximately 10-20mg of matrix followed by nano-liquid chromatography-nano electospray ionisation coupled with tandem mass spectrometry. The validated analytical procedure exhibited recoveries between 40 and 98% for all the target compounds and enabled the determination of pollutants on an individual scale in the ng g(-1) concentration. The method was subsequently applied to determine the levels of target analytes in several encaged organisms which were exposed upstream and downstream of an effluent discharge. The results highlighted a bioaccumulation of certain targeted emerging pollutants in three freshwater invertebrates, as well as inter-species differences. 18 out of 35 compounds were detected and eight were quantified. The highest concentrations were measured for ibuprofen in G. fossarum, reaching up to 105 ng g(-1). Copyright © 2014 Elsevier B.V. All rights reserved.
Subsurface Stress Fields in FCC Single Crystal Anisotropic Contacts
NASA Technical Reports Server (NTRS)
Arakere, Nagaraj K.; Knudsen, Erik; Swanson, Gregory R.; Duke, Gregory; Ham-Battista, Gilda
2004-01-01
Single crystal superalloy turbine blades used in high pressure turbomachinery are subject to conditions of high temperature, triaxial steady and alternating stresses, fretting stresses in the blade attachment and damper contact locations, and exposure to high-pressure hydrogen. The blades are also subjected to extreme variations in temperature during start-up and shutdown transients. The most prevalent high cycle fatigue (HCF) failure modes observed in these blades during operation include crystallographic crack initiation/propagation on octahedral planes, and non-crystallographic initiation with crystallographic growth. Numerous cases of crack initiation and crack propagation at the blade leading edge tip, blade attachment regions, and damper contact locations have been documented. Understanding crack initiation/propagation under mixed-mode loading conditions is critical for establishing a systematic procedure for evaluating HCF life of single crystal turbine blades. This paper presents analytical and numerical techniques for evaluating two and three dimensional subsurface stress fields in anisotropic contacts. The subsurface stress results are required for evaluating contact fatigue life at damper contacts and dovetail attachment regions in single crystal nickel-base superalloy turbine blades. An analytical procedure is presented for evaluating the subsurface stresses in the elastic half-space, based on the adaptation of a stress function method outlined by Lekhnitskii. Numerical results are presented for cylindrical and spherical anisotropic contacts, using finite element analysis (FEA). Effects of crystal orientation on stress response and fatigue life are examined. Obtaining accurate subsurface stress results for anisotropic single crystal contact problems require extremely refined three-dimensional (3-D) finite element grids, especially in the edge of contact region. Obtaining resolved shear stresses (RSS) on the principal slip planes also involves considerable post-processing work. For these reasons it is very advantageous to develop analytical solution schemes for subsurface stresses, whenever possible.
Hernandez, Purnima; Ikkanda, Zachary
2011-03-01
There are a limited number of studies addressing behavior management techniques and procedural modifications that dentists can use to treat people with an autism spectrum disorder (ASD). The authors conducted a search of the dental and behavioral analytic literature to identify management techniques that address problem behaviors exhibited by children with ASDs in dental and other health-related environments. Applied behavior analysis (ABA) is a science in which procedures are based on the principles of behavior through systematic experimentation. Clinicians have used ABA procedures successfully to modify socially significant behaviors of people with ASD. Basic behavior management techniques currently used in dentistry may not encourage people with cognitive and behavioral disabilities, such as ASD, to tolerate simple in-office dental procedures consistently. Instead, dental care providers often are required to use advanced behavior management techniques to complete simple in-office procedures such as prophylaxis, sealant placement and obtaining radiographs. ABA procedures can be integrated in the dental environment to manage problem behaviors often exhibited by children with an ASD. The authors found no evidence-based procedural modifications that address the behavioral characteristics and problematic behaviors of children with an ASD in a dental environment. Further research in this area should be conducted. Knowledge and in-depth understanding of behavioral principles is essential when a dentist is concerned with modifying behaviors. Using ABA procedures can help dentists manage problem behaviors effectively and systematically when performing routine dental treatment. Being knowledgeable about each patient's behavioral characteristics and the parents' level of involvement is important in the successful integration of the procedures and reduction of in-office time.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Horowitz, Arthur J.
2013-01-01
Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.
Srinivas, Nuggehally R
2006-05-01
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.
Off-shell amplitudes as boundary integrals of analytically continued Wilson line slope
NASA Astrophysics Data System (ADS)
Kotko, P.; Serino, M.; Stasto, A. M.
2016-08-01
One of the methods to calculate tree-level multi-gluon scattering amplitudes is to use the Berends-Giele recursion relation involving off-shell currents or off-shell amplitudes, if working in the light cone gauge. As shown in recent works using the light-front perturbation theory, solutions to these recursions naturally collapse into gauge invariant and gauge-dependent components, at least for some helicity configurations. In this work, we show that such structure is helicity independent and emerges from analytic properties of matrix elements of Wilson line operators, where the slope of the straight gauge path is shifted in a certain complex direction. This is similar to the procedure leading to the Britto-Cachazo-Feng-Witten (BCFW) recursion, however we apply a complex shift to the Wilson line slope instead of the external momenta. While in the original BCFW procedure the boundary integrals over the complex shift vanish for certain deformations, here they are non-zero and are equal to the off-shell amplitudes. The main result can thus be summarized as follows: we derive a decomposition of a helicity-fixed off-shell current into gauge invariant component given by a matrix element of a straight Wilson line plus a reminder given by a sum of products of gauge invariant and gauge dependent quantities. We give several examples realizing this relation, including the five-point next-to-MHV helicity configuration.
Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel
2012-01-01
The potential of a headspace device coupled to multi-capillary column-ion mobility spectrometry has been studied as a screening system to differentiate virgin olive oils ("lampante," "virgin," and "extra virgin" olive oil). The last two types are virgin olive oil samples of very similar characteristics, which were very difficult to distinguish with the existing analytical method. The procedure involves the direct introduction of the virgin olive oil sample into a vial, headspace generation, and automatic injection of the volatiles into a gas chromatograph-ion mobility spectrometer. The data obtained after the analysis by duplicate of 98 samples of three different categories of virgin olive oils, were preprocessed and submitted to a detailed chemometric treatment to classify the virgin olive oil samples according to their sensory quality. The same virgin olive oil samples were also analyzed by an expert's panel to establish their category and use these data as reference values to check the potential of this new screening system. This comparison confirms the potential of the results presented here. The model was able to classify 97% of virgin olive oil samples in their corresponding group. Finally, the chemometric method was validated obtaining a percentage of prediction of 87%. These results provide promising perspectives for the use of ion mobility spectrometry to differentiate virgin olive oil samples according to their quality instead of using the classical analytical procedure.
Calvi, Lorenzo; Pentimalli, Daniela; Panseri, Sara; Giupponi, Luca; Gelmini, Fabrizio; Beretta, Giangiacomo; Vitali, Davide; Bruno, Massimo; Zilio, Emanuela; Pavlovic, Radmila; Giorgi, Annamaria
2018-02-20
There are at least 554 identified compounds in C. sativa L., among them 113 phytocannabinoids and 120 terpenes. Phytocomplex composition differences between the pharmaceutical properties of different medical cannabis chemotype have been attributed to strict interactions, defined as 'entourage effect', between cannabinoids and terpenes as a result of synergic action. The chemical complexity of its bioactive constituents highlight the need for standardised and well-defined analytical approaches able to characterise the plant chemotype, the herbal drug quality as well as to monitor the quality of pharmaceutical cannabis extracts and preparations. Hence, in the first part of this study an analytical procedures involving the combination of headspace-solid-phase microextraction (HS-SPME) coupled to GC-MS and High Resolution Mass-Spectrometry LC-HRMS (Orbitrap ® ) were set up, validated and applied for the in-depth profiling and fingerprinting of cannabinoids and terpenes in two authorised medical grade varieties of Cannabis sativa L. inflorescences (Bedrocan ® and Bediol ® ) and in obtained macerated oils. To better understand the trend of all volatile compounds and cannabinoids during oil storage a new procedure for cannabis macerated oil preparation without any thermal step was tested and compared with the existing conventional methods to assess the potentially detrimental effect of heating on overall product quality. Copyright © 2017 Elsevier B.V. All rights reserved.
Bahrani, Sonia; Ghaedi, Mehrorang; Khoshnood Mansoorkhani, Mohammad Javad; Ostovan, Abbas
2017-01-01
A selective and rapid method was developed for quantification of curcumin in human plasma and food samples using molecularly imprinted magnetic multiwalled carbon nanotubes (MMWCNTs) which was characterized with EDX and FESEM. The role of sorbent mass, volume of eluent and sonication time on response in solid phase microextraction procedure were optimized by central composite design (CCD) combined with response surface methodology (RSM) using Statistica. Preliminary experiments reveal that among different solvents, methanol:dimethyl sulfoxide (4:1V/V) led to efficient and quantitative elution of analyte. A reversed-phase high performance liquid chromatographic technique with UV detection (HPLC-UV) was applied for detection of curcumin content. The assay procedure involves chromatographic separation on analytical Nucleosil C18 column (250×4.6mm I.D., 5μm particle size) at ambient temperature with acetonitrile-water adjusted at pH=4.0 (20:80, v/v) as mobile phase at flow rate of 1.0mLmin -1 , while UV detector was set at 420nm. Under optimized conditions, the method demonstrated linear calibration curve with good detection limit (0.028ngmL -1 ) and R 2 =0.9983. The proposed method was successfully applied to biological fluid and food samples including ginger powder, curry powder, and turmeric powder. Copyright © 2016. Published by Elsevier B.V.
Nicoara, Simona C.; Turner, Nicholas W.; Minnikin, David E.; Lee, Oona Y.-C.; O'Sullivan, Denise M.; McNerney, Ruth; Mutetwa, Reggie; Corbett, Liz E.; Morgan, Geraint H.
2015-01-01
A proof of principle gas chromatography–mass spectrometry method is presented, in combination with clean up assays, aiming to improve the analysis of methyl mycocerosate tuberculosis biomarkers from sputum. Methyl mycocerosates are generated from the transesterification of phthiocerol dimycocerosates (PDIMs), extracted in petroleum ether from sputum of tuberculosis suspect patients. When a high matrix background is present in the sputum extracts, the identification of the chromatographic peaks corresponding to the methyl derivatives of PDIMs analytes may be hindered by the closely eluting methyl ether of cholesterol, usually an abundant matrix constituent frequently present in sputum samples. The purification procedures involving solid phase extraction (SPE) based methods with both commercial Isolute-Florisil cartridges, and purpose designed molecularly imprinted polymeric materials (MIPs), resulted in cleaner chromatograms, while the mycocerosates are still present. The clean-up performed on solutions of PDIMs and cholesterol standards in petroleum ether show that, depending on the solvent mix and on the type of SPE used, the recovery of PDIMs is between 64 and 70%, whilst most of the cholesterol is removed from the system. When applied to petroleum ether extracts from representative sputum samples, the clean-up procedures resulted in recoveries of 36–68% for PDIMs, allowing some superior detection of the target analytes. PMID:25728371
Bustamante, Luis; Cárdenas, Diana; von Baer, Dietrich; Pastene, Edgar; Duran-Sandoval, Daniel; Vergara, Carola; Mardones, Claudia
2017-09-01
Miniaturized sample pretreatments for the analysis of phenolic metabolites in plasma, involving protein precipitation, enzymatic deconjugation, extraction procedures, and different derivatization reactions were systematically evaluated. The analyses were conducted by gas chromatography with mass spectrometry for the evaluation of 40 diet-derived phenolic compounds. Enzyme purification was necessary for the phenolic deconjugation before extraction. Trimethylsilanization reagent and two different tetrabutylammonium salts for derivatization reactions were compared. The optimum reaction conditions were 50 μL of trimethylsilanization reagent at 90°C for 30 min, while tetrabutylammonium salts were associated with loss of sensitivity due to rapid activation of the inert gas chromatograph liner. Phenolic acids extractions from plasma were optimized. Optimal microextraction by packed sorbent performance was achieved using an octadecylsilyl packed bed and better recoveries for less polar compounds, such as methoxylated derivatives, were observed. Despite the low recovery for many analytes, repeatability using an automated extraction procedure in the gas chromatograph inlet was 2.5%. Instead, using liquid-liquid microextraction, better recoveries (80-110%) for all analytes were observed at the expense of repeatability (3.8-18.4%). The phenolic compounds in gerbil plasma samples, collected before and 4 h after the administration of a calafate extract, were analyzed with the optimized methodology. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Evaluation of management measures of software development. Volume 1: Analysis summary
NASA Technical Reports Server (NTRS)
Page, J.; Card, D.; Mcgarry, F.
1982-01-01
The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.
Clean Water Act Analytical Methods
EPA publishes laboratory analytical methods (test procedures) that are used by industries and municipalities to analyze the chemical, physical and biological components of wastewater and other environmental samples required by the Clean Water Act.
FDA Bacteriological Analytical Manual, Chapter 10, 2003: Listeria monocytogenes
FDA Bacteriological Analytical Manual, Chapter 10 describes procedures for analysis of food samples and may be adapted for assessment of solid, particulate, aerosol, liquid and water samples containing Listeria monocytogenes.
Bámaca-Colbert, Mayra Y; Gayles, Jochebed G
2010-11-01
The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.
Derivation and application of a class of generalized impedance boundary conditions, part 2
NASA Technical Reports Server (NTRS)
Volakis, J. L.; Senior, T. B. A.; Jin, J.-M.
1989-01-01
Boundary conditions involving higher order derivatives are presented by simulating surfaces whose reflection coefficients are known analytically, numerically, or experimentally. Procedures for determining the coefficients of the derivatives are discussed, along with the effect of displacing the surface where the boundary conditions are applied. Provided the coefficients satisfy a duality relation, equivalent forms of the boundary conditions involving tangential field components are deduced, and these provide the natural extension to non-planar surfaces. As an illustration, the simulation of metal-backed uniform and three-layer dielectric coatings is given. It is shown that fourth order conditions are capable of providing an accurate simulation for the uniform coating at least a quarter of a wavelength in thickness. Provided, though, some compromise in accuracy is acceptable, it is also shown that a third order condition may be sufficient for practical purposes when simulating uniform coatings.
Knowledge-based geographic information systems (KBGIS): New analytic and data management tools
Albert, T.M.
1988-01-01
In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.
NASA Astrophysics Data System (ADS)
Hor, Yew Fong
2002-08-01
This thesis involves the design, fabrication and characterization of an integrated optical waveguide sensor. Prior to fabrication, design parameters of the waveguide need to be determined and optimized. The waveguide parameters such as waveguide dimension and the refractive index of the core and cladding are obtained from the single-mode cutoff frequency calculated using either analytical or numerical methods. In this thesis, details of analytical calculations to determine the cutoff frequency in terms of the waveguide parameters will be presented. The method discussed here is Marcatili's approximation. The purpose is to solve the scalar wave equation derived from Maxwell's equations because it describes the mode properties inside the waveguides. The Finite Element Method is used to simulate the electric and magnetic fields inside the waveguides and to determine the propagation characteristics in optical waveguides. This method is suited for problems involving complicated geometries and variable index of refraction. Fabrication of the Integrated Mach-Zehnder Interferometer sensor involves several important standard processes such as Chemical Vapor Deposition (CVD) for thin film fabrication, photolithography for mask transfer, and etching for ridge waveguide formation. The detailed fabrication procedures of the tested Mach-Zehnder Interferometer sensors are discussed. After completion of the sensor fabrication processes, the characterizations were carried out for the thin film of SiO2 and PSG, the waveguides and the Y-junction separately. The waveguides were analyzed to make sure that the sensors are working as expected. The experimental testing on the separated waveguide portions of the first batch Integrated Mach-Zehnder Interferometer (MZI) sensors are described. These testing procedures were also performed for the subsequent fabricated batches of the integrated MZI sensors until optimum performance is achieved. A new concept has been proposed for chemical sensing applications. The novelty of the approach is mainly based on utilizing the multi-wavelength or broadband source instead of single wavelength input to the integrated MZI. The shifting of output spectra resulting from the interference has shown the ability of the MZI to analyze the different concentrations of a chemical analyte. The sensitivity of the sensor is also determined from the plot of intensity versus concentration, which is around 0.013 (%ml)-1 and 0.007 (%ml)-l for the white light source and the 1.5 mum broadband source, respectively, while the lowest detectable concentration of ethanol for the sensor detection is around 8% using a intensity variation method and 0.6% using a peak wavelength variation method.
NASA Technical Reports Server (NTRS)
Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.
1974-01-01
The methods and procedures used in the analysis and testing of the scale model are reported together with the correlation of the analytical and experimental results. The model, the NASTRAN finite element analysis, and results are discussed. Tests and analytical investigations are also reported.
This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Koukouvinos, Georgios; Petrou, Panagiota; Misiakos, Konstantinos; Drygiannakis, Dimitris; Raptis, Ioannis; Stefanitsis, Gerasimos; Martini, Spyridoula; Nikita, Dimitra; Goustouridis, Dimitrios; Moser, Isabella; Jobst, Gerhard; Kakabakos, Sotirios
2016-10-15
A dual-analyte assay for the simultaneous determination of C-reactive protein (CRP) and D-dimer in human blood plasma based on a white light interference spectroscopy sensing platform is presented. Measurement is accomplished in real-time by scanning the sensing surface, on which distinct antibody areas have been created, with a reflection probe used both for illumination of the surface and collection of the reflected interference spectrum. The composition of the transducer, the sensing surface chemical activation and biofunctionalization procedures were optimized with respect to signal magnitude and repeatability. The assay format involved direct detection of CRP whereas for D-dimer a two-site immunoassay employing a biotinylated reporter antibody and reaction with streptavidin was selected. The assays were sensitive with detection limits of 25ng/mL for both analytes, precise with intra- and inter-assay CV values ranging from 3.6% to 7.7%, and from 4.8% to 9.5%, respectively, for both assays, and accurate with recovery values ranging from 88.5% to 108% for both analytes. Moreover, the values determined for the two analytes in 35 human plasma samples were in excellent agreement with those received for the same samples by standard diagnostic laboratory instrumentation employing commercial kits. The excellent agreement of the results supported the validity of the proposed system for clinical application for the detection of multiple analytes since it was demonstrated that up to seven antibody areas can be created on the sensing surface and successfully interrogated with the developed optical set-up. Copyright © 2015. Published by Elsevier B.V.
Han, Lijun; Matarrita, Jessie; Sapozhnikova, Yelena; Lehotay, Steven J
2016-06-03
This study demonstrates the application of a novel lipid removal product to the residue analysis of 65 pesticides and 52 environmental contaminants in kale, pork, salmon, and avocado by fast, low pressure gas chromatography - tandem mass spectrometry (LPGC-MS/MS). Sample preparation involves QuEChERS extraction followed by use of EMR-Lipid ("enhanced matrix removal of lipids") and an additional salting out step for cleanup. The optimal amount of EMR-Lipid was determined to be 500mg for 2.5mL extracts for most of the analytes. The co-extractive removal efficiency by the EMR-Lipid cleanup step was 83-98% for fatty samples and 79% for kale, including 76% removal of chlorophyll. Matrix effects were typically less than ±20%, in part because analyte protectants were used in the LPGC-MS/MS analysis. The recoveries of polycyclic aromatic hydrocarbons and diverse pesticides were mostly 70-120%, whereas recoveries of nonpolar polybrominated diphenyl ethers and polychlorinated biphenyls were mostly lower than 70% through the cleanup procedure. With the use of internal standards, method validation results showed that 76-85 of the 117 analytes achieved satisfactory results (recoveries of 70-120% and RSD≤20%) in pork, avocado, and kale, while 53 analytes had satisfactory results in salmon. Detection limits were 5-10ng/g for all but a few analytes. EMR-Lipid is a new sample preparation tool that serves as another useful option for cleanup in multiresidue analysis, particularly of fatty foods. Published by Elsevier B.V.
Current projects in Pre-analytics: where to go?
Sapino, Anna; Annaratone, Laura; Marchiò, Caterina
2015-01-01
The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindberg, Michael J.
2010-09-28
Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less
Recent developments in computer vision-based analytical chemistry: A tutorial review.
Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J
2015-10-29
Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of
Ferrari, Davide; Manca, Monica; Banfi, Giuseppe; Locatelli, Massimo
2018-01-01
Driving under the influence of alcohol and/or illicit drugs in Italy is regulated by the articles 186 and 187 of the National Street Code. Epidemiological studies on drivers involved in road traffic crashes (RTC) provide useful information about the use/abuse of these substances in the general population. Comparison with case control studies may reveal important information like the cut-off limits adequacy. Data from 1587 blood tests for alcohol and 1258 blood tests for illicit drugs on drivers involved in RTC around Milan between 2012 and 2016, were analyzed and compared with a published random survey (DRUID) from the European Community. Our data from RTC-involved drivers show that alcohol abuse is not age-related whereas illicit drugs are more common in young people. Cannabinoids are frequent among younger drivers (median age 27) whereas cocaine is more often detected in adults (median age 34). The calculated odds ratio after comparison with the DRUID survey shows that a blood alcohol concentration below the legal limit does not represent a risk factor in having a car accident whereas concentrations of cocaine and cannabinoids within the legal limits are associated with being involved in a car accident. Despite authority efforts, the abuse of alcohol and illicit drugs is still common in young drivers. We suspect that the cut-off limits for cannabinoids and cocaine and/or the pre-analytical procedures for these substances are inadequate. We suggest a better standardization of the procedure by shortening the time interval between the request for investigation and blood collection and propose the adoption of more stringent cut-off limits. Copyright © 2017 Elsevier B.V. All rights reserved.
The role of light microscopy in aerospace analytical laboratories
NASA Technical Reports Server (NTRS)
Crutcher, E. R.
1977-01-01
Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.
Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.
Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli
2018-03-13
The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadler, D.A.; Sun, F.; Littlejohn, D.
1995-12-31
ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2011 CFR
2011-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
Design and analysis of composite structures with stress concentrations
NASA Technical Reports Server (NTRS)
Garbo, S. P.
1983-01-01
An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.
ERIC Educational Resources Information Center
Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba
2010-01-01
Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…
40 CFR 63.786 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...
40 CFR 63.786 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2005 through June 2007. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: total aluminum, calcium, magnesium, nitrate (colorimetric method), potassium, silicon, sodium, and sulfate. Eight of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: total aluminum, calcium, dissolved organic carbon, chloride, nitrate (ion chromatograph), potassium, silicon, and sulfate. The magnesium and pH procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The acid-neutralizing capacity, total monomeric aluminum, nitrite, and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicated that the procedures for 16 of 17 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 93 percent of the samples met data-quality objectives for all analytes except acid-neutralizing capacity (85 percent of samples met objectives), total monomeric aluminum (83 percent of samples met objectives), total aluminum (85 percent of samples met objectives), and chloride (85 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project met the Troy Laboratory data-quality objectives for 87 percent of the samples analyzed. The P-sample (low-ionic-strength constituents) analysis had two outliers each in two studies. The T-sample (trace constituents) analysis and the N-sample (nutrient constituents) analysis had one outlier each in two studies. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 85 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were acid-neutralizing capacity, total aluminum and ammonium. Data-quality objectives were not met in 41 percent of samples analyzed for acid-neutralizing capacity, 50 percent of samples analyzed for total aluminum, and 44 percent of samples analyzed for ammonium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 76 percent of the samples analyzed for chloride, 80 percent of the samples analyzed for specific conductance, and 77 percent of the samples analyzed for sulfate.
How to conduct External Quality Assessment Schemes for the pre-analytical phase?
Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre
2014-01-01
In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.
The pitfalls of hair analysis for toxicants in clinical practice: three case reports.
Frisch, Melissa; Schwartz, Brian S
2002-01-01
Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463
Dwyer, Johanna T.; Picciano, Mary Frances; Betz, Joseph M.; Fisher, Kenneth D.; Saldanha, Leila G.; Yetley, Elizabeth A.; Coates, Paul M.; Radimer, Kathy; Bindewald, Bernadette; Sharpless, Katherine E.; Holden, Joanne; Andrews, Karen; Zhao, Cuiwei; Harnly, James; Wolf, Wayne R.; Perry, Charles R.
2013-01-01
Several activities of the Office of Dietary Supplements (ODS) at the National Institutes of Health involve enhancement of dietary supplement databases. These include an initiative with US Department of Agriculture to develop an analytically substantiated dietary supplement ingredient database (DSID) and collaboration with the National Center for Health Statistics to enhance the dietary supplement label database in the National Health and Nutrition Examination Survey (NHANES). The many challenges that must be dealt with in developing an analytically supported DSID include categorizing product types in the database, identifying nutrients, and other components of public health interest in these products and prioritizing which will be entered in the database first. Additional tasks include developing methods and reference materials for quantifying the constituents, finding qualified laboratories to measure the constituents, developing appropriate sample handling procedures, and finally developing representative sampling plans. Developing the NHANES dietary supplement label database has other challenges such as collecting information on dietary supplement use from NHANES respondents, constant updating and refining of information obtained, developing default values that can be used if the respondent cannot supply the exact supplement or strength that was consumed, and developing a publicly available label database. Federal partners and the research community are assisting in making an analytically supported dietary supplement database a reality. PMID:25309034
Development of analytic intermodal freight networks for use within a GIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Southworth, F.; Xiong, D.; Middendorf, D.
1997-05-01
The paper discusses the practical issues involved in constructing intermodal freight networks that can be used within GIS platforms to support inter-regional freight routing and subsequent (for example, commodity flow) analysis. The procedures described can be used to create freight-routable and traffic flowable interstate and intermodal networks using some combination of highway, rail, water and air freight transportation. Keys to realistic freight routing are the identification of intermodal transfer locations and associated terminal functions, a proper handling of carrier-owned and operated sub-networks within each of the primary modes of transport, and the ability to model the types of carrier servicesmore » being offered.« less
NASA Technical Reports Server (NTRS)
Knox, R. J.
1978-01-01
Embryonic kidney cells were studied as a follow-up to the MA-011 Electrophoresis Technology Experiment which was conducted during the Apollo Soyuz Test Project (ASTP). The postflight analysis of the performance of the ASTP zone electrophoresis experiment involving embryonic kidney cells is reported. The feasibility of producing standard particles for electrophoresis was also studied. This work was undertaken in response to a need for standardization of methods for producing, calibrating, and storing electrophoretic particle standards which could be employed in performance tests of various types of electrophoresis equipment. Promising procedures were tested for their suitability in the production of standard test particles from red blood cells.
NASA Astrophysics Data System (ADS)
Conte, Eric D.; Barry, Eugene F.; Rubinstein, Harry
1996-12-01
Certain individuals may be sensitive to specific compounds in comsumer products. It is important to quantify these analytes in food products in order to monitor their intake. Caffeine is one such compound. Determination of caffeine in beverages by spectrophotometric procedures requires an extraction procedure, which can prove time-consuming. Although the corresponding determination by HPLC allows for a direct injection, capillary zone electrophoresis provides several advantages such as extremely low solvent consumption, smaller sample volume requirements, and improved sensitivity.
A Fuzzy-Based Decision Support Model for Selecting the Best Dialyser Flux in Haemodialysis.
Oztürk, Necla; Tozan, Hakan
2015-01-01
Decision making is an important procedure for every organization. The procedure is particularly challenging for complicated multi-criteria problems. Selection of dialyser flux is one of the decisions routinely made for haemodialysis treatment provided for chronic kidney failure patients. This study provides a decision support model for selecting the best dialyser flux between high-flux and low-flux dialyser alternatives. The preferences of decision makers were collected via a questionnaire. A total of 45 questionnaires filled by dialysis physicians and nephrologists were assessed. A hybrid fuzzy-based decision support software that enables the use of Analytic Hierarchy Process (AHP), Fuzzy Analytic Hierarchy Process (FAHP), Analytic Network Process (ANP), and Fuzzy Analytic Network Process (FANP) was used to evaluate the flux selection model. In conclusion, the results showed that a high-flux dialyser is the best. option for haemodialysis treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Suet Yi; Kleber, Markus; Takahashi, Lynelle K.
2013-04-01
Soil organic matter (OM) is important because its decay drives life processes in the biosphere. Analysis of organic compounds in geological systems is difficult because of their intimate association with mineral surfaces. To date there is no procedure capable of quantitatively separating organic from mineral phases without creating artifacts or mass loss. Therefore, analytical techniques that can (a) generate information about both organic and mineral phases simultaneously and (b) allow the examination of predetermined high-interest regions of the sample as opposed to conventional bulk analytical techniques are valuable. Laser Desorption Synchrotron Postionization (synchrotron-LDPI) mass spectrometry is introduced as a novelmore » analytical tool to characterize the molecular properties of organic compounds in mineral-organic samples from terrestrial systems, and it is demonstrated that when combined with Secondary Ion Mass Spectrometry (SIMS), can provide complementary information on mineral composition. Mass spectrometry along a decomposition gradient in density fractions, verifies the consistency of our results with bulk analytical techniques. We further demonstrate that by changing laser and photoionization energies, variations in molecular stability of organic compounds associated with mineral surfaces can be determined. The combination of synchrotron-LDPI and SIMS shows that the energetic conditions involved in desorption and ionization of organic matter may be a greater determinant of mass spectral signatures than the inherent molecular structure of the organic compounds investigated. The latter has implications for molecular models of natural organic matter that are based on mass spectrometric information.« less
Agut, C; Caron, A; Giordano, C; Hoffman, D; Ségalini, A
2011-09-10
In 2001, a multidisciplinary team made of analytical scientists and statisticians at Sanofi-aventis has published a methodology which has governed, from that time, the transfers from R&D sites to Manufacturing sites of the release monographs. This article provides an overview of the recent adaptations brought to this original methodology taking advantage of our experience and the new regulatory framework, and, in particular, the risk management perspective introduced by ICH Q9. Although some alternate strategies have been introduced in our practices, the comparative testing one, based equivalence testing as statistical approach, remains the standard for assays lying on very critical quality attributes. This is conducted with the concern to control the most important consumer's risk involved at two levels in analytical decisions in the frame of transfer studies: risk, for the receiving laboratory, to take poor release decisions with the analytical method and risk, for the sending laboratory, to accredit such a receiving laboratory on account of its insufficient performances with the method. Among the enhancements to the comparative studies, the manuscript presents the process settled within our company for a better integration of the transfer study into the method life-cycle, just as proposals of generic acceptance criteria and designs for assay and related substances methods. While maintaining rigor and selectivity of the original approach, these improvements tend towards an increased efficiency in the transfer operations. Copyright © 2011 Elsevier B.V. All rights reserved.
Relative frequencies of constrained events in stochastic processes: An analytical approach.
Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C
2015-10-01
The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.
Mechanic, Leah; Mendez, Armando; Merrill, Lori; Rogers, John; Layton, Marnie; Todd, Deborah; Varanasi, Arti; O’Brien, Barbara; Meyer, William A.; Zhang, Ming; Schleicher, Rosemary L.; Moye, Jack
2014-01-01
BACKGROUND Preanalytical conditions encountered during collection, processing, and storage of biospecimens may influence laboratory results. The National Children’s Study (NCS) is a planned prospective cohort study of 100,000 families to examine the influence of a wide variety of exposures on child health. In developing biospecimen collection, processing, and storage procedures for the NCS, we identified several analytes of different biochemical categories for which it was unclear to what extent deviations from NCS procedures could influence measurement results. METHODS A pilot study was performed to examine effects of preanalytic sample handling conditions (delays in centrifugation, freezing delays, delays in separation from cells, additive delay, and tube type) on concentrations of eight different analytes. 2,825 measurements were made to assess 15 unique combinations of analyte and handling conditions in blood collected from 151 women of childbearing age (≥20 individuals per handling condition). RESULTS The majority of analytes were stable under the conditions evaluated. However, levels of plasma interleukin-6 and serum insulin were decreased in response to sample centrifugation delays of up to 5.5 hours post collection (P<0.0001). In addition, delays in freezing centrifuged plasma samples (comparing 24, 48 and 72 hours to immediate freezing) resulted in increased levels of adrenocorticotropic hormone (P=0.0014). CONCLUSIONS Determining stability of proposed analytes in response to preanalytical conditions and handling helps to ensure high-quality specimens for study now and in the future. The results inform development of procedures, plans for measurement of analytes, and interpretation of laboratory results. PMID:23924524
Simplified Computation for Nonparametric Windows Method of Probability Density Function Estimation.
Joshi, Niranjan; Kadir, Timor; Brady, Michael
2011-08-01
Recently, Kadir and Brady proposed a method for estimating probability density functions (PDFs) for digital signals which they call the Nonparametric (NP) Windows method. The method involves constructing a continuous space representation of the discrete space and sampled signal by using a suitable interpolation method. NP Windows requires only a small number of observed signal samples to estimate the PDF and is completely data driven. In this short paper, we first develop analytical formulae to obtain the NP Windows PDF estimates for 1D, 2D, and 3D signals, for different interpolation methods. We then show that the original procedure to calculate the PDF estimate can be significantly simplified and made computationally more efficient by a judicious choice of the frame of reference. We have also outlined specific algorithmic details of the procedures enabling quick implementation. Our reformulation of the original concept has directly demonstrated a close link between the NP Windows method and the Kernel Density Estimator.
Theory of invasion extinction dynamics in minimal food webs
NASA Astrophysics Data System (ADS)
Haerter, Jan O.; Mitarai, Namiko; Sneppen, Kim
2018-02-01
When food webs are exposed to species invasion, secondary extinction cascades may be set off. Although much work has gone into characterizing the structure of food webs, systematic predictions on their evolutionary dynamics are still scarce. Here we present a theoretical framework that predicts extinctions in terms of an alternating sequence of two basic processes: resource depletion by or competitive exclusion between consumers. We first propose a conceptual invasion extinction model (IEM) involving random fitness coefficients. We bolster this IEM by an analytical, recursive procedure for calculating idealized extinction cascades after any species addition and simulate the long-time evolution. Our procedure describes minimal food webs where each species interacts with only a single resource through the generalized Lotka-Volterra equations. For such food webs ex- tinction cascades are determined uniquely and the system always relaxes to a stable steady state. The dynamics and scale invariant species life time resemble the behavior of the IEM, and correctly predict an upper limit for trophic levels as observed in the field.
Theory of invasion extinction dynamics in minimal food webs.
Haerter, Jan O; Mitarai, Namiko; Sneppen, Kim
2018-02-01
When food webs are exposed to species invasion, secondary extinction cascades may be set off. Although much work has gone into characterizing the structure of food webs, systematic predictions on their evolutionary dynamics are still scarce. Here we present a theoretical framework that predicts extinctions in terms of an alternating sequence of two basic processes: resource depletion by or competitive exclusion between consumers. We first propose a conceptual invasion extinction model (IEM) involving random fitness coefficients. We bolster this IEM by an analytical, recursive procedure for calculating idealized extinction cascades after any species addition and simulate the long-time evolution. Our procedure describes minimal food webs where each species interacts with only a single resource through the generalized Lotka-Volterra equations. For such food webs ex- tinction cascades are determined uniquely and the system always relaxes to a stable steady state. The dynamics and scale invariant species life time resemble the behavior of the IEM, and correctly predict an upper limit for trophic levels as observed in the field.
Recommendation for the review of biological reference intervals in medical laboratories.
Henny, Joseph; Vassault, Anne; Boursier, Guilaine; Vukasovic, Ines; Mesko Brguljan, Pika; Lohmander, Maria; Ghita, Irina; Andreu, Francisco A Bernabeu; Kroupis, Christos; Sprongl, Ludek; Thelen, Marc H M; Vanstapel, Florent J L A; Vodnik, Tatjana; Huisman, Willem; Vaubourdolle, Michel
2016-12-01
This document is based on the original recommendation of the Expert Panel on the Theory of Reference Values of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), updated guidelines were recently published under the auspices of the IFCC and the Clinical and Laboratory Standards Institute (CLSI). This document summarizes proposals for recommendations on: (i) The terminology, which is often confusing, noticeably concerning the terms of reference limits and decision limits. (ii) The method for the determination of reference limits according to the original procedure and the conditions, which should be used. (iii) A simple procedure allowing the medical laboratories to fulfill the requirements of the regulation and standards. The updated document proposes to verify that published reference limits are applicable to the laboratory involved. Finally, the strengths and limits of the revised recommendations (especially the selection of the reference population, the maintenance of the analytical quality, the choice of the statistical method used…) will be briefly discussed.
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
75 FR 5722 - Procedures for Transportation Workplace Drug and Alcohol Testing Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... drugs in a DOT drug test. You must not test ``DOT specimens'' for any other drugs. (a) Marijuana... test analyte concentration analyte concentration Marijuana metabolites 50 ng/mL THCA \\1\\ 15 ng/mL...
Kumar, Keshav; Mishra, Ashok Kumar
2015-07-01
Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.
Matrix-enhanced secondary ion mass spectrometry: The Alchemist's solution?
NASA Astrophysics Data System (ADS)
Delcorte, Arnaud
2006-07-01
Because of the requirements of large molecule characterization and high-lateral resolution SIMS imaging, the possibility of improving molecular ion yields by the use of specific sample preparation procedures has recently generated a renewed interest in the static SIMS community. In comparison with polyatomic projectiles, however, signal enhancement by a matrix might appear to some as the alchemist's versus the scientist's solution to the current problems of organic SIMS. In this contribution, I would like to discuss critically the pros and cons of matrix-enhanced SIMS procedures, in the new framework that includes polyatomic ion bombardment. This discussion is based on a short review of the experimental and theoretical developments achieved in the last decade with respect to the three following approaches: (i) blending the analyte with a low-molecular weight organic matrix (MALDI-type preparation procedure); (ii) mixing alkali/noble metal salts with the analyte; (iii) evaporating a noble metal layer on the analyte sample surface (organic molecules, polymers).
Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.
Sample Collection Procedures and Strategies
Individuals responsible for collecting environmental and building material samples following a contamination incident, can use these procedures to plan for and/or collect samples for analysis using the analytical methods listed in EPA's SAM
Mirasoli, Mara; Guardigli, Massimo; Michelini, Elisa; Roda, Aldo
2014-01-01
Miniaturization of analytical procedures through microchips, lab-on-a-chip or micro total analysis systems is one of the most recent trends in chemical and biological analysis. These systems are designed to perform all the steps in an analytical procedure, with the advantages of low sample and reagent consumption, fast analysis, reduced costs, possibility of extra-laboratory application. A range of detection technologies have been employed in miniaturized analytical systems, but most applications relied on fluorescence and electrochemical detection. Chemical luminescence (which includes chemiluminescence, bioluminescence, and electrogenerated chemiluminescence) represents an alternative detection principle that offered comparable (or better) analytical performance and easier implementation in miniaturized analytical devices. Nevertheless, chemical luminescence-based ones represents only a small fraction of the microfluidic devices reported in the literature, and until now no review has been focused on these devices. Here we review the most relevant applications (since 2009) of miniaturized analytical devices based on chemical luminescence detection. After a brief overview of the main chemical luminescence systems and of the recent technological advancements regarding their implementation in miniaturized analytical devices, analytical applications are reviewed according to the nature of the device (microfluidic chips, microchip electrophoresis, lateral flow- and paper-based devices) and the type of application (micro-flow injection assays, enzyme assays, immunoassays, gene probe hybridization assays, cell assays, whole-cell biosensors). Copyright © 2013 Elsevier B.V. All rights reserved.
2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Johnson, Bruce D; Golub, Andrew
2007-09-01
There are numerous analytic and methodological limitations to current measures of drug market activity. This paper explores the structure of markets and individual user behavior to provide an integrated understanding of behavioral and economic (and market) aspects of illegal drug use with an aim toward developing improved procedures for measurement. This involves understanding the social processes that structure illegal distribution networks and drug users' interactions with them. These networks are where and how social behaviors, prices, and markets for illegal drugs intersect. Our focus is upon getting an up close measurement of these activities. Building better measures of consumption behaviors necessitates building better rapport with subjects than typically achieved with one-time surveys in order to overcome withholding and underreporting and to get a comprehensive understanding of the processes involved. This can be achieved through repeated interviews and observations of behaviors. This paper also describes analytic advances that could be adopted to direct this inquiry including behavioral templates, and insights into the economic valuation of labor inputs and cash expenditures for various illegal drugs. Additionally, the paper makes recommendations to funding organizations for developing the mechanisms that would support behavioral scientists to weigh specimens and to collect small samples for laboratory analysis-by providing protection from the potential for arrest. The primary focus is upon U.S. markets. The implications for other countries are discussed.
Johnson, Bruce D.; Golub, Andrew
2007-01-01
There are numerous analytic and methodological limitations to current measures of drug market activity. This paper explores the structure of markets and individual user behavior to provide an integrated understanding of behavioral and economic (and market) aspects of illegal drug use with an aim toward developing improved procedures for measurement. This involves understanding the social processes that structure illegal distribution networks and drug users’ interactions with them. These networks are where and how social behaviors, prices, and markets for illegal drugs intersect. Our focus is upon getting an up close measurement of these activities. Building better measures of consumption behaviors necessitates building better rapport with subjects than typically achieved with one-time surveys in order to overcome withholding and underreporting and to get a comprehensive understanding of the processes involved. This can be achieved through repeated interviews and observations of behaviors. This paper also describes analytic advances that could be adopted to direct this inquiry including behavioral templates, and insights into the economic valuation of labor inputs and cash expenditures for various illegal drugs. Additionally, the paper makes recommendations to funding organizations for developing the mechanisms that would support behavioral scientists to weigh specimens and to collect small samples for laboratory analysis—by providing protection from the potential for arrest. The primary focus is upon U.S. markets. The implications for other countries are discussed. PMID:16978801
Behavior analytic approaches to problem behavior in intellectual disabilities.
Hagopian, Louis P; Gregory, Meagan K
2016-03-01
The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
2016-01-01
The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882
Loading-unloading response of circular GLARE fiber-metal laminates under lateral indentation
NASA Astrophysics Data System (ADS)
Tsamasphyros, George J.; Bikakis, George S.
2015-01-01
GLARE is a Fiber-Metal laminated material used in aerospace structures which are frequently subjected to various impact damages. Hence, the response of GLARE plates subjected to lateral indentation is very important. In this paper, analytical expressions are derived and a non-linear finite element modeling procedure is proposed in order to predict the static load-indentation curves of circular GLARE plates during loading and unloading by a hemispherical indentor. We have recently published analytical formulas and a finite element procedure for the static indentation of circular GLARE plates which are now used during the loading stage. Here, considering that aluminum layers are in a state of membrane yield and employing energy balance during unloading, the unloading path is determined. Using this unloading path, an algebraic equation is derived for calculating the permanent dent depth of the GLARE plate after the indentor's withdrawal. Furthermore, our finite element procedure is modified in order to simulate the unloading stage as well. The derived formulas and the proposed finite element modeling procedure are applied successfully to GLARE 2-2/1-0.3 and to GLARE 3-3/2-0.4 circular plates. The analytical results are compared with corresponding FEM results and a good agreement is found. The analytically calculated permanent dent depth is within 6 % for the GLARE 2 plate, and within 7 % for the GLARE 3 plate, of the corresponding numerically calculated result. No other solution of this problem is known to the authors.
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82...
Liquefaction Resistance Based on Shear Wave Velocity
DOT National Transportation Integrated Search
1999-01-01
This report reviews the current simplified procedures for evaluating the liquefaction resistance of granular soil deposits using small-strain shear wave velocity. These procedures were developed from analytical studies, laboratory studies, or very li...
Hemmings, Annette
2009-12-01
This paper explores ethical dilemmas in situated fieldwork ethics concerning ethnographic studies of adolescent students. While consequentialist and deontological ethics form the basis of the ethical stances shared by ethnographers and research ethics committees, the interpretation of those principles may diverge in school-based ethnography with adolescent students because of the particular role of the adult ethnographer vis-à-vis developmentally immature adolescents not held legally responsible for many of their actions. School ethnographers attempt to build trust with adolescent participants in order to learn about their hidden cultural worlds, which may involve activities that are very harmful to the youths involved. They face many difficult and sometimes unexpected choices, including whether to intervene and how to represent events and adolescents in published findings. Scenarios with examples drawn from research conducted in public high schools are used to illustrate and explicate dilemmas in formal research and latent insider/outsider roles and relations involving harmful adolescent behaviors, advocacy, and psychological trauma. Also examined are analytical procedures used to construct interpretations leading to representations of research participants in the resulting publication.
Recent developments in nickel electrode analysis
NASA Technical Reports Server (NTRS)
Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.
1991-01-01
Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.
Ferrario, J; Byrne, C; Dupuy, A E
1997-06-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
NASA Technical Reports Server (NTRS)
Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr
1997-01-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
Communication Network Analysis Methods.
ERIC Educational Resources Information Center
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Validation of the replica trick for simple models
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2018-04-01
We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.
SAMPLING AND ANALYSIS OF MERCURY IN CRUDE OIL
Sampling and analytical procedures used to determine total mercury content in crude oils were examined. Three analytical methods were compared with respect to accuracy, precision and detection limit. The combustion method and a commercial extraction method were found adequate to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and... specifications for fuels, engine fluids, and analytical gases; these specifications apply for testing under this...
Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel
2016-01-01
A new procedure based on direct insert microvial thermal desorption injection allows the direct analysis of ionic liquid extracts by gas chromatography and mass spectrometry (GC-MS). For this purpose, an in situ ionic liquid dispersive liquid-liquid microextraction (in situ IL DLLME) has been developed for the quantification of bisphenol A (BPA), bisphenol Z (BPZ) and bisphenol F (BPF). Different parameters affecting the extraction efficiency of the microextraction technique and the thermal desorption step were studied. The optimized procedure, determining the analytes as acetyl derivatives, provided detection limits of 26, 18 and 19 ng L(-1) for BPA, BPZ and BPF, respectively. The release of the three analytes from plastic containers was monitored using this newly developed analytical method. Analysis of the migration test solutions for 15 different plastic containers in daily use identified the presence of the analytes at concentrations ranging between 0.07 and 37 μg L(-1) in six of the samples studied, BPA being the most commonly found and at higher concentrations than the other analytes.
Aprea, Maria Cristina; Scapellato, Maria Luisa; Valsania, Maria Carmen; Perico, Andrea; Perbellini, Luigi; Ricossa, Maria Cristina; Pradella, Marco; Negri, Sara; Iavicoli, Ivo; Lovreglio, Piero; Salamon, Fabiola; Bettinelli, Maurizio; Apostoli, Pietro
2017-04-21
Biological reference values (RVs) explore the relationships between humans and their environment and habits. RVs are fundamental in the environmental field for assessing illnesses possibly associated with environmental pollution, and also in the occupational field, especially in the absence of established biological or environmental limits. The Italian Society for Reference Values (SIVR) determined to test criteria and procedures for the definition of RVs to be used in the environmental and occupational fields. The paper describes the SIVR methodology for defining RVs of xenobiotics and their metabolites. Aspects regarding the choice of population sample, the quality of analytical data, statistical analysis and control of variability factors are considered. The simultaneous interlaboratory circuits involved can be expected to increasingly improve the quality of the analytical data. Examples of RVs produced by SIVR are presented. In particular, levels of chromium, mercury, ethylenethiourea, 3,5,6-trichloro-2-pyridinol, 2,5-hexanedione, 1-hydroxypyrene and t,t-muconic acid measured in urine and expressed in micrograms/g creatinine (μg/g creat) or micrograms/L (μg/L) are reported. With the proposed procedure, SIVR intends to make its activities known to the scientific community in order to increase the number of laboratories involved in the definition of RVs for the Italian population. More research is needed to obtain further RVs in different biological matrices, such as hair, nails and exhaled breath. It is also necessary to update and improve the present reference values and broaden the portfolio of chemicals for which RVs are available. In the near future, SIVR intends to expand its scientific activity by using a multivariate approach for xenobiotics that may have a common origin, and to define RVs separately for children who may be exposed more than adults and be more vulnerable.
NASA Astrophysics Data System (ADS)
López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel
2017-04-01
The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) and to the Spanish MINECO (Project CTQ2015-68049-R) for financial support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitri, F. G., E-mail: F.G.Mitri@ieee.org
2015-11-14
Using the partial-wave series expansion method in cylindrical coordinates, a formal analytical solution for the acoustical scattering of a 2D cylindrical quasi-Gaussian beam with an arbitrary angle of incidence θ{sub i}, focused on a rigid elliptical cylinder in a non-viscous fluid, is developed. The cylindrical focused beam expression is an exact solution of the Helmholtz equation. The scattering coefficients for the elliptical cylinder are determined by forcing the expression of the total (incident + scattered) field to satisfy the Neumann boundary condition for a rigid immovable surface, and performing the product of matrices involving an inversion procedure. Computations for the matrices elementsmore » require a single numerical integration procedure for each partial-wave mode. Numerical results are performed with particular emphasis on the focusing properties of the incident beam and its angle of incidence with respect to the major axis a of the ellipse as well as the aspect ratio a/b where b is the minor axis (assuming a > b). The method is validated and verified against previous results obtained via the T-matrix for plane waves. The present analysis is the first to consider an acoustical beam on an elliptic cylinder of variable cross-section as opposed to plane waves of infinite extent. Other 2D non-spherical and Chebyshev surfaces are mentioned that may be examined throughout this analytical formalism assuming a small deformation parameter ε.« less
Nowak, Donald E; Aloe, Ariel M
2014-12-01
The problem of gambling addiction can be especially noteworthy among college and university students, many of whom have the resources, proximity, free time, and desire to become involved in the myriad options of gambling now available. Although limited attention has been paid specifically to college student gambling in the body of literature, there have been two published meta-analyses estimating the prevalence of probable pathological gambling among college students. This present study aims to be the third, presenting an up-to-date proportion of those students exhibiting gambling pathology, and is the first to include international studies from outside the United States and Canada. The purpose of this study was to use the most up-to-date meta-analytical procedures to synthesize the rates of probable pathological gambling for college and university students worldwide. A thorough literature review and coding procedure resulted in 19 independent data estimates retrieved from 18 studies conducted between 2005 and 2013. To synthesize the studies, a random effects model for meta-analysis was applied. The estimated proportion of probable pathological gamblers among the over 13,000 college students surveyed was computed at 10.23%, considerably higher than either of the two previously published meta-analyses, and more than double the rate reported in the first meta-analysis of this type published in 1999. Implications and recommendations for future practice in dealing with college students and gambling addiction are outlined and described for both administrators and mental health professionals.
ERIC Educational Resources Information Center
Yogev, Sara; Brett, Jeanne
This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…
Lima, Manoel J A; Fernandes, Ridvan N; Tanaka, Auro A; Reis, Boaventura F
2016-02-01
This paper describes a new technique for the determination of captopril in pharmaceutical formulations, implemented by employing multicommuted flow analysis. The analytical procedure was based on the reaction between hypochlorite and captopril. The remaining hypochlorite oxidized luminol that generated electromagnetic radiation detected using a homemade luminometer. To the best of our knowledge, this is the first time that this reaction has been exploited for the determination of captopril in pharmaceutical products, offering a clean analytical procedure with minimal reagent usage. The effectiveness of the proposed procedure was confirmed by analyzing a set of pharmaceutical formulations. Application of the paired t-test showed that there was no significant difference between the data sets at a 95% confidence level. The useful features of the new analytical procedure included a linear response for captopril concentrations in the range 20.0-150.0 µmol/L (r = 0.997), a limit of detection (3σ) of 2.0 µmol/L, a sample throughput of 164 determinations per hour, reagent consumption of 9 µg luminol and 42 µg hypochlorite per determination and generation of 0.63 mL of waste. A relative standard deviation of 1% (n = 6) for a standard solution containing 80 µmol/L captopril was also obtained. Copyright © 2015 John Wiley & Sons, Ltd.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
PPM mixtures of formaldehyde in gas cylinders: Stability and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, K.C.; Miller, S.B.; Patterson, L.M.
1999-07-01
Scott Specialty Gases has been successful in producing stable calibration gases of formaldehyde at low concentration. Critical to this success has been the development of a treatment process for high pressure aluminum cylinders. Formaldehyde cylinders having concentrations of 20ppm and 4ppm were found to show only small decline in concentrations over a period of approximately 12 months. Since no NIST traceable formaldehyde standards (or Standard Reference Material) are available, all Scott's formaldehyde cylinders were originally certified by traditional impinger method. This method involves an extremely tedious purification procedure for 2,4-dinitrophenylhydrazine (2,4-DNPH). A modified version of the impinger method has beenmore » developed and does not require extensive reagent purification for formaldehyde analysis. Extremely low formaldehyde blanks have been obtained with the modified method. The HPLC conditions in the original method were used for chromatographic separations. The modified method results in a lower analytical uncertainty for the formaldehyde standard mixtures. Consequently, it is possible to discern small differences between analytical results that are important for stability study.« less
Scaling Laws Applied to a Modal Formulation of the Aeroservoelastic Equations
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.
2002-01-01
A method of scaling is described that easily converts the aeroelastic equations of motion of a full-sized aircraft into ones of a wind-tunnel model. To implement the method, a set of rules is provided for the conversion process involving matrix operations with scale factors. In addition, a technique for analytically incorporating a spring mounting system into the aeroelastic equations is also presented. As an example problem, a finite element model of a full-sized aircraft is introduced from the High Speed Research (HSR) program to exercise the scaling method. With a set of scale factor values, a brief outline is given of a procedure to generate the first-order aeroservoelastic analytical model representing the wind-tunnel model. To verify the scaling process as applied to the example problem, the root-locus patterns from the full-sized vehicle and the wind-tunnel model are compared to see if the root magnitudes scale with the frequency scale factor value. Selected time-history results are given from a numerical simulation of an active-controlled wind-tunnel model to demonstrate the utility of the scaling process.
Naing, Nyi Nyi; Li, Sam Fong Yau; Lee, Hian Kee
2015-12-24
A fast and low-cost sample preparation method of graphene based dispersive solid-phase extraction combined with gas chromatography-mass spectrometric (GC-MS) analysis, was developed. The procedure involves an initial extraction with water-immiscible organic solvent, followed by a rapid clean-up using amine functionalized reduced graphene oxide as sorbent. Simple and fast one-step in situ derivatization using trimethylphenylammonium hydroxide was subsequently applied on acidic pharmaceuticals serving as model analytes, ibuprofen, gemfibrozil, naproxen, ketoprofen and diclofenac, before GC-MS analysis. Extraction parameters affecting the derivatization and extraction efficiency such as volume of derivatization agent, effect of desorption solvent, effect of pH and effect of ionic strength were investigated. Under the optimum conditions, the method demonstrated good limits of detection ranging from 1 to 16ngL(-1), linearity (from 0.01 to 50 and 0.05 to 50μgL(-1), depending on the analytes) and satisfactory repeatability of extractions (relative standard deviations, below 13%, n=3). Copyright © 2015 Elsevier B.V. All rights reserved.
de Oliveira, Gabriel Barros; de Castro Gomes Vieira, Carolyne Menezes; Orlando, Ricardo Mathias; Faria, Adriana Ferreira
2017-10-15
This work involved the optimization and validation of a method, according to Directive 2002/657/EC and the Analytical Quality Assurance Manual of Ministério da Agricultura, Pecuária e Abastecimento, Brazil, for simultaneous extraction and determination of fumonisins B1 and B2 in maize. The extraction procedure was based on a matrix solid phase dispersion approach, the optimization of which employed a sequence of different factorial designs. A liquid chromatography-tandem mass spectrometry method was developed for determining these analytes using the selected reaction monitoring mode. The optimized method employed only 1g of silica gel for dispersion and elution with 70% ammonium formate aqueous buffer (50mmolL -1 , pH 9), representing a simple, cheap and chemically friendly sample preparation method. Trueness (recoveries: 86-106%), precision (RSD ≤19%), decision limits, detection capabilities and measurement uncertainties were calculated for the validated method. The method scope was expanded to popcorn kernels, white maize kernels and yellow maize grits. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chu, Khim Hoong
2017-11-09
Surface diffusion coefficients may be estimated by fitting solutions of a diffusion model to batch kinetic data. For non-linear systems, a numerical solution of the diffusion model's governing equations is generally required. We report here the application of the classic Langmuir kinetics model to extract surface diffusion coefficients from batch kinetic data. The use of the Langmuir kinetics model in lieu of the conventional surface diffusion model allows derivation of an analytical expression. The parameter estimation procedure requires determining the Langmuir rate coefficient from which the pertinent surface diffusion coefficient is calculated. Surface diffusion coefficients within the 10 -9 to 10 -6 cm 2 /s range obtained by fitting the Langmuir kinetics model to experimental kinetic data taken from the literature are found to be consistent with the corresponding values obtained from the traditional surface diffusion model. The virtue of this simplified parameter estimation method is that it reduces the computational complexity as the analytical expression involves only an algebraic equation in closed form which is easily evaluated by spreadsheet computation.
Santoro, Valentina; Aigotti, Riccardo; Gastaldi, Daniela; Romaniello, Francesco; Forte, Emanuele; Magni, Martina; Baiocchi, Claudio
2018-01-01
Interesterification is an industrial transformation process aiming to change the physico-chemical properties of vegetable oils by redistributing fatty acid position within the original constituent of the triglycerides. In the confectionery industry, controlling formation degree of positional isomers is important in order to obtain fats with the desired properties. Silver ion HPLC (High Performance Liquid Chromatography) is the analytical technique usually adopted to separate triglycerides (TAGs) having different unsaturation degrees. However, separation of TAG positional isomers is a challenge when the number of double bonds is the same and the only difference is in their position within the triglyceride molecule. The TAG positional isomers involved in the present work have a structural specificity that require a separation method tailored to the needs of confectionery industry. The aim of this work was to obtain a chromatographic resolution that might allow reliable qualitative and quantitative evaluation of TAG positional isomers within reasonably rapid retention times and robust in respect of repeatability and reproducibility. The resulting analytical procedure was applied both to confectionery raw materials and final products. PMID:29462917
Marchese, Stefano; Perret, Daniela; Gentili, Alessandra; D'Ascenzo, Guiseppe; Faberi, Angelo
2002-01-01
An evaluation was made of the feasibility of using reversed-phase liquid chromatography/tandem mass spectrometry with an electrospray interface (LC/ESI-MS/MS) to measure traces of phenoxyacid herbicides and their metabolites in surface and drinking water samples. The procedure involved passing 0.5 L of river and drinking water samples through a 0.5 g graphitized carbon black (GCB) extraction cartridge. Recovery was higher than 85% irrespective of the aqueous matrix in which the analytes were dissolved. A conventional 4.6-mm i.d. reversed-phase LC C-18 column operating with a mobile phase flow rate of 1 mL/min was used to chromatograph the analytes. A flow of 200 microL/min of the column effluent was diverted to the ESI source. The limits of detection (signal-to-noise ratio = 3) of the method for the pesticides considered in drinking and surface water samples are less than 0.1 ng/L for phenoxyacid herbicides, and about 5-10 ng/L for their metabolites (2,4-dichlorophenol and 4-chloro-2-methylphenol). Copyright 2001 John Wiley & Sons, Ltd.
Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min
2009-01-01
Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data for the time period addressed in this report were stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality- control samples analyzed from July 1997 through June 1999. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration and (or) low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, ammonium, calcium, chloride, specific conductance, and sulfate. The data from the potassium and sodium analytical procedures are insufficient for evaluation. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 11 of 13 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. Blank analysis results for chloride showed that 22 percent of blanks did not meet data-quality objectives and results for dissolved organic carbon showed that 31 percent of the blanks did not meet data-quality objectives. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 14 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except total aluminum (70 percent of samples met objectives) and potassium (83 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality for most constituents over the time period. The P-sample (low-ionic-strength constituents) analysis had good ratings in two of these studies and a satisfactory rating in the third. The results of the T-sample (trace constituents) analysis indicated high data quality with good ratings in all three studies. The N-sample (nutrient constituents) studies had one each of excellent, good, and satisfactory ratings. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 80 percent of the samples met data-quality objectives for 9 of the 13 analytes; the exceptions were dissolved organic carbon, ammonium, chloride, and specific conductance. Data-quality objectives were not met for dissolved organic carbon in two NWRI studies, but all of the samples were within control limits for the last study. Data-quality objectives were not met in 41 percent of samples analyzed for ammonium, 25 percent of samples analyzed for chloride, and 30 percent of samples analyzed for specific conductance. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 84 percent of the samples analyzed for calcium, chloride, magnesium, pH, and potassium. Data-quality objectives were met by 73 percent of those analyzed for sulfate. The data-quality objective was not met for sodium. The data are insufficient for evaluation of the specific conductance results.
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
Low level vapor verification of monomethyl hydrazine
NASA Technical Reports Server (NTRS)
Mehta, Narinder
1990-01-01
The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek
2013-12-15
The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.
Nechaeva, Daria; Shishov, Andrey; Ermakov, Sergey; Bulatov, Andrey
2018-06-01
An easily performed miniaturized, cheap, selective and sensitive procedure for the determination of H 2 S in fuel oil samples based on a headspace liquid-phase microextraction followed by a cyclic voltammetry detection using a paper-based analytical device (PAD) was developed. A modified wax dipping method was applied to fabricate the PAD. The PAD included hydrophobic zones of sample and supporting electrolyte connecting by hydrophilic channel. The zones of sample and supporting electrolyte were connected with nickel working, platinum auxiliary and Ag/AgCl reference electrodes. The analytical procedure included separation of H 2 S from fuel oil sample based on the headspace liquid-phase microextraction in alkaline solution. Then, sulfide ions solution obtained and supporting electrolyte were dropped on the zones followed by analyte detection at + 0.45 V. Under the optimized conditions, H 2 S concentration in the range from 2 to 20 mg kg -1 had a good linear relation with the peak current. The limit of detection (3σ) was 0.6 mg kg -1 . The procedure was successfully applied to the analysis of fuel oil samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Are Higher Education Institutions Prepared for Learning Analytics?
ERIC Educational Resources Information Center
Ifenthaler, Dirk
2017-01-01
Higher education institutions and involved stakeholders can derive multiple benefits from learning analytics by using different data analytics strategies to produce summative, real-time, and predictive insights and recommendations. However, are institutions and academic as well as administrative staff prepared for learning analytics? A learning…
Determination of Total Carbohydrates in Algal Biomass: Laboratory Analytical Procedure (LAP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Wychen, Stefanie; Laurens, Lieve M. L.
This procedure uses two-step sulfuric acid hydrolysis to hydrolyze the polymeric forms of carbohydrates in algal biomass into monomeric subunits. The monomers are then quantified by either HPLC or a suitable spectrophotometric method.
1990 National Water Quality Laboratory Services Catalog
Pritt, Jeffrey; Jones, Berwyn E.
1989-01-01
PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.
Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.
1987-01-01
The U.S. Geological Survey operated a blind audit sample program during 1974 to test the effects of the sample handling and shipping procedures used by the National Atmospheric Deposition Program and National Trends Network on the quality of wet deposition data produced by the combined networks. Blind audit samples, which were dilutions of standard reference water samples, were submitted by network site operators to the central analytical laboratory disguised as actual wet deposition samples. Results from the analyses of blind audit samples were used to calculate estimates of analyte bias associated with all network wet deposition samples analyzed in 1984 and to estimate analyte precision. Concentration differences between double blind samples that were submitted to the central analytical laboratory and separate analyses of aliquots of those blind audit samples that had not undergone network sample handling and shipping were used to calculate analyte masses that apparently were added to each blind audit sample by routine network handling and shipping procedures. These calculated masses indicated statistically significant biases for magnesium, sodium , potassium, chloride, and sulfate. Median calculated masses were 41.4 micrograms (ug) for calcium, 14.9 ug for magnesium, 23.3 ug for sodium, 0.7 ug for potassium, 16.5 ug for chloride and 55.3 ug for sulfate. Analyte precision was estimated using two different sets of replicate measures performed by the central analytical laboratory. Estimated standard deviations were similar to those previously reported. (Author 's abstract)
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234
Mudge, Elizabeth M; Brown, Paula N
2016-01-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823
Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.
Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel
2016-01-01
Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.
Quifer-Rada, Paola; Martínez-Huélamo, Miriam; Lamuela-Raventos, Rosa M
2017-07-19
Phenolic compounds are present in human fluids (plasma and urine) mainly as glucuronidated and sulfated metabolites. Up to now, due to the unavailability of standards, enzymatic hydrolysis has been the method of choice in analytical chemistry to quantify these phase II phenolic metabolites. Enzymatic hydrolysis procedures vary in enzyme concentration, pH and temperature; however, there is a lack of knowledge about the stability of polyphenols in their free form during the process. In this study, we evaluated the stability of 7 phenolic acids, 2 flavonoids and 3 prenylflavanoids in urine during enzymatic hydrolysis to assess the suitability of this analytical procedure, using three different concentrations of β-glucuronidase/sulfatase enzymes from Helix pomatia. The results indicate that enzymatic hydrolysis negatively affected the recovery of the precursor and free-form polyphenols present in the sample. Thus, enzymatic hydrolysis does not seem an ideal analytical strategy to quantify glucuronidated and sulfated polyphenol metabolites.
Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology
2016-01-01
Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217
Zietze, Stefan; Müller, Rainer H; Brecht, René
2008-03-01
In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.
The Importance of Method Selection in Determining Product Integrity for Nutrition Research.
Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N
2016-03-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.
Risk analysis of analytical validations by probabilistic modification of FMEA.
Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J
2012-05-01
Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.
Medical immunology: two-way bridge connecting bench and bedside.
Rijkers, Ger T; Damoiseaux, Jan G M C; Hooijkaas, Herbert
2014-12-01
Medical immunology in The Netherlands is a laboratory specialism dealing with immunological analyses as well as pre- and post-analytical consultation to clinicians (clinical immunologists and other specialists) involved in patients with immune mediated diseases. The scope of medical immunology includes immunodeficiencies, autoimmune diseases, allergy, transfusion and transplantation immunology, and lymphoproliferative disorders plus the monitoring of these patients. The training, professional criteria, quality control of procedures and laboratories is well organized. As examples of the bridge function of medical immunology between laboratory (bench) and patient (bedside) the contribution of medical immunologists to diagnosis and treatment of primary immunodeficiency diseases (in particular: humoral immunodeficiencies) as well as autoantibodies (anti-citrullinated proteins in rheumatoid arthritis) are given. Copyright © 2014 Elsevier B.V. All rights reserved.
Accelerated testing of space mechanisms
NASA Technical Reports Server (NTRS)
Murray, S. Frank; Heshmat, Hooshang
1995-01-01
This report contains a review of various existing life prediction techniques used for a wide range of space mechanisms. Life prediction techniques utilized in other non-space fields such as turbine engine design are also reviewed for applicability to many space mechanism issues. The development of new concepts on how various tribological processes are involved in the life of the complex mechanisms used for space applications are examined. A 'roadmap' for the complete implementation of a tribological prediction approach for complex mechanical systems including standard procedures for test planning, analytical models for life prediction and experimental verification of the life prediction and accelerated testing techniques are discussed. A plan is presented to demonstrate a method for predicting the life and/or performance of a selected space mechanism mechanical component.
Analysis of general-aviation accidents using ATC radar records
NASA Technical Reports Server (NTRS)
Wingrove, R. C.; Bach, R. E., Jr.
1982-01-01
It is pointed out that general aviation aircraft usually do not carry flight recorders, and in accident investigations the only available data may come from the Air Traffic Control (ATC) records. A description is presented of a technique for deriving time-histories of aircraft motions from ATC radar records. The employed procedure involves a smoothing of the raw radar data. The smoothed results, in combination with other available information (meteorological data and aircraft aerodynamic data) are used to derive the expanded set of motion time-histories. Applications of the considered analytical methods are related to different types of aircraft, such as light piston-props, executive jets, and commuter turboprops, as well as different accident situations, such as takeoff, climb-out, icing, and deep stall.
Subwavelength structured surfaces and their applications
NASA Technical Reports Server (NTRS)
Raguin, Daniel H.; Morris, G. Michael
1993-01-01
The term subwavelength structured (SWS) surface describes any surface that contains a subwavelength-period grating or gratings. The grating may be of any type provided the period is sufficiently fine so that, unlike conventional gratings, no diffraction orders propagate other than the zeroth orders. Because of the fine periods involved, the fabrication of such surfaces for applications in the visible and infrared portions of the spectral regime have only recently been considered. With refinements in holographic procedures and the push of the semiconductor industry for submicron lithography, production of SWS surfaces is becoming increasingly viable. The topics covered include the following: analytic approaches to analyze SWS surfaces, 1D periodic stratification and effective medium theory, design of waveplates using form birefringence, and 2D binary antireflection structured surfaces.
Investigations Into Tank Venting for Propellant Resupply
NASA Technical Reports Server (NTRS)
Hearn, H. C.; Harrison, Robert A. (Technical Monitor)
2002-01-01
Models and simulations have been developed and applied to the evaluation of propellant tank ullage venting, which is integral to one approach for propellant resupply. The analytical effort was instrumental in identifying issues associated with resupply objectives, and it was used to help develop an operational procedure to accomplish the desired propellant transfer for a particular storable bipropellant system. Work on the project was not completed, and several topics have been identified as requiring further study; these include the potential for liquid entrainment during the low-g and thermal/freezing effects in the vent line and orifice. Verification of the feasibility of this propellant venting and resupply approach still requires additional analyses as well as testing to investigate the fluid and thermodynamic phenomena involved.
A Discounting Framework for Choice With Delayed and Probabilistic Rewards
Green, Leonard; Myerson, Joel
2005-01-01
When choosing between delayed or uncertain outcomes, individuals discount the value of such outcomes on the basis of the expected time to or the likelihood of their occurrence. In an integrative review of the expanding experimental literature on discounting, the authors show that although the same form of hyperbola-like function describes discounting of both delayed and probabilistic outcomes, a variety of recent findings are inconsistent with a single-process account. The authors also review studies that compare discounting in different populations and discuss the theoretical and practical implications of the findings. The present effort illustrates the value of studying choice involving both delayed and probabilistic outcomes within a general discounting framework that uses similar experimental procedures and a common analytical approach. PMID:15367080
Pre-analytical issues in the haemostasis laboratory: guidance for the clinical laboratories.
Magnette, A; Chatelain, M; Chatelain, B; Ten Cate, H; Mullier, F
2016-01-01
Ensuring quality has become a daily requirement in laboratories. In haemostasis, even more than in other disciplines of biology, quality is determined by a pre-analytical step that encompasses all procedures, starting with the formulation of the medical question, and includes patient preparation, sample collection, handling, transportation, processing, and storage until time of analysis. This step, based on a variety of manual activities, is the most vulnerable part of the total testing process and is a major component of the reliability and validity of results in haemostasis and constitutes the most important source of erroneous or un-interpretable results. Pre-analytical errors may occur throughout the testing process and arise from unsuitable, inappropriate or wrongly handled procedures. Problems may arise during the collection of blood specimens such as misidentification of the sample, use of inadequate devices or needles, incorrect order of draw, prolonged tourniquet placing, unsuccessful attempts to locate the vein, incorrect use of additive tubes, collection of unsuitable samples for quality or quantity, inappropriate mixing of a sample, etc. Some factors can alter the result of a sample constituent after collection during transportation, preparation and storage. Laboratory errors can often have serious adverse consequences. Lack of standardized procedures for sample collection accounts for most of the errors encountered within the total testing process. They can also have clinical consequences as well as a significant impact on patient care, especially those related to specialized tests as these are often considered as "diagnostic". Controlling pre-analytical variables is critical since this has a direct influence on the quality of results and on their clinical reliability. The accurate standardization of the pre-analytical phase is of pivotal importance for achieving reliable results of coagulation tests and should reduce the side effects of the influence factors. This review is a summary of the most important recommendations regarding the importance of pre-analytical factors for coagulation testing and should be a tool to increase awareness about the importance of pre-analytical factors for coagulation testing.
Panos, Joseph A.; Hoffman, Joshua T.; Wordeman, Samuel C.; Hewett, Timothy E.
2016-01-01
Background Correction of neuromuscular impairments after anterior cruciate ligament injury is vital to successful return to sport. Frontal plane knee control during landing is a common measure of lower-extremity neuromuscular control and asymmetries in neuromuscular control of the knee can predispose injured athletes to additional injury and associated morbidities. Therefore, this study investigated the effects of anterior cruciate ligament injury on knee biomechanics during landing. Methods Two-dimensional frontal plane video of single leg drop, cross over drop, and drop vertical jump dynamic movement trials was analyzed for twenty injured and reconstructed athletes. The position of the knee joint center was tracked in ImageJ software for 500 milliseconds after landing to calculate medio-lateral knee motion velocities and determine normal fluency, the number of times per second knee velocity changed direction. The inverse of this calculation, analytical fluency, was used to associate larger numerical values with fluent movement. Findings Analytical fluency was decreased in involved limbs for single leg drop trials (P=0.0018). Importantly, analytical fluency for single leg drop differed compared to cross over drop trials for involved (P<0.001), but not uninvolved limbs (P=0.5029). For involved limbs, analytical fluency values exhibited a stepwise trend in relative magnitudes. Interpretation Decreased analytical fluency in involved limbs is consistent with previous studies. Fluency asymmetries observed during single leg drop tasks may be indicative of abhorrent landing strategies in the involved limb. Analytical fluency differences in unilateral tasks for injured limbs may represent neuromuscular impairment as a result of injury. PMID:26895446
Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G
2000-07-01
Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.
Lamminpää, A; Riihimäki, V
1992-11-01
Pesticide-related incidents are uncommon in Finland. They comprised 0.11% of all hospitalizations due to poisoning in 1987-88. A search of the nationwide Hospital Discharge Register revealed 78 pesticide-related incidents in a 5-year period. Some 30 different agents were involved, the most frequent being organophosphate and MCPA. Only 36 cases (46%) were judged to be unequivocal or probable pesticide poisonings; 26 (33%) were probably other illnesses because of no or minimal exposure and of the children admitted for follow-up, nine (12%) had potentially marked exposure, but no poisoning developed owing to vigorous early treatment which limited absorption, and seven (9%) cases remained undetermined. According to our analysis, the management of patients with (suspected) pesticide poisoning at hospitals could be further improved if the following procedures were emphasised: decontamination of the skin when appropriate, systematic early estimation of the likely dose involved, analytical verification of pesticide absorption whenever feasible, and consistent collaboration with a toxicological advisory service.
Schwaighofer, Andreas; Kuligowski, Julia; Quintás, Guillermo; Mayer, Helmut K; Lendl, Bernhard
2018-06-30
Analysis of proteins in bovine milk is usually tackled by time-consuming analytical approaches involving wet-chemical, multi-step sample clean-up procedures. The use of external cavity-quantum cascade laser (EC-QCL) based IR spectroscopy was evaluated as an alternative screening tool for direct and simultaneous quantification of individual proteins (i.e. casein and β-lactoglobulin) and total protein content in commercial bovine milk samples. Mid-IR spectra of protein standard mixtures were used for building partial least squares (PLS) regression models. A sample set comprising different milk types (pasteurized; differently processed extended shelf life, ESL; ultra-high temperature, UHT) was analysed and results were compared to reference methods. Concentration values of the QCL-IR spectroscopy approach obtained within several minutes are in good agreement with reference methods involving multiple sample preparation steps. The potential application as a fast screening method for estimating the heat load applied to liquid milk is demonstrated. Copyright © 2018 Elsevier Ltd. All rights reserved.
Stegmann, Benedikt; Dörfelt, Anett; Haen, Ekkehard
2016-02-01
For psychostimulants, a marked individual variability in the dose-response relationship and large differences in plasma concentrations after similar doses are known. Therefore, optimizing the efficacy of these drugs is at present the most promising way to exploit their full pharmacological potential. Moreover, it seems important to examine oral fluid as less invasive biological matrix for its benefit in therapeutic drug monitoring for patients with hyperkinetic disorder. A high-performance liquid chromatography method for quantification of methylphenidate (MPH), dexamphetamine (DXA), and atomoxetine in serum and oral fluid has been developed and validated. The analytical procedure involves liquid-liquid extraction, derivatization with 4-(4,5-diphenyl-1H-imidazol-2-yl)benzoyl chloride as a label and chromatographic separation on a Phenomenex Gemini-NX C18 analytical column using gradient elution with water-acetonitrile. The derivatized analytes were detected at 330 nm (excitation wavelength) and 440 nm (emission wavelength). To examine the oral fluid/serum ratios, oral fluid samples were collected simultaneously to blood samples from patients with hyperkinetic disorder. The method allows quantification of all analytes in serum and oral fluid within 16 minutes under the same or similar conditions. Oral fluid/serum ratios for MPH and DXA were highly variable and showed an accumulation of these drugs in oral fluid. The developed method covers the determination of MPH, DXA, and atomoxetine concentrations in serum and oral fluid after the intake of therapeutic doses. Oral fluid samples are useful for the qualitative detection of MPH and DXA.
Sandhu, Sundeep Kaur; Kellett, Stephen; Hardy, Gillian
2017-11-01
"Exits" in cognitive analytic therapy (CAT) are methods that change unhelpful patterns or roles during the final "revision" phase of the therapy. How exits are conceived and achieved is currently poorly understood. This study focussed on the revision stage to explore and define how change is accomplished in CAT. Qualitative content analysis studied transcripts of sessions 6 and 7 of a protocol delivered 8-session CAT treatment for depression. Eight participants met the study inclusion criteria, and therefore, 16 sessions were analysed. The exit model developed contained 3 distinct (but interacting) phases: (a) developing an observing self via therapist input or client self-reflection, (b) breaking out of old patterns by creating new roles and procedures, and (c) utilisation of a range of methods to support and maintain change. Levels of interrater reliability for the exit categories that formed the model were good. The revision stage of CAT emerged as a complex and dynamic process involving 3 interacting stages. Further research is recommended to understand how exits relate to durability of change and whether change processes differ according to presenting problem. Exit work in cognitive analytic therapy is a dynamic process that requires progression through stages of insight, active change, and consolidation. Development of an "observing self" is an important foundation stone for change, and cognitive analytic therapists need to work within the client's zone of proximal development. A number of aspects appear important in facilitating change, such as attending to the process and feelings generated by change talk. Copyright © 2017 John Wiley & Sons, Ltd.
Van Nimmen, Nadine F J; Veulemans, Hendrik A F
2004-05-07
A highly sensitive gas chromatographic-mass spectrometric (GC-MS) analytical method for the determination of the opioid narcotics fentanyl, alfentanil, and sufentanil in industrial hygiene personal air samples and surface contamination wipes was developed and comprehensively validated. Sample preparation involved a single step extraction of the samples with methanol, fortified with a fixed amount of the penta-deuterated analogues of the opioid narcotics as internal standard. The GC-MS analytical procedure using selected ion monitoring (SIM) was shown to be highly selective. Linearity was shown for levels of extracted wipe and air samples corresponding to at least 0.1-2 times their surface contamination limit (SCL) and accordingly to 0.1-2 times their time weighted average occupational exposure limit (OEL-TWA) based on a full shift 9601 air sample. Extraction recoveries were determined for spiked air samples and surface wipes and were found to be quantitative for both sampling media in the entire range studied. The air sampling method's limit of detection (LOD) was determined to be 0.4 ng per sample for fentanyl and sufentanil and 1.6 ng per sample for alfentanil, corresponding to less than 1% of their individual OEL for a full shift air sample (9601). The limit of quantification (LOQ) was found to be 1.4, 1.2, and 5.0 ng per filter for fentanyl, sufentanil, and alfentanil, respectively. The wipe sampling method had LODs of 4 ng per wipe for fentanyl and sufentanil and 16 ng per wipe for alfentanil and LOQs of respectively, 14, 12, and 50 ng per wipe. The analytical intra-assay precision of the air sampling and wipe sampling method, defined as the coefficient of variation on the analytical result of six replicate spiked media was below 10 and 5%, respectively, for all opioids at all spike levels. Accuracy expressed as relative error was determined to be below 10%, except for alfentanil at the lowest spike level (-13.1%). The stability of the opioids during simulated air sampling was investigated. For fentanyl and sufentanil a quantitative recovery was observed at all spike levels, while for alfentanil recoveries ranged from 60.3 to 85.4%. When spiked air samples were stored at ambient temperature and at -15 degrees C quantitative recovery was found for fentanyl and sufentanil after 7 and 14 days. For alfentanil a slight loss seemed to occur upon storage during 7 days, being more explicit after 14 days. Ambient storage of spiked wipes seemed to lead to significant losses of all opioids studied, yielding recoveries of 37.7-88.3%. Upon storage of similar wipes at -15 degrees C a significantly higher recovery was found ranging from 77.3 to 88.3%. The developed analytical and sampling procedures have been recently applied in an explorative field study of which the results of surface contamination wipe sampling are presented in this paper. To our knowledge, this is the first study addressing the development and validation of analytical procedures for the assessment of external occupational exposure to potent opioid narcotics.
Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-10-24
This plan incorporates U.S. Department of Energy (DOE) Office of Legacy Management (LM) standard operating procedures (SOPs) into environmental monitoring activities and will be implemented at all sites managed by LM. This document provides detailed procedures for the field sampling teams so that samples are collected in a consistent and technically defensible manner. Site-specific plans (e.g., long-term surveillance and maintenance plans, environmental monitoring plans) document background information and establish the basis for sampling and monitoring activities. Information will be included in site-specific tabbed sections to this plan, which identify sample locations, sample frequencies, types of samples, field measurements, and associatedmore » analytes for each site. Additionally, within each tabbed section, program directives will be included, when developed, to establish additional site-specific requirements to modify or clarify requirements in this plan as they apply to the corresponding site. A flowchart detailing project tasks required to accomplish routine sampling is displayed in Figure 1. LM environmental procedures are contained in the Environmental Procedures Catalog (LMS/PRO/S04325), which incorporates American Society for Testing and Materials (ASTM), DOE, and U.S. Environmental Protection Agency (EPA) guidance. Specific procedures used for groundwater and surface water monitoring are included in Appendix A. If other environmental media are monitored, SOPs used for air, soil/sediment, and biota monitoring can be found in the site-specific tabbed sections in Appendix D or in site-specific documents. The procedures in the Environmental Procedures Catalog are intended as general guidance and require additional detail from planning documents in order to be complete; the following sections fulfill that function and specify additional procedural requirements to form SOPs. Routine revision of this Sampling and Analysis Plan will be conducted annually at the beginning of each fiscal year when attachments in Appendix D, including program directives and sampling location/analytical tables, will be reviewed by project personnel and updated. The sampling location/analytical tables in Appendix D, however, may have interim updates according to project direction that are not reflected in this plan. Deviations from location/analytical tables in Appendix D prior to sampling will be documented in project correspondence (e.g., startup letters). If significant changes to other aspects of this plan are required before the annual update, then the plan will be revised as needed.« less
Dai, James Y.; Hughes, James P.
2012-01-01
The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ndong, Mamadou; Lauvergnat, David; Nauts, André
2013-11-28
We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less
Use of evidence in a categorization task: analytic and holistic processing modes.
Greco, Alberto; Moretti, Stefania
2017-11-01
Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.
Using Analytic Hierarchy Process in Textbook Evaluation
ERIC Educational Resources Information Center
Kato, Shigeo
2014-01-01
This study demonstrates the application of the analytic hierarchy process (AHP) in English language teaching materials evaluation, focusing in particular on its potential for systematically integrating different components of evaluation criteria in a variety of teaching contexts. AHP is a measurement procedure wherein pairwise comparisons are made…
Analytical solution for the advection-dispersion transport equation in layered media
USDA-ARS?s Scientific Manuscript database
The advection-dispersion transport equation with first-order decay was solved analytically for multi-layered media using the classic integral transform technique (CITT). The solution procedure used an associated non-self-adjoint advection-diffusion eigenvalue problem that had the same form and coef...
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...
Computer program for calculating the flow field of supersonic ejector nozzles
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1974-01-01
An analytical procedure for computing the performance of supersonic ejector nozzles is presented. This procedure includes real sonic line effects and an interaction analysis for the mixing process between the two streams. The procedure is programmed in FORTRAN 4 and has operated successfully on IBM 7094, IBM 360, CDC 6600, and Univac 1108.
ERIC Educational Resources Information Center
Dunn, William N.; And Others
This volume presents in one collection a systematic inventory of research and analytic procedures appropriate for generating information on knowledge production, diffusion, and utilization, gathered by the University of Pittsburgh Program for the Study of Knowledge Use. The main concern is with those procedures that focus on the utilization of…
This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...
Iqbal, Sahar; Mustansar, Tazeen
2017-03-01
Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.
González-Fuenzalida, R. A.; Moliner-Martínez, Y.; Prima-Garcia, Helena; Ribera, Antonio; Campins-Falcó, P.; Zaragozá, Ramon J.
2014-01-01
The use of magnetic nanomaterials for analytical applications has increased in the recent years. In particular, magnetic nanomaterials have shown great potential as adsorbent phase in several extraction procedures due to the significant advantages over the conventional methods. In the present work, the influence of magnetic forces over the extraction efficiency of triazines using superparamagnetic silica nanoparticles (NPs) in magnetic in tube solid phase microextraction (Magnetic-IT-SPME) coupled to CapLC has been evaluated. Atrazine, terbutylazine and simazine has been selected as target analytes. The superparamagnetic silica nanomaterial (SiO2-Fe3O4) deposited onto the surface of a capillary column gave rise to a magnetic extraction phase for IT-SPME that provided a enhancemment of the extraction efficiency for triazines. This improvement is based on two phenomena, the superparamegnetic behavior of Fe3O4 NPs and the diamagnetic repulsions that take place in a microfluidic device such a capillary column. A systematic study of analytes adsorption and desorption was conducted as function of the magnetic field and the relationship with triazines magnetic susceptibility. The positive influence of magnetism on the extraction procedure was demonstrated. The analytical characteristics of the optimized procedure were established and the method was applied to the determination of the target analytes in water samples with satisfactory results. When coupling Magnetic-IT-SPME with CapLC, improved adsorption efficiencies (60%–63%) were achieved compared with conventional adsorption materials (0.8%–3%). PMID:28344221
Neutron radiative capture methods for surface elemental analysis
Trombka, J.I.; Senftle, F.; Schmadebeck, R.
1970-01-01
Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.
Estimating and testing mediation and moderation in within-subject designs.
Judd, C M; Kenny, D A; McClelland, G H
2001-06-01
Analyses designed to detect mediation and moderation of treatment effects are increasingly prevalent in research in psychology. The mediation question concerns the processes that produce a treatment effect. The moderation question concerns factors that affect the magnitude of that effect. Although analytic procedures have been reasonably well worked out in the case in which the treatment varies between participants, no systematic procedures for examining mediation and moderation have been developed in the case in which the treatment varies within participants. The authors present an analytic approach to these issues using ordinary least squares estimation.
Analytical studies of the Space Shuttle orbiter nose-gear tire
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.; Robinson, Martha P.
1991-01-01
A computational procedure is presented for evaluating the analytic sensitivity derivatives of the tire response with respect to material and geometrical properties of the tire. The tire is modeled by using a two-dimensional laminated anisotropic shell theory with the effects of variation in material and geometric parameters included. The computational procedure is applied to the case of the Space Shuttle orbiter nose-gear tire subjected to uniform inflation pressure. Numerical results are presented which show the sensitivity of the different tire response quantities to variations in the material characteristics of both the cord and rubber.
Computer-aided diagnostic strategy selection.
Greenes, R A
1986-03-01
Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.
Study of a heat rejection system using capillary pumping
NASA Technical Reports Server (NTRS)
Neal, L. G.; Wanous, D. J.; Clausen, O. W.
1971-01-01
Results of an analytical study investigating the application of capillary pumping to the heat rejection loop of an advanced Rankine cycle power conversion system are presented. The feasibility of the concept of capillary pumping as an alternate to electromagnetic pumping is analytically demonstrated. Capillary pumping is shown to provide a potential for weight and electrical power saving and reliability through the use of redundant systems. A screen wick pump design with arterial feed lines was analytically developed. Advantages of this design are high thermodynamic and hydrodynamic efficiency, which provide a lightweight easily packaged system. Operational problems were identified which must be solved for successful application of capillary pumping. The most important are the development of start up and shutdown procedures, and development of a means of keeping noncondensibles from the system and of earth-bound testing procedures.
Fast analytical model of MZI micro-opto-mechanical pressure sensor
NASA Astrophysics Data System (ADS)
Rochus, V.; Jansen, R.; Goyvaerts, J.; Neutens, P.; O’Callaghan, J.; Rottenberg, X.
2018-06-01
This paper presents a fast analytical procedure in order to design a micro-opto-mechanical pressure sensor (MOMPS) taking into account the mechanical nonlinearity and the optical losses. A realistic model of the photonic MZI is proposed, strongly coupled to a nonlinear mechanical model of the membrane. Based on the membrane dimensions, the residual stress, the position of the waveguide, the optical wavelength and the phase variation due to the opto-mechanical coupling, we derive an analytical model which allows us to predict the response of the total system. The effect of the nonlinearity and the losses on the total performance are carefully studied and measurements on fabricated devices are used to validate the model. Finally, a design procedure is proposed in order to realize fast design of this new type of pressure sensor.
Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.
Blake, Christopher J
2007-09-01
Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.
Analytic modeling of aerosol size distributions
NASA Technical Reports Server (NTRS)
Deepack, A.; Box, G. P.
1979-01-01
Mathematical functions commonly used for representing aerosol size distributions are studied parametrically. Methods for obtaining best fit estimates of the parameters are described. A catalog of graphical plots depicting the parametric behavior of the functions is presented along with procedures for obtaining analytical representations of size distribution data by visual matching of the data with one of the plots. Examples of fitting the same data with equal accuracy by more than one analytic model are also given.
The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)
2000-11-20
development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by
The mandate for a proper preservation in histopathological tissues.
Comănescu, Maria; Arsene, D; Ardeleanu, Carmen; Bussolati, G
2012-01-01
A sequence of technically reproducible procedures is mandatory to guarantee a proper preservation of tissues and to build up the basis for sound diagnoses. However, while the goal of these procedures was, until recently, to assure only structural (histological and cytological) preservation, an appropriate preservation of antigenic properties and of nucleic acid integrity is now additionally requested, in order to permit pathologists to provide the biological information necessary for the adoption of personalized therapies. The present review analyses the sequence of technical steps open to critical variations. Passages such as dehydration, paraffin embedding, sectioning and staining are relatively well standardized and allow adoption of dedicated (automatic) apparatuses, while other pre-analytical steps, i.e. time and modalities of transfer of surgical specimens from the surgical theatre to the pathology laboratory (s.c. "ischemia time") and the type and length of fixation are not standardized and are a potential cause of discrepancies in diagnostic results. Our group is involved in European-funded projects tackling these problems with the concrete objective of implementing a model of effective tumors investigations by high performance genetic and molecular methodologies. The problem of the discrepant quality level of histopathological and cytological preparations involved five European countries and exploiting the potential of "virtual slide technology". Concrete issues, techniques and pitfalls, as well as proposed guidelines for processing the tissues are shown in this presentation.
21 CFR 314.94 - Content and format of an abbreviated application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... bioequivalence study contained in the abbreviated new drug application, a description of the analytical and... exclusivity under section 505(j)(5)(F) of the act. (9) Chemistry, manufacturing, and controls. (i) The... the act and one copy of the analytical procedures and descriptive information needed by FDA's...
21 CFR 314.94 - Content and format of an abbreviated application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... bioequivalence study contained in the abbreviated new drug application, a description of the analytical and... exclusivity under section 505(j)(5)(F) of the act. (9) Chemistry, manufacturing, and controls. (i) The... the act and one copy of the analytical procedures and descriptive information needed by FDA's...
40 CFR 86.1207-96 - Sampling and analytical systems; evaporative emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) Evaporative Emission Test Procedures for New Gasoline-Fueled, Natural Gas-Fueled, Liquefied Petroleum Gas-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1207-96 Sampling and analytical systems..., the enclosure shall be gas tight in accordance with § 86.1217-96. Interior surfaces must be...
40 CFR 86.1207-96 - Sampling and analytical systems; evaporative emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) Evaporative Emission Test Procedures for New Gasoline-Fueled, Natural Gas-Fueled, Liquefied Petroleum Gas-Fueled and Methanol-Fueled Heavy-Duty Vehicles § 86.1207-96 Sampling and analytical systems..., the enclosure shall be gas tight in accordance with § 86.1217-96. Interior surfaces must be...
40 CFR 86.214-94 - Analytical gases.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for 1994 and Later Model Year Gasoline-Fueled New Light-Duty Vehicles, New Light-Duty Trucks and New Medium-Duty Passenger Vehicles; Cold Temperature Test Procedures § 86.214-94 Analytical gases. The provisions of § 86...
PFOS and PFOS: Analytics | Science Inventory | US EPA
This presentation describes the drivers for development of Method 537, the extraction and analytical procedure, performance data, holding time data as well as detection limits. The purpose of this presentation is to provide an overview of EPA drinking water Method 537 to the U.S. EPA Drinking Water Workshop participants.
Data collection and analysis software development for rotor dynamics testing in spin laboratory
NASA Astrophysics Data System (ADS)
Abdul-Aziz, Ali; Arble, Daniel; Woike, Mark
2017-04-01
Gas turbine engine components undergo high rotational loading another complex environmental conditions. Such operating environment leads these components to experience damages and cracks that can cause catastrophic failure during flights. There are traditional crack detections and health monitoring methodologies currently being used which rely on periodic routine maintenances, nondestructive inspections that often times involve engine and components dis-assemblies. These methods do not also offer adequate information about the faults, especially, if these faults at subsurface or not clearly evident. At NASA Glenn research center, the rotor dynamics laboratory is presently involved in developing newer techniques that are highly dependent on sensor technology to enable health monitoring and prediction of damage and cracks in rotor disks. These approaches are noninvasive and relatively economical. Spin tests are performed using a subscale test article mimicking turbine rotor disk undergoing rotational load. Non-contact instruments such as capacitive and microwave sensors are used to measure the blade tip gap displacement and blade vibrations characteristics in an attempt develop a physics based model to assess/predict the faults in the rotor disk. Data collection is a major component in this experimental-analytical procedure and as a result, an upgrade to an older version of the data acquisition software which is based on LabVIEW program has been implemented to support efficiently running tests and analyze the results. Outcomes obtained from the tests data and related experimental and analytical rotor dynamics modeling including key features of the updated software are presented and discussed.
Analytic H I-to-H2 Photodissociation Transition Profiles
NASA Astrophysics Data System (ADS)
Bialy, Shmuel; Sternberg, Amiel
2016-05-01
We present a simple analytic procedure for generating atomic (H I) to molecular ({{{H}}}2) density profiles for optically thick hydrogen gas clouds illuminated by far-ultraviolet radiation fields. Our procedure is based on the analytic theory for the structure of one-dimensional H I/{{{H}}}2 photon-dominated regions, presented by Sternberg et al. Depth-dependent atomic and molecular density fractions may be computed for arbitrary gas density, far-ultraviolet field intensity, and the metallicity-dependent H2 formation rate coefficient, and dust absorption cross section in the Lyman-Werner photodissociation band. We use our procedure to generate a set of {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition profiles for a wide range of conditions, from the weak- to strong-field limits, and from super-solar down to low metallicities. We show that if presented as functions of dust optical depth, the {{H}} {{I}} and {{{H}}}2 density profiles depend primarily on the Sternberg “α G parameter” (dimensionless) that determines the dust optical depth associated with the total photodissociated {{H}} {{I}} column. We derive a universal analytic formula for the {{H}} {{I}}{-}{to}{-}{{{H}}}2 transition points as a function of just α G. Our formula will be useful for interpreting emission-line observations of H I/{{{H}}}2 interfaces, for estimating star formation thresholds, and for sub-grid components in hydrodynamics simulations.
An analytical SMASH procedure (ASP) for sensitivity-encoded MRI.
Lee, R F; Westgate, C R; Weiss, R G; Bottomley, P A
2000-05-01
The simultaneous acquisition of spatial harmonics (SMASH) method of imaging with detector arrays can reduce the number of phase-encoding steps, and MRI scan time several-fold. The original approach utilized numerical gradient-descent fitting with the coil sensitivity profiles to create a set of composite spatial harmonics to replace the phase-encoding steps. Here, an analytical approach for generating the harmonics is presented. A transform is derived to project the harmonics onto a set of sensitivity profiles. A sequence of Fourier, Hilbert, and inverse Fourier transform is then applied to analytically eliminate spatially dependent phase errors from the different coils while fully preserving the spatial-encoding. By combining the transform and phase correction, the original numerical image reconstruction method can be replaced by an analytical SMASH procedure (ASP). The approach also allows simulation of SMASH imaging, revealing a criterion for the ratio of the detector sensitivity profile width to the detector spacing that produces optimal harmonic generation. When detector geometry is suboptimal, a group of quasi-harmonics arises, which can be corrected and restored to pure harmonics. The simulation also reveals high-order harmonic modulation effects, and a demodulation procedure is presented that enables application of ASP to a large numbers of detectors. The method is demonstrated on a phantom and humans using a standard 4-channel phased-array MRI system. Copyright 2000 Wiley-Liss, Inc.
Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina
2012-10-01
Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-03-01
A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-01-01
Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390
Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof
Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers satisfactory sensitivity of determination.
Multi-ion detection by one-shot optical sensors using a colour digital photographic camera.
Lapresta-Fernández, Alejandro; Capitán-Vallvey, Luis Fermín
2011-10-07
The feasibility and performance of a procedure to evaluate previously developed one-shot optical sensors as single and selective analyte sensors for potassium, magnesium and hardness are presented. The procedure uses a conventional colour digital photographic camera as the detection system for simultaneous multianalyte detection. A 6.0 megapixel camera was used, and the procedure describes how it is possible to quantify potassium, magnesium and hardness simultaneously from the images captured, using multianalyte one-shot sensors based on ionophore-chromoionophore chemistry, employing the colour information computed from a defined region of interest on the sensing membrane. One of the colour channels in the red, green, blue (RGB) colour space is used to build the analytical parameter, the effective degree of protonation (1-α(eff)), in good agreement with the theoretical model. The linearization of the sigmoidal response function increases the limit of detection (LOD) and analytical range in all cases studied. The increases were from 5.4 × 10(-6) to 2.7 × 10(-7) M for potassium, from 1.4 × 10(-4) to 2.0 × 10(-6) M for magnesium and from 1.7 to 2.0 × 10(-2) mg L(-1) of CaCO(3) for hardness. The method's precision was determined in terms of the relative standard deviation (RSD%) which was from 2.4 to 7.6 for potassium, from 6.8 to 7.8 for magnesium and from 4.3 to 7.8 for hardness. The procedure was applied to the simultaneous determination of potassium, magnesium and hardness using multianalyte one-shot sensors in different types of waters and beverages in order to cover the entire application range, statistically validating the results against atomic absorption spectrometry as the reference procedure. Accordingly, this paper is an attempt to demonstrate the possibility of using a conventional digital camera as an analytical device to measure this type of one-shot sensor based on ionophore-chromoionophore chemistry instead of using conventional lab instrumentation.
δ15N measurement of organic and inorganic substances by EA-IRMS: a speciation-dependent procedure.
Gentile, Natacha; Rossi, Michel J; Delémont, Olivier; Siegwolf, Rolf T W
2013-01-01
Little attention has been paid so far to the influence of the chemical nature of the substance when measuring δ(15)N by elemental analysis (EA)-isotope ratio mass spectrometry (IRMS). Although the bulk nitrogen isotope analysis of organic material is not to be questioned, literature from different disciplines using IRMS provides hints that the quantitative conversion of nitrate into nitrogen presents difficulties. We observed abnormal series of δ(15)N values of laboratory standards and nitrates. These unexpected results were shown to be related to the tailing of the nitrogen peak of nitrate-containing compounds. A series of experiments were set up to investigate the cause of this phenomenon, using ammonium nitrate (NH(4)NO(3)) and potassium nitrate (KNO(3)) samples, two organic laboratory standards as well as the international secondary reference materials IAEA-N1, IAEA-N2-two ammonium sulphates [(NH(4))(2)SO(4)]-and IAEA-NO-3, a potassium nitrate. In experiment 1, we used graphite and vanadium pentoxide (V(2)O(5)) as additives to observe if they could enhance the decomposition (combustion) of nitrates. In experiment 2, we tested another elemental analyser configuration including an additional section of reduced copper in order to see whether or not the tailing could originate from an incomplete reduction process. Finally, we modified several parameters of the method and observed their influence on the peak shape, δ(15)N value and nitrogen content in weight percent of nitrogen of the target substances. We found the best results using mere thermal decomposition in helium, under exclusion of any oxygen. We show that the analytical procedure used for organic samples should not be used for nitrates because of their different chemical nature. We present the best performance given one set of sample introduction parameters for the analysis of nitrates, as well as for the ammonium sulphate IAEA-N1 and IAEA-N2 reference materials. We discuss these results considering the thermochemistry of the substances and the analytical technique itself. The results emphasise the difference in chemical nature of inorganic and organic samples, which necessarily involves distinct thermochemistry when analysed by EA-IRMS. Therefore, they should not be processed using the same analytical procedure. This clearly impacts on the way international secondary reference materials should be used for the calibration of organic laboratory standards.
Analytical Derivation: An Epistemic Game for Solving Mathematically Based Physics Problems
ERIC Educational Resources Information Center
Bajracharya, Rabindra R.; Thompson, John R.
2016-01-01
Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the "analytical derivation" game. This game involves deriving an…
Multi-element RIMS Analysis of Genesis Solar Wind Collectors
NASA Astrophysics Data System (ADS)
Veryovkin, I. V.; Tripa, C. E.; Zinovev, A. V.; King, B. V.; Pellin, M. J.; Burnett, D. S.
2009-12-01
The samples of Solar Wind (SW) delivered by the NASA Genesis mission, present significant challenges for surface analytical techniques, in part due to severe terrestrial contamination of the samples on reentry, in part due to the ultra-shallow and diffused ion implants in the SW collector materials. We are performing measurements of metallic elements in the Genesis collectors using Resonance Ionization Mass Spectrometry (RIMS), an ultra-sensitive analytical method capable of detecting SW in samples with lateral dimensions of only a few mm and at concentrations from above one ppm to below one ppt. Since our last report at 2008 AGU Fall Meeting, we have (a) developed and tested new resonance ionization schemes permitting simultaneous measurements of up to three (Ca, Cr, and Mg) elements, and (b) improved reproducibility and accuracy of our RIMS analyses for SW-like samples (i.e. shallow ion implants) by developing and implementing an optimized set of new analytical protocols. This is important since the quality of scientific results from the Genesis mission critically depends on the accuracy of analytical techniques. In this work, we report on simultaneous RIMS measurements of Ca and Cr performed on two silicon SW collector samples, (#60179 and #60476). First, we have conducted test experiments with 3×1013 at/cm2 52Cr and 44Ca implants in silicon to evaluate the accuracy of our quantitative analyses. Implant fluencies were measured by RIMS to be 2.73×1013 and 2.71×1013 at/cm2 for 52Cr and 44Ca, respectively, which corresponds to an accuracy of ≈10%. Using the same implanted wafer as a reference, we conducted RIMS analyses of the Genesis samples: 3 spots on #60179 and 4 spots on #60476. The elemental SW fluencies expected for Cr and Ca are 2.95×1010 and 1.33×1011 at/cm2 , respectively. Our measurements of 52Cr yielded 3.0±0.6×1011 at/cm2 and 5.1±4.1×1010 at/cm2 for #60179 and #60476, respectively. For 40Ca, SW fluencies of 1.39±0.70×1011 at/cm2 in #60179 and 3.6±2.5×1013 at/cm2 in #60476 were measured. Thus, only one element in each sample showed reasonable agreement with the expected values, Ca in #60179 and Cr in #60476. However the cleaning procedures applied to these samples were different: #60179 was only Megasonicated in ultra-pure water, while #60476 was subjected to longer Megasonication and an RCA cleaning procedure involving multiple rinsing steps with acid solutions. It is apparent that the surface contamination and cleaning procedures influenced the results of our measurements. We will present these experimental results and discuss procedures - including improved sample cleaning, dual-beam high resolution sputter depth profiling from front and back sides of the sample, and modeling of near-surface impurity transport - aimed at improving the accuracy of determination of elemental abundances by ion sputtering based analytical methods. This work is supported by NASA through a grant NNH08AH761 and by UChicago Argonne, LLC, under contract No. DE-AC02-06CH11357.
42 CFR 493.1256 - Standard: Control procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...
NASA Astrophysics Data System (ADS)
Majidzadeh, K.; Ilves, G. J.
1981-08-01
A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.
Fort Dix Remedial Investigation/Feasibility Study for MAG-1 Area
1994-01-01
by PID headspace results or odor ), samples should be diluted to bring the target compound concentrations within the instrument calibration range...Conductivity Testing ................... 2-38 2.9 ANALYTICAL PROCEDURES FOR FIELD SCREENING SAMPLES .. 2-38 2.9.1 Volatile Organic Compounds ...ANALYSIS OF VOLATILE ORGANIC COMPOUNDS BY FIELD GAS CHROMATOGRAPHY - STANDARD OPERATING PROCEDURE APPENDIX B RDX EXPLOSIVES FIELD TEST KIT PROCEDURES
Determination of Total Solids and Ash in Algal Biomass: Laboratory Analytical Procedure (LAP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Wychen, Stefanie; Laurens, Lieve M. L.
2016-01-13
This procedure describes the methods used to determine the amount of moisture or total solids present in a freeze-dried algal biomass sample, as well as the ash content. A traditional convection oven drying procedure is covered for total solids content, and a dry oxidation method at 575 deg. C is covered for ash content.
Fais, Ana Paula; Franco, Rodolfo Scarpino Barboza; da Silva, Agnaldo Fernando Baldo; de Freitas, Osvaldo; Paschoal, Jonas Augusto Rizzato
2017-04-01
This paper describes the method development for sulfadimethoxine (SDM) and ormetoprim (OMP) quantitation in fish feed and fish fillet employing LC-MS/MS. In order to assess the reliability of the analytical method, valuation was undertaken as recommended by guidelines proposed by the Brazilian Ministry of Agriculture. The calibration curve for the quantification of both drugs in feed showed adequate linearity (r > 0.99), precision (CV < 12%) and trueness ranging from 97% to 100%. The method for the determination of SDM and OMP residues in fish fillet involved a simple sample preparation procedure that had adequate linearity (r > 0.99), precision (CV < 16%) and trueness around 100%, with CCα < 100.2 ng g - 1 and CCβ < 100.4 ng g - 1 . With a goal of avoiding the risk of drug leaching from feed into the aquatic environment during fish medication via the oral route, different procedures for drug incorporation into feed were evaluated. Coating feed pellets with ethyl cellulose polymer containing the drug showed promising results. In this case, medicated feed released drugs to water at a level below 6% when the medicated feed stayed in the water for up to 15 min.
NASA Astrophysics Data System (ADS)
Hérisson, Benjamin; Challamel, Noël; Picandet, Vincent; Perrot, Arnaud
2016-09-01
The static behavior of the Fermi-Pasta-Ulam (FPU) axial chain under distributed loading is examined. The FPU system examined in the paper is a nonlinear elastic lattice with linear and quadratic spring interaction. A dimensionless parameter controls the possible loss of convexity of the associated quadratic and cubic energy. Exact analytical solutions based on Hurwitz zeta functions are developed in presence of linear static loading. It is shown that this nonlinear lattice possesses scale effects and possible localization properties in the absence of energy convexity. A continuous approach is then developed to capture the main phenomena observed regarding the discrete axial problem. The associated continuum is built from a continualization procedure that is mainly based on the asymptotic expansion of the difference operators involved in the lattice problem. This associated continuum is an enriched gradient-based or nonlocal axial medium. A Taylor-based and a rational differential method are both considered in the continualization procedures to approximate the FPU lattice response. The Padé approximant used in the continualization procedure fits the response of the discrete system efficiently, even in the vicinity of the limit load when the non-convex FPU energy is examined. It is concluded that the FPU lattice system behaves as a nonlocal axial system in dynamic but also static loading.
WIPP waste characterization program sampling and analysis guidance manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-01-01
The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less
Quantitative HPLC Analysis of an Analgesic/Caffeine Formulation: Determination of Caffeine
NASA Astrophysics Data System (ADS)
Ferguson, Glenda K.
1998-04-01
A modern high performance liquid chromatography (HPLC) laboratory experiment which entails the separation of acetaminophen, aspirin, and caffeine and the quantitative assay of caffeine in commercial mixtures of these compounds has been developed. Our HPLC protocol resolves these compounds in only three minutes with a straightforward chromatographic apparatus which consists of a C-18 column, an isocratic mobile phase, UV detection at 254 nm, and an integrator; an expensive, sophisticated system is not required. The separation is both repeatable and rapid. Moreover, the experiment can be completed in a single three-hour period. The experiment is appropriate for any chemistry student who has completed a minimum of one year of general chemistry and is ideal for an analytical or instrumental analysis course. The experiment detailed herein involves the determination of caffeine in Goody's Extra Strength Headache Powders, a commercially available medication which contains acetaminophen, aspirin, and caffeine as active ingredients. However, the separation scheme is not limited to this brand of medication nor is it limited to caffeine as the analyte. With only minor procedural modifications, students can simultaneously quantitate all of these compounds in a commercial mixture. In our procedure, students prepare a series of four caffeine standard solutions as well as a solution from a pharmaceutical analgesic/caffeine mixture, chromatographically analyze each solution in quadruplicate, and plot relative average caffeine standard peak area versus concentration. From the mathematical relationship that results, the concentration of caffeine in the commercial formulation is obtained. Finally, the absolute standard deviation of the mean concentration is calculated.
Lambropoulou, Dimitra A; Konstantinou, Ioannis K; Albanis, Triantafyllos A
2006-07-28
In the present study a combined analytical method involving ultrasonic extraction (USE), sulfuric acid clean-up and headspace solid-phase microextraction (HS-SPME) was developed for the determination of chlorinated pesticides (CPs) in bird livers. Extraction of CPs from 1g of liver was performed by ultrasonication for 30 min using 20 mL of solvent mixture (n-hexane:acetone (4:1, v/v)). The extract was subsequently subjected to a clean-up step for lipid removal. A comparative study on several clean-up procedures prior to the HS-SPME enrichment step was performed in order to achieve maximum recovery and optimal clean-up efficiency, which would provide suitable limits of detection in the gas chromatographic analysis. For this purpose, destructive (sulfuric acid or sodium hydroxide treatment) and non-destructive (alumina column) clean-up procedures has been assayed. The treatment of the extract with 40% (v/v) H2SO4 prior to HS-SPME process showed the best performance since lower detection limits and higher extraction efficiencies were obtained. The method detection limit ranged from 0.5 to 1.0 ng g(-1) wet weight and peak areas were proportional to analyte concentrations (r2>0.990) in the range of 5-500 ng g(-1) wet wt. The method was found to be reproducible (R.S.D.<10%) and effective under the operational conditions proposed and was applied successfully to the analysis of CPs in liver tissues of various bird species from Greece.
[What can medicine expect from health economics?].
Bismarck, E; Schmitz-Dräger, B J; Schöffski, O
2012-04-01
Medicine has changed dramatically in the past ten decades thanks to the introduction of innovative diagnostic and therapeutic procedures. However, besides the unmistakable advances achieved in medicine, the costs of all health care systems have risen dramatically. In contrast to the escalation in expenditures, only moderate gains in proceeds have been accomplished. This situation requires that future financial resources be judiciously expended. The field of health economics has set as its goal the analysis of medical measures in terms of costs and benefits to be able to provide information on these parameters to those involved in the public health sector. The emerging problems are diverse and extend from assessment of effects and side effects to difficulties in standardizing analytical procedures and comparing results between different health care systems.In the context of this manuscript an attempt has been made to illustrate the methodological approaches to health economics based on current issues in the diagnosis and treatment of prostate cancer. This contribution intends to motivate stakeholders to view health economics as a tool to promote improvements in medical care and not as a means to regulating and rationing medical measures.
Metal speciation of environmental samples using SPE and SFC-AED analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, S.C.; Burford, M.D.; Robson, M.
1995-12-31
Due to growing public concern over heavy metals in the environment, soil, water and air particulate samples azre now routinely screened for their metal content. Conventional metal analysis typically involves acid digestion extraction and results in the generation of large aqueous and organic solvent waste. This harsh extraction process is usually used to obtain the total metal content of the sample, the extract being analysed by atomic emission or absorption spectroscoply techniques. A more selective method of metal extraction has been investigated which uses a supercritical fluid modified with a complexing agent. The relatively mild extraction method enables both organometallicmore » and inorganic metal species to be recovered intact. The various components from the supercritical fluid extract can be chromatographically separated using supercritical fluid chromatography (SFC) and positive identification of the metals achieved using atomic emission detection (AED). The aim of the study is to develop an analytical extraction procedure which enables a rapid, sensitive and quantitative analysis of metals in environmental samples, using just one extraction (eg SFE) and one analysis (eg SFC-AED) procedure.« less
NASA Technical Reports Server (NTRS)
Manson, S. S.; Halford, G. R.
1980-01-01
Simple procedures are presented for treating cumulative fatigue damage under complex loading history using either the damage curve concept or the double linear damage rule. A single equation is provided for use with the damage curve approach; each loading event providing a fraction of damage until failure is presumed to occur when the damage sum becomes unity. For the double linear damage rule, analytical expressions are provided for determining the two phases of life. The procedure involves two steps, each similar to the conventional application of the commonly used linear damage rule. When the sum of cycle ratios based on phase 1 lives reaches unity, phase 1 is presumed complete, and further loadings are summed as cycle ratios on phase 2 lives. When the phase 2 sum reaches unity, failure is presumed to occur. No other physical properties or material constants than those normally used in a conventional linear damage rule analysis are required for application of either of the two cumulative damage methods described. Illustrations and comparisons of both methods are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Montero, Luis G., E-mail: luisgonzaga.garcia@upm.e; Lopez, Elena, E-mail: elopez@caminos.upm.e; Monzon, Andres, E-mail: amonzon@caminos.upm.e
Most Strategic Environmental Assessment (SEA) research has been concerned with SEA as a procedure, and there have been relatively few developments and tests of analytical methodologies. The first stage of the SEA is the 'screening', which is the process whereby a decision is taken on whether or not SEA is required for a particular programme or plan. The effectiveness of screening and SEA procedures will depend on how well the assessment fits into the planning from the early stages of the decision-making process. However, it is difficult to prepare the environmental screening for an infrastructure plan involving a whole country.more » To be useful, such methodologies must be fast and simple. We have developed two screening tools which would make it possible to estimate promptly the overall impact an infrastructure plan might have on biodiversity and global warming for a whole country, in order to generate planning alternatives, and to determine whether or not SEA is required for a particular infrastructure plan.« less
Petty, J.D.; Orazio, C.E.; Huckins, J.N.; Gale, R.W.; Lebo, J.A.; Meadows, J.C.; Echols, K.R.; Cranor, W.L.
2000-01-01
Semipermeable membrane devices (SPMDs) are used with increasing frequency, and throughout the world as samplers of organic contaminants. The devices can be used to detect a variety of lipophilic chemicals in water, sediment/soil, and air. SPMDs are designed to sample nonpolar, hydrophobic chemicals. The maximum concentration factor achievable for a particular chemical is proportional to its octanol–water partition coefficient. Techniques used for cleanup of SPMD extracts for targeted analytes and for general screening by full-scan mass spectrometry do not differ greatly from techniques used for extracts of other matrices. However, SPMD extracts contain potential interferences that are specific to the membrane–lipid matrix. Procedures have been developed or modified to alleviate these potential interferences. The SPMD approach has been demonstrated to be applicable to sequestering and analyzing a wide array of environmental contaminants including organochlorine pesticides, polychlorinated biphenyls, polycyclic aromatic hydrocarbons, polychlorinated dioxins and dibenzofurans, selected organophosphate pesticides and pyrethroid insecticides, and other nonpolar organic chemicals. We present herein an overview of effective procedural steps for analyzing exposed SPMDs for trace to ultra-trace levels of contaminants sequestered from environmental matrices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, W.W.; Sullivan, H.H.
Electroless nicke-plate characteristics are substantially influenced by percent phosphorous concentrations. Available ASTM analytical methods are designed for phosphorous concentrations of less than one percent compared to the 4.0 to 20.0% concentrations common in electroless nickel plate. A variety of analytical adaptations are applied through the industry resulting in poor data continuity. This paper presents a statistical comparison of five analytical methods and recommends accurate and precise procedures for use in percent phosphorous determinations in electroless nickel plate. 2 figures, 1 table.
Life cycle management of analytical methods.
Parr, Maria Kristina; Schmidt, Alexander H
2018-01-05
In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
A spin column-free approach to sodium hydroxide-based glycan permethylation.
Hu, Yueming; Borges, Chad R
2017-07-24
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.
A spin column-free approach to sodium hydroxide-based glycan permethylation†
Hu, Yueming; Borges, Chad R.
2018-01-01
Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Analytical Ultrasonics in Materials Research and Testing
NASA Technical Reports Server (NTRS)
Vary, A.
1986-01-01
Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.
Zayed, M A; El-Habeeb, Abeer A
2009-06-01
The reactions between the drug buspirone (busp) in its base form and iodine amphoteric reagent (n-donor and/or sigma-acceptor) and with tetracyanoethylene as a pi-acceptor reagent (TCNE) have been studied spectrophotometrically at different reactant concentrations, time intervals, temperatures, and with different solvents and wavelengths, with the aim of selecting the conditions that give the most suitable molar extinction coefficients. This study aims chiefly to throw light on the nature of these reactions and to select the most proper conditions for spectrophotometric application of these reagents to determine this biologically active drug used in treating different diseases. The reaction mechanism involves the formation of busp-I(2) outer and inner sphere complexes. The separated busp-I(2) solid product obtained was investigated using elemental analyses, FT-IR, thermal analyses (TA) and electron ionization mass spectrometry (EI-MS) and was found to be biologically active. The reaction mechanism of busp-TCNE involves the formation of a charge transfer (CT) complex. The analytical parameters of the proposed spectrophotometric procedures were calculated. These procedures were applied in the analysis of busp in its formulations as a drug used to treat psychiatric illnesses. The values of the Sandell sensitivity, standard deviation (SD), relative standard deviation (RSD) and recovery percentage show the high sensitivity of these procedures. This study also presents a promising new busp-I(2) drug derivative that can be used more efficiently for the same purposes as its parent. It gives a clear idea about the possible metabolites and metabolic pathways of busp and its derivative that may occur in vivo. Copyright 2009 John Wiley & Sons, Ltd.
40 CFR 63.7521 - What fuel analyses and procedures must I use?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., at a point prior to mixing with other dissimilar fuel types. (iv) For each fuel type, the analytical methods, with the expected minimum detection levels, to be used for the measurement of selected total metals, chlorine, or mercury. (v) If you request to use an alternative analytical method other than those...
40 CFR 91.414 - Raw gaseous exhaust sampling and analytical system description.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw gaseous exhaust sampling and... Gaseous Exhaust Test Procedures § 91.414 Raw gaseous exhaust sampling and analytical system description... the component systems. (g) The following requirements must be incorporated in each system used for raw...
40 CFR 91.414 - Raw gaseous exhaust sampling and analytical system description.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw gaseous exhaust sampling and... Gaseous Exhaust Test Procedures § 91.414 Raw gaseous exhaust sampling and analytical system description... the component systems. (g) The following requirements must be incorporated in each system used for raw...
40 CFR 89.412 - Raw gaseous exhaust sampling and analytical system description.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw gaseous exhaust sampling and...-IGNITION ENGINES Exhaust Emission Test Procedures § 89.412 Raw gaseous exhaust sampling and analytical... must be incorporated in each system used for raw testing under this subpart. (1) [Reserved] (2) The...
40 CFR 89.412 - Raw gaseous exhaust sampling and analytical system description.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw gaseous exhaust sampling and...-IGNITION ENGINES Exhaust Emission Test Procedures § 89.412 Raw gaseous exhaust sampling and analytical... must be incorporated in each system used for raw testing under this subpart. (1) [Reserved] (2) The...
Predictors of Bullying and Victimization in Childhood and Adolescence: A Meta-Analytic Investigation
ERIC Educational Resources Information Center
Cook, Clayton R.; Williams, Kirk R.; Guerra, Nancy G.; Kim, Tia E.; Sadek, Shelly
2010-01-01
Research on the predictors of 3 bully status groups (bullies, victims, and bully victims) for school-age children and adolescents was synthesized using meta-analytic procedures. The primary purpose was to determine the relative strength of individual and contextual predictors to identify targets for prevention and intervention. Age and how…
Mega-Analysis of School Psychology Blueprint for Training and Practice Domains
ERIC Educational Resources Information Center
Burns, Matthew K.; Kanive, Rebecca; Zaslofsky, Anne F.; Parker, David C.
2013-01-01
Meta-analytic research is an effective method for synthesizing existing research and for informing practice and policy. Hattie (2009) suggested that meta-analytic procedures could be employed to existing meta-analyses to create a mega-analysis. The current mega-analysis examined a sample of 47 meta-analyses according to the "School…
Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993
ERIC Educational Resources Information Center
O'Driscoll, Gillian A.; Callahan, Brandy L.
2008-01-01
Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…
Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach
ERIC Educational Resources Information Center
Cheung, Mike W.-L.; Chan, Wai
2005-01-01
Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…
Fitting Meta-Analytic Structural Equation Models with Complex Datasets
ERIC Educational Resources Information Center
Wilson, Sandra Jo; Polanin, Joshua R.; Lipsey, Mark W.
2016-01-01
A modification of the first stage of the standard procedure for two-stage meta-analytic structural equation modeling for use with large complex datasets is presented. This modification addresses two common problems that arise in such meta-analyses: (a) primary studies that provide multiple measures of the same construct and (b) the correlation…
ERIC Educational Resources Information Center
Gao, Ruomei
2015-01-01
In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…
ERIC Educational Resources Information Center
Zhang, Zhidong
2016-01-01
This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…
21 CFR 320.29 - Analytical methods for an in vivo bioavailability or bioequivalence study.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 5 2010-04-01 2010-04-01 false Analytical methods for an in vivo bioavailability..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS FOR HUMAN USE BIOAVAILABILITY AND BIOEQUIVALENCE REQUIREMENTS Procedures for Determining the Bioavailability or Bioequivalence of Drug Products § 320.29...
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Code of Federal Regulations, 2013 CFR
2013-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2012 CFR
2012-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2014 CFR
2014-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
ERIC Educational Resources Information Center
Hodge, David R.; Jackson, Kelly F.; Vaughn, Michael G.
2012-01-01
This study assessed the effectiveness of culturally sensitive interventions (CSIs) ("N" = 10) designed to address substance use among minority youths. Study methods consisted of systematic search procedures, quality of study ratings, and meta-analytic techniques to gauge effects and evaluate publication bias. The results, across all measures and…
A procedure is presented that uses a vacuum distillation/gas chromatography/mass spectrometry system for analysis of problematic matrices of volatile organic compounds. The procedure compensates for matrix effects and provides both analytical results and confidence intervals from...
Physical and Chemical Properties of the Copper-Alanine System: An Advanced Laboratory Project
ERIC Educational Resources Information Center
Farrell, John J.
1977-01-01
An integrated physical-analytical-inorganic chemistry laboratory procedure for use with undergraduate biology majors is described. The procedure requires five to six laboratory periods and includes acid-base standardizations, potentiometric determinations, computer usage, spectrophotometric determinations of crystal-field splitting…
Capacity improvement analytical tools and benchmark development for terminal operations
DOT National Transportation Integrated Search
2009-10-01
With U.S. air traffic predicted to triple over the : next fifteen years, new technologies and procedures are : being considered to cope with this growth. As such, it : may be of use to quickly and easily evaluate any new : technologies or procedures ...
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
Piérard, Gérald E; Courtois, Justine; Ritacco, Caroline; Humbert, Philippe; Fanian, Ferial; Piérard-Franchimont, Claudine
2015-01-01
Background In cosmetic science, noninvasive sampling of the upper part of the stratum corneum is conveniently performed using strippings with adhesive-coated discs (SACD) and cyanoacrylate skin surface strippings (CSSSs). Methods Under controlled conditions, it is possible to scrutinize SACD and CSSS with objectivity using appropriate methods of analytical morphology. These procedures apply to a series of clinical conditions including xerosis grading, comedometry, corneodynamics, corneomelametry, corneosurfametry, corneoxenometry, and dandruff assessment. Results With any of the analytical evaluations, SACD and CSSS provide specific salient information that is useful in the field of cosmetology. In particular, both methods appear valuable and complementary in assessing the human skin compatibility of personal skincare products. Conclusion A set of quantitative analytical methods applicable to the minimally invasive and low-cost SACD and CSSS procedures allow for a sound assessment of cosmetic effects on the stratum corneum. Under regular conditions, both methods are painless and do not induce adverse events. Globally, CSSS appears more precise and informative than the regular SACD stripping. PMID:25767402
Multianalyte imaging in one-shot format sensors for natural waters.
Lapresta-Fernández, A; Huertas, Rafael; Melgosa, Manuel; Capitán-Vallvey, L F
2009-03-23
A one-shot multisensor based on ionophore-chromoionophore chemistry for optical monitoring of potassium, magnesium and hardness in water is presented. The analytical procedure uses a black and white non-cooled CCD camera for image acquisition of the one-shot multisensor after reaction, followed by data treatment for quantitation using the grey value pixel average from a defined region of interest from each sensing area to build the analytical parameter 1-alpha. In optimised experimental conditions, the procedure shows a large linear range, up to 6 orders using the linearised model and good detection limits: 9.92 x 10(-5)mM, 1.86 x 10(-3)mM and 1.30 x 10(-2)mgL(-1) of CaCO(3) for potassium, magnesium and hardness, respectively. This analysis system exhibits good precision in terms of relative standard deviation (RSD%) from 2.3 to 3.8 for potassium, from 5.0 to 6.8 for magnesium and from 5.4 to 5.9 for hardness. The trueness of this multisensor procedure was demonstrated comparing it with results obtained by a DAD spectrophotometer used as a reference. Finally, it was satisfactorily applied to the analysis of these analytes in miscellaneous samples, such as water and beverage samples from different origins, validating the results against atomic absorption spectrometry (AAS) as the reference procedure.
Lei, W Q; El Haddad, J; Motto-Ros, V; Gilon-Delepine, N; Stankova, A; Ma, Q L; Bai, X S; Zheng, L J; Zeng, H P; Yu, J
2011-07-01
Mineral elements contained in commercially available milk powders, including seven infant formulae and one adult milk, were analyzed with inductively coupled plasma atomic emission spectrometry (ICP-AES) and laser-induced breakdown spectroscopy (LIBS). The purpose of this work was, through a direct comparison of the analytical results, to provide an assessment of the performance of LIBS, and especially of the procedure of calibration-free LIBS (CF-LIBS), to deal with organic compounds such as milk powders. In our experiments, the matrix effect was clearly observed affecting the analytical results each time laser ablation was employed for sampling. Such effect was in addition directly observed by determining the physical parameters of the plasmas induced on the different samples. The CF-LIBS procedure was implemented to deduce the concentrations of Mg and K with Ca as the internal reference element. Quantitative analytical results with CF-LIBS were validated with ICP-AES measurements and nominal concentrations specified for commercial milks. The obtained good results with the CF-LIBS procedure demonstrate its capacity to take into account the difference in physical parameters of the plasma in the calculation of the concentrations of mineral elements, which allows a significant reduction of the matrix effect related to laser ablation. We finally discuss the way to optimize the implementation of the CF-LIBS procedure for the analysis of mineral elements in organic materials.
NASA Astrophysics Data System (ADS)
Syta, Olga; Rozum, Karol; Choińska, Marta; Zielińska, Dobrochna; Żukowska, Grażyna Zofia; Kijowska, Agnieszka; Wagner, Barbara
2014-11-01
Analytical procedure for the comprehensive chemical characterization of samples from medieval Nubian wall-paintings by means of portable X-ray fluorescence (pXRF), laser ablation inductively coupled plasma mass spectrometry (LA-ICPMS) and Raman spectroscopy (RS) was proposed in this work. The procedure was used for elemental and molecular investigations of samples from archeological excavations in Nubia (modern southern Egypt and northern Sudan). Numerous remains of churches with painted decorations dated back to the 7th-14th century were excavated in the region of medieval kingdoms of Nubia but many aspects of this art and its technology are still unknown. Samples from the selected archeological sites (Faras, Old Dongola and Banganarti) were analyzed in the form of transfers (n = 26), small fragments collected during the excavations (n = 35) and cross sections (n = 15). XRF was used to collect data about elemental composition, LA-ICPMS allowed mapping of selected elements, while RS was used to get the molecular information about the samples. The preliminary results indicated the usefulness of the proposed analytical procedure for distinguishing the substances, from both the surface and sub-surface domains of the wall-paintings. The possibility to identify raw materials from the wall-paintings will be used in the further systematic, archeometric studies devoted to the detailed comparison of various historic Nubian centers.
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
NASTRAN analysis of the 1/8-scale space shuttle dynamic model
NASA Technical Reports Server (NTRS)
Bernstein, M.; Mason, P. W.; Zalesak, J.; Gregory, D. J.; Levy, A.
1973-01-01
The space shuttle configuration has more complex structural dynamic characteristics than previous launch vehicles primarily because of the high model density at low frequencies and the high degree of coupling between the lateral and longitudinal motions. An accurate analytical representation of these characteristics is a primary means for treating structural dynamics problems during the design phase of the shuttle program. The 1/8-scale model program was developed to explore the adequacy of available analytical modeling technology and to provide the means for investigating problems which are more readily treated experimentally. The basic objectives of the 1/8-scale model program are: (1) to provide early verification of analytical modeling procedures on a shuttle-like structure, (2) to demonstrate important vehicle dynamic characteristics of a typical shuttle design, (3) to disclose any previously unanticipated structural dynamic characteristics, and (4) to provide for development and demonstration of cost effective prototype testing procedures.
METHOD 544. DETERMINATION OF MICROCYSTINS AND ...
Method 544 is an accurate and precise analytical method to determine six microcystins (including MC-LR) and nodularin in drinking water using solid phase extraction and liquid chromatography tandem mass spectrometry (SPE-LC/MS/MS). The advantage of this SPE-LC/MS/MS is its sensitivity and ability to speciate the microcystins. This method development task establishes sample preservation techniques, sample concentration and analytical procedures, aqueous and extract holding time criteria and quality control procedures. Draft Method 544 undergone a multi-laboratory verification to ensure other laboratories can implement the method and achieve the quality control measures specified in the method. It is anticipated that Method 544 may be used in UCMR 4 to collect nationwide occurrence data for selected microcystins in drinking water. The purpose of this research project is to develop an accurate and precise analytical method to concentrate and determine selected MCs and nodularin in drinking water.
Montgomery, L D; Montgomery, R W; Guisado, R
1995-05-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
NASA Technical Reports Server (NTRS)
Montgomery, L. D.; Montgomery, R. W.; Guisado, R.
1995-01-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
Some subclasses of multivalent functions involving a certain linear operator
NASA Astrophysics Data System (ADS)
Srivastava, H. M.; Patel, J.
2005-10-01
The authors investigate various inclusion and other properties of several subclasses of the class of normalized p-valent analytic functions in the open unit disk, which are defined here by means of a certain linear operator. Problems involving generalized neighborhoods of analytic functions in the class are investigated. Finally, some applications of fractional calculus operators are considered.
Analysis of the interaction of a weak normal shock wave with a turbulent boundary layer
NASA Technical Reports Server (NTRS)
Melnik, R. E.; Grossman, B.
1974-01-01
The method of matched asymptotic expansions is used to analyze the interaction of a normal shock wave with an unseparated turbulent boundary layer on a flat surface at transonic speeds. The theory leads to a three-layer description of the interaction in the double limit of Reynolds number approaching infinity and Mach number approaching unity. The interaction involves an outer, inviscid rotational layer, a constant shear-stress wall layer, and a blending region between them. The pressure distribution is obtained from a numerical solution of the outer-layer equations by a mixed-flow relaxation procedure. An analytic solution for the skin friction is determined from the inner-layer equations. The significance of the mathematical model is discussed with reference to existing experimental data.
General aviation crash safety program at Langley Research Center
NASA Technical Reports Server (NTRS)
Thomson, R. G.
1976-01-01
The purpose of the crash safety program is to support development of the technology to define and demonstrate new structural concepts for improved crash safety and occupant survivability in general aviation aircraft. The program involves three basic areas of research: full-scale crash simulation testing, nonlinear structural analyses necessary to predict failure modes and collapse mechanisms of the vehicle, and evaluation of energy absorption concepts for specific component design. Both analytical and experimental methods are being used to develop expertise in these areas. Analyses include both simplified procedures for estimating energy absorption capabilities and more complex computer programs for analysis of general airframe response. Full-scale tests of typical structures as well as tests on structural components are being used to verify the analyses and to demonstrate improved design concepts.
Rutty, Guy N; Barber, Jade; Amoroso, Jasmin; Morgan, Bruno; Graham, Eleanor A M
2013-12-01
Post-mortem computed tomography angiography (PMCTA) involves the injection of contrast agents. This could have both a dilution effect on biological fluid samples and could affect subsequent post-contrast analytical laboratory processes. We undertook a small sample study of 10 targeted and 10 whole body PMCTA cases to consider whether or not these two methods of PMCTA could affect post-PMCTA cadaver blood based DNA identification. We used standard methodology to examine DNA from blood samples obtained before and after the PMCTA procedure. We illustrate that neither of these PMCTA methods had an effect on the alleles called following short tandem repeat based DNA profiling, and therefore the ability to undertake post-PMCTA blood based DNA identification.
Molecularly imprinted polymers as selective adsorbents for ambient plasma mass spectrometry.
Cegłowski, Michał; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2017-05-01
The application of molecularly imprinted polymers (MIPs) as molecular scavengers for ambient plasma ionization mass spectrometry has been reported for the first time. MIPs were synthesized using methacrylic acid as functional monomer; nicotine, propyphenazone, or methylparaben as templates; ethylene glycol dimethacrylate as a cross-linker; and 2,2'-azobisisobutyronitrile as polymerization initiator. To perform ambient plasma ionization experiments, a setup consisting of the heated crucible, a flowing atmospheric-pressure afterglow (FAPA) plasma ion source, and a quadrupole ion trap mass spectrometer has been used. The heated crucible with programmable temperature allows for desorption of the analytes from MIPs structure which results in their direct introduction into the ion stream. Limits of detection, linearity of the proposed analytical procedure, and selectivities have been determined for three analytes: nicotine, propyphenazone, and methylparaben. The analytes used were chosen from various classes of organic compounds to show the feasibility of the analytical procedure. The limits of detections (LODs) were 10 nM, 10, and 0.5 μM for nicotine, propyphenazone, and methylparaben, respectively. In comparison with the measurements performed for the non-imprinted polymers, the values of LODs were improved for at least one order of magnitude due to preconcentration of the sample and reduction of background noise, contributing to signal suppression. The described procedure has shown linearity in a broad range of concentrations. The overall time of single analysis is short and requires ca. 5 min. The developed technique was applied for the determination of nicotine, propyphenazone, and methylparaben in spiked real-life samples, with recovery of 94.6-98.4%. The proposed method is rapid, sensitive, and accurate which provides a new option for the detection of small organic compounds in various samples. Graphical abstract The experimental setup used for analysis.
A singularity free analytical solution of artificial satellite motion with drag
NASA Technical Reports Server (NTRS)
Scheifele, G.; Mueller, A. C.; Starke, S. E.
1977-01-01
The connection between the existing Delaunay-Similar and Poincare-Similar satellite theories in the true anomaly version is outlined for the J(2) perturbation and the new drag approach. An overall description of the concept of the approach is given while the necessary expansions and the procedure to arrive at the computer program for the canonical forces is delineated. The procedure for the analytical integration of these developed equations is described. In addition, some numerical results are given. The computer program for the algebraic multiplication of the Fourier series which creates the FORTRAN coding in an automatic manner is described and documented.
NASA Technical Reports Server (NTRS)
Romanofsky, Robert R.
1989-01-01
In this report, a thorough analytical procedure is developed for evaluating the frequency-dependent loss characteristics and effective permittivity of microstrip lines. The technique is based on the measured reflection coefficient of microstrip resonator pairs. Experimental data, including quality factor Q, effective relative permittivity, and fringing for 50-omega lines on gallium arsenide (GaAs) from 26.5 to 40.0 GHz are presented. The effects of an imperfect open circuit, coupling losses, and loading of the resonant frequency are considered. A cosine-tapered ridge-guide text fixture is described. It was found to be well suited to the device characterization.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Bernini, Patrizia; Bertini, Ivano; Luchinat, Claudio; Nincheri, Paola; Staderini, Samuele; Turano, Paola
2011-04-01
(1)H NMR metabolic profiling of urine, serum and plasma has been used to monitor the impact of the pre-analytical steps on the sample quality and stability in order to propose standard operating procedures (SOPs) for deposition in biobanks. We analyzed the quality of serum and plasma samples as a function of the elapsed time (t = 0-4 h) between blood collection and processing and of the time from processing to freezing (up to 24 h). The stability of the urine metabolic profile over time (up to 24 h) at various storage temperatures was monitored as a function of the different pre-analytical treatments like pre-storage centrifugation, filtration, and addition of the bacteriostatic preservative sodium azide. Appreciable changes in the profiles, reflecting changes in the concentration of a number of metabolites, were detected and discussed in terms of chemical and enzymatic reactions for both blood and urine samples. Appropriate procedures for blood derivatives collection and urine preservation/storage that allow maintaining as much as possible the original metabolic profile of the fresh samples emerge, and are proposed as SOPs for biobanking.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less
Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Vu, A; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less
Analysis of Thiodiglycol: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS777
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for the analysis of thiodiglycol, the breakdown product of the sulfur mustard HD, in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS777 (hereafter referred to as EPA CRL SOP MS777). This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to verifymore » the analytical procedures described in MS777 for analysis of thiodiglycol in aqueous samples. The gathered data from this study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS777 can be determined.« less
Veillet, Sébastien; Tomao, Valérie; Ruiz, Karine; Chemat, Farid
2010-07-26
In the past 10 years, trends in analytical chemistry have turned toward the green chemistry which endeavours to develop new techniques that reduce the influence of chemicals on the environment. The challenge of the green analytical chemistry is to develop techniques that meet the request for information output while reducing the environmental impact of the analyses. For this purpose petroleum-based solvents have to be avoided. Therefore, increasing interest was given to new green solvents such as limonene and their potential as alternative solvents in analytical chemistry. In this work limonene was used instead of toluene in the Dean-Stark procedure. Moisture determination on wide range of food matrices was performed either using toluene or limonene. Both solvents gave similar water percentages in food materials, i.e. 89.3+/-0.5 and 89.5+/-0.7 for carrot, 68.0+/-0.7 and 68.6+/-1.9 for garlic, 64.1+/-0.5 and 64.0+/-0.3 for minced meat with toluene and limonene, respectively. Consequently limonene could be used as a good alternative solvent in the Dean-Stark procedure. Copyright 2010 Elsevier B.V. All rights reserved.
Role of microextraction sampling procedures in forensic toxicology.
Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia
2012-07-01
The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.
Current role of liquid chromatography-mass spectrometry in clinical and forensic toxicology.
Maurer, Hans H
2007-08-01
This paper reviews multi-analyte single-stage and tandem liquid chromatography-mass spectrometry (LC-MS) procedures using different mass analyzers (quadrupole, ion trap, time-of-flight) for screening, identification, and/or quantification of drugs, poisons, and/or their metabolites in blood, plasma, serum, or urine published after 2004. Basic information about the biosample assayed, work-up, LC column, mobile phase, ionization type, mass spectral detection mode, and validation data of each procedure is summarized in tables. The following analytes are covered: drugs of abuse, analgesics, opioids, sedative-hypnotics, benzodiazepines, antidepressants including selective-serotonin reuptake inhibitors (SSRIs), herbal phenalkylamines (ephedrines), oral antidiabetics, antiarrhythmics and other cardiovascular drugs, antiretroviral drugs, toxic alkaloids, quaternary ammonium drugs and herbicides, and dialkylphosphate pesticides. The pros and cons of the reviewed procedures are critically discussed, particularly, the need for studies on matrix effects, selectivity, analyte stability, and the use of stable-isotope labeled internal standards instead of unlabeled therapeutic drugs. In conclusion, LC-MS will probably become a gold standard for detection of very low concentrations particularly in alternative matrices and for quantification in clinical and forensic toxicology. However, some drawbacks still need to be addressed and finally overcome.
Thermoelectric DC conductivities in hyperscaling violating Lifshitz theories
NASA Astrophysics Data System (ADS)
Cremonini, Sera; Cvetič, Mirjam; Papadimitriou, Ioannis
2018-04-01
We analytically compute the thermoelectric conductivities at zero frequency (DC) in the holographic dual of a four dimensional Einstein-Maxwell-Axion-Dilaton theory that admits a class of asymptotically hyperscaling violating Lifshitz backgrounds with a dynamical exponent z and hyperscaling violating parameter θ. We show that the heat current in the dual Lifshitz theory involves the energy flux, which is an irrelevant operator for z > 1. The linearized fluctuations relevant for computing the thermoelectric conductivities turn on a source for this irrelevant operator, leading to several novel and non-trivial aspects in the holographic renormalization procedure and the identification of the physical observables in the dual theory. Moreover, imposing Dirichlet or Neumann boundary conditions on the spatial components of one of the two Maxwell fields present leads to different thermoelectric conductivities. Dirichlet boundary conditions reproduce the thermoelectric DC conductivities obtained from the near horizon analysis of Donos and Gauntlett, while Neumann boundary conditions result in a new set of DC conductivities. We make preliminary analytical estimates for the temperature behavior of the thermoelectric matrix in appropriate regions of parameter space. In particular, at large temperatures we find that the only case which could lead to a linear resistivity ρ ˜ T corresponds to z = 4 /3.