Analytical procedure validation and the quality by design paradigm.
Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno
2015-01-01
Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel
2013-12-15
In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.
Mass Spectrometric Rapid Diagnosis of Infectious Diseases.
1980-01-01
the analytical procedures to warrant reporting anew the whole analytical procedure. A. Sample Collection and Storage Procedure Urine samples were...positives or false-negatives. Next we have carried out a longitudinal study on urine samples obtained from groups of volunteer subjects vaccinated with...sterilization and storage procedures. 2. Developed new, simpler sample preparation techniques including one to handle tissue culture media. 3. Improved on the
Srinivas, Nuggehally R
2006-05-01
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.
NASA Technical Reports Server (NTRS)
Snow, L. Dale
1996-01-01
Dextroamphetamine has potential as a pharmacologic agent for the alleviation of two common health effects associated with microgravity. As an adjuvant to Space Motion Sickness (SMS) medication, dextroamphetamine can enhance treatment efficacy by reducing undesirable Central Nervous System (CNS) side effects of SMS medications. Secondly, dextroamphetamine may be useful for the prevention of symptoms of post-mission orthostatic intolerance caused by cardiovascular deconditioning during spaceflight. There is interest in developing an intranasal delivery form of dextroamphetanmine for use as a countermeasure in microgravity conditions. Development of this dosage form will require an analytical detection method with sensitivity in the low ng range (1 to 100 ng/mL). During the 1995 Summer Faculty Fellowship Program, two analytical methods were developed and evaluated for their suitability as quantitative procedures for dextroamphetamine in studies of product stability, bioavailability assessment, and pharmacokinetic evaluation. In developing some of the analytical methods, beta-phenylethylamine, a primary amine structurally similar to dextroamphetamine, was used. The first analytical procedure to be evaluated involved hexane extraction and subsequent fluorescamine labeling of beta-phenylethylamine. The second analytical procedure to be evaluated involved quantitation of dextroamphetamine by an Enzyme-Linked ImmunoSorbent Assay (ELISA).
A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry
ERIC Educational Resources Information Center
Adami, Gianpiero
2006-01-01
A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER: PART 1. PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for t...
Comellas, L; Portillo, J L; Vaquero, M T
1993-12-24
A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
MASTER ANALYTICAL SCHEME FOR ORGANIC COMPOUNDS IN WATER. PART 2. APPENDICES TO PROTOCOLS
A Master Analytical Scheme (MAS) has been developed for the analysis of volatile (gas chromatographable) organic compounds in water. In developing the MAS, it was necessary to evaluate and modify existing analysis procedures and develop new techniques to produce protocols that pr...
Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L
2004-11-15
This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.
Trends in Analytical Scale Separations.
ERIC Educational Resources Information Center
Jorgenson, James W.
1984-01-01
Discusses recent developments in the instrumentation and practice of analytical scale operations. Emphasizes detection devices and procedures in gas chromatography, liquid chromatography, electrophoresis, supercritical fluid chromatography, and field-flow fractionation. (JN)
Determining a carbohydrate profile for Hansenula polymorpha
NASA Technical Reports Server (NTRS)
Petersen, G. R.
1985-01-01
The determination of the levels of carbohydrates in the yeast Hansenula polymorpha required the development of new analytical procedures. Existing fractionation and analytical methods were adapted to deal with the problems involved with the lysis of whole cells. Using these new procedures, the complete carbohydrate profiles of H. polymorpha and selected mutant strains were determined and shown to correlate favourably with previously published results.
ERIC Educational Resources Information Center
Ember, Lois R.
1977-01-01
The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)
The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234
Mudge, Elizabeth M; Brown, Paula N
2016-01-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823
The Importance of Method Selection in Determining Product Integrity for Nutrition Research.
Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N
2016-03-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.
Evaluation of management measures of software development. Volume 1: Analysis summary
NASA Technical Reports Server (NTRS)
Page, J.; Card, D.; Mcgarry, F.
1982-01-01
The conceptual model, the data classification scheme, and the analytic procedures are explained. The analytic results are summarized and specific software measures for collection and monitoring are recommended.
Momen, Awad A; Zachariadis, George A; Anthemidis, Aristidis N; Stratis, John A
2007-01-15
Two digestion procedures have been tested on nut samples for application in the determination of essential (Cr, Cu, Fe, Mg, Mn, Zn) and non-essential (Al, Ba, Cd, Pb) elements by inductively coupled plasma-optical emission spectrometry (ICP-OES). These included wet digestions with HNO(3)/H(2)SO(4) and HNO(3)/H(2)SO(4)/H(2)O(2). The later one is recommended for better analytes recoveries (relative error<11%). Two calibrations (aqueous standard and standard addition) procedures were studied and proved that standard addition was preferable for all analytes. Experimental designs for seven factors (HNO(3), H(2)SO(4) and H(2)O(2) volumes, digestion time, pre-digestion time, temperature of the hot plate and sample weight) were used for optimization of sample digestion procedures. For this purpose Plackett-Burman fractional factorial design, which involve eight experiments was adopted. The factors HNO(3) and H(2)O(2) volume, and the digestion time were found to be the most important parameters. The instrumental conditions were also optimized (using peanut matrix rather than aqueous standard solutions) considering radio-frequency (rf) incident power, nebulizer argon gas flow rate and sample uptake flow rate. The analytical performance, such as limits of detection (LOD<0.74mugg(-1)), precision of the overall procedures (relative standard deviation between 2.0 and 8.2%) and accuracy (relative errors between 0.4 and 11%) were assessed statistically to evaluate the developed analytical procedures. The good agreement between measured and certified values for all analytes (relative error <11%) with respect to IAEA-331 (spinach leaves) and IAEA-359 (cabbage) indicates that the developed analytical method is well suited for further studies on the fate of major elements in nuts and possibly similar matrices.
Sell, Bartosz; Sniegocki, Tomasz; Zmudzki, Jan; Posyniak, Andrzej
2018-04-01
Reported here is a new analytical multiclass method based on QuEChERS technique, which has proven to be effective in diagnosing fatal poisoning cases in animals. This method has been developed for the determination of analytes in liver samples comprising rodenticides, carbamate and organophosphorus pesticides, coccidiostats and mycotoxins. The procedure entails addition of acetonitrile and sodium acetate to 2 g of homogenized liver sample. The mixture was shaken intensively and centrifuged for phase separation, which was followed by an organic phase transfer into a tube containing sorbents (PSA and C18) and magnesium sulfate, then it was centrifuged, the supernatant was filtered and analyzed by liquid chromatography tandem mass spectrometry. A validation of the procedure was performed. Repeatability variation coefficients <15% have been achieved for most of the analyzed substances. Analytical conditions allowed for a successful separation of variety of poisons with the typical screening detection limit at ≤10 μg/kg levels. The method was used to investigate more than 100 animals poisoning incidents and proved that is useful to be used in animal forensic toxicology cases.
Sajnóg, Adam; Hanć, Anetta; Koczorowski, Ryszard; Barałkiewicz, Danuta
2017-12-01
A new procedure for determination of elements derived from titanium implants and physiological elements in soft tissues by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is presented. The analytical procedure was developed which involved preparation of in-house matrix matched solid standards with analyte addition based on certified reference material (CRM) MODAS-4 Cormorant Tissue. Addition of gelatin, serving as a binding agent, essentially improved physical properties of standards. Performance of the analytical method was assayed and validated by calculating parameters like precision, detection limits, trueness and recovery of analyte addition using additional CRM - ERM-BB184 Bovine Muscle. Analyte addition was additionally confirmed by microwave digestion of solid standards and analysis by solution nebulization ICP-MS. The detection limits are in range 1.8μgg -1 to 450μgg -1 for Mn and Ca respectively. The precision values range from 7.3% to 42% for Al and Zn respectively. The estimated recoveries of analyte addition line within scope of 83%-153% for Mn and Cu respectively. Oral mucosa samples taken from patients treated with titanium dental implants were examined using developed analytical method. Standards and tissue samples were cryocut into 30µm thin sections. LA-ICP-MS allowed to obtain two-dimensional maps of distribution of elements in tested samples which revealed high content of Ti and Al derived from implants. Photographs from optical microscope displayed numerous particles with µm size in oral mucosa samples which suggests that they are residues from implantation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.
Study of a heat rejection system using capillary pumping
NASA Technical Reports Server (NTRS)
Neal, L. G.; Wanous, D. J.; Clausen, O. W.
1971-01-01
Results of an analytical study investigating the application of capillary pumping to the heat rejection loop of an advanced Rankine cycle power conversion system are presented. The feasibility of the concept of capillary pumping as an alternate to electromagnetic pumping is analytically demonstrated. Capillary pumping is shown to provide a potential for weight and electrical power saving and reliability through the use of redundant systems. A screen wick pump design with arterial feed lines was analytically developed. Advantages of this design are high thermodynamic and hydrodynamic efficiency, which provide a lightweight easily packaged system. Operational problems were identified which must be solved for successful application of capillary pumping. The most important are the development of start up and shutdown procedures, and development of a means of keeping noncondensibles from the system and of earth-bound testing procedures.
Liquefaction Resistance Based on Shear Wave Velocity
DOT National Transportation Integrated Search
1999-01-01
This report reviews the current simplified procedures for evaluating the liquefaction resistance of granular soil deposits using small-strain shear wave velocity. These procedures were developed from analytical studies, laboratory studies, or very li...
Ferrario, J; Byrne, C; Dupuy, A E
1997-06-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
NASA Technical Reports Server (NTRS)
Ferrario, J.; Byrne, C.; Dupuy, A. E. Jr
1997-01-01
The addition of the "dioxin-like" polychlorinated biphenyl (PCB) congeners to the assessment of risk associated with the 2,3,7,8-chlorine substituted dioxins and furans has dramatically increased the number of laboratories worldwide that are developing analytical procedures for their detection and quantitation. Most of these procedures are based on established sample preparation and analytical techniques employing high resolution gas chromatography/high resolution mass spectrometry (HRGC/HRMS), which are used for the analyses of dioxin/furans at low parts-per-trillion (ppt) levels. A significant and widespread problem that arises when using these sample preparation procedures for the analysis of coplanar PCBs is the presence of background levels of these congeners. Industrial processes, urban incineration, leaking electrical transformers, hazardous waste accidents, and improper waste disposal practices have released appreciable quantities of PCBs into the environment. This contamination has resulted in the global distribution of these compounds via the atmosphere and their ubiquitous presence in ambient air. The background presence of these compounds in method blanks must be addressed when determining the exact concentrations of these and other congeners in environmental samples. In this study reliable procedures were developed to accurately define these background levels and assess their variability over the course of the study. The background subtraction procedures developed and employed increase the probability that the values reported accurately represent the concentrations found in the samples and were not biased due to this background contamination.
Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.
A New Analytic Framework for Moderation Analysis --- Moving Beyond Analytic Interactions
Tang, Wan; Yu, Qin; Crits-Christoph, Paul; Tu, Xin M.
2009-01-01
Conceptually, a moderator is a variable that modifies the effect of a predictor on a response. Analytically, a common approach as used in most moderation analyses is to add analytic interactions involving the predictor and moderator in the form of cross-variable products and test the significance of such terms. The narrow scope of such a procedure is inconsistent with the broader conceptual definition of moderation, leading to confusion in interpretation of study findings. In this paper, we develop a new approach to the analytic procedure that is consistent with the concept of moderation. The proposed framework defines moderation as a process that modifies an existing relationship between the predictor and the outcome, rather than simply a test of a predictor by moderator interaction. The approach is illustrated with data from a real study. PMID:20161453
Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel
2016-01-01
A new procedure based on direct insert microvial thermal desorption injection allows the direct analysis of ionic liquid extracts by gas chromatography and mass spectrometry (GC-MS). For this purpose, an in situ ionic liquid dispersive liquid-liquid microextraction (in situ IL DLLME) has been developed for the quantification of bisphenol A (BPA), bisphenol Z (BPZ) and bisphenol F (BPF). Different parameters affecting the extraction efficiency of the microextraction technique and the thermal desorption step were studied. The optimized procedure, determining the analytes as acetyl derivatives, provided detection limits of 26, 18 and 19 ng L(-1) for BPA, BPZ and BPF, respectively. The release of the three analytes from plastic containers was monitored using this newly developed analytical method. Analysis of the migration test solutions for 15 different plastic containers in daily use identified the presence of the analytes at concentrations ranging between 0.07 and 37 μg L(-1) in six of the samples studied, BPA being the most commonly found and at higher concentrations than the other analytes.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
This research program was initiated with the objective of developing, codifying and testing a group of chemical analytical methods for measuring toxic compounds in the exhaust of distillate-fueled engines (i.e. diesel, gas turbine, Stirling, or Rankin cycle powerplants). It is a ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Mechanic, Leah; Mendez, Armando; Merrill, Lori; Rogers, John; Layton, Marnie; Todd, Deborah; Varanasi, Arti; O’Brien, Barbara; Meyer, William A.; Zhang, Ming; Schleicher, Rosemary L.; Moye, Jack
2014-01-01
BACKGROUND Preanalytical conditions encountered during collection, processing, and storage of biospecimens may influence laboratory results. The National Children’s Study (NCS) is a planned prospective cohort study of 100,000 families to examine the influence of a wide variety of exposures on child health. In developing biospecimen collection, processing, and storage procedures for the NCS, we identified several analytes of different biochemical categories for which it was unclear to what extent deviations from NCS procedures could influence measurement results. METHODS A pilot study was performed to examine effects of preanalytic sample handling conditions (delays in centrifugation, freezing delays, delays in separation from cells, additive delay, and tube type) on concentrations of eight different analytes. 2,825 measurements were made to assess 15 unique combinations of analyte and handling conditions in blood collected from 151 women of childbearing age (≥20 individuals per handling condition). RESULTS The majority of analytes were stable under the conditions evaluated. However, levels of plasma interleukin-6 and serum insulin were decreased in response to sample centrifugation delays of up to 5.5 hours post collection (P<0.0001). In addition, delays in freezing centrifuged plasma samples (comparing 24, 48 and 72 hours to immediate freezing) resulted in increased levels of adrenocorticotropic hormone (P=0.0014). CONCLUSIONS Determining stability of proposed analytes in response to preanalytical conditions and handling helps to ensure high-quality specimens for study now and in the future. The results inform development of procedures, plans for measurement of analytes, and interpretation of laboratory results. PMID:23924524
ERIC Educational Resources Information Center
Golden, Mark
This report briefly describes the procedures for assessing children's psychological development and the data analytic framework used in the New York City Infant Day Care Study. This study is a 5-year, longitudinal investigation in which infants in group and family day care programs and infants reared at home are compared. Children in the study are…
PFOS and PFOS: Analytics | Science Inventory | US EPA
This presentation describes the drivers for development of Method 537, the extraction and analytical procedure, performance data, holding time data as well as detection limits. The purpose of this presentation is to provide an overview of EPA drinking water Method 537 to the U.S. EPA Drinking Water Workshop participants.
NASTRAN analysis of the 1/8-scale space shuttle dynamic model
NASA Technical Reports Server (NTRS)
Bernstein, M.; Mason, P. W.; Zalesak, J.; Gregory, D. J.; Levy, A.
1973-01-01
The space shuttle configuration has more complex structural dynamic characteristics than previous launch vehicles primarily because of the high model density at low frequencies and the high degree of coupling between the lateral and longitudinal motions. An accurate analytical representation of these characteristics is a primary means for treating structural dynamics problems during the design phase of the shuttle program. The 1/8-scale model program was developed to explore the adequacy of available analytical modeling technology and to provide the means for investigating problems which are more readily treated experimentally. The basic objectives of the 1/8-scale model program are: (1) to provide early verification of analytical modeling procedures on a shuttle-like structure, (2) to demonstrate important vehicle dynamic characteristics of a typical shuttle design, (3) to disclose any previously unanticipated structural dynamic characteristics, and (4) to provide for development and demonstration of cost effective prototype testing procedures.
METHOD 544. DETERMINATION OF MICROCYSTINS AND ...
Method 544 is an accurate and precise analytical method to determine six microcystins (including MC-LR) and nodularin in drinking water using solid phase extraction and liquid chromatography tandem mass spectrometry (SPE-LC/MS/MS). The advantage of this SPE-LC/MS/MS is its sensitivity and ability to speciate the microcystins. This method development task establishes sample preservation techniques, sample concentration and analytical procedures, aqueous and extract holding time criteria and quality control procedures. Draft Method 544 undergone a multi-laboratory verification to ensure other laboratories can implement the method and achieve the quality control measures specified in the method. It is anticipated that Method 544 may be used in UCMR 4 to collect nationwide occurrence data for selected microcystins in drinking water. The purpose of this research project is to develop an accurate and precise analytical method to concentrate and determine selected MCs and nodularin in drinking water.
Nonequilibrium chemistry boundary layer integral matrix procedure
NASA Technical Reports Server (NTRS)
Tong, H.; Buckingham, A. C.; Morse, H. L.
1973-01-01
The development of an analytic procedure for the calculation of nonequilibrium boundary layer flows over surfaces of arbitrary catalycities is described. An existing equilibrium boundary layer integral matrix code was extended to include nonequilibrium chemistry while retaining all of the general boundary condition features built into the original code. For particular application to the pitch-plane of shuttle type vehicles, an approximate procedure was developed to estimate the nonequilibrium and nonisentropic state at the edge of the boundary layer.
Cegłowski, Michał; Kurczewska, Joanna; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2015-09-07
In this paper, a procedure for the preconcentration and transport of mixtures of acids, bases, and drug components to a mass spectrometer using magnetic scavengers is presented. Flowing atmospheric pressure afterglow mass spectrometry (FAPA-MS) was used as an analytical method for identification of the compounds by thermal desorption from the scavengers. The proposed procedure is fast and cheap, and does not involve time-consuming purification steps. The developed methodology can be applied for trapping harmful substances in minute quantities, to transport them to specialized, remotely located laboratories.
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
Capacity improvement analytical tools and benchmark development for terminal operations
DOT National Transportation Integrated Search
2009-10-01
With U.S. air traffic predicted to triple over the : next fifteen years, new technologies and procedures are : being considered to cope with this growth. As such, it : may be of use to quickly and easily evaluate any new : technologies or procedures ...
Computer simulation of gear tooth manufacturing processes
NASA Technical Reports Server (NTRS)
Mavriplis, Dimitri; Huston, Ronald L.
1990-01-01
The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.
ERIC Educational Resources Information Center
Gao, Ruomei
2015-01-01
In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…
FINANCIAL ANALYSIS OF CURRENT OPERATIONS OF COLLEGES AND UNIVERSITIES.
ERIC Educational Resources Information Center
SWANSON, JOHN E.; AND OTHERS
TECHNIQUES FOR DEVELOPING FINANCIAL AND RELATED COST-EFFECTIVENESS DATA FOR PUBLIC AND PRIVATELY SUPPORTED AMERICAN COLLEGES AND UNIVERSITIES WERE STUDIED TO FORMULATE PRINCIPLES, PROCEDURES, AND STANDARDS FOR THE ACCUMULATION AND ANALYSES OF CURRENT OPERATING COSTS. AFTER SEPARATE ANALYSES OF INSTITUTIONAL PROCEDURES AND REPORTS, ANALYTIC UNITS…
The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)
2000-11-20
development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by
Recent developments in computer vision-based analytical chemistry: A tutorial review.
Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J
2015-10-29
Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
A new method for constructing analytic elements for groundwater flow.
NASA Astrophysics Data System (ADS)
Strack, O. D.
2007-12-01
The analytic element method is based upon the superposition of analytic functions that are defined throughout the infinite domain, and can be used to meet a variety of boundary conditions. Analytic elements have been use successfully for a number of problems, mainly dealing with the Poisson equation (see, e.g., Theory and Applications of the Analytic Element Method, Reviews of Geophysics, 41,2/1005 2003 by O.D.L. Strack). The majority of these analytic elements consists of functions that exhibit jumps along lines or curves. Such linear analytic elements have been developed also for other partial differential equations, e.g., the modified Helmholz equation and the heat equation, and were constructed by integrating elementary solutions, the point sink and the point doublet, along a line. This approach is limiting for two reasons. First, the existence is required of the elementary solutions, and, second, the integration tends to limit the range of solutions that can be obtained. We present a procedure for generating analytic elements that requires merely the existence of a harmonic function with the desired properties; such functions exist in abundance. The procedure to be presented is used to generalize this harmonic function in such a way that the resulting expression satisfies the applicable differential equation. The approach will be applied, along with numerical examples, for the modified Helmholz equation and for the heat equation, while it is noted that the method is in no way restricted to these equations. The procedure is carried out entirely in terms of complex variables, using Wirtinger calculus.
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
PARTNERING TO IMPROVE HUMAN EXPOSURE METHODS
Methods development research is an application-driven scientific area that addresses programmatic needs. The goals are to reduce measurement uncertainties, address data gaps, and improve existing analytical procedures for estimating human exposures. Partnerships have been develop...
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for analysis of aldicarb, bromadiolone, carbofuran, oxamyl, and methomyl in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS666. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in MS666 for analysis of carbamatemore » pesticides in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS666 can be determined.« less
Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Vu, A; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less
Analysis of Thiodiglycol: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS777
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method for the analysis of thiodiglycol, the breakdown product of the sulfur mustard HD, in water by high performance liquid chromatography tandem mass spectrometry (HPLC-MS/MS), titled Method EPA MS777 (hereafter referred to as EPA CRL SOP MS777). This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to verifymore » the analytical procedures described in MS777 for analysis of thiodiglycol in aqueous samples. The gathered data from this study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of Method EPA MS777 can be determined.« less
Veillet, Sébastien; Tomao, Valérie; Ruiz, Karine; Chemat, Farid
2010-07-26
In the past 10 years, trends in analytical chemistry have turned toward the green chemistry which endeavours to develop new techniques that reduce the influence of chemicals on the environment. The challenge of the green analytical chemistry is to develop techniques that meet the request for information output while reducing the environmental impact of the analyses. For this purpose petroleum-based solvents have to be avoided. Therefore, increasing interest was given to new green solvents such as limonene and their potential as alternative solvents in analytical chemistry. In this work limonene was used instead of toluene in the Dean-Stark procedure. Moisture determination on wide range of food matrices was performed either using toluene or limonene. Both solvents gave similar water percentages in food materials, i.e. 89.3+/-0.5 and 89.5+/-0.7 for carrot, 68.0+/-0.7 and 68.6+/-1.9 for garlic, 64.1+/-0.5 and 64.0+/-0.3 for minced meat with toluene and limonene, respectively. Consequently limonene could be used as a good alternative solvent in the Dean-Stark procedure. Copyright 2010 Elsevier B.V. All rights reserved.
Recent developments in nickel electrode analysis
NASA Technical Reports Server (NTRS)
Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.
1991-01-01
Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.
Life cycle management of analytical methods.
Parr, Maria Kristina; Schmidt, Alexander H
2018-01-05
In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.
Neutron radiative capture methods for surface elemental analysis
Trombka, J.I.; Senftle, F.; Schmadebeck, R.
1970-01-01
Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.
Estimating and testing mediation and moderation in within-subject designs.
Judd, C M; Kenny, D A; McClelland, G H
2001-06-01
Analyses designed to detect mediation and moderation of treatment effects are increasingly prevalent in research in psychology. The mediation question concerns the processes that produce a treatment effect. The moderation question concerns factors that affect the magnitude of that effect. Although analytic procedures have been reasonably well worked out in the case in which the treatment varies between participants, no systematic procedures for examining mediation and moderation have been developed in the case in which the treatment varies within participants. The authors present an analytic approach to these issues using ordinary least squares estimation.
Matrix-enhanced secondary ion mass spectrometry: The Alchemist's solution?
NASA Astrophysics Data System (ADS)
Delcorte, Arnaud
2006-07-01
Because of the requirements of large molecule characterization and high-lateral resolution SIMS imaging, the possibility of improving molecular ion yields by the use of specific sample preparation procedures has recently generated a renewed interest in the static SIMS community. In comparison with polyatomic projectiles, however, signal enhancement by a matrix might appear to some as the alchemist's versus the scientist's solution to the current problems of organic SIMS. In this contribution, I would like to discuss critically the pros and cons of matrix-enhanced SIMS procedures, in the new framework that includes polyatomic ion bombardment. This discussion is based on a short review of the experimental and theoretical developments achieved in the last decade with respect to the three following approaches: (i) blending the analyte with a low-molecular weight organic matrix (MALDI-type preparation procedure); (ii) mixing alkali/noble metal salts with the analyte; (iii) evaporating a noble metal layer on the analyte sample surface (organic molecules, polymers).
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Design and Analysis of a Preconcentrator for the ChemLab
DOE Office of Scientific and Technical Information (OSTI.GOV)
WONG,CHUNGNIN C.; FLEMMING,JEB H.; MANGINELL,RONALD P.
2000-07-17
Preconcentration is a critical analytical procedure when designing a microsystem for trace chemical detection, because it can purify a sample mixture and boost the small analyte concentration to a much higher level allowing a better analysis. This paper describes the development of a micro-fabricated planar preconcentrator for the {mu}ChemLab{trademark} at Sandia. To guide the design, an analytical model to predict the analyte transport, adsorption and resorption process in the preconcentrator has been developed. Experiments have also been conducted to analyze the adsorption and resorption process and to validate the model. This combined effort of modeling, simulation, and testing has ledmore » us to build a reliable, efficient preconcentrator with good performance.« less
NASA Technical Reports Server (NTRS)
Ehlers, F. E.; Sebastian, J. D.; Weatherill, W. H.
1979-01-01
Analytical and empirical studies of a finite difference method for the solution of the transonic flow about harmonically oscillating wings and airfoils are presented. The procedure is based on separating the velocity potential into steady and unsteady parts and linearizing the resulting unsteady equations for small disturbances. Since sinusoidal motion is assumed, the unsteady equation is independent of time. Three finite difference investigations are discussed including a new operator for mesh points with supersonic flow, the effects on relaxation solution convergence of adding a viscosity term to the original differential equation, and an alternate and relatively simple downstream boundary condition. A method is developed which uses a finite difference procedure over a limited inner region and an approximate analytical procedure for the remaining outer region. Two investigations concerned with three-dimensional flow are presented. The first is the development of an oblique coordinate system for swept and tapered wings. The second derives the additional terms required to make row relaxation solutions converge when mixed flow is present. A finite span flutter analysis procedure is described using the two-dimensional unsteady transonic program with a full three-dimensional steady velocity potential.
Digital forensics: an analytical crime scene procedure model (ACSPM).
Bulbul, Halil Ibrahim; Yavuzcan, H Guclu; Ozel, Mesut
2013-12-10
In order to ensure that digital evidence is collected, preserved, examined, or transferred in a manner safeguarding the accuracy and reliability of the evidence, law enforcement and digital forensic units must establish and maintain an effective quality assurance system. The very first part of this system is standard operating procedures (SOP's) and/or models, conforming chain of custody requirements, those rely on digital forensics "process-phase-procedure-task-subtask" sequence. An acceptable and thorough Digital Forensics (DF) process depends on the sequential DF phases, and each phase depends on sequential DF procedures, respectively each procedure depends on tasks and subtasks. There are numerous amounts of DF Process Models that define DF phases in the literature, but no DF model that defines the phase-based sequential procedures for crime scene identified. An analytical crime scene procedure model (ACSPM) that we suggest in this paper is supposed to fill in this gap. The proposed analytical procedure model for digital investigations at a crime scene is developed and defined for crime scene practitioners; with main focus on crime scene digital forensic procedures, other than that of whole digital investigation process and phases that ends up in a court. When reviewing the relevant literature and interrogating with the law enforcement agencies, only device based charts specific to a particular device and/or more general perspective approaches to digital evidence management models from crime scene to courts are found. After analyzing the needs of law enforcement organizations and realizing the absence of crime scene digital investigation procedure model for crime scene activities we decided to inspect the relevant literature in an analytical way. The outcome of this inspection is our suggested model explained here, which is supposed to provide guidance for thorough and secure implementation of digital forensic procedures at a crime scene. In digital forensic investigations each case is unique and needs special examination, it is not possible to cover every aspect of crime scene digital forensics, but the proposed procedure model is supposed to be a general guideline for practitioners. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Laboratory Analytical Procedures | Bioenergy | NREL
analytical procedures (LAPs) to provide validated methods for biofuels and pyrolysis bio-oils research . Biomass Compositional Analysis These lab procedures provide tested and accepted methods for performing
Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.
Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek
2015-06-12
The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.
2012-01-01
Background Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. Results This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. Conclusions The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples. PMID:23050842
Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria
2012-10-10
Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in paint samples.
STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT
The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...
NASA Astrophysics Data System (ADS)
Vassileva, Emilia; Wysocka, Irena
2016-12-01
Anthropogenic Pb in the oceans, derived from high-temperature industrial processes, fuel combustion and incineration can have an isotopic signature distinct from naturally occurring Pb, supplied by rock weathering. To identify the different pollution sources accurately and to quantify their relative contributions, Pb isotope ratios are widely used. Due to the high salt content (approximately 3.5% of total dissolved solids) and very low levels of Pb (typically from 1 to 100 ng L- 1) in seawater the determination of Pb isotope ratios requires preliminary matrix separation and analyte preconcentration. An analytical protocol for the measurements of Pb isotope ratios in seawater combining seaFAST sample pre-treatment system and Sector Field Inductively Coupled Plasma Mass Spectrometry (SF ICP-MS) was developed. The application of seaFAST system was advantageous, because of its completely closed working cycle and small volumes of chemicals introduced in pre-treatment step, resulting in very low detection limits and procedural blanks. The preconcentration/matrix separation step was also of crucial importance for minimizing the isobaric and matrix interferences, coming from the seawater. In order to differentiate between anthropogenic and natural Pb sources, particular attention was paid to the determination of 204Pb isotope because of its implication in some geological interpretations. The validation of the analytical procedure was effectuated according to the recommendations of the ISO/IEC 17025 standard. The method was validated by processing the common Pb isotope reference material NIST SRM 981. All major sources of uncertainty were identified and propagated together following the ISO/GUM guidelines. The estimation of the total uncertainty associated to each measurement result was fundamental tool for sorting the main sources of possible biases. The developed analytical procedure was applied to the coastal and open seawater samples, collected in different regions of the world and revealed that the procedure is applicable for the measurement of Pb isotope ratios in seawater with combined uncertainty adequate to discuss the origin of Pb pollution in the ocean.
Developing automated analytical methods for scientific environments using LabVIEW.
Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard
2010-01-15
The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.
Behavior analytic approaches to problem behavior in intellectual disabilities.
Hagopian, Louis P; Gregory, Meagan K
2016-03-01
The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.
Computer-aided diagnostic strategy selection.
Greenes, R A
1986-03-01
Determination of the optimal diagnostic work-up strategy for the patient is becoming a major concern for the practicing physician. Overlap of the indications for various diagnostic procedures, differences in their invasiveness or risk, and high costs have made physicians aware of the need to consider the choice of procedure carefully, as well as its relation to management actions available. In this article, the author discusses research approaches that aim toward development of formal decision analytic methods to allow the physician to determine optimal strategy; clinical algorithms or rules as guides to physician decisions; improved measures for characterizing the performance of diagnostic tests; educational tools for increasing the familiarity of physicians with the concepts underlying these measures and analytic procedures; and computer-based aids for facilitating the employment of these resources in actual clinical practice.
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.
Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel
2016-01-01
Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.
Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology
2016-01-01
Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217
Zietze, Stefan; Müller, Rainer H; Brecht, René
2008-03-01
In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.
Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.
Blake, Christopher J
2007-09-01
Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.
A singularity free analytical solution of artificial satellite motion with drag
NASA Technical Reports Server (NTRS)
Scheifele, G.; Mueller, A. C.; Starke, S. E.
1977-01-01
The connection between the existing Delaunay-Similar and Poincare-Similar satellite theories in the true anomaly version is outlined for the J(2) perturbation and the new drag approach. An overall description of the concept of the approach is given while the necessary expansions and the procedure to arrive at the computer program for the canonical forces is delineated. The procedure for the analytical integration of these developed equations is described. In addition, some numerical results are given. The computer program for the algebraic multiplication of the Fourier series which creates the FORTRAN coding in an automatic manner is described and documented.
NASA Technical Reports Server (NTRS)
Romanofsky, Robert R.
1989-01-01
In this report, a thorough analytical procedure is developed for evaluating the frequency-dependent loss characteristics and effective permittivity of microstrip lines. The technique is based on the measured reflection coefficient of microstrip resonator pairs. Experimental data, including quality factor Q, effective relative permittivity, and fringing for 50-omega lines on gallium arsenide (GaAs) from 26.5 to 40.0 GHz are presented. The effects of an imperfect open circuit, coupling losses, and loading of the resonant frequency are considered. A cosine-tapered ridge-guide text fixture is described. It was found to be well suited to the device characterization.
Analysis of Phosphonic Acids: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Vu, A; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled Analysis of Diisopropyl Methylphosphonate, Ethyl Hydrogen Dimethylamidophosphate, Isopropyl Methylphosphonic Acid, Methylphosphonic Acid, and Pinacolyl Methylphosphonic Acid in Water by Multiple Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry: EPA Version MS999. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures describedmore » in EPA Method MS999 for analysis of the listed phosphonic acids and surrogates in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of EPA Method MS999 can be determined.« less
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
Dietary fibre: challenges in production and use of food composition data.
Westenbrink, Susanne; Brunt, Kommer; van der Kamp, Jan-Willem
2013-10-01
Dietary fibre is a heterogeneous group of components for which several definitions and analytical methods were developed over the past decades, causing confusion among users and producers of dietary fibre data in food composition databases. An overview is given of current definitions and analytical methods. Some of the issues related to maintaining dietary fibre values in food composition databases are discussed. Newly developed AOAC methods (2009.01 or modifications) yield higher dietary fibre values, due to the inclusion of low molecular weight dietary fibre and resistant starch. For food composition databases procedures need to be developed to combine 'classic' and 'new' dietary fibre values since re-analysing all foods on short notice is impossible due to financial restrictions. Standardised value documentation procedures are important to evaluate dietary fibre values from several sources before exchanging and using the data, e.g. for dietary intake research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Immobilization/remobilization and the regulation of muscle mass
NASA Technical Reports Server (NTRS)
Almon, R. R.
1983-01-01
The relationship between animal body weight and the wet and dry weights of the soleus and EDL muscles was derived. Procedures were examined for tissue homogenization, fractionation, protein determination and DNA determination. A sequence of procedures and buffers were developed to carry out all analyses on one small muscle. This would yield a considerable increase in analytical strength associated with paired statistics. The proposed casting procedure which was to be used for immobilization was reexamined.
NASA Technical Reports Server (NTRS)
Smithers, G. A.
1992-01-01
The microbial ecology facility in the Analytical and Physical Chemistry Branch at Marshall Space Flight Center is tasked with anticipation of potential microbial problems (and opportunities to exploit microorganisms) which may occur in partially closed systems such as space station/vehicles habitats and in water reclamation systems therein, with particular emphasis on the degradation of materials. Within this context, procedures for microbial biofilm research are being developed. Reported here is the development of static system procedures to study aquatic biofilms and their responses to disinfection and invading species. Preliminary investigations have been completed. As procedures are refined, it will be possible to focus more closely on the elucidation of biofilm phenomena.
Progress and development of analytical methods for gibberellins.
Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya
2017-01-01
Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C
2014-06-01
A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.
1989-06-23
Iterations .......................... 86 3.2 Comparison between MACH and POLAR ......................... 90 3.3 Flow Chart for VSTS Algorithm...The most recent changes are: a) development of the VSTS (velocity space topology search) algorithm for calculating particle densities b) extension...with simple analytic models. The largest modification of the MACH code was the implementation of the VSTS procedure, which constituted a complete
Dai, James Y.; Hughes, James P.
2012-01-01
The meta-analytic approach to evaluating surrogate end points assesses the predictiveness of treatment effect on the surrogate toward treatment effect on the clinical end point based on multiple clinical trials. Definition and estimation of the correlation of treatment effects were developed in linear mixed models and later extended to binary or failure time outcomes on a case-by-case basis. In a general regression setting that covers nonnormal outcomes, we discuss in this paper several metrics that are useful in the meta-analytic evaluation of surrogacy. We propose a unified 3-step procedure to assess these metrics in settings with binary end points, time-to-event outcomes, or repeated measures. First, the joint distribution of estimated treatment effects is ascertained by an estimating equation approach; second, the restricted maximum likelihood method is used to estimate the means and the variance components of the random treatment effects; finally, confidence intervals are constructed by a parametric bootstrap procedure. The proposed method is evaluated by simulations and applications to 2 clinical trials. PMID:22394448
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ndong, Mamadou; Lauvergnat, David; Nauts, André
2013-11-28
We present new techniques for an automatic computation of the kinetic energy operator in analytical form. These techniques are based on the use of the polyspherical approach and are extended to take into account Cartesian coordinates as well. An automatic procedure is developed where analytical expressions are obtained by symbolic calculations. This procedure is a full generalization of the one presented in Ndong et al., [J. Chem. Phys. 136, 034107 (2012)]. The correctness of the new implementation is analyzed by comparison with results obtained from the TNUM program. We give several illustrations that could be useful for users of themore » code. In particular, we discuss some cyclic compounds which are important in photochemistry. Among others, we show that choosing a well-adapted parameterization and decomposition into subsystems can allow one to avoid singularities in the kinetic energy operator. We also discuss a relation between polyspherical and Z-matrix coordinates: this comparison could be helpful for building an interface between the new code and a quantum chemistry package.« less
Nechaeva, Daria; Shishov, Andrey; Ermakov, Sergey; Bulatov, Andrey
2018-06-01
An easily performed miniaturized, cheap, selective and sensitive procedure for the determination of H 2 S in fuel oil samples based on a headspace liquid-phase microextraction followed by a cyclic voltammetry detection using a paper-based analytical device (PAD) was developed. A modified wax dipping method was applied to fabricate the PAD. The PAD included hydrophobic zones of sample and supporting electrolyte connecting by hydrophilic channel. The zones of sample and supporting electrolyte were connected with nickel working, platinum auxiliary and Ag/AgCl reference electrodes. The analytical procedure included separation of H 2 S from fuel oil sample based on the headspace liquid-phase microextraction in alkaline solution. Then, sulfide ions solution obtained and supporting electrolyte were dropped on the zones followed by analyte detection at + 0.45 V. Under the optimized conditions, H 2 S concentration in the range from 2 to 20 mg kg -1 had a good linear relation with the peak current. The limit of detection (3σ) was 0.6 mg kg -1 . The procedure was successfully applied to the analysis of fuel oil samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Montgomery, L D; Montgomery, R W; Guisado, R
1995-05-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
NASA Technical Reports Server (NTRS)
Montgomery, L. D.; Montgomery, R. W.; Guisado, R.
1995-01-01
This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.
Tobiszewski, Marek; Orłowski, Aleksander
2015-03-27
The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Lima, Manoel J A; Fernandes, Ridvan N; Tanaka, Auro A; Reis, Boaventura F
2016-02-01
This paper describes a new technique for the determination of captopril in pharmaceutical formulations, implemented by employing multicommuted flow analysis. The analytical procedure was based on the reaction between hypochlorite and captopril. The remaining hypochlorite oxidized luminol that generated electromagnetic radiation detected using a homemade luminometer. To the best of our knowledge, this is the first time that this reaction has been exploited for the determination of captopril in pharmaceutical products, offering a clean analytical procedure with minimal reagent usage. The effectiveness of the proposed procedure was confirmed by analyzing a set of pharmaceutical formulations. Application of the paired t-test showed that there was no significant difference between the data sets at a 95% confidence level. The useful features of the new analytical procedure included a linear response for captopril concentrations in the range 20.0-150.0 µmol/L (r = 0.997), a limit of detection (3σ) of 2.0 µmol/L, a sample throughput of 164 determinations per hour, reagent consumption of 9 µg luminol and 42 µg hypochlorite per determination and generation of 0.63 mL of waste. A relative standard deviation of 1% (n = 6) for a standard solution containing 80 µmol/L captopril was also obtained. Copyright © 2015 John Wiley & Sons, Ltd.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's LabMaster data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality-control samples analyzed from July 1999 through June 2001. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, calcium, chloride and nitrate (ion chromatography and colormetric method) and sulfate. The total aluminum and dissolved organic carbon procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits. The calcium and specific conductance procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The magnesium procedure was biased for the high-concentration and low concentration samples, but was within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 14 of 15 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 17 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except ammonium (81 percent of samples met objectives), chloride (75 percent of samples met objectives), and sodium (86 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with most ratings for each sample in the good to excellent range. The P-sample (low-ionic-strength constituents) analysis had one satisfactory rating for the specific conductance procedure in one study. The T-sample (trace constituents) analysis had one satisfactory rating for the aluminum procedure in one study and one unsatisfactory rating for the sodium procedure in another. The remainder of the samples had good or excellent ratings for each study. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 89 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were ammonium, total aluminum, dissolved organic carbon, and sodium. Results indicate a positive bias for the ammonium procedure in all studies. Data-quality objectives were not met in 50 percent of samples analyzed for total aluminum, 38 percent of samples analyzed for dissolved organic carbon, and 27 percent of samples analyzed for sodium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 91 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, and sulfate. Data-quality objectives were met by 75 percent of the samples analyzed for sodium and 58 percent of the samples analyzed for specific conductance.
Validation of urban freeway models.
DOT National Transportation Integrated Search
2015-01-01
This report describes the methodology, data, conclusions, and enhanced models regarding the validation of two sets of models developed in the Strategic Highway Research Program 2 (SHRP 2) Reliability Project L03, Analytical Procedures for Determining...
Computer modeling of lung cancer diagnosis-to-treatment process
Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U.; Yu, Xinhua; Faris, Nick
2015-01-01
We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed. PMID:26380181
Cantarero, Samuel; Zafra-Gómez, Alberto; Ballesteros, Oscar; Navalón, Alberto; Vílchez, José L; Crovetto, Guillermo; Verge, Coral; de Ferrer, Juan A
2010-11-01
We have developed a new analytical procedure for determining insoluble Ca and Mg fatty acid salts (soaps) in agricultural soil and sewage sludge samples. The number of analytical methodologies that focus in the determination of insoluble soap salts in different environmental compartments is very limited. In this work, we propose a methodology that involves a sample clean-up step with petroleum ether to remove soluble salts and a conversion of Ca and Mg insoluble salts into soluble potassium salts using tripotassium ethylenediaminetetraacetate salt and potassium carbonate, followed by the extraction of analytes from the samples using microwave-assisted extraction with methanol. An improved esterification procedure using 2,4-dibromoacetophenone before the liquid chromatography with ultraviolet detection analysis also has been developed. The absence of matrix effect was demonstrated with two fatty acid Ca salts that are not commercial and are never detected in natural samples (C₁₃:₀ and C₁₇:₀). Therefore, it was possible to evaluate the matrix effect because both standards have similar environmental behavior (adsorption and precipitation) to commercial soaps (C₁₀:₀) to C₁₈:₀). We also studied the effect of the different variables on the clean-up, the conversion of Ca soap, and the extraction and derivatization procedures. The quantification limits found ranged from 0.4 to 0.8 mg/kg. The proposed method was satisfactorily applied for the development of a study on soap behavior in agricultural soil and sewage sludge samples. © 2010 SETAC.
40 CFR 140.5 - Analytical procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 140.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) MARINE SANITATION DEVICE STANDARD § 140.5 Analytical procedures. In determining the composition and quality of effluent discharge from marine sanitation devices, the procedures contained in 40 CFR part 136...
A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines
Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua
2018-01-01
The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905
Fukushima, Romualdo S; Hatfield, Ronald D
2004-06-16
Present analytical methods to quantify lignin in herbaceous plants are not totally satisfactory. A spectrophotometric method, acetyl bromide soluble lignin (ABSL), has been employed to determine lignin concentration in a range of plant materials. In this work, lignin extracted with acidic dioxane was used to develop standard curves and to calculate the derived linear regression equation (slope equals absorptivity value or extinction coefficient) for determining the lignin concentration of respective cell wall samples. This procedure yielded lignin values that were different from those obtained with Klason lignin, acid detergent acid insoluble lignin, or permanganate lignin procedures. Correlations with in vitro dry matter or cell wall digestibility of samples were highest with data from the spectrophotometric technique. The ABSL method employing as standard lignin extracted with acidic dioxane has the potential to be employed as an analytical method to determine lignin concentration in a range of forage materials. It may be useful in developing a quick and easy method to predict in vitro digestibility on the basis of the total lignin content of a sample.
Baranowska, Irena; Buszewski, Bogusław; Namieśnik, Jacek; Konieczka, Piotr; Magiera, Sylwia; Polkowska-Motrenko, Halina; Kościelniak, Paweł; Gadzała-Kopciuch, Renata; Woźniakiewicz, Aneta; Samczyński, Zbigniew; Kochańska, Kinga; Rutkowska, Małgorzata
2017-02-01
Regular use of a reference material and participation in a proficiency testing program can improve the reliability of analytical data. This paper presents the preparation of candidate reference materials for the drugs metoprolol, propranolol, carbamazepine, naproxen, and acenocoumarol in freshwater bottom sediment and cod and herring tissues. These reference materials are not available commercially. Drugs (between 7 ng/g and 32 ng/g) were added to the samples, and the spiked samples were freeze-dried, pulverized, sieved, homogenized, bottled, and sterilized by γ-irradiation to prepare the candidate materials. Procedures for extraction and liquid chromatography coupled with tandem mass spectrometry were developed to determine the drugs of interest in the studied material. Each target drug was quantified using two analytical procedures, and the results obtained from these two procedures were in good agreement with each other. Stability and homogeneity assessments were performed, and the relative uncertainties due to instability (for an expiration date of 12 months) and inhomogeneity were 10-25% and 4.0-6.8%, respectively. These procedures will be useful in the future production of reference materials. Copyright © 2016 Elsevier Ltd. All rights reserved.
Marcinkowska, Monika; Komorowicz, Izabela; Barałkiewicz, Danuta
2016-05-12
Analytical procedure dedicated for multielemental determination of toxic species: As(III), As(V), Cr(VI), Sb(III) and Sb(V) in drinking water samples using high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-DRC-MS) technique was developed. Optimization of the detection and separation conditions was conducted. Dynamic reaction cell (DRC) with oxygen as a reaction gas was involved in the experiments. Obtained analytical signals for species separation were symmetrical, as studied by anion-exchange chromatography. Applied mobile phase consisted of 3 mM of EDTANa2 and 36 mM of ammonium nitrate. Full separation of species in the form of the following forms: H3AsO3, H2AsO4(-), SbO2(-), Sb(OH)6(-), CrO4(2-) was achieved in 15 min with use of gradient elution program. Detailed validation of analytical procedure proved the reliability of analytical measurements. The procedure was characterized by high precision in the range from 1.7% to 2.4%. Detection limits (LD) were 0.067 μg L(-1), 0.068 μg L(-1), 0.098 μg L(-1), 0.083 μg L(-1) and 0.038 μg L(-1) for As(III), As(V), Cr(VI), Sb(III) and Sb(V), respectively. Obtained recoveries confirmed the lack of interferences' influence on analytical signals as their values were in the range of 91%-110%. The applicability of the proposed procedure was tested on drinking water samples characterized by mineralization up to 650 mg L(-1). Copyright © 2016 Elsevier B.V. All rights reserved.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2003 through June 2005. Results for the quality-control samples for 20 analytical procedures were evaluated for bias and precision. Control charts indicate that data for five of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, pH, silicon, and sodium. Seven of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: dissolved organic carbon, chloride, nitrate (ion chromatograph), nitrite, silicon, sodium, and sulfate. The calcium and magnesium procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 17 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 22 analytes. At least 85 percent of the samples met data-quality objectives for all analytes except total monomeric aluminum (82 percent of samples met objectives), total aluminum (77 percent of samples met objectives), chloride (80 percent of samples met objectives), fluoride (76 percent of samples met objectives), and nitrate (ion chromatograph) (79 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with ratings for each sample in the satisfactory, good, and excellent ranges or less than 10 percent error. The P-sample (low-ionic-strength constituents) analysis had one marginal and two unsatisfactory ratings for the chloride procedure. The T-sample (trace constituents)analysis had two unsatisfactory ratings and one high range percent error for the aluminum procedure. The N-sample (nutrient constituents) analysis had one marginal rating for the nitrate procedure. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 84 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were ammonium, total aluminum, and acid-neutralizing capacity. The ammonium procedure did not meet data quality objectives in all studies. Data-quality objectives were not met in 23 percent of samples analyzed for total aluminum and 45 percent of samples analyzed acid-neutralizing capacity. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, sodium, and sulfate. Data-quality objectives were not met by samples analyzed for fluoride.
The Case for Adopting Server-side Analytics
NASA Astrophysics Data System (ADS)
Tino, C.; Holmes, C. P.; Feigelson, E.; Hurlburt, N. E.
2017-12-01
The standard method for accessing Earth and space science data relies on a scheme developed decades ago: data residing in one or many data stores must be parsed out and shipped via internet lines or physical transport to the researcher who in turn locally stores the data for analysis. The analyses tasks are varied and include visualization, parameterization, and comparison with or assimilation into physics models. In many cases this process is inefficient and unwieldy as the data sets become larger and demands on the analysis tasks become more sophisticated and complex. For about a decade, several groups have explored a new paradigm to this model. The names applied to the paradigm include "data analytics", "climate analytics", and "server-side analytics". The general concept is that in close network proximity to the data store there will be a tailored processing capability appropriate to the type and use of the data served. The user of the server-side analytics will operate on the data with numerical procedures. The procedures can be accessed via canned code, a scripting processor, or an analysis package such as Matlab, IDL or R. Results of the analytics processes will then be relayed via the internet to the user. In practice, these results will be at a much lower volume, easier to transport to and store locally by the user and easier for the user to interoperate with data sets from other remote data stores. The user can also iterate on the processing call to tailor the results as needed. A major component of server-side analytics could be to provide sets of tailored results to end users in order to eliminate the repetitive preconditioning that is both often required with these data sets and which drives much of the throughput challenges. NASA's Big Data Task Force studied this issue. This paper will present the results of this study including examples of SSAs that are being developed and demonstrated and suggestions for architectures that might be developed for future applications.
Development of a GC-MS-SPME Method for the Determination of Amines in Meteorites
NASA Astrophysics Data System (ADS)
Hilts, R. W.; Skelhorne, A. W.; Simkus, D.; Herd, C. D. K.
2016-08-01
A GC-MS-SPME analytical method for the direct determination of amines in aqueous solution has been developed. The key step in the procedure is the conversion of the amines into their non-volatile ammonium salts by protonation with HCl.
Vibro-acoustics for Space Station applications
NASA Technical Reports Server (NTRS)
Vaicaitis, R.; Bofilios, D. A.
1986-01-01
An analytical procedure has been developed to study noise generation in a double wall and single wall cylindrical shell due to mechanical point loads. The objective of this study is to develop theoretical procedures for parametetric evaluation of noise generation andd noise transmission for the habitability modules of the proposed Space Station operation. The solutions of the governing acoustic-structural equations are obtained utilizing modal decomposition. The numerical results include modal frequencies, deflection response spectral densities and interior noise sound pressure levels.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
Assessment of passive drag in swimming by numerical simulation and analytical procedure.
Barbosa, Tiago M; Ramos, Rui; Silva, António J; Marinho, Daniel A
2018-03-01
The aim was to compare the passive drag-gliding underwater by a numerical simulation and an analytical procedure. An Olympic swimmer was scanned by computer tomography and modelled gliding at a 0.75-m depth in the streamlined position. Steady-state computer fluid dynamics (CFD) analyses were performed on Fluent. A set of analytical procedures was selected concurrently. Friction drag (D f ), pressure drag (D pr ), total passive drag force (D f +pr ) and drag coefficient (C D ) were computed between 1.3 and 2.5 m · s -1 by both techniques. D f +pr ranged from 45.44 to 144.06 N with CFD, from 46.03 to 167.06 N with the analytical procedure (differences: from 1.28% to 13.77%). C D ranged between 0.698 and 0.622 by CFD, 0.657 and 0.644 by analytical procedures (differences: 0.40-6.30%). Linear regression models showed a very high association for D f +pr plotted in absolute values (R 2 = 0.98) and after log-log transformation (R 2 = 0.99). The C D also obtained a very high adjustment for both absolute (R 2 = 0.97) and log-log plots (R 2 = 0.97). The bias for the D f +pr was 8.37 N and 0.076 N after logarithmic transformation. D f represented between 15.97% and 18.82% of the D f +pr by the CFD, 14.66% and 16.21% by the analytical procedures. Therefore, despite the bias, analytical procedures offer a feasible way of gathering insight on one's hydrodynamics characteristics.
2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Exhaust Gaseous Emissions (Aircraft and Aircraft Gas Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
Miller, Eleanor I; Murray, Gordon J; Rollins, Douglas E; Tiffany, Stephen T; Wilkins, Diana G
2011-07-01
The aim of this exploratory study was to develop and validate a liquid chromatography-tandem mass spectrometry (LC-MS-MS) method for the quantification of nicotine, eight nicotine metabolites, and two minor tobacco alkaloids in fortified analyte-free hair and subsequently apply this method to hair samples collected from active smokers. An additional aim of the study was to include an evaluation of different wash procedures for the effective removal of environmentally deposited nicotine from tobacco smoke. An apparatus was designed for the purpose of exposing analyte-free hair to environmental tobacco smoke in order to deposit nicotine onto the hair surface. A shampoo/water wash procedure was identified as the most effective means of removing nicotine. This wash procedure was utilized for a comparison of washed and unwashed heavy smoker hair samples. Analytes and corresponding deuterated internal standards were extracted using a cation-exchange solid-phase cartridge. LC-MS-MS was carried out using an Acquity™ UPLC(®) system (Waters) and a Quattro Premier XE™ triple quadrupole MS (Waters) operated in electrospray positive ionization mode, with multiple reaction monitoring data acquisition. The developed method was applied to hair samples collected from heavy smokers (n = 3) and low-level smokers (n = 3) collected through IRB-approved protocols. Nicotine, cotinine, and nornicotine were quantified in both the washed and unwashed hair samples collected from three heavy smokers, whereas 3-hydroxycotinine was quantified in only one unwashed sample and nicotine-1'-oxide in the washed and unwashed hair samples from two heavy smokers. In contrast, nicotine-1'-oxide was quantified in one of the three low-level smoker samples; nicotine was quantified in the other two low-level smoker samples. No other analytes were detected in the hair of the three low-level smokers.
Improvement of analytical dynamic models using modal test data
NASA Technical Reports Server (NTRS)
Berman, A.; Wei, F. S.; Rao, K. V.
1980-01-01
A method developed to determine maximum changes in analytical mass and stiffness matrices to make them consistent with a set of measured normal modes and natural frequencies is presented. The corrected model will be an improved base for studies of physical changes, boundary condition changes, and for prediction of forced responses. The method features efficient procedures not requiring solutions of the eigenvalue problem, and the ability to have more degrees of freedom than the test data. In addition, modal displacements are obtained for all analytical degrees of freedom, and the frequency dependence of the coordinate transformations is properly treated.
Modeling regional freight flow assignment through intermodal terminals
DOT National Transportation Integrated Search
2005-03-01
An analytical model is developed to assign regional freight across a multimodal highway and railway network using geographic information systems. As part of the regional planning process, the model is an iterative procedure that assigns multimodal fr...
NASA Technical Reports Server (NTRS)
Book, W. J.
1973-01-01
An investigation is reported involving a mathematical procedure using 4 x 4 transformation matrices for analyzing the vibrations of flexible manipulators. Previous studies with the procedure are summarized and the method is extended to include flexible joints as well as links, and to account for the effects of various power transmission schemes. A systematic study of the allocation of structural material and the placement of components such as motors and gearboxes was undertaken using the analytical tools developed. As one step in this direction the variables which relate the vibration parameters of the arm to the task and environment of the arm were isolated and nondimensionalized. The 4 x 4 transformation matrices were also used to develop analytical expressions for the terms of the complete 6 x 6 compliance matrix for the case of two flexible links joined by a rotating joint, flexible about its axis of rotation.
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
NASA Technical Reports Server (NTRS)
Wilkenfeld, J. M.; Harlacher, B. L.; Mathews, D.
1982-01-01
A combined experimental and analytical program to develop system electrical test procedures for the qualification of spacecraft against damage produced by space-electron-induced discharges (EID) occurring on spacecraft dielectric outer surfaces is described. A review and critical evaluation of possible approaches to qualify spacecraft against space electron-induced discharges (EID) is presented. A variety of possible schemes to simulate EID electromagnetic effects produced in spacecraft was studied. These techniques form the principal element of a provisional, recommended set of test procedures for the EID qualification spacecraft. Significant gaps in our knowledge about EID which impact the final specification of an electrical test to qualify spacecraft against EID are also identified.
Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina
2012-10-01
Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
A Case Study of a Mixed Methods Study Engaged in Integrated Data Analysis
ERIC Educational Resources Information Center
Schiazza, Daniela Marie
2013-01-01
The nascent field of mixed methods research has yet to develop a cohesive framework of guidelines and procedures for mixed methods data analysis (Greene, 2008). To support the field's development of analytical frameworks, this case study reflects on the development and implementation of a mixed methods study engaged in integrated data analysis.…
Microorganisms in inorganic chemical analysis.
Godlewska-Zyłkiewicz, Beata
2006-01-01
There are innumerable strains of microbes (bacteria, yeast and fungi) that degrade or transform chemicals and compounds into simpler, safer or less toxic substances. These bioprocesses have been used for centuries in the treatment of municipal wastes, in wine, cheese and bread making, and in bioleaching and metal recovery processes. Recent literature shows that microorganisms can be also used as effective sorbents for solid phase extraction procedures. This review reveals that fundamental nonanalytical studies on the parameters and conditions of biosorption processes and on metal-biomass interactions often result in efficient analytical procedures and biotechnological applications. Some selected examples illustrate the latest developments in the biosorption of metals by microbial biomass, which have opened the door to the application of microorganisms to analyte preconcentration, matrix separation and speciation analysis.
14 CFR 34.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE... Turbine Engines) § 34.64 Sampling and analytical procedures for measuring gaseous exhaust emissions. The...
Identification and evaluation of software measures
NASA Technical Reports Server (NTRS)
Card, D. N.
1981-01-01
A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.
Development of analytic intermodal freight networks for use within a GIS
DOT National Transportation Integrated Search
1997-05-01
The paper discusses the practical issues involved in constructing intermodal freight networks that can be used within GIS platforms to support inter-regional freight routing and subsequent (for example, commodity flow) analysis. The procedures descri...
Investigating causes and determine repair needs to mitigate falling concrete from bridge decks.
DOT National Transportation Integrated Search
2012-09-01
This study developed a procedure to identify concrete bridge decks that are exhibiting the characteristics associated : with falling concrete. Field exploratory work on reinforced concrete bridge decks was supported by analytical and : laboratory inv...
Concept Development for Future Domains: A New Method of Knowledge Elicitation
2005-06-01
Procedure: U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) examined methods to generate, refine, test , and validate new...generate, elaborate, refine, describe, test , and validate new Future Force concepts relating to doctrine, tactics, techniques, procedures, unit and team...System (Harvey, 1993), and the Job Element Method (Primoff & Eyde , 1988). Figure 1 provides a more comprehensive list of task analytic methods. Please see
Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D
2017-11-01
Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty than on trueness factor. Copyright © 2017 Elsevier B.V. All rights reserved.
Maloney, T.J.; Ludtke, A.S.; Krizman, T.L.
1994-01-01
The US. Geological Survey operates a quality- assurance program based on the analyses of reference samples for the National Water Quality Laboratory in Arvada, Colorado, and the Quality of Water Service Unit in Ocala, Florida. Reference samples containing selected inorganic, nutrient, and low ionic-strength constituents are prepared and disguised as routine samples. The program goal is to determine precision and bias for as many analytical methods offered by the participating laboratories as possible. The samples typically are submitted at a rate of approximately 5 percent of the annual environmental sample load for each constituent. The samples are distributed to the laboratories throughout the year. Analytical data for these reference samples reflect the quality of environmental sample data produced by the laboratories because the samples are processed in the same manner for all steps from sample login through data release. The results are stored permanently in the National Water Data Storage and Retrieval System. During water year 1991, 86 analytical procedures were evaluated at the National Water Quality Laboratory and 37 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic (major ion and trace metal) constituent data for water year 1991 indicated analytical imprecision in the National Water Quality Laboratory for 5 of 67 analytical procedures: aluminum (whole-water recoverable, atomic emission spectrometric, direct-current plasma); calcium (atomic emission spectrometric, direct); fluoride (ion-exchange chromatographic); iron (whole-water recoverable, atomic absorption spectrometric, direct); and sulfate (ion-exchange chromatographic). The results for 11 of 67 analytical procedures had positive or negative bias during water year 1991. Analytical imprecision was indicated in the determination of two of the five National Water Quality Laboratory nutrient constituents: orthophosphate as phosphorus and phosphorus. A negative or positive bias condition was indicated in three of five nutrient constituents. There was acceptable precision and no indication of bias for the 14 low ionic-strength analytical procedures tested in the National Water Quality Laboratory program and for the 32 inorganic and 5 nutrient analytical procedures tested in the Quality of Water Service Unit during water year 1991.
Acoustic emission from a growing crack
NASA Technical Reports Server (NTRS)
Jacobs, Laurence J.
1989-01-01
An analytical method is being developed to determine the signature of an acoustic emission waveform from a growing crack and the results of this analysis are compared to experimentally obtained values. Within the assumptions of linear elastic fracture mechanics, a two dimensional model is developed to examine a semi-infinite crack that, after propagating with a constant velocity, suddenly stops. The analytical model employs an integral equation method for the analysis of problems of dynamic fracture mechanics. The experimental procedure uses an interferometric apparatus that makes very localized absolute measurements with very high fidelity and without acoustically loading the specimen.
Handheld magnetic sensor for measurement of tension
NASA Astrophysics Data System (ADS)
Singal, K.; Rajamani, R.
2012-04-01
This letter develops an analytical formulation for measurement of tension in a string using a handheld sensor. By gently pushing the sensor against the string, the tension in the string can be obtained. An experimental sensor prototype is constructed to verify the analytical formulation. The centimeter-sized prototype utilizes three moving pistons and magnetic field based measurements of their positions. Experimental data show that the sensor can accurately measure tension on a bench top rig. The developed sensor could be useful in a variety of orthopedic surgical procedures, including knee replacement, hip replacement, ligament repair, shoulder stabilization, and tendon repair.
Ultrasensitive biomolecular assays with amplifying nanowire FET biosensors
NASA Astrophysics Data System (ADS)
Chui, Chi On; Shin, Kyeong-Sik; Mao, Yufei
2013-09-01
In this paper, we review our recent development and validation of the ultrasensitive electronic biomolecular assays enabled by our novel amplifying nanowire field-effect transistor (nwFET) biosensors. Our semiconductor nwFET biosensor platform technology performs extreme proximity signal amplification in the electrical domain that requires neither labeling nor enzymes nor optics. We have designed and fabricated the biomolecular assay prototypes and developed the corresponding analytical procedures. We have also confirmed their analytical performance in quantitating key protein biomarker in human serum, demonstrating an ultralow limit of detection and concurrently high output current level for the first time.
Acrylamide analysis in food by liquid chromatographic and gas chromatographic methods.
Elbashir, Abdalla A; Omar, Mei M Ali; Ibrahim, Wan Aini Wan; Schmitz, Oliver J; Aboul-Enein, Hassan Y
2014-01-01
Acrylamide (AA) is a compound classified as carcinogenic to humans by the International Agency for Research on Cancer. It was first discovered to be present in certain heated processed food by the Swedish National Food Administration (SNFA) and University of Stockholm in early 2002. The major pathway for AA formation in food is the Maillard reaction between reducing sugar and the amino acid asparagine at high temperature. Since the discovery of AA's presence in food, many analytical methods have been developed for determination of AA contents in different food matrices. Also, several studies have been conducted to develop extraction procedures for AA from difficult food matrices. AA is a small, highly polar molecule, which makes its extraction and analysis challenging. Many articles and reviews have been published dealing with AA in food. The aim of the review is to discuss AA formation in food, the factors affecting AA formation and removal, AA exposure assessment, AA extraction and cleanup from food samples, and analytical methods used in AA determination, such as high-performance liquid chromatography (HPLC) and gas chromatography (GC). Special attention is given to sample extraction and cleanup procedures and analytical techniques used for AA determination.
Ozay, Guner; Seyhan, Ferda; Yilmaz, Aysun; Whitaker, Thomas B; Slate, Andrew B; Giesbrecht, Francis
2006-01-01
The variability associated with the aflatoxin test procedure used to estimate aflatoxin levels in bulk shipments of hazelnuts was investigated. Sixteen 10 kg samples of shelled hazelnuts were taken from each of 20 lots that were suspected of aflatoxin contamination. The total variance associated with testing shelled hazelnuts was estimated and partitioned into sampling, sample preparation, and analytical variance components. Each variance component increased as aflatoxin concentration (either B1 or total) increased. With the use of regression analysis, mathematical expressions were developed to model the relationship between aflatoxin concentration and the total, sampling, sample preparation, and analytical variances. The expressions for these relationships were used to estimate the variance for any sample size, subsample size, and number of analyses for a specific aflatoxin concentration. The sampling, sample preparation, and analytical variances associated with estimating aflatoxin in a hazelnut lot at a total aflatoxin level of 10 ng/g and using a 10 kg sample, a 50 g subsample, dry comminution with a Robot Coupe mill, and a high-performance liquid chromatographic analytical method are 174.40, 0.74, and 0.27, respectively. The sampling, sample preparation, and analytical steps of the aflatoxin test procedure accounted for 99.4, 0.4, and 0.2% of the total variability, respectively.
Surface Analysis of Nerve Agent Degradation Products by ...
Report This sampling and analytical procedure was developed and applied by a single laboratory to investigate nerve agent degradation products, which may persist at a contaminated site, via surface wiping followed by analytical characterization. The performance data presented demonstrate the fitness-for-purpose regarding surface analysis in that single laboratory. Surfaces (laminate, glass, galvanized steel, vinyl tile, painted drywall and treated wood) were wiped with cotton gauze wipes, sonicated, extracted with distilled water, and filtered. Samples were analyzed with direct injection electrospray ionization liquid chromatography tandem mass spectrometry (ESI-LC/MS/MS) without derivatization. Detection limit data were generated for all analytes of interest on a laminate surface. Accuracy and precision data were generated from each surface fortified with these analytes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Code of Federal Regulations, 2014 CFR
2014-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Code of Federal Regulations, 2011 CFR
2011-07-01
... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...
Hanaoka, Shigeyuki; Nomura, Koji; Kudo, Shinichi
2005-09-02
Knowledge of the exact nature of the constituents of abandoned chemical weapons (ACW) is a prerequisite for their orderly destruction. Here we report the development of analytical procedures to identify diphenylchloroarsine (DA/Clark I), diphenylcyanoarsine (DC/Clark II) and related substances employed in one of the munitions known as "Red canister". Both DA and DC are relatively unstable under conventional analytical procedures without thiol derivatization. Unfortunately however, thiol drivatization affords the same volatile organo-arsenic derivative from several different diphenylarsenic compounds, making it impossible to identify and quantify the original compounds. Further, diminishing the analytical interference caused by the celluloid powder used as a stacking material in the weapons, is also essential for accurate analysis. In this study, extraction and instrumental conditions have been evaluated and an optimal protocol was determined. The analysis of Red canister samples following this protocol showed that most of the DA and DC associated with pumice had degraded to bis(diphenylarsine)oxide (BDPAO), while those associated with celluloid were dominantly degraded to diphenylarsinic acid (DPAA).
Validation of urban freeway models. [supporting datasets
DOT National Transportation Integrated Search
2015-01-01
The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...
Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E
2009-01-01
Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.
ERIC Educational Resources Information Center
Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella
2012-01-01
Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…
Integrating Water Quality and River Rehabilitation Management - A Decision-Analytical Perspective
NASA Astrophysics Data System (ADS)
Reichert, P.; Langhans, S.; Lienert, J.; Schuwirth, N.
2009-04-01
Integrative river management involves difficult decisions about alternative measures to improve their ecological state. For this reason, it seems useful to apply knowledge from the decision sciences to support river management. We discuss how decision-analytical elements can be employed for designing an integrated river management procedure. An important aspect of this procedure is to clearly separate scientific predictions of the consequences of alternatives from objectives to be achieved by river management. The key elements of the suggested procedure are (i) the quantitative elicitation of the objectives from different stakeholder groups, (ii) the compilation of the current scientific knowledge about the consequences of the effects resulting from suggested measures in the form of a probabilistic mathematical model, and (iii) the use of these predictions and valuations to prioritize alternatives, to uncover conflicting objectives, to support the design of better alternatives, and to improve the transparency of communication about the chosen management strategy. The development of this procedure led to insights regarding necessary steps to be taken for rational decision-making in river management, to guidelines about the use of decision-analytical techniques for performing these steps, but also to new insights about the application of decision-analytical techniques in general. In particular, the consideration of the spatial distribution of the effects of measures and the potential added value of connected rehabilitated river reaches leads to favoring measures that have a positive effect beyond a single river reach. As these effects only propagate within the river network, this results in a river basin oriented management concept as a consequence of a rational decision support procedure, rather than as an a priori management paradigm. There are also limitations to the support that can be expected from the decision-analytical perspective. It will not provide the societal values that are driving prioritization in river management, it will only support their elicitation and rational use. This is particularly important for the assessment of micro-pollutants because of severe limitations in scientific knowledge of their effects on river ecosystems. This makes the influence of pollution by micro-pollutants on prioritization of measures strongly dependent on the weight of the precautionary principle relative to other societal objectives of river management.
Wang, Lu; Qu, Haibin
2016-03-01
A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of
Lombardi, Giovanni; Sansoni, Veronica; Banfi, Giuseppe
2017-08-01
In the last few years, a growing number of molecules have been associated to an endocrine function of the skeletal muscle. Circulating myokine levels, in turn, have been associated with several pathophysiological conditions including the cardiovascular ones. However, data from different studies are often not completely comparable or even discordant. This would be due, at least in part, to the whole set of situations related to the preparation of the patient prior to blood sampling, blood sampling procedure, processing and/or store. This entire process constitutes the pre-analytical phase. The importance of the pre-analytical phase is often not considered. However, in routine diagnostics, the 70% of the errors are in this phase. Moreover, errors during the pre-analytical phase are carried over in the analytical phase and affects the final output. In research, for example, when samples are collected over a long time and by different laboratories, a standardized procedure for sample collecting and the correct procedure for sample storage are acknowledged. In this review, we discuss the pre-analytical variables potentially affecting the measurement of myokines with cardiovascular functions.
Salinas, Maria; Lopez-Garrigos, Maite; Flores, Emilio; Leiva-Salinas, Carlos
2018-06-01
To study the urinalysis request, pre-analytical sample conditions, and analytical procedures. Laboratories were asked to provide the number of primary care urinalyses requested, and to fill out a questionnaire regarding pre-analytical conditions and analytical procedures. 110 laboratories participated in the study. 232.5 urinalyses/1,000 inhabitants were reported. 75.4% used the first morning urine. The sample reached the laboratory in less than 2 hours in 18.8%, between 2 - 4 hours in 78.3%, and between 4 - 6 hours in the remaining 2.9%. 92.5% combined the use of test strip and particle analysis, and only 7.5% used the strip exclusively. All participants except one performed automated particle analysis depending on strip results; in 16.2% the procedure was only manual. Urinalysis was highly requested. There was a lack of compliance with guidelines regarding time between micturition and analysis that usually involved the combination of strip followed by particle analysis.
NASA Technical Reports Server (NTRS)
Giles, G. L.; Rogers, J. L., Jr.
1982-01-01
The implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calclating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of the system are also discussed.
Indoor Exposure Product Testing Protocols Version 2
EPA’s Office of Pollution Prevention and Toxics (OPPT) has developed a set of ten indoor exposure testing protocols intended to provide information on the purpose of the testing, general description of the sampling and analytical procedures, and references for tests that will be ...
Recent developments in urinalysis of metabolites of new psychoactive substances using LC-MS.
Peters, Frank T
2014-08-01
In the last decade, an ever-increasing number of new psychoactive substances (NPSs) have appeared on the recreational drug market. To account for this development, analytical toxicologists have to continuously adapt their methods to encompass the latest NPSs. Urine is the preferred biological matrix for screening analysis in different areas of analytical toxicology. However, the development of urinalysis procedures for NPSs is complicated by the fact that generally little or no information on urinary excretion patterns of such drugs exists when they first appear on the market. Metabolism studies are therefore a prerequisite in the development of urinalysis methods for NPSs. In this article, the literature on the urinalysis of NPS metabolites will be reviewed, focusing on articles published after 2008.
Ludtke, Amy S.; Woodworth, Mark T.; Marsh, Philip S.
2000-01-01
The U.S. Geological Survey operates a quality-assurance program based on the analyses of reference samples for two laboratories: the National Water Quality Laboratory and the Quality of Water Service Unit. Reference samples that contain selected inorganic, nutrient, and low-level constituents are prepared and submitted to the laboratory as disguised routine samples. The program goal is to estimate precision and bias for as many analytical methods offered by the participating laboratories as possible. Blind reference samples typically are submitted at a rate of 2 to 5 percent of the annual environmental-sample load for each constituent. The samples are distributed to the laboratories throughout the year. The reference samples are subject to the identical laboratory handling, processing, and analytical procedures as those applied to environmental samples and, therefore, have been used as an independent source to verify bias and precision of laboratory analytical methods and ambient water-quality measurements. The results are stored permanently in the National Water Information System and the Blind Sample Project's data base. During water year 1998, 95 analytical procedures were evaluated at the National Water Quality Laboratory and 63 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic and low-level constituent data for water year 1998 indicated 77 of 78 analytical procedures at the National Water Quality Laboratory met the criteria for precision. Silver (dissolved, inductively coupled plasma-mass spectrometry) was determined to be imprecise. Five of 78 analytical procedures showed bias throughout the range of reference samples: chromium (dissolved, inductively coupled plasma-atomic emission spectrometry), dissolved solids (dissolved, gravimetric), lithium (dissolved, inductively coupled plasma-atomic emission spectrometry), silver (dissolved, inductively coupled plasma-mass spectrometry), and zinc (dissolved, inductively coupled plasma-mass spectrometry). At the National Water Quality Laboratory during water year 1998, lack of precision was indicated for 2 of 17 nutrient procedures: ammonia as nitrogen (dissolved, colorimetric) and orthophosphate as phosphorus (dissolved, colorimetric). Bias was indicated throughout the reference sample range for ammonia as nitrogen (dissolved, colorimetric, low level) and nitrate plus nitrite as nitrogen (dissolved, colorimetric, low level). All analytical procedures tested at the Quality of Water Service Unit during water year 1998 met the criteria for precision. One of the 63 analytical procedures indicated a bias throughout the range of reference samples: aluminum (whole-water recoverable, inductively coupled plasma-atomic emission spectrometry, trace).
Durante, Caterina; Baschieri, Carlo; Bertacchini, Lucia; Bertelli, Davide; Cocchi, Marina; Marchetti, Andrea; Manzini, Daniela; Papotti, Giulia; Sighinolfi, Simona
2015-04-15
Geographical origin and authenticity of food are topics of interest for both consumers and producers. Among the different indicators used for traceability studies, (87)Sr/(86)Sr isotopic ratio has provided excellent results. In this study, two analytical approaches for wine sample pre-treatment, microwave and low temperature mineralisation, were investigated to develop accurate and precise analytical method for (87)Sr/(86)Sr determination. The two procedures led to comparable results (paired t-test, with t
Multi-ion detection by one-shot optical sensors using a colour digital photographic camera.
Lapresta-Fernández, Alejandro; Capitán-Vallvey, Luis Fermín
2011-10-07
The feasibility and performance of a procedure to evaluate previously developed one-shot optical sensors as single and selective analyte sensors for potassium, magnesium and hardness are presented. The procedure uses a conventional colour digital photographic camera as the detection system for simultaneous multianalyte detection. A 6.0 megapixel camera was used, and the procedure describes how it is possible to quantify potassium, magnesium and hardness simultaneously from the images captured, using multianalyte one-shot sensors based on ionophore-chromoionophore chemistry, employing the colour information computed from a defined region of interest on the sensing membrane. One of the colour channels in the red, green, blue (RGB) colour space is used to build the analytical parameter, the effective degree of protonation (1-α(eff)), in good agreement with the theoretical model. The linearization of the sigmoidal response function increases the limit of detection (LOD) and analytical range in all cases studied. The increases were from 5.4 × 10(-6) to 2.7 × 10(-7) M for potassium, from 1.4 × 10(-4) to 2.0 × 10(-6) M for magnesium and from 1.7 to 2.0 × 10(-2) mg L(-1) of CaCO(3) for hardness. The method's precision was determined in terms of the relative standard deviation (RSD%) which was from 2.4 to 7.6 for potassium, from 6.8 to 7.8 for magnesium and from 4.3 to 7.8 for hardness. The procedure was applied to the simultaneous determination of potassium, magnesium and hardness using multianalyte one-shot sensors in different types of waters and beverages in order to cover the entire application range, statistically validating the results against atomic absorption spectrometry as the reference procedure. Accordingly, this paper is an attempt to demonstrate the possibility of using a conventional digital camera as an analytical device to measure this type of one-shot sensor based on ionophore-chromoionophore chemistry instead of using conventional lab instrumentation.
Characterization of structural connections using free and forced response test data
NASA Technical Reports Server (NTRS)
Lawrence, Charles; Huckelbridge, Arthur A.
1989-01-01
The accurate prediction of system dynamic response often has been limited by deficiencies in existing capabilities to characterize connections adequately. Connections between structural components often are complex mechanically, and difficult to accurately model analytically. Improved analytical models for connections are needed to improve system dynamic preditions. A procedure for identifying physical connection properties from free and forced response test data is developed, then verified utilizing a system having both a linear and nonlinear connection. Connection properties are computed in terms of physical parameters so that the physical characteristics of the connections can better be understood, in addition to providing improved input for the system model. The identification procedure is applicable to multi-degree of freedom systems, and does not require that the test data be measured directly at the connection locations.
A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses
Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert
2011-01-01
Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325
ERIC Educational Resources Information Center
Fisher, James E.; Sealey, Ronald W.
The study describes the analytical pragmatic structure of concepts and applies this structure to the legal concept of procedural due process. This structure consists of form, purpose, content, and function. The study conclusions indicate that the structure of the concept of procedural due process, or any legal concept, is not the same as the…
Silvestre, Daniel Menezes; Nomura, Cassiana Seimi
2013-07-03
The development of methods for direct determinations of Al, Cd, and Pb in rice by SS-GF AAS is presented. Heating program optimization associated with the use of an adequate chemical modifier containing Pd + Mg allowed direct analysis against aqueous calibrations. The obtained LOD values were 114.0, 3.0, and 16.0 μg kg⁻¹ for Al, Cd, and Pb, respectively. Important parameters associated with a solid sampling analysis were investigated, such as minimum and maximum sample mass size and analyte segregation. Seventeen rice samples available in São Paulo City were analyzed, and all of them presented analyte mass fractions less than the maximum allowed by legislation. The influences of rice washing and the cooking procedure were also investigated. The washing procedure diminished the Al and Pb total mass fractions, indicating an exogenous grain contamination. The cooking procedure diminished the Cd total mass fraction. Rice cooking using an aluminum container did not cause a significant increase in the Al mass fraction in the rice, indicating no translocation of this element from container to food. In general, coarse rice presented higher levels of Al when compared to polished or parabolized rice.
Performance optimization of helicopter rotor blades
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.
1991-01-01
As part of a center-wide activity at NASA Langley Research Center to develop multidisciplinary design procedures by accounting for discipline interactions, a performance design optimization procedure is developed. The procedure optimizes the aerodynamic performance of rotor blades by selecting the point of taper initiation, root chord, taper ratio, and maximum twist which minimize hover horsepower while not degrading forward flight performance. The procedure uses HOVT (a strip theory momentum analysis) to compute the horse power required for hover and the comprehensive helicopter analysis program CAMRAD to compute the horsepower required for forward flight and maneuver. The optimization algorithm consists of the general purpose optimization program CONMIN and approximate analyses. Sensitivity analyses consisting of derivatives of the objective function and constraints are carried out by forward finite differences. The procedure is applied to a test problem which is an analytical model of a wind tunnel model of a utility rotor blade.
NASA Technical Reports Server (NTRS)
Martin, Carl J., Jr.
1996-01-01
This report describes a structural optimization procedure developed for use with the Engineering Analysis Language (EAL) finite element analysis system. The procedure is written primarily in the EAL command language. Three external processors which are written in FORTRAN generate equivalent stiffnesses and evaluate stress and local buckling constraints for the sections. Several built-up structural sections were coded into the design procedures. These structural sections were selected for use in aircraft design, but are suitable for other applications. Sensitivity calculations use the semi-analytic method, and an extensive effort has been made to increase the execution speed and reduce the storage requirements. There is also an approximate sensitivity update method included which can significantly reduce computational time. The optimization is performed by an implementation of the MINOS V5.4 linear programming routine in a sequential liner programming procedure.
Cognitive Task Analysis of En Route Air Traffic Control: Model Extension and Validation.
ERIC Educational Resources Information Center
Redding, Richard E.; And Others
Phase II of a project extended data collection and analytic procedures to develop a model of expertise and skill development for en route air traffic control (ATC). New data were collected by recording the Dynamic Simulator (DYSIM) performance of five experts with a work overload problem. Expert controllers were interviewed in depth for mental…
ERIC Educational Resources Information Center
Trivette, Carol M.; Dunst, Carl J.; Hamby, Deborah W.
2010-01-01
The extent to which the influences of family-systems intervention practices could be traced to variations in parent-child interactions and child development was investigated by meta-analytic structural equation modeling (MASEM). MASEM is a procedure for producing a weighted pooled correlation matrix and fitting a structural equation model to the…
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2005 through June 2007. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: total aluminum, calcium, magnesium, nitrate (colorimetric method), potassium, silicon, sodium, and sulfate. Eight of the analytical procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits; these procedures were: total aluminum, calcium, dissolved organic carbon, chloride, nitrate (ion chromatograph), potassium, silicon, and sulfate. The magnesium and pH procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The acid-neutralizing capacity, total monomeric aluminum, nitrite, and specific conductance procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicated that the procedures for 16 of 17 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 93 percent of the samples met data-quality objectives for all analytes except acid-neutralizing capacity (85 percent of samples met objectives), total monomeric aluminum (83 percent of samples met objectives), total aluminum (85 percent of samples met objectives), and chloride (85 percent of samples met objectives). The ammonium and total dissolved nitrogen did not meet the data-quality objectives. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project met the Troy Laboratory data-quality objectives for 87 percent of the samples analyzed. The P-sample (low-ionic-strength constituents) analysis had two outliers each in two studies. The T-sample (trace constituents) analysis and the N-sample (nutrient constituents) analysis had one outlier each in two studies. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 85 percent of the samples met data-quality objectives for 11 of the 14 analytes; the exceptions were acid-neutralizing capacity, total aluminum and ammonium. Data-quality objectives were not met in 41 percent of samples analyzed for acid-neutralizing capacity, 50 percent of samples analyzed for total aluminum, and 44 percent of samples analyzed for ammonium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 86 percent of the samples analyzed for calcium, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 76 percent of the samples analyzed for chloride, 80 percent of the samples analyzed for specific conductance, and 77 percent of the samples analyzed for sulfate.
Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-10-24
This plan incorporates U.S. Department of Energy (DOE) Office of Legacy Management (LM) standard operating procedures (SOPs) into environmental monitoring activities and will be implemented at all sites managed by LM. This document provides detailed procedures for the field sampling teams so that samples are collected in a consistent and technically defensible manner. Site-specific plans (e.g., long-term surveillance and maintenance plans, environmental monitoring plans) document background information and establish the basis for sampling and monitoring activities. Information will be included in site-specific tabbed sections to this plan, which identify sample locations, sample frequencies, types of samples, field measurements, and associatedmore » analytes for each site. Additionally, within each tabbed section, program directives will be included, when developed, to establish additional site-specific requirements to modify or clarify requirements in this plan as they apply to the corresponding site. A flowchart detailing project tasks required to accomplish routine sampling is displayed in Figure 1. LM environmental procedures are contained in the Environmental Procedures Catalog (LMS/PRO/S04325), which incorporates American Society for Testing and Materials (ASTM), DOE, and U.S. Environmental Protection Agency (EPA) guidance. Specific procedures used for groundwater and surface water monitoring are included in Appendix A. If other environmental media are monitored, SOPs used for air, soil/sediment, and biota monitoring can be found in the site-specific tabbed sections in Appendix D or in site-specific documents. The procedures in the Environmental Procedures Catalog are intended as general guidance and require additional detail from planning documents in order to be complete; the following sections fulfill that function and specify additional procedural requirements to form SOPs. Routine revision of this Sampling and Analysis Plan will be conducted annually at the beginning of each fiscal year when attachments in Appendix D, including program directives and sampling location/analytical tables, will be reviewed by project personnel and updated. The sampling location/analytical tables in Appendix D, however, may have interim updates according to project direction that are not reflected in this plan. Deviations from location/analytical tables in Appendix D prior to sampling will be documented in project correspondence (e.g., startup letters). If significant changes to other aspects of this plan are required before the annual update, then the plan will be revised as needed.« less
Molecularly imprinted polymers as selective adsorbents for ambient plasma mass spectrometry.
Cegłowski, Michał; Smoluch, Marek; Reszke, Edward; Silberring, Jerzy; Schroeder, Grzegorz
2017-05-01
The application of molecularly imprinted polymers (MIPs) as molecular scavengers for ambient plasma ionization mass spectrometry has been reported for the first time. MIPs were synthesized using methacrylic acid as functional monomer; nicotine, propyphenazone, or methylparaben as templates; ethylene glycol dimethacrylate as a cross-linker; and 2,2'-azobisisobutyronitrile as polymerization initiator. To perform ambient plasma ionization experiments, a setup consisting of the heated crucible, a flowing atmospheric-pressure afterglow (FAPA) plasma ion source, and a quadrupole ion trap mass spectrometer has been used. The heated crucible with programmable temperature allows for desorption of the analytes from MIPs structure which results in their direct introduction into the ion stream. Limits of detection, linearity of the proposed analytical procedure, and selectivities have been determined for three analytes: nicotine, propyphenazone, and methylparaben. The analytes used were chosen from various classes of organic compounds to show the feasibility of the analytical procedure. The limits of detections (LODs) were 10 nM, 10, and 0.5 μM for nicotine, propyphenazone, and methylparaben, respectively. In comparison with the measurements performed for the non-imprinted polymers, the values of LODs were improved for at least one order of magnitude due to preconcentration of the sample and reduction of background noise, contributing to signal suppression. The described procedure has shown linearity in a broad range of concentrations. The overall time of single analysis is short and requires ca. 5 min. The developed technique was applied for the determination of nicotine, propyphenazone, and methylparaben in spiked real-life samples, with recovery of 94.6-98.4%. The proposed method is rapid, sensitive, and accurate which provides a new option for the detection of small organic compounds in various samples. Graphical abstract The experimental setup used for analysis.
Guan, Fuyu; Robinson, Mary A
2017-09-08
The ability to analyze biological samples for multitudinous exogenous peptides with a single analytical method is desired for doping control in horse racing. The key to achieving this goal is the capability of extracting all target peptides from the sample matrix. In the present study, theory of mixed-mode solid-phase extraction (SPE) of peptides from plasma is described, and a generic mixed-mode SPE procedure has been developed for recovering multitudinous exogenous peptides with remarkable sequence diversity, from equine plasma and urine in a single procedure. Both the theory and the developed SPE procedure have led to the development of a novel analytical method for comprehensive detection of multitudinous bioactive peptides in equine plasma and urine using liquid chromatography coupled to high resolution mass spectrometry (LC-HRMS). Thirty nine bioactive peptides were extracted with strong anion-exchange mixed-mode SPE sorbent, separated on a reversed-phase C 18 column and detected by HRMS and data-dependent tandem mass spectrometry. The limit of detection (LOD) was 10-50 pg mL -1 in plasma for most of the peptides and 100 pg mL -1 for the remaining. For urine, LOD was 20-400 pg mL -1 for most of the peptides and 1-4 ng mL -1 for the others. In vitro degradation of the peptides in equine plasma and urine was examined at ambient temperature; the peptides except those with a D-amino acid at position 2 were unstable not only in plasma but also in urine. The developed method was successful in analysis of plasma and urine samples from horses administered dermorphin. Additionally, dermorphin metabolites were identified in the absence of reference standards. The developed SPE procedure and LC-HRMS method can theoretically detect virtually all peptides present at a sufficient concentration in a sample. New peptides can be readily included in the method to be detected without method re-development. The developed method also generates such data that can be retrospectively analyzed for peptides unknown at the time of sample analysis. It is the first generic analytical method for comprehensive detection of multitudinous exogenous peptides in biological samples, to the authors' knowledge. Copyright © 2017 Elsevier B.V. All rights reserved.
Kuczynska, Paulina; Jemiola-Rzeminska, Malgorzata
2017-01-01
Two diatom-specific carotenoids are engaged in the diadinoxanthin cycle, an important mechanism which protects these organisms against photoinhibition caused by absorption of excessive light energy. A high-performance and economical procedure of isolation and purification of diadinoxanthin and diatoxanthin from the marine diatom Phaeodactylum tricornutum using a four-step procedure has been developed. It is based on the use of commonly available materials and does not require advanced technology. Extraction of pigments, saponification, separation by partition and then open column chromatography, which comprise the complete experimental procedure, can be performed within 2 days. This method allows HPLC grade diadinoxanthin and diatoxanthin of a purity of 99 % or more to be obtained, and the efficiency was estimated to be 63 % for diadinoxanthin and 73 % for diatoxanthin. Carefully selected diatom culture conditions as well as analytical ones ensure highly reproducible performance. A protocol can be used to isolate and purify the diadinoxanthin cycle pigments both on analytical and preparative scale.
Lupu, Stelian; Lete, Cecilia; Balaure, Paul Cătălin; Caval, Dan Ion; Mihailciuc, Constantin; Lakard, Boris; Hihn, Jean-Yves; del Campo, Francisco Javier
2013-01-01
Bio-composite coatings consisting of poly(3,4-ethylenedioxythiophene) (PEDOT) and tyrosinase (Ty) were successfully electrodeposited on conventional size gold (Au) disk electrodes and microelectrode arrays using sinusoidal voltages. Electrochemical polymerization of the corresponding monomer was carried out in the presence of various Ty amounts in aqueous buffered solutions. The bio-composite coatings prepared using sinusoidal voltages and potentiostatic electrodeposition methods were compared in terms of morphology, electrochemical properties, and biocatalytic activity towards various analytes. The amperometric biosensors were tested in dopamine (DA) and catechol (CT) electroanalysis in aqueous buffered solutions. The analytical performance of the developed biosensors was investigated in terms of linear response range, detection limit, sensitivity, and repeatability. A semi-quantitative multi-analyte procedure for simultaneous determination of DA and CT was developed. The amperometric biosensor prepared using sinusoidal voltages showed much better analytical performance. The Au disk biosensor obtained by 50 mV alternating voltage amplitude displayed a linear response for DA concentrations ranging from 10 to 300 μM, with a detection limit of 4.18 μM. PMID:23698270
Kuhlenbeck, Debbie L; Eichold, Thomas H; Hoke, Steven H; Baker, Timothy R; Mensen, Robert; Wehmeyer, Kenneth R
2005-01-01
An on-line liquid chromatography/tandem mass spectrometry (LC-MS/MS) procedure, using the Prospekt- 2 system, was developed and used for the determination of the levels of the active ingredients of cough/cold medications in human plasma matrix. The experimental configuration allows direct plasma injection by performing on- line solid phase extraction (SPE) on small cartridge columns prior to elution of the analyte(s) onto the analytical column and subsequent MS/MS detection. The quantitative analysis of three analytes with differing polarities, dextromethorphan (DEX), dextrorphan (DET) and guaifenesin (GG) in human plasma presented a significant challenge. Using stable-isotope-labeled internal standards for each analyte, the Prospekt-2 on-line methodology was evaluated for sensitivity, suppression, accuracy, precision, linearity, analyst time, analysis time, cost, carryover and ease of use. The lower limit of quantitation for the on-line SPE procedure for DEX, DET and GG was 0.05, 0.05 and 5.0 ng mL(-1), respectively, using a 0.1 mL sample volume. The linear range for DEX and DET was 0.05-50 ng mL(-1) and was 5-5,000 ng mL(-1) for GG. Accuracy and precision data for five different levels of QC samples were collected over three separate days. Accuracy ranged from 90% to 112% for all three analytes, while the precision, as measured by the %RSD, ranged from 1.5% to 16.0%
Contamination in food from packaging material.
Lau, O W; Wong, S K
2000-06-16
Packaging has become an indispensible element in the food manufacturing process, and different types of additives, such as antioxidants, stabilizers, lubricants, anti-static and anti-blocking agents, have also been developed to improve the performance of polymeric packaging materials. Recently the packaging has been found to represent a source of contamination itself through the migration of substances from the packaging into food. Various analytical methods have been developed to analyze the migrants in the foodstuff, and migration evaluation procedures based on theoretical prediction of migration from plastic food contact material were also introduced recently. In this paper, the regulatory control, analytical methodology, factors affecting the migration and migration evaluation are reviewed.
Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica
2016-06-01
To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.
This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.
This standard operating procedure describes the method used for the determination of target analytes in sample extracts and related quality assurance/quality control sample extracts generated in the CTEPP study.
An analytic survey of signing inventory procedures in Virginia.
DOT National Transportation Integrated Search
1972-01-01
An analytic survey was made of the highway signing and sign-maintenance inventory systems in each of the districts of the Virginia Department of Highways. Of particular concern in reviewing the procedures was the format of the inventory forms, the ap...
The Standardized Letter of Recommendation: Implications for Selection. Research Report. ETS RR-07-38
ERIC Educational Resources Information Center
Liu, Ou Lydia; Minsky, Jennifer; Ling, Guangming; Kyllonen, Patrick
2007-01-01
In an effort to standardize academic application procedures, the Standardized Letter of Recommendation (SLR) was developed to capture important cognitive and noncognitive qualities of graduate school candidates. The SLR consists of seven scales ("knowledge," "analytical skills," "communication skills,"…
Atomic Absorption, Atomic Fluorescence, and Flame Emission Spectrometry.
ERIC Educational Resources Information Center
Horlick, Gary
1984-01-01
This review is presented in six sections. Sections focus on literature related to: (1) developments in instrumentation, measurement techniques, and procedures; (2) performance studies of flames and electrothermal atomizers; (3) applications of atomic absorption spectrometry; (4) analytical comparisons; (5) atomic fluorescence spectrometry; and (6)…
A systematic scanning election microscope analytical technique has been developed to examine granular activated carbon used a a medium for biomass attachment in liquid waste treatment. The procedure allows for the objective monitoring, comparing, and trouble shooting of combined ...
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
System identification of analytical models of damped structures
NASA Technical Reports Server (NTRS)
Fuh, J.-S.; Chen, S.-Y.; Berman, A.
1984-01-01
A procedure is presented for identifying linear nonproportionally damped system. The system damping is assumed to be representable by a real symmetric matrix. Analytical mass, stiffness and damping matrices which constitute an approximate representation of the system are assumed to be available. Given also are an incomplete set of measured natural frequencies, damping ratios and complex mode shapes of the structure, normally obtained from test data. A method is developed to find the smallest changes in the analytical model so that the improved model can exactly predict the measured modal parameters. The present method uses the orthogonality relationship to improve mass and damping matrices and the dynamic equation to find the improved stiffness matrix.
Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.
Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel
2015-01-01
Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2006-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data for the time period addressed in this report were stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality- control samples analyzed from July 1997 through June 1999. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration and (or) low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, ammonium, calcium, chloride, specific conductance, and sulfate. The data from the potassium and sodium analytical procedures are insufficient for evaluation. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 11 of 13 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. Blank analysis results for chloride showed that 22 percent of blanks did not meet data-quality objectives and results for dissolved organic carbon showed that 31 percent of the blanks did not meet data-quality objectives. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 14 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except total aluminum (70 percent of samples met objectives) and potassium (83 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality for most constituents over the time period. The P-sample (low-ionic-strength constituents) analysis had good ratings in two of these studies and a satisfactory rating in the third. The results of the T-sample (trace constituents) analysis indicated high data quality with good ratings in all three studies. The N-sample (nutrient constituents) studies had one each of excellent, good, and satisfactory ratings. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 80 percent of the samples met data-quality objectives for 9 of the 13 analytes; the exceptions were dissolved organic carbon, ammonium, chloride, and specific conductance. Data-quality objectives were not met for dissolved organic carbon in two NWRI studies, but all of the samples were within control limits for the last study. Data-quality objectives were not met in 41 percent of samples analyzed for ammonium, 25 percent of samples analyzed for chloride, and 30 percent of samples analyzed for specific conductance. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 84 percent of the samples analyzed for calcium, chloride, magnesium, pH, and potassium. Data-quality objectives were met by 73 percent of those analyzed for sulfate. The data-quality objective was not met for sodium. The data are insufficient for evaluation of the specific conductance results.
Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid: a systematic review
Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda
2015-01-01
This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030
DOT National Transportation Integrated Search
2010-01-01
The initial objective of this research was to develop procedures and standards for applying GPC as an analytical tool to define the percentage amounts of polymer modifiers in polymer modified asphalt cements soluble in eluting GPC solvents. Quantific...
ERIC Educational Resources Information Center
Bailey, Leonard
1978-01-01
The experiment described was developed for the third-year course in inorganic and analytical pharmaceutical chemistry to provide students with "hands-on" experience with high pressure liquid chromatography. Assay procedures are given along with experimental parameters and student results. (LBH)
A Structure for Pedagogical Art Criticism.
ERIC Educational Resources Information Center
Anderson, Tom
1988-01-01
Develops method for incorporating the intuitive and affective with intellectual and analytic components for understanding works of art. States that the premises for such a systematization include both Arnheim's claim that two basic interdependent procedures of intelligent cognition are intuition and intellect (1986); and Harry Broudy's (1972)…
USDA-ARS?s Scientific Manuscript database
An analytical and statistical method has been developed to measure the ultrasound-enhanced bioscouring performance of milligram quantities of endo- and exo-polygalacturonase enzymes obtained from Rhizopus oryzae fungi. UV-Vis spectrophotometric data and a general linear mixed models procedure indic...
NASA Astrophysics Data System (ADS)
Krachler, Michael; Mohl, Carola; Emons, Hendrik; Shotyk, William
2002-08-01
A simple, robust and reliable analytical procedure for the determination of 15 elements, namely Ca, V, Cr, Mn, Co, Ni, Cu, Zn, Rb, Ag, Cd, Ba, Tl, Th and U in peat and plant materials by inductively coupled plasma-quadrupole mass spectrometry (ICP-QMS) was developed. Powdered sample aliquots of approximately 220 mg were dissolved with various acid mixtures in a microwave heated high-pressure autoclave capable to digest 40 samples simultaneously. The selection of appropriate amounts of digestion acids (nitric acid, hydrofluoric acid or tetrafluoroboric acid) was crucial to obtain accurate results. The optimized acid mixture for digestion of plant and peat samples consisted of 3 ml HNO 3 and 0.1 ml HBF 4. An ultrasonic nebulizer with an additional membrane desolvation unit was found beneficial for the determination of Co, Ni, Ag, Tl, Th and U, allowing to aspirate a dry sample aerosol into the ICP-QMS. A pneumatic cross flow nebulizer served as sample introduction device for the other elements. Internal standardization was achieved with 103Rh for all elements, except for Th whose ICP-QMS signals were corrected by 103Rh and 185Re. Quality control was ascertained by analysis of the certified plant reference material GBW 07602 Bush Branches and Leaves. In almost all cases HNO 3 alone could not fully liberate the analytes of interest from the peat or plant matrix, probably because of the silicates present. After adding small amounts (0.05-0.1 ml) of either HF or HBF 4 to the digestion mixture, concentrations quantified by ICP-QMS generally increased significantly, in the case of Rb up to 80%. Further increasing the volumes of HF or HBF 4 in turn, resulted in a loss of recoveries of almost all elements, some of which amounted to approximately 60%. The successful analytical procedures were applied to the determination of two bulk peat materials. In general, good agreement between the found concentrations and results from an inter-laboratory trial or from instrumental neutron activation data were obtained, underpinning the suitability of the developed analytical approach.
The development, design and test of a 66 W/kg (30-W/lb) roll-up solar array
NASA Technical Reports Server (NTRS)
Hasbach, W. A.; Ross, R. G., Jr.
1972-01-01
A program to develop a 250 square foot roll-up solar array with a power-to-weight ratio exceeding 30 watts per pound is described. The system design and fabrication of a full scale engineering development model are discussed. The system and development test program results are presented. Special test equipment and test procedures are included, together with comparisons of experimental and analytical results.
Development tests for the 2.5 megawatt Mod-2 wind turbine generator
NASA Technical Reports Server (NTRS)
Andrews, J. S.; Baskin, J. M.
1982-01-01
The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
The transfer of analytical procedures.
Ermer, J; Limberger, M; Lis, K; Wätzig, H
2013-11-01
Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Engine isolation for structural-borne interior noise reduction in a general aviation aircraft
NASA Technical Reports Server (NTRS)
Unruh, J. F.; Scheidt, D. C.
1981-01-01
Engine vibration isolation for structural-borne interior noise reduction is investigated. A laboratory based test procedure to simulate engine induced structure-borne noise transmission, the testing of a range of candidate isolators for relative performance data, and the development of an analytical model of the transmission phenomena for isolator design evaluation are addressed. The isolator relative performance test data show that the elastomeric isolators do not appear to operate as single degree of freedom systems with respect to noise isolation. Noise isolation beyond 150 Hz levels off and begins to decrease somewhat above 600 Hz. Coupled analytical and empirical models were used to study the structure-borne noise transmission phenomena. Correlation of predicted results with measured data show that (1) the modeling procedures are reasonably accurate for isolator design evaluation, (2) the frequency dependent properties of the isolators must be included in the model if reasonably accurate noise prediction beyond 150 Hz is desired. The experimental and analytical studies were carried out in the frequency range from 10 Hz to 1000 Hz.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassemi, S.A.
1988-04-01
High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.
NASA Technical Reports Server (NTRS)
Kassemi, Siavash A.
1988-01-01
High Rayleigh number convection in a rectangular cavity with insulated horizontal surfaces and differentially heated vertical walls was analyzed for an arbitrary aspect ratio smaller than or equal to unity. Unlike previous analytical studies, a systematic method of solution based on linearization technique and analytical iteration procedure was developed to obtain approximate closed-form solutions for a wide range of aspect ratios. The predicted velocity and temperature fields are shown to be in excellent agreement with available experimental and numerical data.
A behavior analytic analogue of learning to use synonyms, syntax, and parts of speech.
Chase, Philip N; Ellenwood, David W; Madden, Gregory
2008-01-01
Matching-to-sample and sequence training procedures were used to develop responding to stimulus classes that were considered analogous to 3 aspects of verbal behavior: identifying synonyms and parts of speech, and using syntax. Matching-to-sample procedures were used to train 12 paired associates from among 24 stimuli. These pairs were analogous to synonyms. Then, sequence characteristics were trained to 6 of the stimuli. The result was the formation of 3 classes of 4 stimuli, with the classes controlling a sequence response analogous to a simple ordering syntax: first, second, and third. Matching-to-sample procedures were then used to add 4 stimuli to each class. These stimuli, without explicit sequence training, also began to control the same sequence responding as the other members of their class. Thus, three 8-member functionally equivalent sequence classes were formed. These classes were considered to be analogous to parts of speech. Further testing revealed three 8-member equivalence classes and 512 different sequences of first, second, and third. The study indicated that behavior analytic procedures may be used to produce some generative aspects of verbal behavior related to simple syntax and semantics.
Hogendoorn, E A; Westhuis, K; Dijkman, E; Heusinkveld, H A; den Boer, A C; Evers, E A; Baumann, R A
1999-10-08
The coupled-column (LC-LC) configuration consisting of a 3 microm C18 column (50 x 4.6 mm I.D.) as the first column and a 5 microm C18 semi-permeable-surface (SPS) column (150 x 4.6 mm I.D.) as the second column appeared to be successful for the screening of acidic pesticides in surface water samples. In comparison to LC-LC employing two C18 columns, the combination of C18/SPS-C18 significantly decreased the baseline deviation caused by the hump of the co-extracted humic substances when using UV detection (217 nm). The developed LC-LC procedure allowed the simultaneous determination of the target analytes bentazone and bromoxynil in uncleaned extracts of surface water samples to a level of 0.05 microg/l in less than 15 min. In combination with a simple solid-phase extraction step (200 ml of water on a 500 mg C18-bonded silica) the analytical procedure provides a high sample throughput. During a period of about five months more than 200 ditch-water samples originating from agricultural locations were analyzed with the developed procedure. Validation of the method was performed by randomly analyzing recoveries of water samples spiked at levels of 0.1 microg/l (n=10), 0.5 microg/l (n=7) and 2.5 microg/l (n=4). Weighted regression of the recovery data showed that the method provides overall recoveries of 95 and 100% for bentazone and bromoxynil, respectively, with corresponding intra-laboratory reproducibilities of 10 and 11%, respectively. Confirmation of the analytes in part of the samples extracts was carried out with GC-negative ion chemical ionization MS involving a derivatization step with bis(trifluoromethyl)benzyl bromide. No false negatives or positives were observed.
Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A
2000-02-11
The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.
An Analytical Singularity-Free Solution to the J2 Perturbation Problem
NASA Technical Reports Server (NTRS)
Bond, V. R.
1979-01-01
The development of a singularity-free solution of the J2 problem in satellite theory is presented. The procedure resembles that of Lyndane who rederives Brouwer's satellite theory using Poincare elements. A comparable procedure is used in this report in which the satellite theory of Scheifele, who used elements similar to the Delaunay elements but in the extended phase space, is rederived using Poincare elements also in the extended phase space. Only the short-period effects due to J2 are included.
Mass balancing of hollow fan blades
NASA Technical Reports Server (NTRS)
Kielb, R. E.
1986-01-01
A typical section model is used to analytically investigate the effect of mass balancing as applied to hollow, supersonic fan blades. A procedure to determine the best configuration of an internal balancing mass to provide flutter alleviation is developed. This procedure is applied to a typical supersonic shroudless fan blade which is unstable in both the solid configuration and when it is hollow with no balancing mass. The addition of an optimized balancing mass is shown to stabilize the blade at the design condition.
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
40 CFR 87.64 - Sampling and analytical procedures for measuring gaseous exhaust emissions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring gaseous exhaust emissions. 87.64 Section 87.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...
21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2001-01-01
A laboratory for analysis of low-ionic strength water has been developed at the U.S. Geological Survey (USGS) office in Troy, N.Y., to analyze samples collected by USGS projects in the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures developed to ensure proper sample collection, processing, and analysis. The quality-assurance/quality-control data are stored in the laboratory's SAS data-management system, which provides efficient review, compilation, and plotting of quality-assurance/quality-control data. This report presents and discusses samples analyzed from July 1993 through June 1995. Quality-control results for 18 analytical procedures were evaluated for bias and precision. Control charts show that data from seven of the analytical procedures were biased throughout the analysis period for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, dissolved inorganic carbon, dissolved organic carbon (soil expulsions), chloride, magnesium, nitrate (colorimetric method), and pH. Three of the analytical procedures were occasionally biased but were within control limits; they were: calcium (high for high-concentration samples for May 1995), dissolved organic carbon (high for highconcentration samples from January through September 1994), and fluoride (high in samples for April and June 1994). No quality-control sample has been developed for the organic monomeric aluminum procedure. Results from the filter-blank and analytical-blank analyses indicate that all analytical procedures in which blanks were run were within control limits, although values for a few blanks were outside the control limits. Blanks were not analyzed for acid-neutralizing capacity, dissolved inorganic carbon, fluoride, nitrate (colorimetric method), or pH. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in 14 of the 18 procedures. Data-quality objectives were met by more than 90 percent of the samples analyzed in all procedures except total monomeric aluminum (85 percent of samples met objectives), total aluminum (70 percent of samples met objectives), and dissolved organic carbon (85 percent of samples met objectives). Triplicate samples were not analyzed for ammonium, fluoride, dissolved inorganic carbon, or nitrate (colorimetric method). Results of the USGS interlaboratory Standard Reference Sample Program indicated high data quality with a median result of 3.6 of a possible 4.0. Environment Canada's LRTAP interlaboratory study results indicated that more than 85 percent of the samples met data-quality objectives in 6 of the 12 analyses; exceptions were calcium, dissolved organic carbon, chloride, pH, potassium, and sodium. Data-quality objectives were not met for calcium samples in one LRTAP study, but 94 percent of samples analyzed were within control limits for the remaining studies. Data-quality objectives were not met by 35 percent of samples analyzed for dissolved organic carbon, but 94 percent of sample values were within 20 percent of the most probable value. Data-quality objectives were not met for 30 percent of samples analyzed for chloride, but 90 percent of sample values were within 20 percent of the most probable value. Measurements of samples with a pH above 6.0 were biased high in 54 percent of the samples, although 85 percent of the samples met data-quality objectives for pH measurements below 6.0. Data-quality objectives for potassium and sodium were not met in one study (only 33 percent of the samples analyzed met the objectives), although 85 percent of the sample values were within control limits for the other studies. Measured sodium values were above the upper control limit in all studies. Results from blind reference-sample analyses indicated that data
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
1990-08-01
evidence for a surprising degree of long-term skill retention. We formulated a theoretical framework , focusing on the importance of procedural reinstatement...considerable forgetting over even relatively short retention intervals. We have been able to place these studies in the same general theoretical framework developed
E. L. Schaffer
Analytical procedures to predict the fire endurance of structural wood members have been developed worldwide. This research is reviewed for capability to predict the results of tests in North America and what considerations are necessary to apply the information here. Critical research needs suggested include: (1) Investigation of load levels used in reported tests,...
The NASTRAN theoretical manual (level 16.0)
NASA Technical Reports Server (NTRS)
1976-01-01
The manual is a commentary on the NASTRAN computer program, introducing the program to all interested persons. The manual's most important function is to present the developments of the analytical and numerical procedures that underlie the program. This manual is one of the four manuals which document the NASTRAN computer program.
DOT National Transportation Integrated Search
2005-05-01
This report synthesized the research findings of Phase I of the Statewide Traffic Safety Study of Louisiana, sponsored by the Louisiana Department of Transportation and Development. The objective of Phase I was to provide a comprehensive review of th...
Meta-Analysis of Academic Interventions Derived from Neuropsychological Data
ERIC Educational Resources Information Center
Burns, Matthew K.; Petersen-Brown, Shawna; Haegele, Katherine; Rodriguez, Megan; Schmitt, Braden; Cooper, Maureen; Clayton, Kate; Hutcheson, Shannon; Conner, Cynthia; Hosp, John; VanDerHeyden, Amanda M.
2016-01-01
Several scholars have recommended using data from neuropsychological tests to develop interventions for reading and mathematics. The current study examined the effects of using neuropsychological data within the intervention process with meta-analytic procedures. A total of 1,126 articles were found from an electronic search and compared to…
A rapid, safe and efficient procedure was developed to synthesize perfluorinated chloroformates in the small scale generally required to perform analytical derivatizations. This new family of derivatizing agents allows straightforward derivatization of highly polar compounds, co...
DOT National Transportation Integrated Search
2010-01-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
ERIC Educational Resources Information Center
Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati
2007-01-01
Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…
DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT
An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...
Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech
2015-01-01
Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.
2016-04-01
The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.
Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S
2016-04-15
The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.
Development of coring procedures applied to Si, CdTe, and CIGS solar panels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moutinho, H. R.; Johnston, S.; To, B.
Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less
Development of coring procedures applied to Si, CdTe, and CIGS solar panels
Moutinho, H. R.; Johnston, S.; To, B.; ...
2018-01-04
Most of the research on the performance and degradation of photovoltaic modules is based on macroscale measurements of device parameters such as efficiency, fill factor, open-circuit voltage, and short-circuit current. Our goal is to develop the capabilities to allow us to study the degradation of these parameters in the micro- and nanometer scale and to relate our results to performance parameters. To achieve this objective, the first step is to be able to access small samples from specific areas of the solar panels without changing the properties of the material. In this paper, we describe two coring procedures that wemore » developed and applied to Si, CIGS, and CdTe solar panels. In the first procedure, we cored full samples, whereas in the second we performed a partial coring that keeps the tempered glass intact. The cored samples were analyzed by different analytical techniques before and after coring, at the same locations, and no damage during the coring procedure was observed.« less
Ground temperature measurement by PRT-5 for maps experiment
NASA Technical Reports Server (NTRS)
Gupta, S. K.; Tiwari, S. N.
1978-01-01
A simple algorithm and computer program were developed for determining the actual surface temperature from the effective brightness temperature as measured remotely by a radiation thermometer called PRT-5. This procedure allows the computation of atmospheric correction to the effective brightness temperature without performing detailed radiative transfer calculations. Model radiative transfer calculations were performed to compute atmospheric corrections for several values of the surface and atmospheric parameters individually and in combination. Polynomial regressions were performed between the magnitudes or deviations of these parameters and the corresponding computed corrections to establish simple analytical relations between them. Analytical relations were also developed to represent combined correction for simultaneous variation of parameters in terms of their individual corrections.
NASA Technical Reports Server (NTRS)
Eberle, W. R.
1981-01-01
A computer program to calculate the wake downwind of a wind turbine was developed. Turbine wake characteristics are useful for determining optimum arrays for wind turbine farms. The analytical model is based on the characteristics of a turbulent coflowing jet with modification for the effects of atmospheric turbulence. The program calculates overall wake characteristics, wind profiles, and power recovery for a wind turbine directly in the wake of another turbine, as functions of distance downwind of the turbine. The calculation procedure is described in detail, and sample results are presented to illustrate the general behavior of the wake and the effects of principal input parameters.
Development of flexible rotor balancing criteria
NASA Technical Reports Server (NTRS)
Walter, W. W.; Rieger, N. F.
1979-01-01
Several studies in which analytical procedures were used to obtain balancing criteria for flexible rotors are described. General response data for a uniform rotor in damped flexible supports were first obtained for plain cylindrical bearings, tilting pad bearings, axial groove bearings, and partial arc bearings. These data formed the basis for the flexible rotor balance criteria presented. A procedure by which a practical rotor in bearings could be reduced to an equivalent uniform rotor was developed and tested. It was found that the equivalent rotor response always exceeded to practical rotor response by more than sixty percent for the cases tested. The equivalent rotor procedure was then tested against six practical rotor configurations for which data was available. It was found that the equivalent rotor method offered a procedure by which balance criteria could be selected for practical flexible rotors, using the charts given for the uniform rotor.
da Costa, Wiviane Kássia Oliveira Correia; da Silva, Caroline Santos; Figueiredo, José Fernando Dagnone; Nóbrega, Joaquim Araujo; Paim, Ana Paula Silveira
2018-06-05
A fast and simple dilute-and-shoot procedure for determination of Al, As, Ba, Cd, Cu, Fe, Mg, Mn, Ni, Pb, Sc, Ti, V, Zn and Zr in deodorants by inductively coupled plasma optical emission spectrometry (ICP OES) was developed. Sample preparation was carried out by diluting 1 mL of deodorant sample in 1% (v v -1 ) HNO 3 . The accuracy of the analytical procedure was evaluated using addition and recovery experiments, and recoveries ranged from 80 to 119%. The limits of detection varied from 0.001 to 0.76 mg kg -1 . Nine deodorants samples of different brands were analyzed. The maximum concentrations found (mg kg -1 ) were: Fe (1.0), Mn (0.1), Ti (1.02), V (0.33), Zn (255.2) and Zr (0.5); for Al and Mg, determined concentrations varied from 0.01 to 7.0% and from 0.005 to 1.44 mg kg -1 , respectively, showing wide variation depending on the sample type. The developed procedure was adequate for determining these analytes in routine analysis presenting high sample throughput and demonstrated the feasibility of direct analysis measurements after simple dilution step. Copyright © 2018 Elsevier B.V. All rights reserved.
40 CFR 600.108-08 - Analytical gases.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...
40 CFR 600.108-08 - Analytical gases.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ECONOMY AND GREENHOUSE GAS EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy and Carbon-Related Exhaust Emission Test Procedures § 600.108-08 Analytical gases. The analytical gases for all fuel economy testing...
An analytical method for free vibration analysis of functionally graded beams with edge cracks
NASA Astrophysics Data System (ADS)
Wei, Dong; Liu, Yinghua; Xiang, Zhihai
2012-03-01
In this paper, an analytical method is proposed for solving the free vibration of cracked functionally graded material (FGM) beams with axial loading, rotary inertia and shear deformation. The governing differential equations of motion for an FGM beam are established and the corresponding solutions are found first. The discontinuity of rotation caused by the cracks is simulated by means of the rotational spring model. Based on the transfer matrix method, then the recurrence formula is developed to get the eigenvalue equations of free vibration of FGM beams. The main advantage of the proposed method is that the eigenvalue equation for vibrating beams with an arbitrary number of cracks can be conveniently determined from a third-order determinant. Due to the decrease in the determinant order as compared with previous methods, the developed method is simpler and more convenient to analytically solve the free vibration problem of cracked FGM beams. Moreover, free vibration analyses of the Euler-Bernoulli and Timoshenko beams with any number of cracks can be conducted using the unified procedure based on the developed method. These advantages of the proposed procedure would be more remarkable as the increase of the number of cracks. A comprehensive analysis is conducted to investigate the influences of the location and total number of cracks, material properties, axial load, inertia and end supports on the natural frequencies and vibration mode shapes of FGM beams. The present work may be useful for the design and control of damaged structures.
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...
40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...
40 CFR 63.145 - Process wastewater provisions-test methods and procedures to determine compliance.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 10 2013-07-01 2013-07-01 false Process wastewater provisions-test... Operations, and Wastewater § 63.145 Process wastewater provisions—test methods and procedures to determine... analytical method for wastewater which has that compound as a target analyte. (7) Treatment using a series of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES... to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater... times the standard deviation of replicate instrumental measurements of the analyte in reagent water. (c...
Electroencephalographic monitoring of complex mental tasks
NASA Technical Reports Server (NTRS)
Guisado, Raul; Montgomery, Richard; Montgomery, Leslie; Hickey, Chris
1992-01-01
Outlined here is the development of neurophysiological procedures to monitor operators during the performance of cognitive tasks. Our approach included the use of electroencepalographic (EEG) and rheoencephalographic (REG) techniques to determine changes in cortical function associated with cognition in the operator's state. A two channel tetrapolar REG, a single channel forearm impedance plethysmograph, a Lead I electrocardiogram (ECG) and a 21 channel EEG were used to measure subject responses to various visual-motor cognitive tasks. Testing, analytical, and display procedures for EEG and REG monitoring were developed that extend the state of the art and provide a valuable tool for the study of cerebral circulatory and neural activity during cognition.
Tran, Ngoc Han; Hu, Jiangyong; Ong, Say Leong
2013-09-15
A high-throughput method for the simultaneous determination of 24 pharmaceuticals and personal care products (PPCPs), endocrine disrupting chemicals (EDCs) and artificial sweeteners (ASs) was developed. The method was based on a single-step solid phase extraction (SPE) coupled with high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) and isotope dilution. In this study, a single-step SPE procedure was optimized for simultaneous extraction of all target analytes. Good recoveries (≥ 70%) were observed for all target analytes when extraction was performed using Chromabond(®) HR-X (500 mg, 6 mL) cartridges under acidic condition (pH 2). HPLC-MS/MS parameters were optimized for the simultaneous analysis of 24 PPCPs, EDCs and ASs in a single injection. Quantification was performed by using 13 isotopically labeled internal standards (ILIS), which allows correcting efficiently the loss of the analytes during SPE procedure, matrix effects during HPLC-MS/MS and fluctuation in MS/MS signal intensity due to instrument. Method quantification limit (MQL) for most of the target analytes was below 10 ng/L in all water samples. The method was successfully applied for the simultaneous determination of PPCPs, EDCs and ASs in raw wastewater, surface water and groundwater samples collected in a local catchment area in Singapore. In conclusion, the developed method provided a valuable tool for investigating the occurrence, behavior, transport, and the fate of PPCPs, EDCs and ASs in the aquatic environment. Copyright © 2013 Elsevier B.V. All rights reserved.
Analytic tests and their relation to jet fuel thermal stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heneghan, S.P.; Kauffman, R.E.
1995-05-01
The evaluation of jet fuel thermal stability (TS) by simple analytic procedures has long been a goal of fuels chemists. The reason is obvious: if the analytic chemist can determine which types of material cause his test to respond, the refiners will know which materials to remove to improve stability. Complicating this quest is the lack of an acceptable quantitative TS test with which to compare any analytic procedures. To circumvent this problem, we recently compiled the results of TS tests for 12 fuels using six separate test procedures. The results covering a range of flow and temperature conditions showmore » that TS is not as dependent on test conditions as previously thought. Also, comparing the results from these tests with several analytic procedures shows that either a measure of the number of phenols or the total sulfur present in jet fuels is strongly indicative of the TS. The phenols have been measured using a cyclic voltammetry technique and the polar material by gas chromatography (atomic emission detection) following a solid phase extraction on silica gel. The polar material has been identified as mainly phenols (by mass spectrometry identification). Measures of the total acid number or peroxide concentration have little correlation with TS.« less
Störmer, Elke; Bauer, Steffen; Kirchheiner, Julia; Brockmöller, Jürgen; Roots, Ivar
2003-01-05
A new HPLC method for the simultaneous determination of celecoxib, carboxycelecoxib and hydroxycelecoxib in human plasma samples has been developed. Following a solid-phase extraction procedure, the samples were separated by gradient reversed-phase HLPC (C(18)) and quantified using UV detection at 254 nm. The method was linear over the concentration range 10-500 ng/ml. The intra-assay variability for the three analytes ranged from 4.0 to 12.6% and the inter-assay variability from 4.9 to 14.2%. The achieved limits of quantitation (LOQ) of 10 ng/ml for each analyte allowed the determination of the pharmacokinetic parameters of the analytes after administration of 100 mg celecoxib.
Enzymatic analysis of α-ketoglutaramate—A biomarker for hyperammonemia
Halámková, Lenka; Mailloux, Shay; Halámek, Jan; Cooper, Arthur J.L.; Katz, Evgeny
2012-01-01
Two enzymatic assays were developed for the analysis of α-ketoglutaramate (KGM)—an important biomarker of hepatic encephalopathy and other hyperammonemic diseases. In both procedures, KGM is first converted to α-ketoglutarate (KTG) via a reaction catalyzed by ω-amidase (AMD). In the first procedure, KTG generated in the AMD reaction initiates a biocatalytic cascade in which the concerted action of alanine transaminase and lactate dehydrogenase results in the oxidation of NADH. In the second procedure, KTG generated from KGM is reductively aminated, with the concomitant oxidation of NADH, in a reaction catalyzed by L-glutamic dehydrogenase. In both assays, the decrease in optical absorbance (λ=340 nm) corresponding to NADH oxidation is used to quantify concentrations of KGM. The two analytical procedures were applied to 50% (v/v) human serum diluted with aqueous solutions containing the assay components and spiked with concentrations of KGM estimated to be present in normal human plasma and in plasma from hyperammonemic patients. Since KTG is the product of AMD-catalyzed hydrolysis of KGM, in a separate study, this compound was used as a surrogate for KGM. Statistical analyses of samples mimicking the concentration of KGM assumed to be present in normal and pathological concentration ranges were performed. Both enzymatic assays for KGM were confirmed to discriminate between the predicted normal and pathophysiological concentrations of the analyte. The present study is the first step toward the development of a clinically useful probe for KGM analysis in biological fluids. PMID:23141304
NASA Technical Reports Server (NTRS)
Newman, Brett; Yu, Si-bok; Rhew, Ray D. (Technical Monitor)
2003-01-01
Modern experimental and test activities demand innovative and adaptable procedures to maximize data content and quality while working within severely constrained budgetary and facility resource environments. This report describes development of a high accuracy angular measurement capability for NASA Langley Research Center hypersonic wind tunnel facilities to overcome these deficiencies. Specifically, utilization of micro-electro-mechanical sensors including accelerometers and gyros, coupled with software driven data acquisition hardware, integrated within a prototype measurement system, is considered. Development methodology addresses basic design requirements formulated from wind tunnel facility constraints and current operating procedures, as well as engineering and scientific test objectives. Description of the analytical framework governing relationships between time dependent multi-axis acceleration and angular rate sensor data and the desired three dimensional Eulerian angular state of the test model is given. Calibration procedures for identifying and estimating critical parameters in the sensor hardware is also addressed.
In situ impulse test: an experimental and analytical evaluation of data interpretation procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1975-08-01
Special experimental field testing and analytical studies were undertaken at Fort Lawton in Seattle, Washington, to study ''close-in'' wave propagation and evaluate data interpretation procedures for a new in situ impulse test. This test was developed to determine the shear wave velocity and dynamic modulus of soils underlying potential nuclear power plant sites. The test is different from conventional geophysical testing in that the velocity variation with strain is determined for each test. In general, strains between 10/sup -1/ and 10/sup -3/ percent are achieved. The experimental field work consisted of performing special tests in a large test sand fillmore » to obtain detailed ''close-in'' data. Six recording transducers were placed at various points on the energy source, while approximately 37 different transducers were installed within the soil fill, all within 7 feet of the energy source. Velocity measurements were then taken simultaneously under controlled test conditions to study shear wave propagation phenomenology and help evaluate data interpretation procedures. Typical test data are presented along with detailed descriptions of the results.« less
Citak, Demirhan; Tuzen, Mustafa; Soylak, Mustafa
2010-01-15
A speciation procedure based on the coprecipitation of manganese(II) with zirconium(IV) hydroxide has been developed for the investigation of levels of manganese species. The determination of manganese levels was performed by flame atomic absorption spectrometry (FAAS). Total manganese was determined after the reduction of Mn(VII) to Mn(II) by ascorbic acid. The analytical parameters including pH, amount of zirconium(IV), sample volume, etc., were investigated for the quantitative recoveries of manganese(II). The effects of matrix ions were also examined. The recoveries for manganese(II) were in the range of 95-98%. Preconcentration factor was calculated as 50. The detection limit for the analyte ions based on 3 sigma (n=21) was 0.75 microg L(-1) for Mn(II). The relative standard deviation was found to be lower than 7%. The validation of the presented procedure was performed by analysis of certified reference material having different matrices, NIST SRM 1515 (Apple Leaves) and NIST SRM 1568a (Rice Flour). The procedure was successfully applied to natural waters and food samples.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahn, M.; Kleinberg, J.
1977-09-01
The preparation of isotopes of the element, with selected procedures for its determination in or separation from various media is described along with the separating of iodine species from each other. Each part of the introductory section is referenced separately from the remainder of the monograph. For the preparative and analytical sections there is an extensive, indexed bibliography which was developed from the indexes of Volumes 19 to 30 inclusive (1965-1974) of Nuclear Science Abstracts (NSA). From these indexes the NSA abstracts of possible pertinent references were selected for examination and a choice was made of those references which weremore » to be included in the bibliography. The bibliography has both primary and secondary references. Although the monograph does not cover hot atom chemistry, the kinetics of exchange reactions, decay schemes, or physiological applications, papers in these areas were examined as possible sources of useful preparative and analytical procedures. (JRD)« less
Cardiac data mining (CDM); organization and predictive analytics on biomedical (cardiac) data
NASA Astrophysics Data System (ADS)
Bilal, M. Musa; Hussain, Masood; Basharat, Iqra; Fatima, Mamuna
2013-10-01
Data mining and data analytics has been of immense importance to many different fields as we witness the evolution of data sciences over recent years. Biostatistics and Medical Informatics has proved to be the foundation of many modern biological theories and analysis techniques. These are the fields which applies data mining practices along with statistical models to discover hidden trends from data that comprises of biological experiments or procedures on different entities. The objective of this research study is to develop a system for the efficient extraction, transformation and loading of such data from cardiologic procedure reports given by Armed Forces Institute of Cardiology. It also aims to devise a model for the predictive analysis and classification of this data to some important classes as required by cardiologists all around the world. This includes predicting patient impressions and other important features.
NASA Astrophysics Data System (ADS)
Mittal, R.; Rao, P.; Kaur, P.
2018-01-01
Elemental evaluations in scanty powdered material have been made using energy dispersive X-ray fluorescence (EDXRF) measurements, for which formulations along with specific procedure for sample target preparation have been developed. Fractional amount evaluation involves an itinerary of steps; (i) collection of elemental characteristic X-ray counts in EDXRF spectra recorded with different weights of material, (ii) search for linearity between X-ray counts and material weights, (iii) calculation of elemental fractions from the linear fit, and (iv) again linear fitting of calculated fractions with sample weights and its extrapolation to zero weight. Thus, elemental fractions at zero weight are free from material self absorption effects for incident and emitted photons. The analytical procedure after its verification with known synthetic samples of macro-nutrients, potassium and calcium, was used for wheat plant/ soil samples obtained from a pot experiment.
Analytical simulation of weld effects in creep range
NASA Technical Reports Server (NTRS)
Dhalla, A. K.
1985-01-01
The inelastic analysis procedure used to investigate the effect of welding on the creep rupture strength of a typical Liquid Metal Fast Breeder Reactor (LMFBR) nozzle is discussed. The current study is part of an overall experimental and analytical investigation to verify the inelastic analysis procedure now being used to design LMFBR structural components operating at elevated temperatures. Two important weld effects included in the numerical analysis are: (1) the residual stress introduced in the fabrication process; and (2) the time-independent and the time-dependent material property variations. Finite element inelastic analysis was performed on a CRAY-1S computer using the ABAQUS program with the constitutive equations developed for the design of LMFBR structural components. The predicted peak weld residual stresses relax by as much as 40% during elevated temperature operation, and their effect on creep-rupture cracking of the nozzle is considered of secondary importance.
Probabilistic seismic vulnerability and risk assessment of stone masonry structures
NASA Astrophysics Data System (ADS)
Abo El Ezz, Ahmad
Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.
Gao, Yan; Sun, Ying; Jiang, Chunzhu; Yu, Xi; Wang, Yuanpeng; Zhang, Hanqi; Song, Daqian
2013-01-01
An analytical method was developed for the extraction and determination of pyrethroid pesticide residues in tobacco. The modified QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method was applied for preparing samples. In this study, methyl cyanide (MeCN)-saturated salt aqueous was used as the two-phase extraction solvent for the first time, and a vortex shaker was used for the simultaneous shaking and concentration of the analytes. The effects of experimental parameters on extraction and clean-up efficiency were investigated and optimized. The analytes were determined by gas chromatography-mass spectrometry-selected ion monitoring (GC-MS-SIM). The obtained recoveries of the analytes at three different fortification levels were 76.85-114.1% and relative standard deviations (RSDs) were lower than 15.7%. The limits of quantification (LOQs) were from 1.28 to 26.6 μg kg(-1). This method was also applied to the analysis of actual commercial tobacco products and the analytical results were satisfactory.
Semantic False Memories in the Form of Derived Relational Intrusions Following Training
ERIC Educational Resources Information Center
Guinther, Paul M.; Dougher, Michael J.
2010-01-01
Contemporary behavior analytic research is making headway in characterizing memory phenomena that typically have been characterized by cognitive models, and the current study extends this development by producing "false memories" in the form of functional equivalence responding. A match-to-sample training procedure was administered in order to…
Chris A. Childers; Douglas D. Piirto
1989-01-01
Fire management has always meant fire suppression to the managers of the chaparral covered southern California National Forests. Today, Forest Service fire management programs must be cost effective, while wilderness fire management objectives are aimed at recreating natural fire regimes. A cost-effectiveness analysis has been developed to compare fire management...
Model transformations for state-space self-tuning control of multivariable stochastic systems
NASA Technical Reports Server (NTRS)
Shieh, Leang S.; Bao, Yuan L.; Coleman, Norman P.
1988-01-01
The design of self-tuning controllers for multivariable stochastic systems is considered analytically. A long-division technique for finding the similarity transformation matrix and transforming the estimated left MFD to the right MFD is developed; the derivation is given in detail, and the procedures involved are briefly characterized.
Geng, Zongyu; Yang, Feng; Chen, Xi; Wu, Nianqiang
2016-01-01
It remains a challenge to accurately calibrate a sensor subject to environmental drift. The calibration task for such a sensor is to quantify the relationship between the sensor’s response and its exposure condition, which is specified by not only the analyte concentration but also the environmental factors such as temperature and humidity. This work developed a Gaussian Process (GP)-based procedure for the efficient calibration of sensors in drifting environments. Adopted as the calibration model, GP is not only able to capture the possibly nonlinear relationship between the sensor responses and the various exposure-condition factors, but also able to provide valid statistical inference for uncertainty quantification of the target estimates (e.g., the estimated analyte concentration of an unknown environment). Built on GP’s inference ability, an experimental design method was developed to achieve efficient sampling of calibration data in a batch sequential manner. The resulting calibration procedure, which integrates the GP-based modeling and experimental design, was applied on a simulated chemiresistor sensor to demonstrate its effectiveness and its efficiency over the traditional method. PMID:26924894
SRC-I demonstration plant analytical laboratory methods manual. Final technical report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klusaritz, M.L.; Tewari, K.C.; Tiedge, W.F.
1983-03-01
This manual is a compilation of analytical procedures required for operation of a Solvent-Refined Coal (SRC-I) demonstration or commercial plant. Each method reproduced in full includes a detailed procedure, a list of equipment and reagents, safety precautions, and, where possible, a precision statement. Procedures for the laboratory's environmental and industrial hygiene modules are not included. Required American Society for Testing and Materials (ASTM) methods are cited, and ICRC's suggested modifications to these methods for handling coal-derived products are provided.
ERIC Educational Resources Information Center
Wang, Tianyou
2009-01-01
Holland and colleagues derived a formula for analytical standard error of equating using the delta-method for the kernel equating method. Extending their derivation, this article derives an analytical standard error of equating procedure for the conventional percentile rank-based equipercentile equating with log-linear smoothing. This procedure is…
Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek
2016-01-15
In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.
The Shock and Vibration Digest. Volume 15, Number 3
1983-03-01
High Temperature Gas-Cooled Reactor Core with Block-type Fuel (2nd Report: An Analytical Method of Two-dmentmnal Vibration of Interacting CohunM) T...Computer-aided techniquei, Detign techniquei A wite of computer programs hat been developed which allow« advanced fatigue analyiit procedures to be...valuei with those developed by bearing analysis computer programs were used to formulate an understanding of the mechanisms that induce ball skidding
On analytic modeling of lunar perturbations of artificial satellites of the earth
NASA Astrophysics Data System (ADS)
Lane, M. T.
1989-06-01
Two different procedures for analytically modeling the effects of the moon's direct gravitational force on artificial earth satellites are discussed from theoretical and numerical viewpoints. One is developed using classical series expansions of inclination and eccentricity for both the satellite and the moon, and the other employs the method of averaging. Both solutions are seen to have advantages, but it is shown that while the former is more accurate in special situations, the latter is quicker and more practical for the general orbit determination problem where observed data are used to correct the orbit in near real time.
NASA Technical Reports Server (NTRS)
Elishakoff, Isaac; Lin, Y. K.; Zhu, Li-Ping; Fang, Jian-Jie; Cai, G. Q.
1994-01-01
This report supplements a previous report of the same title submitted in June, 1992. It summarizes additional analytical techniques which have been developed for predicting the response of linear and nonlinear structures to noise excitations generated by large propulsion power plants. The report is divided into nine chapters. The first two deal with incomplete knowledge of boundary conditions of engineering structures. The incomplete knowledge is characterized by a convex set, and its diagnosis is formulated as a multi-hypothesis discrete decision-making algorithm with attendant criteria of adaptive termination.
MS-based analytical methodologies to characterize genetically modified crops.
García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro
2011-01-01
The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.
Dwyer, Johanna T.; Picciano, Mary Frances; Betz, Joseph M.; Fisher, Kenneth D.; Saldanha, Leila G.; Yetley, Elizabeth A.; Coates, Paul M.; Radimer, Kathy; Bindewald, Bernadette; Sharpless, Katherine E.; Holden, Joanne; Andrews, Karen; Zhao, Cuiwei; Harnly, James; Wolf, Wayne R.; Perry, Charles R.
2013-01-01
Several activities of the Office of Dietary Supplements (ODS) at the National Institutes of Health involve enhancement of dietary supplement databases. These include an initiative with US Department of Agriculture to develop an analytically substantiated dietary supplement ingredient database (DSID) and collaboration with the National Center for Health Statistics to enhance the dietary supplement label database in the National Health and Nutrition Examination Survey (NHANES). The many challenges that must be dealt with in developing an analytically supported DSID include categorizing product types in the database, identifying nutrients, and other components of public health interest in these products and prioritizing which will be entered in the database first. Additional tasks include developing methods and reference materials for quantifying the constituents, finding qualified laboratories to measure the constituents, developing appropriate sample handling procedures, and finally developing representative sampling plans. Developing the NHANES dietary supplement label database has other challenges such as collecting information on dietary supplement use from NHANES respondents, constant updating and refining of information obtained, developing default values that can be used if the respondent cannot supply the exact supplement or strength that was consumed, and developing a publicly available label database. Federal partners and the research community are assisting in making an analytically supported dietary supplement database a reality. PMID:25309034
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Recommendations for clinical biomarker specimen preservation and stability assessments.
Dakappagari, Naveen; Zhang, Hui; Stephen, Laurie; Amaravadi, Lakshmi; Khan, Masood U
2017-04-01
With the wide use of biomarkers to enable critical drug-development decisions, there is a growing concern from scientific community on the need for a 'standardized process' for ensuring biomarker specimen stability and hence, a strong desire to share best practices on preserving the integrity of biomarker specimens in clinical trials and the design of studies to evaluate analyte stability. By leveraging representative industry experience, we have attempted to provide an overview of critical aspects of biomarker specimen stability commonly encountered during clinical development, including: planning of clinical sample collection procedures, clinical site training, selection of sample preservation buffers, shipping logistics, fit-for-purpose stability assessments in the analytical laboratory and presentation of case studies covering widely utilized biomarker specimen types.
Analysis of high-aspect-ratio jet-flap wings of arbitrary geometry
NASA Technical Reports Server (NTRS)
Lissaman, P. B. S.
1973-01-01
An analytical technique to compute the performance of an arbitrary jet-flapped wing is developed. The solution technique is based on the method of Maskell and Spence in which the well-known lifting-line approach is coupled with an auxiliary equation providing the extra function needed in jet-flap theory. The present method is generalized to handle straight, uncambered wings of arbitrary planform, twist, and blowing (including unsymmetrical cases). An analytical procedure is developed for continuous variations in the above geometric data with special functions to exactly treat discontinuities in any of the geometric and blowing data. A rational theory for the effect of finite wing thickness is introduced as well as simplified concepts of effective aspect ratio for rapid estimation of performance.
Determination of Ivermectin in Medicated Feeds by Liquid Chromatography with Fluorescence Detection
2013-01-01
A labour- and time-effective analytical procedure for determination of ivermectin in medicated feed at recommended level of 2.0 mg kg−1 has been developed and validated. The analyte was extracted from grinded feed samples with acetonitrile and derivatisated with N-methylimidazole and trifluoracetic anhydride. The fluorescent derivatives were analysed by liquid chromatography method using C8 column. The isocratic conditions using acetonitrile, methanol, water, and tetrahydrofuran were applied. Fluorescence detection was performed at 365 nm (excitation) and 475 nm (emission) wavelengths. The total analysis time was 10 min. The validation results of the method (within-laboratory reproducibility 4.0% CV, mean recovery 100.1%) confirm the appropriate precision and accuracy of the developed method. PMID:24453835
Advanced Noise Abatement Procedures for a Supersonic Business Jet
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Jones, Scott M.; Seidel, Jonathan A.; Huff, Dennis L.
2017-01-01
Supersonic civil aircraft present a unique noise certification challenge. High specific thrust required for supersonic cruise results in high engine exhaust velocity and high levels of jet noise during takeoff. Aerodynamics of thin, low-aspect-ratio wings equipped with relatively simple flap systems deepen the challenge. Advanced noise abatement procedures have been proposed for supersonic aircraft. These procedures promise to reduce airport noise, but they may require departures from normal reference procedures defined in noise regulations. The subject of this report is a takeoff performance and noise assessment of a notional supersonic business jet. Analytical models of an airframe and a supersonic engine derived from a contemporary subsonic turbofan core are developed. These models are used to predict takeoff trajectories and noise. Results indicate advanced noise abatement takeoff procedures are helpful in reducing noise along lateral sidelines.
Procedures For Microbial-Ecology Laboratory
NASA Technical Reports Server (NTRS)
Huff, Timothy L.
1993-01-01
Microbial Ecology Laboratory Procedures Manual provides concise and well-defined instructions on routine technical procedures to be followed in microbiological laboratory to ensure safety, analytical control, and validity of results.
Rice, Michael; Gladstone, William; Weir, Michael
2004-01-01
We discuss how relational databases constitute an ideal framework for representing and analyzing large-scale genomic data sets in biology. As a case study, we describe a Drosophila splice-site database that we recently developed at Wesleyan University for use in research and teaching. The database stores data about splice sites computed by a custom algorithm using Drosophila cDNA transcripts and genomic DNA and supports a set of procedures for analyzing splice-site sequence space. A generic Web interface permits the execution of the procedures with a variety of parameter settings and also supports custom structured query language queries. Moreover, new analytical procedures can be added by updating special metatables in the database without altering the Web interface. The database provides a powerful setting for students to develop informatic thinking skills.
2004-01-01
We discuss how relational databases constitute an ideal framework for representing and analyzing large-scale genomic data sets in biology. As a case study, we describe a Drosophila splice-site database that we recently developed at Wesleyan University for use in research and teaching. The database stores data about splice sites computed by a custom algorithm using Drosophila cDNA transcripts and genomic DNA and supports a set of procedures for analyzing splice-site sequence space. A generic Web interface permits the execution of the procedures with a variety of parameter settings and also supports custom structured query language queries. Moreover, new analytical procedures can be added by updating special metatables in the database without altering the Web interface. The database provides a powerful setting for students to develop informatic thinking skills. PMID:15592597
NASA Technical Reports Server (NTRS)
Kvaternik, R. G.
1975-01-01
Two computational procedures for analyzing complex structural systems for their natural modes and frequencies of vibration are presented. Both procedures are based on a substructures methodology and both employ the finite-element stiffness method to model the constituent substructures. The first procedure is a direct method based on solving the eigenvalue problem associated with a finite-element representation of the complete structure. The second procedure is a component-mode synthesis scheme in which the vibration modes of the complete structure are synthesized from modes of substructures into which the structure is divided. The analytical basis of the methods contains a combination of features which enhance the generality of the procedures. The computational procedures exhibit a unique utilitarian character with respect to the versatility, computational convenience, and ease of computer implementation. The computational procedures were implemented in two special-purpose computer programs. The results of the application of these programs to several structural configurations are shown and comparisons are made with experiment.
SAM Companion Documents and Sample Collection Procedures provide information intended to complement the analytical methods listed in Selected Analytical Methods for Environmental Remediation and Recovery (SAM).
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Prioritizing pesticide compounds for analytical methods development
Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.
2012-01-01
The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1 compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1 by adding as many additional Tier 1 compounds as are analytically compatible. About 35 percent of the Tier 1 compounds for sediment are high priority on the basis of measured occurrence. A total of 74 compounds, or 42 percent, are high priority on the basis of predicted likelihood of occurrence according to physical-chemical properties, and either have potential toxicity to aquatic life, high pesticide useage, or both. The remaining 22 percent of Tier 1 pesticide compounds were either degradates of Tier 1 parent compounds or included for other reasons. As with water, the Tier 1 pesticide compounds for sediment are distributed across the major pesticide-use groups; insecticides and their degradates are the largest fraction, making up 45 percent of Tier 1. In contrast to water, organochlorines, at 17 percent, are the largest chemical class for Tier 1 in sediment, which is to be expected because there is continued widespread detection in sediments of persistent organochlorine pesticides and their degradates at concentrations high enough for potential effects on aquatic life. Compared to water, there are fewer available benchmarks with which to compare contaminant concentrations in sediment, but a total of 19 Tier 1 compounds have at least one sediment benchmark or screening value for aquatic organisms. Of the 175 compounds in Tier 1, 77 percent have high aquatic-life toxicity, as defined for this process. This evaluation of pesticides and degradates resulted in two lists of compounds that are priorities for USGS analytical methods development, one for water and one for sediment. These lists will be used as the basis for redesigning and enhancing USGS analytical capabilities for pesticides in order to capture as many high-priority pesticide compounds as possible using an economically feasible approach.
NASA Technical Reports Server (NTRS)
Bodley, C. S.; Devers, A. D.; Park, A. C.
1975-01-01
Analytical procedures and digital computer code are presented for the dynamic analysis of a flexible spacecraft with rotating components. Topics, considered include: (1) nonlinear response in the time domain, and (2) linear response in the frequency domain. The spacecraft is assumed to consist of an assembly of connected rigid or flexible subassemblies. The total system is not restricted to a topological connection arrangement and may be acting under the influence of passive or active control systems and external environments. The analytics and associated digital code provide the user with the capability to establish spacecraft system nonlinear total response for specified initial conditions, linear perturbation response about a calculated or specified nominal motion, general frequency response and graphical display, and spacecraft system stability analysis.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Formulation of blade-flutter spectral analyses in stationary reference frame
NASA Technical Reports Server (NTRS)
Kurkov, A. P.
1984-01-01
Analytic representations are developed for the discrete blade deflection and the continuous tip static pressure fields in a stationary reference frame. Considered are the sampling rates equal to the rotational frequency, equal to blade passing frequency, and for the pressure, equal to a multiple of the blade passing frequency. For the last two rates the expressions for determining the nodal diameters from the spectra are included. A procedure is presented for transforming the complete unsteady pressure field into a rotating frame of reference. The determination of the true flutter frequency by using two sensors is described. To illustrate their use, the developed procedures are used to interpret selected experimental results.
NASA Technical Reports Server (NTRS)
Hayes, J. D.
1972-01-01
The feasibility of monitoring volatile contaminants in a large space simulation chamber using techniques of internal reflection spectroscopy was demonstrated analytically and experimentally. The infrared spectral region was selected as the operational spectral range in order to provide unique identification of the contaminants along with sufficient sensitivity to detect trace contaminant concentrations. It was determined theoretically that a monolayer of the contaminants could be detected and identified using optimized experimental procedures. This ability was verified experimentally. Procedures were developed to correct the attenuated total reflectance spectra for thick sample distortion. However, by using two different element designs the need for such correction can be avoided.
Source-term development for a contaminant plume for use by multimedia risk assessment models
NASA Astrophysics Data System (ADS)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.
2000-02-01
Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.
Sokoliess, Torsten; Köller, Gerhard
2005-06-01
A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.
Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek
2018-02-01
Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
William D. Smith; Barbara L. Conkling
2004-01-01
This report focuses on the Forest Health Monitoring Programâs development and use of analytical procedures for monitoring changes in forest health and for expressing the corresponding statistical confidences. The programâs assessments of long-term status, changes, and trends in forest ecosystem health use the Santiago Declaration: âCriteria and Indicators for the...
ERIC Educational Resources Information Center
Obradovic, Jelena; Yousafzai, Aisha K.; Finch, Jenna E.; Rasheed, Muneera A.
2016-01-01
This study contributes to the understanding of how early parenting interventions implemented in low- and middle-income countries during the first 2 years of children's lives are sustained longitudinally to promote cognitive skills in preschoolers. We employed path analytic procedures to examine 2 family processes--the quality of home stimulation…
USDA-ARS?s Scientific Manuscript database
A constraint to growth of the commercial humic products industry has been the lack of a widely accepted procedure for determining humic acid and fulvic acid concentrations of the products, which has raised regulatory issues. On behalf of the U.S.-based Humic Products Trade Association, we developed ...
ERIC Educational Resources Information Center
Han, Jisu; Neuharth-Pritchett, Stacey
2015-01-01
This study examined interactions between preschool children and parents during shared book reading by analyzing parental self-report data. Using confirmatory factor analytic procedures and structural equation modeling, this study developed a scale measuring meaning-related and print-related reading interactions and examined their associations with…
A Factor Analytic Study of a Scale Designed to Measure Death Anxiety.
ERIC Educational Resources Information Center
Thorson, James A.; Perkins, Mark
A death anxiety scale developed in 1973 by Nehrke was administered to 655 adult subjects. Their responses were differentiated according to age, sex, race, and level of education. Data were also analyzed using the varimax rotated factor matrix procedure to determine significant factors that the scale was, in fact, measuring. Loadings on four…
QSPR studies on the photoinduced-fluorescence behaviour of pharmaceuticals and pesticides.
López-Malo, D; Bueso-Bordils, J I; Duart, M J; Alemán-López, P A; Martín-Algarra, R V; Antón-Fos, G M; Lahuerta-Zamora, L; Martínez-Calatayud, J
2017-07-01
Fluorimetric analysis is still a growing line of research in the determination of a wide range of organic compounds, including pharmaceuticals and pesticides, which makes necessary the development of new strategies aimed at improving the performance of fluorescence determinations as well as the sensitivity and, especially, the selectivity of the newly developed analytical methods. In this paper are presented applications of a useful and growing tool suitable for fostering and improving research in the analytical field. Experimental screening, molecular connectivity and discriminant analysis are applied to organic compounds to predict their fluorescent behaviour after their photodegradation by UV irradiation in a continuous flow manifold (multicommutation flow assembly). The screening was based on online fluorimetric measurement and comprised pre-selected compounds with different molecular structures (pharmaceuticals and some pesticides with known 'native' fluorescent behaviour) to study their changes in fluorescent behaviour after UV irradiation. Theoretical predictions agree with the results from the experimental screening and could be used to develop selective analytical methods, as well as helping to reduce the need for expensive, time-consuming and trial-and-error screening procedures.
Design Criteria for Low Profile Flange Calculations
NASA Technical Reports Server (NTRS)
Leimbach, K. R.
1973-01-01
An analytical method and a design procedure to develop flanged separable pipe connectors are discussed. A previously established algorithm is the basis for calculating low profile flanges. The characteristics and advantages of the low profile flange are analyzed. The use of aluminum, titanium, and plastics for flange materials is described. Mathematical models are developed to show the mechanical properties of various flange configurations. A computer program for determining the structural stability of the flanges is described.
Pellegrino Vidal, Rocío B; Allegrini, Franco; Olivieri, Alejandro C
2018-03-20
Multivariate curve resolution-alternating least-squares (MCR-ALS) is the model of choice when dealing with some non-trilinear arrays, specifically when the data are of chromatographic origin. To drive the iterative procedure to chemically interpretable solutions, the use of constraints becomes essential. In this work, both simulated and experimental data have been analyzed by MCR-ALS, applying chemically reasonable constraints, and investigating the relationship between selectivity, analytical sensitivity (γ) and root mean square error of prediction (RMSEP). As the selectivity in the instrumental modes decreases, the estimated values for γ did not fully represent the predictive model capabilities, judged from the obtained RMSEP values. Since the available sensitivity expressions have been developed by error propagation theory in unconstrained systems, there is a need of developing new expressions or analytical indicators. They should not only consider the specific profiles retrieved by MCR-ALS, but also the constraints under which the latter ones have been obtained. Copyright © 2017 Elsevier B.V. All rights reserved.
Applications of computer algebra to distributed parameter systems
NASA Technical Reports Server (NTRS)
Storch, Joel A.
1993-01-01
In the analysis of vibrations of continuous elastic systems, one often encounters complicated transcendental equations with roots directly related to the system's natural frequencies. Typically, these equations contain system parameters whose values must be specified before a numerical solution can be obtained. The present paper presents a method whereby the fundamental frequency can be obtained in analytical form to any desired degree of accuracy. The method is based upon truncation of rapidly converging series involving inverse powers of the system natural frequencies. A straightforward method to developing these series and summing them in closed form is presented. It is demonstrated how Computer Algebra can be exploited to perform the intricate analytical procedures which otherwise would render the technique difficult to apply in practice. We illustrate the method by developing two analytical approximations to the fundamental frequency of a vibrating cantilever carrying a rigid tip body. The results are compared to the numerical solution of the exact (transcendental) frequency equation over a range of system parameters.
Experimental performance and acoustic investigation of modern, counterrotating blade concepts
NASA Technical Reports Server (NTRS)
Hoff, G. E.
1990-01-01
The aerodynamic, acoustic, and aeromechanical performance of counterrotating blade concepts were evaluated both theoretically and experimentally. Analytical methods development and design are addressed. Utilizing the analytical methods which evolved during the conduct of this work, aerodynamic and aeroacoustic predictions were developed, which were compared to NASA and GE wind tunnel test results. The detailed mechanical design and fabrication of five different composite shell/titanium spar counterrotating blade set configurations are presented. Design philosophy, analyses methods, and material geometry are addressed, as well as the influence of aerodynamics, aeromechanics, and aeroacoustics on the design procedures. Blade fabrication and quality control procedures are detailed; bench testing procedures and results of blade integrity verification are presented; and instrumentation associated with the bench testing also is identified. Additional hardware to support specialized testing is described, as are operating blade instrumentation and the associated stress limits. The five counterrotating blade concepts were scaled to a tip diameter of 2 feet, so they could be incorporated into MPS (model propulsion simulators). Aerodynamic and aeroacoustic performance testing was conducted in the NASA Lewis 8 x 6 supersonic and 9 x 15 V/STOL (vertical or short takeoff and landing) wind tunnels and in the GE freejet anechoic test chamber (Cell 41) to generate an experimental data base for these counterrotating blade designs. Test facility and MPS vehicle matrices are provided, and test procedures are presented. Effects on performance of rotor-to-rotor spacing, angle-of-attack, pylon proximity, blade number, reduced-diameter aft blades, and mismatched rotor speeds are addressed. Counterrotating blade and specialized aeromechanical hub stability test results are also furnished.
Sverko, Ed
2006-01-01
Analytical methods for the analysis of polychlorinated biphenyls (PCBs) and organochlorine pesticides (OCPs) are widely available and are the result of a vast amount of environmental analytical method development and research on persistent organic pollutants (POPs) over the past 30–40 years. This review summarizes procedures and examines new approaches for extraction, isolation, identification and quantification of individual congeners/isomers of the PCBs and OCPs. Critical to the successful application of this methodology is the collection, preparation, and storage of samples, as well as specific quality control and reporting criteria, and therefore these are also discussed. With the signing of the Stockholm convention on POPs and the development of global monitoring programs, there is an increased need for laboratories in developing countries to determine PCBs and OCPs. Thus, while this review attempts to summarize the current best practices for analysis of PCBs and OCPs, a major focus is the need for low-cost methods that can be easily implemented in developing countries. A “performance based” process is described whereby individual laboratories can adapt methods best suited to their situations. Access to modern capillary gas chromatography (GC) equipment with either electron capture or low-resolution mass spectrometry (MS) detection to separate and quantify OCP/PCBs is essential. However, screening of samples, especially in areas of known use of OCPs or PCBs, could be accomplished with bioanalytical methods such as specific commercially available enzyme-linked immunoabsorbent assays and thus this topic is also reviewed. New analytical techniques such two-dimensional GC (2D-GC) and “fast GC” using GC–ECD may be well-suited for broader use in routine PCB/OCP analysis in the near future given their relatively low costs and ability to provide high-resolution separations of PCB/OCPs. Procedures with low environmental impact (SPME, microscale, low solvent use, etc.) are increasingly being used and may be particularly suited to developing countries. Electronic supplementary material Supplementary material is available in the online version of this article at http://dx.doi.org/10.1007/s00216-006-0765-y and is accessible for authorized users. PMID:17047943
Guided-inquiry laboratory experiments to improve students' analytical thinking skills
NASA Astrophysics Data System (ADS)
Wahyuni, Tutik S.; Analita, Rizki N.
2017-12-01
This study aims to improve the experiment implementation quality and analytical thinking skills of undergraduate students through guided-inquiry laboratory experiments. This study was a classroom action research conducted in three cycles. The study has been carried out with 38 undergraduate students of the second semester of Biology Education Department of State Islamic Institute (SII) of Tulungagung, as a part of Chemistry for Biology course. The research instruments were lesson plans, learning observation sheets and undergraduate students' experimental procedure. Research data were analyzed using quantitative-descriptive method. The increasing of analytical thinking skills could be measured using gain score normalized and statistical paired t-test. The results showed that guided-inquiry laboratory experiments model was able to improve both the experiment implementation quality and the analytical thinking skills. N-gain score of the analytical thinking skills was increased, in spite of just 0.03 with low increase category, indicated by experimental reports. Some of undergraduate students have had the difficulties in detecting the relation of one part to another and to an overall structure. The findings suggested that giving feedback the procedural knowledge and experimental reports were important. Revising the experimental procedure that completed by some scaffolding questions were also needed.
Remane, Daniela; Wissenbach, Dirk K; Meyer, Markus R; Maurer, Hans H
2010-04-15
In clinical and forensic toxicology, multi-analyte procedures are very useful to quantify drugs and poisons of different classes in one run. For liquid chromatographic/tandem mass spectrometric (LC/MS/MS) multi-analyte procedures, often only a limited number of stable-isotope-labeled internal standards (SIL-ISs) are available. If an SIL-IS is used for quantification of other analytes, it must be excluded that the co-eluting native analyte influences its ionization. Therefore, the effect of ion suppression and enhancement of fourteen SIL-ISs caused by their native analogues has been studied. It could be shown that the native analyte concentration influenced the extent of ion suppression and enhancement effects leading to more suppression with increasing analyte concentration especially when electrospray ionization (ESI) was used. Using atmospheric-pressure chemical ionization (APCI), methanolic solution showed mainly enhancement effects, whereas no ion suppression and enhancement effect, with one exception, occurred when plasma extracts were used under these conditions. Such differences were not observed using ESI. With ESI, eleven SIL-ISs showed relevant suppression effects, but only one analyte showed suppression effects when APCI was used. The presented study showed that ion suppression and enhancement tests using matrix-based samples of different sources are essential for the selection of ISs, particularly if used for several analytes to avoid incorrect quantification. In conclusion, only SIL-ISs should be selected for which no suppression and enhancement effects can be observed. If not enough ISs are free of ionization interferences, a different ionization technique should be considered. 2010 John Wiley & Sons, Ltd.
Parametric study of minimum reactor mass in energy-storage dc-to-dc converters
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1981-01-01
Closed-form analytical solutions for the design equations of a minimum-mass reactor for a two-winding voltage-or-current step-up converter are derived. A quantitative relationship between the three parameters - minimum total reactor mass, maximum output power, and switching frequency - is extracted from these analytical solutions. The validity of the closed-form solution is verified by a numerical minimization procedure. A computer-aided design procedure using commercially available toroidal cores and magnet wires is also used to examine how the results from practical designs follow the predictions of the analytical solutions.
X-Graphs: Language and Algorithms for Heterogeneous Graph Streams
2017-09-01
INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d
Gao, Le; Li, Jian; Wu, Yandan; Yu, Miaohao; Chen, Tian; Shi, Zhixiong; Zhou, Xianqing; Sun, Zhiwei
2016-11-01
Two simple and efficient pretreatment procedures have been developed for the simultaneous extraction and cleanup of six novel brominated flame retardants (NBFRs) and eight common polybrominated diphenyl ethers (PBDEs) in human serum. The first sample pretreatment procedure was a quick, easy, cheap, effective, rugged, and safe (QuEChERS)-based approach. An acetone/hexane mixture was employed to isolate the lipid and analytes from the serum with a combination of MgSO 4 and NaCl, followed by a dispersive solid-phase extraction (d-SPE) step using C18 particles as a sorbent. The second sample pretreatment procedure was based on solid-phase extraction. The sample extraction and cleanup were conducted directly on an Oasis HLB SPE column using 5 % aqueous isopropanol, concentrated sulfuric acid, and 10 % aqueous methanol, followed by elution with dichloromethane. The NBFRs and PBDEs were then detected using gas chromatography-negative chemical ionization mass spectrometry (GC-NCI MS). The methods were assessed for repeatability, accuracy, selectivity, limits of detection (LODs), and linearity. The results of spike recovery experiments in fetal bovine serum showed that average recoveries ranged from 77.9 % to 128.8 % with relative standard deviations (RSDs) from 0.73 % to 12.37 % for most of the analytes. The LODs for the analytes in fetal bovine serum ranged from 0.3 to 50.8 pg/mL except for decabromodiphenyl ethane. The proposed method was successfully applied to the determination of the 14 brominated flame retardants in human serum. The two pretreatment procedures described here are simple, accurate, and precise, and are suitable for the routine analysis of human serum. Graphical Abstract Workflow of a QuEChERS-based approach (top) and an SPE-based approach (bottom) for the detection of PBDEs and NBFRs in serum.
NASA Technical Reports Server (NTRS)
Burrows, Leroy T.
1993-01-01
During the 1960's over 30 full-scale aircraft crash tests were conducted by the Flight Safety Foundation under contract to the Aviation Applied Technology Directorate (AATD) of the U.S. Army Aviation Systems Command (AVSCOM). The purpose of these tests were to conduct crash injury investigations that would provide a basis for the formulation of sound crash resistance design criteria for light fixed-wing and rotary wing aircraft. This resulted in the Crash Survival Design Criteria Designer's Guide which was first published in 1967 and has been revised numerous times, the last being in 1989. Full-scale aircraft crash testing is an expensive way to investigate structural deformations of occupied spaces and to determine the decelerative loadings experienced by occupants in a crash. This gave initial impetus to the U.S. Army to develop analytical methods to predict the dynamic response of aircraft structures in a crash. It was believed that such analytical tools could be very useful in the preliminary design stage of a new helicopter system which is required to demonstrate a level of crash resistance and had to be more cost effective than full-scale crash tests or numerous component design support tests. From an economic point of view, it is more efficient to optimize for the incorporation of crash resistance features early in the design stage. However, during preliminary design it is doubtful if sufficient design details, which influence the exact plastic deformation shape of structural elements, will be available. The availability of simple procedures to predict energy absorption and load-deformation characteristics will allow the designer to initiate valuable cost, weight, and geometry tradeoff studies. The development of these procedures will require some testing of typical specimens. This testing should, as a minimum, verify the validity of proposed procedures for providing pertinent nonlinear load-deformation data. It was hoped that through the use of these analytical models, the designer could optimize aircraft design for crash resistance from both a weight and cost increment standpoint, thus enhancing the acceptance of the design criteria for crash resistance.
Mazzarino, Monica; Cesarei, Lorenzo; de la Torre, Xavier; Fiacco, Ilaria; Robach, Paul; Botrè, Francesco
2016-01-05
This work presents an analytical method for the simultaneous analysis in human urine of 38 pharmacologically active compounds (19 benzodiazepine-like substances, 7 selective serotonin reuptake inhibitors, 4 azole antifungal drugs, 5 inhibitors of the phosphodiesterases type 4 and 3 inhibitors of the phosphodiesterase type 5) by liquid-chromatography coupled with tandem mass spectrometry. The above substances classes include both the most common "non banned" drugs used by the athletes (based on the information reported on the "doping control form") and those drugs who are suspected to be performance enhancing and/or act as masking agents in particular conditions. The chromatographic separation was performed by a reverse-phase octadecyl column using as mobile phases acetonitrile and ultra-purified water, both with 0.1% formic acid. The detection was carried out using a triple quadrupole mass spectrometric analyser, positive electro-spray as ionization source and selected reaction monitoring as acquisition mode. Sample pre-treatment consisted in an enzymatic hydrolysis followed by a liquid-liquid extraction in neutral field using tert-butyl methyl-ether. The analytical procedure, once developed, was validated in terms of sensitivity (lower limits of detection in the range of 1-50 ng mL(-1)), specificity (no interferences were detected at the retention time of all the analytes under investigation), recovery (≥60% with a satisfactory repeatability, CV % lower than 10), matrix effect (lower than 30%) and reproducibility of retention times (CV% lower than 0.1) and of relative abundances (CV% lower than 15). The performance and the applicability of the method was evaluated by analyzing real samples containing benzodiazepines (alprazolam, diazepam, zolpidem or zoplicone) or inhibitors of the phosphodiesterases type 5 (sildenafil or vardenafil) and samples obtained incubating two of the phosphodiesterases type 4 studied (cilomilast or roflumilast) with pooled human liver microsomes. All the parent compounds, together with their main phase I metabolites, were clearly detected using the analytical procedures here developed. Copyright © 2015 Elsevier B.V. All rights reserved.
Microchip integrating magnetic nanoparticles for allergy diagnosis.
Teste, Bruno; Malloggi, Florent; Siaugue, Jean-Michel; Varenne, Anne; Kanoufi, Frederic; Descroix, Stéphanie
2011-12-21
We report on the development of a simple and easy to use microchip dedicated to allergy diagnosis. This microchip combines both the advantages of homogeneous immunoassays i.e. species diffusion and heterogeneous immunoassays i.e. easy separation and preconcentration steps. In vitro allergy diagnosis is based on specific Immunoglobulin E (IgE) quantitation, in that way we have developed and integrated magnetic core-shell nanoparticles (MCSNPs) as an IgE capture nanoplatform in a microdevice taking benefit from both their magnetic and colloidal properties. Integrating such immunosupport allows to perform the target analyte (IgE) capture in the colloidal phase thus increasing the analyte capture kinetics since both immunological partners are diffusing during the immune reaction. This colloidal approach improves 1000 times the analyte capture kinetics compared to conventional methods. Moreover, based on the MCSNPs' magnetic properties and on the magnetic chamber we have previously developed the MCSNPs and therefore the target can be confined and preconcentrated within the microdevice prior to the detection step. The MCSNPs preconcentration factor achieved was about 35,000 and allows to reach high sensitivity thus avoiding catalytic amplification during the detection step. The developed microchip offers many advantages: the analytical procedure was fully integrated on-chip, analyses were performed in short assay time (20 min), the sample and reagents consumption was reduced to few microlitres (5 μL) while a low limit of detection can be achieved (about 1 ng mL(-1)).
Mantovani, Cínthia de Carvalho; Lima, Marcela Bittar; Oliveira, Carolina Dizioli Rodrigues de; Menck, Rafael de Almeida; Diniz, Edna Maria de Albuquerque; Yonamine, Mauricio
2014-04-15
A method using accelerated solvent extraction (ASE) for the isolation of cocaine/crack biomarkers in meconium samples, followed by solid phase extraction (SPE) and the simultaneous quantification by gas chromatography-mass spectrometry (GC-MS) was developed and validated. Initially, meconium samples were submitted to an ASE procedure, which was followed by SPE with Bond Elut Certify I cartridges. The analytes were derivatizated with PFP/PFPA and analyzed by GC-MS. The limits of detection (LOD) were between 11 and 17ng/g for all analytes. The limits of quantification (LOQ) were 30ng/g for anhydroecgonine methyl ester, and 20ng/g for cocaine, benzoylecgonine, ecgonine methyl ester and cocaethylene. Linearity ranged from the LOQ to 1500ng/g for all analytes, with a coefficients of determination greater than 0.991, except for m-hydroxybenzoylecgonine, which was only qualitatively detected. Precision and accuracy were evaluated at three concentration levels. For all analytes, inter-assay precision ranged from 3.2 to 18.1%, and intra-assay precision did not exceed 12.7%. The accuracy results were between 84.5 and 114.2% and the average recovery ranged from 17 to 84%. The method was applied to 342 meconium samples randomly collected in the University Hospital-University of São Paulo (HU-USP), Brazil. Cocaine biomarkers were detected in 19 samples, which represent 5.6% of exposure prevalence. Significantly lower birth weight, length and head circumference were found for the exposed newborns compared with the non-exposed group. This is the first report in which ASE was used as a sample preparation technique to extract cocaine biomarkers from a complex biological matrix such as meconium samples. The advantages of the developed method are the smaller demand for organic solvents and the minor sample handling, which allows a faster and accurate procedure, appropriate to confirm fetal exposure to cocaine/crack. Copyright © 2014 Elsevier B.V. All rights reserved.
Caprock Breach: A Threat to Secure Geologic Sequestration
NASA Astrophysics Data System (ADS)
Selvadurai, A. P.; Dong, W.
2013-12-01
The integrity of caprock in providing a reliable barrier is crucial to several environmental geosciences endeavours related to geologic sequestration of CO2, deep geologic disposal of hazardous wastes and contaminants. The integrity of geologic barriers can be compromised by several factors. The re-activation of dormant fractures and development of new fractures in the caprock during the injection process are regarded as effects that can pose a threat to storage security. Other poromechanical influences of pore structure collapse due to chemically induced erosion of the porous fabric resulting in worm-hole type features can also contribute to compromising storage security. The assessment of the rate of steady or transient seepage through defects in the caprock can allow geoscientists to make prudent evaluations of the effectiveness of a sequestration strategy. While complicated computational simulations can be used to calculate leakage through defects, it is useful to explore alternative analytical results that could be used in providing preliminary estimates of leakage rates through defects in the caprock in a storage setting. The relevance of such developments is underscored by the fact that the permeability characteristics of the storage formation, the fracture and the surficial rocks overlying the caprock can rarely be quantified with certainty. This paper presents the problem of a crack in a caprock that connects to a storage formation and an overburden rock or surficial soil formation. The geologic media are maintained at constant far-field flow potentials and leakage takes place at either steady or transient conditions. The paper develops an analytical result that can be used to estimate the steady seepage through the crack. The analytical result can also be used to estimate the leakage through hydraulically non-intersecting cracks and leakage from caprock-well casing interfaces. The analytical result is used to estimate the accuracy of a computational procedure based on a finite element procedure.
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
7 CFR 91.23 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROCEDURES Equipment, Fuel, and Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and analytical gases. This section addresses emission sampling and analytical equipment, test fuels, and analytical gases. (b) The provisions of 40 CFR part 1065...
Analysis of the Effects of Surface Pitting and Wear on the Vibrations of a Gear Transmission System
NASA Technical Reports Server (NTRS)
Choy, F. K.; Polyshchuk, V.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.
1994-01-01
A comprehensive procedure to simulate and analyze the vibrations in a gear transmission system with surface pitting, 'wear' and partial tooth fracture of the gear teeth is presented. An analytical model was developed where the effects of surface pitting and wear of the gear tooth were simulated by phase and magnitude changes in the gear mesh stiffness. Changes in the gear mesh stiffness were incorporated into each gear-shaft model during the global dynamic simulation of the system. The overall dynamics of the system were evaluated by solving for the transient dynamics of each shaft system simultaneously with the vibration of the gearbox structure. In order to reduce the number of degrees-of-freedom in the system, a modal synthesis procedure was used in the global transient dynamic analysis of the overall transmission system. An FFT procedure was used to transform the averaged time signal into the frequency domain for signature analysis. In addition, the Wigner-Ville distribution was also introduced to examine the gear vibration in the joint time frequency domain for vibration pattern recognition. Experimental results obtained from a gear fatigue test rig at NASA Lewis Research Center were used to evaluate the analytical model.
NASA Astrophysics Data System (ADS)
Kolecki, J.
2015-12-01
The Bundlab software has been developed mainly for academic and research application. This work can be treated as a kind of a report describing the current state of the development of this computer program, focusing especially on the analytical solutions. Firstly, the overall characteristics of the software are provided. Then the description of the image orientation procedure starting from the relative orientation is addressed. The applied solution is based on the coplanarity equation parametrized with the essential matrix. The problem is reformulated in order to solve it using methods of algebraic geometry. The solution is followed by the optimization involving the least square criterion. The formation of the image block from the oriented models as well as the absolute orientation procedure were implemented using the Horn approach as a base algorithm. The second part of the paper is devoted to the tools and methods applied in the stereo digitization module. The solutions that support the user and improve the accuracy are given. Within the paper a few exemplary applications and products are mentioned. The work finishes with the concepts of development and improvements of existing functions.
Shih, Tsung-Ting; Hsieh, Cheng-Chuan; Luo, Yu-Ting; Su, Yi-An; Chen, Ping-Hung; Chuang, Yu-Chen; Sun, Yuh-Chang
2016-04-15
Herein, a hyphenated system combining a high-throughput solid-phase extraction (htSPE) microchip with inductively coupled plasma-mass spectrometry (ICP-MS) for rapid determination of trace heavy metals was developed. Rather than performing multiple analyses in parallel for the enhancement of analytical throughput, we improved the processing speed for individual samples by increasing the operation flow rate during SPE procedures. To this end, an innovative device combining a micromixer and a multi-channeled extraction unit was designed. Furthermore, a programmable valve manifold was used to interface the developed microchip and ICP-MS instrumentation in order to fully automate the system, leading to a dramatic reduction in operation time and human error. Under the optimized operation conditions for the established system, detection limits of 1.64-42.54 ng L(-1) for the analyte ions were achieved. Validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Each analysis could be readily accomplished within just 186 s using the established system. This represents, to the best of our knowledge, an unprecedented speed for the analysis of trace heavy metal ions. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantification of astaxanthin in shrimp waste hydrolysate by HPLC.
López-Cervantes, J; Sánchez-Machado, D I; Gutiérrez-Coronado, M A; Ríos-Vázquez, N J
2006-10-01
In the present study, a simple and rapid reversed-phase HPLC method for the determination of astaxanthin in shrimp waste hydrolysate has been developed and validated. The analytical procedure involves the direct extraction of astaxanthin from the lipid fraction with methanol. The analytical column, SS Exil ODS, was operated at 25C. The mobile phase consisted of a mixture of water:methanol:dichloromethane:acetonitrile (4.5:28:22:45.5 v/v/v/v) at a flow rate of 1.0 mL/min. Detection and identification were performed using a photodiode array detector (lambda(detection) = 476 nm). The proposed HPLC method showed adequate linearity, repeatability and accuracy.
NASA Astrophysics Data System (ADS)
Muna, E. D. M.; Pereira, R. P.
2016-07-01
The determination of the volatile organic solvents dichloromethane (DCM), methyl isobutyl ketone (MIBK), tetrahydrofuran (THF) and toluene (TOL) is applied on toxicological monitoring of employees in various industrial activities. The gas chromatography technique with flame ionization detector and headspace injection system has been applied. The analytical procedure developed allows the simultaneous determination of the above-mentioned solvents and the accuracy of the method was tested following the INMETRO guidelines through the DOQ-CGRE 008 Rev.04-July/2011.
NASA Technical Reports Server (NTRS)
Giles, G. L.; Rogers, J. L., Jr.
1982-01-01
The methodology used to implement structural sensitivity calculations into a major, general-purpose finite-element analysis system (SPAR) is described. This implementation includes a generalized method for specifying element cross-sectional dimensions as design variables that can be used in analytically calculating derivatives of output quantities from static stress, vibration, and buckling analyses for both membrane and bending elements. Limited sample results for static displacements and stresses are presented to indicate the advantages of analytically calculating response derivatives compared to finite difference methods. Continuing developments to implement these procedures into an enhanced version of SPAR are also discussed.
Weber, Berthold; Hartmann, Beate; Stöckigt, Detlef; Schreiber, Klaus; Roloff, Michael; Bertram, Heinz-Jürgen; Schmidt, Claus O
2006-01-25
Liquid chromatography/mass spectrometry and liquid chromatography/nuclear magnetic resonance techniques with ultraviolet/diode array detection were used as complementary analytical tools for the reliable identification of polymethoxylated flavones in residues from molecular distillation of cold-pressed peel oils of Citrus sinensis. After development of a liquid chromatographic separation procedure, the presence of several polymethoxy flavones such as sinensetin, nobiletin, tangeretin, quercetogetin, heptamethoxyflavone, and other derivatives was unambiguously confirmed. In addition, proceranone, an acetylated tetranortriterpenoid with limonoid structure, was identified for the first time in citrus.
Delay-tunable gap-soliton-based slow-light system
NASA Astrophysics Data System (ADS)
Mok, Joe T.; de Sterke, C. Martijn; Eggleton, Benjamin J.
2006-12-01
We numerically and analytically evaluate the delay of solitons propagating slowly, and without broadening, in an apodized Bragg grating. Simulations indicate that a 100 mm Bragg grating with Δn = 10-3 can delay sub-nanosecond pulses by nearly 20 pulse widths without any change in the output pulse width. Delay tunability is achieved by simultaneously adjusting the launch power and detuning. A simple analytic model is developed to describe the monotonic dependence of delay on Δn and compared with simulations. As the intensity may be greatly enhanced due to a reduced velocity, a procedure for improving the delay while avoiding material damage is outlined.
Harik-Khan, R; Moats, W A
1995-01-01
A procedure for identifying and quantitating violative beta-lactams in milk is described. This procedure integrates beta-lactam residue detection kits with the multiresidue automated liquid chromatographic (LC) cleanup method developed in our laboratory. Spiked milk was deproteinized, extracted, and subjected to reversed-phase LC using a gradient program that concentrated the beta-lactams. Amoxicillin, ampicillin, cephapirin, ceftiofur, cloxacillin, and penicillin G were, thus, separated into 5 fractions that were subsequently tested for activity by using 4 kits. beta-lactams in the positive fractions were quantitated by analytical LC methods developed in our laboratory. The LC cleanup method separated beta-lactam antibiotics from each other and from interferences in the matrix and also concentrated the antibiotics, thus increasing the sensitivity of the kits to the beta-lactam antibiotics. The procedure facilitated the task of identifying and measuring the beta-lactam antibiotics that may be present in milk samples.
Serpe, F P; Russo, R; Ambrosio, L; Esposito, M; Severino, L
2013-06-01
European Commission Regulation 882/2004/EC requires that official control laboratories for foodstuffs in the member states are certified according to UNI EN ISO/IEC 17025:2005 (general requirement for the competence of calibration and testing laboratories). This mandatory requirement has resulted in a continuous adaptation and development of analytical procedures. The aim of this study was to develop a method for semiquantitative screening of polychlorinated biphenyls in fish for human consumption. According to the Commission Decision 657/2002/CE, the detection capability, the precision, the selectivity-specificity, and applicability-ruggedness-stability were determined to validate the method. Moreover, trueness was verified. This procedure resulted in rapid execution, which allowed immediate and effective intervention by the local health authorities to protect the health of consumers. Finally, the procedure has been recognized by the Italian accrediting body, ACCREDIA.
Precession and circularization of elliptical space-tether motion
NASA Technical Reports Server (NTRS)
Chapel, Jim D.; Grosserode, Patrick
1993-01-01
In this paper, we present a simplified analytic model for predicting motion of long space tethers. The perturbation model developed here addresses skip rope motion, where each end of the tether is held in place and the middle of the tether swings with a motion similar to that of a child's skip rope. If the motion of the tether midpoint is elliptical rather than circular, precession of the ellipse complicates the procedures required to damp this motion. The simplified analytic model developed in this paper parametrically predicts the precession of elliptical skip rope motion. Furthermore, the model shows that elliptic skip rope motion will circularize when damping is present in the longitudinal direction. Compared with high-fidelity simulation results, this simplified model provides excellent predictions of these phenomena.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Li, Zhigang; Ji, Cheng; Wang, Lishu
2018-07-01
Although analytical models have been used to quickly predict head response under impact condition, the existing models generally took the head as regular shell with uniform thickness which cannot account for the actual head geometry with varied cranial thickness and curvature at different locations. The objective of this study is to develop and validate an analytical model incorporating actual cranial thickness and curvature for child aged 0-1YO and investigate their effects on child head dynamic responses at different head locations. To develop the new analytical model, the child head was simplified into an irregular fluid-filled shell with non-uniform thickness and the cranial thickness and curvature at different locations were automatically obtained from CT scans using a procedure developed in this study. The implicit equation of maximum impact force was derived as a function of elastic modulus, thickness and radius of curvature of cranium. The proposed analytical model are compared with cadaver test data of children aged 0-1 years old and it is shown to be accurate in predicting head injury metrics. According to this model, obvious difference in injury metrics were observed among subjects with the same age, but different cranial thickness and curvature; and the injury metrics at forehead location are significant higher than those at other locations due to large thickness it owns. The proposed model shows good biofidelity and can be used in quickly predicting the dynamics response at any location of head for child younger than 1 YO. Copyright © 2018 Elsevier B.V. All rights reserved.
Lab-on-chip systems for integrated bioanalyses
Madaboosi, Narayanan; Soares, Ruben R.G.; Fernandes, João Tiago S.; Novo, Pedro; Moulas, Geraud; Chu, Virginia
2016-01-01
Biomolecular detection systems based on microfluidics are often called lab-on-chip systems. To fully benefit from the miniaturization resulting from microfluidics, one aims to develop ‘from sample-to-answer’ analytical systems, in which the input is a raw or minimally processed biological, food/feed or environmental sample and the output is a quantitative or qualitative assessment of one or more analytes of interest. In general, such systems will require the integration of several steps or operations to perform their function. This review will discuss these stages of operation, including fluidic handling, which assures that the desired fluid arrives at a specific location at the right time and under the appropriate flow conditions; molecular recognition, which allows the capture of specific analytes at precise locations on the chip; transduction of the molecular recognition event into a measurable signal; sample preparation upstream from analyte capture; and signal amplification procedures to increase sensitivity. Seamless integration of the different stages is required to achieve a point-of-care/point-of-use lab-on-chip device that allows analyte detection at the relevant sensitivity ranges, with a competitive analysis time and cost. PMID:27365042
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
ERIC Educational Resources Information Center
Cepriá, Gemma; Salvatella, Luis
2014-01-01
All pH calculations for simple acid-base systems used in introductory courses on general or analytical chemistry can be carried out by using a general procedure requiring the use of predominance diagrams. In particular, the pH is calculated as the sum of an independent term equaling the average pK[subscript a] values of the acids involved in the…
PET imaging for receptor occupancy: meditations on calculation and simplification.
Zhang, Yumin; Fox, Gerard B
2012-03-01
This invited mini-review briefly summarizes procedures and challenges of measuring receptor occupancy with positron emission tomography. Instead of describing the detailed analytic procedures of in vivo ligand-receptor imaging, the authors provide a pragmatic approach, along with personal perspectives, for conducting positron emission tomography imaging for receptor occupancy, and systematically elucidate the mathematics of receptor occupancy calculations in practical ways that can be understood with elementary algebra. The authors also share insights regarding positron emission tomography imaging for receptor occupancy to facilitate applications for the development of drugs targeting receptors in the central nervous system.
PET imaging for receptor occupancy: meditations on calculation and simplification
Zhang, Yumin; Fox, Gerard B.
2012-01-01
This invited mini-review briefly summarizes procedures and challenges of measuring receptor occupancy with positron emission tomography. Instead of describing the detailed analytic procedures of in vivo ligand-receptor imaging, the authors provide a pragmatic approach, along with personal perspectives, for conducting positron emission tomography imaging for receptor occupancy, and systematically elucidate the mathematics of receptor occupancy calculations in practical ways that can be understood with elementary algebra. The authors also share insights regarding positron emission tomography imaging for receptor occupancy to facilitate applications for the development of drugs targeting receptors in the central nervous system. PMID:23554733
Caron, Alexandre; Chazard, Emmanuel; Muller, Joris; Perichon, Renaud; Ferret, Laurie; Koutkias, Vassilis; Beuscart, Régis; Beuscart, Jean-Baptiste; Ficheur, Grégoire
2017-03-01
The significant risk of adverse events following medical procedures supports a clinical epidemiological approach based on the analyses of collections of electronic medical records. Data analytical tools might help clinical epidemiologists develop more appropriate case-crossover designs for monitoring patient safety. To develop and assess the methodological quality of an interactive tool for use by clinical epidemiologists to systematically design case-crossover analyses of large electronic medical records databases. We developed IT-CARES, an analytical tool implementing case-crossover design, to explore the association between exposures and outcomes. The exposures and outcomes are defined by clinical epidemiologists via lists of codes entered via a user interface screen. We tested IT-CARES on data from the French national inpatient stay database, which documents diagnoses and medical procedures for 170 million inpatient stays between 2007 and 2013. We compared the results of our analysis with reference data from the literature on thromboembolic risk after delivery and bleeding risk after total hip replacement. IT-CARES provides a user interface with 3 columns: (i) the outcome criteria in the left-hand column, (ii) the exposure criteria in the right-hand column, and (iii) the estimated risk (odds ratios, presented in both graphical and tabular formats) in the middle column. The estimated odds ratios were consistent with the reference literature data. IT-CARES may enhance patient safety by facilitating clinical epidemiological studies of adverse events following medical procedures. The tool's usability must be evaluated and improved in further research. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
ERIC Educational Resources Information Center
Homem, Vera; Alves, Arminda; Santos, Lu´cia
2014-01-01
A laboratory application with a strong component in analytical chemistry was designed for undergraduate students, in order to introduce a current problem in the environmental science field, the water contamination by antibiotics. Therefore, a simple and rapid method based on direct injection and high performance liquid chromatography-tandem mass…
ERIC Educational Resources Information Center
Ramos, Luiz Antonio; Prieto, Katia Roberta; Carvalheiro, Eder Tadeu Gomes; Carvalheiro, Carla Cristina Schmitt
2005-01-01
The use of crude flower extracts to the principle of analytical chemistry automation, with the flow injection analysis (FIA) procedure developed to determine hypochlorite in household bleaching products was performed. The FIA comprises a group of techniques based on injection of a liquid sample into a moving, nonsegmented carrier stream of a…
ERIC Educational Resources Information Center
Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.
2014-01-01
Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…
The method development is on-going in the Region 10 Laboratory. The conditions of the separation technique is complete. The MDLs have been determined to be between 3 to 6 ppt in marine sediment for co-planar PCB congeners #77; #81; #126 and #169. The procedure has been used fo...
ERIC Educational Resources Information Center
Nugent, William Robert; Moore, Matthew; Story, Erin
2015-01-01
The standardized mean difference (SMD) is perhaps the most important meta-analytic effect size. It is typically used to represent the difference between treatment and control population means in treatment efficacy research. It is also used to represent differences between populations with different characteristics, such as persons who are…
Tian, Weijun; Bai, Jie; Sun, Huimei; Zhao, Yangguo
2013-01-30
Sustainability assessments of coastal beach exploitation are difficult because the identification of appropriate monitoring methodologies and evaluation procedures is still ongoing. In particular, the most suitable procedure for the application of sustainability assessment to coastal beaches remains uncertain. This paper presents a complete sustainability assessment process for coastal beach exploitation based on the analytic hierarchy process (AHP). We developed an assessment framework consisting of 14 indicators derived from the three dimensions of suitability, economic and social value, and ecosystem. We chose a wind power project on a coastal beach of Yancheng as a case study. The results indicated that the wind power farms on the coastal beach were not completely in keeping with sustainable development theory. The construction of the wind power farms had some negative impacts. Therefore, in the design stage, wind turbines should be designed and planned carefully to minimize these negative impacts. In addition, the case study demonstrated that the AHP was capable of addressing the complexities associated with the sustainability of coastal beaches. Copyright © 2012 Elsevier Ltd. All rights reserved.
Solid sorbent air sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elskamp, C.J.; Schultz, G.R.
1986-01-01
A sampling and analytical procedure for methyl-, dimethyl-, ethyl-, and diethylamine was developed in order to avoid problems typically encountered in the sampling and analysis of low molecular weight aliphatic amines. Samples are collected with adsorbent tubes containing Amberlite XAD-7 resin coated with the derivatizing reagent, NBD chloride (7-chloro-4-nitrobenzo-2-oxa-1,3-diazole). Analysis is performed by high performance liquid chromatography with the use of a fluorescence and/or UV/visible detector. All four amines can be monitored simultaneously, and neither collection nor storage is affected by humidity. Samples are stable at room temperature for at least two weeks. The methodology has been tested for eachmore » of the four amines at sample loadings equivalent to air concentration ranges of 0.5 to 30 ppm for a sample volume of 10 liters. The method shows promise for determining other airborne primary and secondary low molecular weight aliphatic amines.« less
Togola, Anne; Coureau, Charlotte; Guezennec, Anne-Gwenaëlle; Touzé, Solène
2015-05-01
The presence of acrylamide in natural systems is of concern from both environmental and health points of view. We developed an accurate and robust analytical procedure (offline solid phase extraction combined with UPLC/MS/MS) with a limit of quantification (20 ng L(-1)) compatible with toxicity threshold values. The optimized (considering the nature of extraction phases, sampling volumes, and solvent of elution) solid phase extraction (SPE) was validated according to ISO Standard ISO/IEC 17025 on groundwater, surface water, and industrial process water samples. Acrylamide is highly polar, which induces a high variability during the SPE step, therefore requiring the use of C(13)-labeled acrylamide as an internal standard to guarantee the accuracy and robustness of the method (uncertainty about 25 % (k = 2) at limit of quantification level). The specificity of the method and the stability of acrylamide were studied for these environmental media, and it was shown that the method is suitable for measuring acrylamide in environmental studies.
LaBudde, Robert A; Harnly, James M
2012-01-01
A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.
Dynamics and Control of Flexible Space Vehicles
NASA Technical Reports Server (NTRS)
Likins, P. W.
1970-01-01
The purpose of this report is twofold: (1) to survey the established analytic procedures for the simulation of controlled flexible space vehicles, and (2) to develop in detail methods that employ a combination of discrete and distributed ("modal") coordinates, i.e., the hybrid-coordinate methods. Analytic procedures are described in three categories: (1) discrete-coordinate methods, (2) hybrid-coordinate methods, and (3) vehicle normal-coordinate methods. Each of these approaches is described and analyzed for its advantages and disadvantages, and each is found to have an area of applicability. The hybrid-coordinate method combines the efficiency of the vehicle normal-coordinate method with the versatility of the discrete-coordinate method, and appears to have the widest range of practical application. The results in this report have practical utility in two areas: (1) complex digital computer simulation of flexible space vehicles of arbitrary configuration subject to realistic control laws, and (2) preliminary control system design based on transfer functions for linearized models of dynamics and control laws.
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
Séby, F; Castetbon, A; Ortega, R; Guimon, C; Niveau, F; Barrois-Oudin, N; Garraud, H; Donard, O F X
2008-05-01
The European directive 2000/53/EC limits the use of Cr(VI) in vehicle manufacturing. Although a maximum of 2 g of Cr(VI) was authorised per vehicle for corrosion prevention coatings of key components, since July 2007 its use has been prohibited except for some particular applications. Therefore, the objective of this work was to develop direct analytical procedures for Cr(VI) determination in the different steel coatings used for screws. Instead of working directly with screws, the optimisation of the procedures was carried out with metallic plates homogeneously coated to improve the data comparability. Extraction of Cr(VI) from the metallic parts was performed by sonication. Two extraction solutions were tested: a direct water extraction solution used in standard protocols and an ammonium/ammonia buffer solution at pH 8.9. The extracts were further analysed for Cr speciation by high-performance liquid chromatography (HPLC) inductively coupled plasma (ICP) atomic emission spectrometry or HPLC ICP mass spectrometry depending on the concentration level. When possible, the coatings were also directly analysed by solid speciation techniques (X-ray photoelectron spectroscopy, XPS, and X-ray absorption near-edge structure, XANES) for validation of the results. Very good results between the different analytical approaches were obtained for the sample of coating made up of a heated paint containing Zn, Al and Cr when using the extracting buffer solution at pH 8.9. After a repeated four-step extraction procedure on the same portion test, taking into account the depth of the surface layer reached, good agreement with XPS and XANES results was obtained. In contrast, for the coatings composed of an alkaline Zn layer where Cr(VI) and Cr(III) are deposited, only the extraction procedure using water allowed the detection of Cr(VI). To elucidate the Cr(VI) reduction during extraction at pH 8.9, the reactivity of Cr(VI) towards different species of Zn generally present in the coatings (metallic Zn and zinc oxide) was studied. The results showed that metallic Zn rapidly reduces Cr(VI), whereas this reaction is less evident in the presence of zinc oxide. Water was then retained for coatings containing metallic Zn.
Comparison of three multiplex cytokine analysis systems: Luminex, SearchLight and FAST Quant.
Lash, Gendie E; Scaife, Paula J; Innes, Barbara A; Otun, Harry A; Robson, Steven C; Searle, Roger F; Bulmer, Judith N
2006-02-20
Multiplex cytokine analysis technologies have become readily available in the last five years. Two main formats exist: multiplex sandwich ELISA and bead based assays. While these have each been compared to individual ELISAs, there has been no direct comparison between the two formats. We report here the comparison of two multiplex sandwich ELISA procedures (FAST Quant and SearchLight) and a bead based assay (UpState Luminex). All three kits differed from each other for different analytes and there was no clear pattern of one system giving systematically different results than another for any analyte studied. We suggest that each system has merits and several factors including range of analytes available, prospect of development of new analytes, dynamic range of the assay, sensitivity of the assay, cost of equipment, cost of consumables, ease of use and ease of data analysis need to be considered when choosing a system for use. We also suggest that results obtained from different systems cannot be combined.
Nine-analyte detection using an array-based biosensor
NASA Technical Reports Server (NTRS)
Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.
2002-01-01
A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.
A behavior-analytic critique of Bandura's self-efficacy theory
Biglan, Anthony
1987-01-01
A behavior-analytic critique of self-efficacy theory is presented. Self-efficacy theory asserts that efficacy expectations determine approach behavior and physiological arousal of phobics as well as numerous other clinically important behaviors. Evidence which is purported to support this assertion is reviewed. The evidence consists of correlations between self-efficacy ratings and other behaviors. Such response-response relationships do not unequivocally establish that one response causes another. A behavior-analytic alternative to self-efficacy theory explains these relationships in terms of environmental events. Correlations between self-efficacy rating behavior and other behavior may be due to the contingencies of reinforcement that establish a correspondence between such verbal predictions and the behavior to which they refer. Such a behavior-analytic account does not deny any of the empirical relationships presented in support of self-efficacy theory, but it points to environmental variables that could account for those relationships and that could be manipulated in the interest of developing more effective treatment procedures. PMID:22477956
Discordance between net analyte signal theory and practical multivariate calibration.
Brown, Christopher D
2004-08-01
Lorber's concept of net analyte signal is reviewed in the context of classical and inverse least-squares approaches to multivariate calibration. It is shown that, in the presence of device measurement error, the classical and inverse calibration procedures have radically different theoretical prediction objectives, and the assertion that the popular inverse least-squares procedures (including partial least squares, principal components regression) approximate Lorber's net analyte signal vector in the limit is disproved. Exact theoretical expressions for the prediction error bias, variance, and mean-squared error are given under general measurement error conditions, which reinforce the very discrepant behavior between these two predictive approaches, and Lorber's net analyte signal theory. Implications for multivariate figures of merit and numerous recently proposed preprocessing treatments involving orthogonal projections are also discussed.
NASA Astrophysics Data System (ADS)
Hsieh, Y.-H. Peggy
Immunochemistry is a relatively new science that has developed rapidly in the last few decades. One of the most useful analytical developments associated with this new science is immunoassay. Originally immunoassays were developed in medical settings to facilitate the study of immunology, particularly the antibody-antigen interaction. Immunoassays now are finding widespread applications outside the clinical field because they are appropriate for a wide range of analytes ranging from proteins to small organic molecules. In the food analysis area, immunoassays are widely used for chemical residue analysis, identification of bacteria and viruses, and detection of proteins in food and agricultural products. Protein detection is important for determination of allergens and meat species content, seafood species identification, and detection of genetically modified plant tissues. While immunoassays of all formats are too numerous to cover completely in this chapter, there are several procedures that have become standard for food analysis because of their specificity, sensitivity, and simplicity.
Boyer, Chantal; Gaudin, Karen; Kauss, Tina; Gaubert, Alexandra; Boudis, Abdelhakim; Verschelden, Justine; Franc, Mickaël; Roussille, Julie; Boucher, Jacques; Olliaro, Piero; White, Nicholas J.; Millet, Pascal; Dubost, Jean-Pierre
2012-01-01
Near infrared spectroscopy (NIRS) methods were developed for the determination of analytical content of an antimalarial-antibiotic (artesunate and azithromycin) co-formulation in hard gelatin capsule (HGC). The NIRS consists of pre-processing treatment of spectra (raw spectra and first-derivation of two spectral zones), a unique principal component analysis model to ensure the specificity and then two partial least-squares regression models for the determination content of each active pharmaceutical ingredient. The NIRS methods were developed and validated with no reference method, since the manufacturing process of HGC is basically mixed excipients with active pharmaceutical ingredients. The accuracy profiles showed β-expectation tolerance limits within the acceptance limits (±5%). The analytical control approach performed by reversed phase (HPLC) required two different methods involving two different preparation and chromatographic methods. NIRS offers advantages in terms of lower costs of equipment and procedures, time saving, environmentally friendly. PMID:22579599
A design procedure for a tension-wire stiffened truss-column
NASA Technical Reports Server (NTRS)
Greene, W. H.
1980-01-01
A deployable, tension wire stiffened, truss column configuration was considered for space structure applications. An analytical procedure, developed for design of the truss column and exercised in numerical studies, was based on equivalent beam stiffness coefficients in the classical analysis for an initially imperfect beam column. Failure constraints were formulated to be used in a combined weight/strength and nonlinear mathematical programming automated design procedure to determine the minimum mass column for a particular combination of design load and length. Numerical studies gave the mass characteristics of the truss column for broad ranges of load and length. Comparisons of the truss column with a baseline tubular column used a special structural efficiency parameter for this class of columns.
Busatto, Zenaís; da Silva, Agnaldo Fernando Baldo; de Freitas, Osvaldo; Paschoal, Jonas Augusto Rizzato
2017-04-01
This paper describes the development of analytical methods for the quantification of albendazole (ABZ) in fish feed and ABZ and its main known metabolites (albendazole sulfoxide, albendazole sulfone and albendazole aminosulfone) in fish fillet employing LC-MS/MS. In order to assess the reliability of the analytical methods, evaluation was undertaken as recommended by related guides proposed by the Brazilian Ministry of Agriculture for analytical method validation. The calibration curve for ABZ quantification in feed showed adequate linearity (r > 0.99), precision (CV < 1.03%) and trueness ranging from 99% to 101%. The method for ABZ residues in fish fillet involving the QuEChERS technique for sample extraction had adequate linearity (r > 0.99) for all analytes, precision (CV < 13%) and trueness around 100%, with CCα < 122 ng g - 1 and CCβ < 145 ng g - 1 . Besides, by aiming to avoid the risk of ABZ leaching from feed into the aquatic environment during fish medication via the oral route, a promising procedure for drug incorporation in the feed involving coating feed pellets with ethyl cellulose polymer containing ABZ was also evaluated. The medicated feed had good homogeneity (CV < 3%) and a lower release of ABZ (< 0.2%) from feed to water when the medicated feed stayed in the water for up to 15 min.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
Determination of 232Th in urine by ICP-MS for individual monitoring purposes.
Baglan, N; Cossonnet, C; Ritt, J
2001-07-01
Thorium is naturally occurring in various ores used for industrial purposes and has numerous applications. This paper sets out to investigate urine analysis as a suitable monitoring approach for workers potentially exposed to thorium. Due to its biokinetic behavior and its low solubility, urinary concentrations are generally very low, requiring therefore high sensitivity analytical methods. An analytical procedure has been developed for detecting 232Th concentrations of below 1 mBq L(-1) quickly and easily. Due to the long half-life (1.41 x 10(10) y) of 232Th, the potential of a procedure based on urine sample dilution and ICP-MS (inductively coupled plasma-mass spectrometry) measurement was investigated first. Two dilution factors were chosen: 100, which is more suitable for long-term measurement trials, and 20, which increases sensitivity. It has been shown that a 100-fold dilution can be used to measure concentrations of below 1 mBq L(-1), whereas a 20-fold one can be used to reach concentrations of below 0.06 mBq L(-1). Then, on the basis of the limitation of the procedure based on urine dilution, the suitable field of application for the different procedures (100-fold and 20-fold dilution and also a chemical purification followed by an ICP-MS measurement) was determined in relation to monitoring objectives.
Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min
2009-01-01
Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.
Analytical study of comet nucleus samples
NASA Technical Reports Server (NTRS)
Albee, A. L.
1989-01-01
Analytical procedures for studying and handling frozen (130 K) core samples of comet nuclei are discussed. These methods include neutron activation analysis, x ray fluorescent analysis and high resolution mass spectroscopy.
Van Nimmen, Nadine F J; Veulemans, Hendrik A F
2004-05-07
A highly sensitive gas chromatographic-mass spectrometric (GC-MS) analytical method for the determination of the opioid narcotics fentanyl, alfentanil, and sufentanil in industrial hygiene personal air samples and surface contamination wipes was developed and comprehensively validated. Sample preparation involved a single step extraction of the samples with methanol, fortified with a fixed amount of the penta-deuterated analogues of the opioid narcotics as internal standard. The GC-MS analytical procedure using selected ion monitoring (SIM) was shown to be highly selective. Linearity was shown for levels of extracted wipe and air samples corresponding to at least 0.1-2 times their surface contamination limit (SCL) and accordingly to 0.1-2 times their time weighted average occupational exposure limit (OEL-TWA) based on a full shift 9601 air sample. Extraction recoveries were determined for spiked air samples and surface wipes and were found to be quantitative for both sampling media in the entire range studied. The air sampling method's limit of detection (LOD) was determined to be 0.4 ng per sample for fentanyl and sufentanil and 1.6 ng per sample for alfentanil, corresponding to less than 1% of their individual OEL for a full shift air sample (9601). The limit of quantification (LOQ) was found to be 1.4, 1.2, and 5.0 ng per filter for fentanyl, sufentanil, and alfentanil, respectively. The wipe sampling method had LODs of 4 ng per wipe for fentanyl and sufentanil and 16 ng per wipe for alfentanil and LOQs of respectively, 14, 12, and 50 ng per wipe. The analytical intra-assay precision of the air sampling and wipe sampling method, defined as the coefficient of variation on the analytical result of six replicate spiked media was below 10 and 5%, respectively, for all opioids at all spike levels. Accuracy expressed as relative error was determined to be below 10%, except for alfentanil at the lowest spike level (-13.1%). The stability of the opioids during simulated air sampling was investigated. For fentanyl and sufentanil a quantitative recovery was observed at all spike levels, while for alfentanil recoveries ranged from 60.3 to 85.4%. When spiked air samples were stored at ambient temperature and at -15 degrees C quantitative recovery was found for fentanyl and sufentanil after 7 and 14 days. For alfentanil a slight loss seemed to occur upon storage during 7 days, being more explicit after 14 days. Ambient storage of spiked wipes seemed to lead to significant losses of all opioids studied, yielding recoveries of 37.7-88.3%. Upon storage of similar wipes at -15 degrees C a significantly higher recovery was found ranging from 77.3 to 88.3%. The developed analytical and sampling procedures have been recently applied in an explorative field study of which the results of surface contamination wipe sampling are presented in this paper. To our knowledge, this is the first study addressing the development and validation of analytical procedures for the assessment of external occupational exposure to potent opioid narcotics.
Horowitz, Arthur J.
2013-01-01
Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.
Analytical and experimental study of vibrations in a gear transmission
NASA Technical Reports Server (NTRS)
Choy, F. K.; Ruan, Y. F.; Zakrajsek, J. J.; Oswald, Fred B.; Coy, J. J.
1991-01-01
An analytical simulation of the dynamics of a gear transmission system is presented and compared to experimental results from a gear noise test rig at the NASA Lewis Research Center. The analytical procedure developed couples the dynamic behaviors of the rotor-bearing-gear system with the response of the gearbox structure. The modal synthesis method is used in solving the overall dynamics of the system. Locally each rotor-gear stage is modeled as an individual rotor-bearing system using the matrix transfer technique. The dynamics of each individual rotor are coupled with other rotor stages through the nonlinear gear mesh forces and with the gearbox structure through bearing support systems. The modal characteristics of the gearbox structure are evaluated using the finite element procedure. A variable time steping integration routine is used to calculate the overall time transient behavior of the system in modal coordinates. The global dynamic behavior of the system is expressed in a generalized coordinate system. Transient and steady state vibrations of the gearbox system are presented in the time and frequency domains. The vibration characteristics of a simple single mesh gear noise test rig is modeled. The numerical simulations are compared to experimental data measured under typical operating conditions. The comparison of system natural frequencies, peak vibration amplitudes, and gear mesh frequencies are generally in good agreement.
Magnetic solid-phase extraction using carbon nanotubes as sorbents: a review.
Herrero-Latorre, C; Barciela-García, J; García-Martín, S; Peña-Crecente, R M; Otárola-Jiménez, J
2015-09-10
Magnetic solid-phase extraction (M-SPE) is a procedure based on the use of magnetic sorbents for the separation and preconcentration of different organic and inorganic analytes from large sample volumes. The magnetic sorbent is added to the sample solution and the target analyte is adsorbed onto the surface of the magnetic sorbent particles (M-SPs). Analyte-M-SPs are separated from the sample solution by applying an external magnetic field and, after elution with the appropriate solvent, the recovered analyte is analyzed. This approach has several advantages over traditional solid phase extraction as it avoids time-consuming and tedious on-column SPE procedures and it provides a rapid and simple analyte separation that avoids the need for centrifugation or filtration steps. As a consequence, in the past few years a great deal of research has been focused on M-SPE, including the development of new sorbents and novel automation strategies. In recent years, the use of magnetic carbon nanotubes (M-CNTs) as a sorption substrate in M-SPE has become an active area of research. These materials have exceptional mechanical, electrical, optical and magnetic properties and they also have an extremely large surface area and varied possibilities for functionalization. This review covers the synthesis of M-CNTs and the different approaches for the use of these compounds in M-SPE. The performance, general characteristics and applications of M-SPE based on magnetic carbon nanotubes for organic and inorganic analysis have been evaluated on the basis of more than 110 references. Finally, some important challenges with respect the use of magnetic carbon nanotubes in M-SPE are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Clean Water Act Analytical Methods
EPA publishes laboratory analytical methods (test procedures) that are used by industries and municipalities to analyze the chemical, physical and biological components of wastewater and other environmental samples required by the Clean Water Act.
Laboratory Workhorse: The Analytical Balance.
ERIC Educational Resources Information Center
Clark, Douglas W.
1979-01-01
This report explains the importance of various analytical balances in the water or wastewater laboratory. Stressed is the proper procedure for utilizing the equipment as well as the mechanics involved in its operation. (CS)
FDA Bacteriological Analytical Manual, Chapter 10, 2003: Listeria monocytogenes
FDA Bacteriological Analytical Manual, Chapter 10 describes procedures for analysis of food samples and may be adapted for assessment of solid, particulate, aerosol, liquid and water samples containing Listeria monocytogenes.
Tankiewicz, Maciej; Biziuk, Marek
2018-02-01
A simple and efficient dispersive liquid-liquid microextraction technique (DLLME) was developed by using a mixture of two solvents: 40 μL of tetrachlorethylene (extraction solvent) and 1.0 mL of methanol (disperser solvent), which was rapidly injected with a syringe into 10 mL of water sample. Some important parameters affecting the extraction efficiency, such as type and volume of solvents, water sample volume, extraction time, temperature, pH adjustment and salt addition effect were investigated. Simultaneous determination of 34 commonly used pesticides was performed by using gas chromatography coupled with mass spectrometry (GC-MS). The procedure has been validated in order to obtain the highest efficiency at the lowest concentration levels of analytes to fulfill the requirements of regulations on maximum residue limits. Under the optimum conditions, the linearity range was within 0.0096-100 μg L -1 . The limits of detection (LODs) of the developed DLLME-GC-MS methodology for all investigated pesticides were in the range of 0.0032 (endrin)-0.0174 (diazinon) μg L -1 and limits of quantification (LOQs) from 0.0096 to 0.052 μg L -1 . At lower concentration of 1 μg L -1 for each pesticide, recoveries ranged between 84% (tebufenpyrad) and 108% (deltamethrin) with relative standard deviations (RSDs) (n = 7) from 1.1% (metconazole) to 11% (parathion-mehtyl). This methodology was successfully applied to check contamination of environmental samples. The procedure has proved to be selective, sensitive and precise for the simultaneous determination of various pesticides. The optimized analytical method is very simple and rapid (less than 5 min). Graphical abstract Analytical procedure for testing water samples consists of dispersive liquid-liquid microextraction (DLLME) and gas chromatography coupled with mass spectrometry (GC-MS).
Bámaca-Colbert, Mayra Y; Gayles, Jochebed G
2010-11-01
The overall aim of the current study was to identify the methodological approach and corresponding analytic procedure that best elucidated the associations among Mexican-origin mother-daughter cultural orientation dissonance, family functioning, and adolescent adjustment. To do so, we employed, and compared, two methodological approaches (i.e., variable-centered and person-centered) via four analytic procedures (i.e., difference score, interactive, matched/mismatched grouping, and latent profiles). The sample consisted of 319 girls in the 7th or 10th grade and their mother or mother figure from a large Southwestern, metropolitan area in the US. Family factors were found to be important predictors of adolescent adjustment in all models. Although some findings were similar across all models, overall, findings suggested that the latent profile procedure best elucidated the associations among the variables examined in this study. In addition, associations were present across early and middle adolescents, with a few findings being only present for one group. Implications for using these analytic procedures in studying cultural and family processes are discussed.
NASA Technical Reports Server (NTRS)
Mason, P. W.; Harris, H. G.; Zalesak, J.; Bernstein, M.
1974-01-01
The methods and procedures used in the analysis and testing of the scale model are reported together with the correlation of the analytical and experimental results. The model, the NASTRAN finite element analysis, and results are discussed. Tests and analytical investigations are also reported.
This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Extension of rezoned Eulerian-Lagrangian method to astrophysical plasma applications
NASA Technical Reports Server (NTRS)
Song, M. T.; Wu, S. T.; Dryer, Murray
1993-01-01
The rezoned Eulerian-Lagrangian procedure developed by Brackbill and Pracht (1973), which is limited to simple configurations of the magnetic fields, is modified in order to make it applicable to astrophysical plasma. For this purpose, two specific methods are introduced, which make it possible to determine the initial field topology for which no analytical expressions are available. Numerical examples illustrating these methods are presented.
NASA Astrophysics Data System (ADS)
da Silva Fernandes, S.; das Chagas Carvalho, F.; Bateli Romão, J. V.
2018-04-01
A numerical-analytical procedure based on infinitesimal canonical transformations is developed for computing optimal time-fixed low-thrust limited power transfers (no rendezvous) between coplanar orbits with small eccentricities in an inverse-square force field. The optimization problem is formulated as a Mayer problem with a set of non-singular orbital elements as state variables. Second order terms in eccentricity are considered in the development of the maximum Hamiltonian describing the optimal trajectories. The two-point boundary value problem of going from an initial orbit to a final orbit is solved by means of a two-stage Newton-Raphson algorithm which uses an infinitesimal canonical transformation. Numerical results are presented for some transfers between circular orbits with moderate radius ratio, including a preliminary analysis of Earth-Mars and Earth-Venus missions.
Schellenberg, François; Wielders, Jos; Anton, Raymond; Bianchi, Vincenza; Deenmamode, Jean; Weykamp, Cas; Whitfield, John; Jeppsson, Jan-Olof; Helander, Anders
2017-02-01
Carbohydrate-deficient transferrin (CDT) is used as a biomarker of sustained high alcohol consumption. The currently available measurement procedures for CDT are based on various analytical techniques (HPLC, capillary electrophoresis, nephelometry), some differing in the definition of the analyte and using different reference intervals and cut-off values. The Working Group on Standardization of CDT (WG-CDT), initiated by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC), has validated an HPLC candidate reference measurement procedure (cRMP) for CDT (% disialotransferrin to total transferrin based on peak areas), demonstrating that it is suitable as a reference measurement procedure (RMP) for CDT. Presented is a detailed description of the cRMP and its calibration. Practical aspects on how to treat genetic variant and so-called di-tri bridge samples are described. Results of method performance characteristics, as demanded by ISO 15189 and ISO 15193, are given, as well as the reference interval and measurement uncertainty and how to deal with that in routine use. The correlation of the cRMP with commercial CDT procedures and the performance of the cRMP in a network of laboratories are also presented. The performance of the CDT cRMP in combination with previously developed commutable calibrators allows for standardization of the currently available commercial measurement procedures for CDT. The cRMP has recently been approved by the IFCC and will be from now on be known as the IFCC-RMP for CDT, while CDT results standardized according to this RMP should be indicated as CDT IFCC . Copyright © 2016 Elsevier B.V. All rights reserved.
Current projects in Pre-analytics: where to go?
Sapino, Anna; Annaratone, Laura; Marchiò, Caterina
2015-01-01
The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindberg, Michael J.
2010-09-28
Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less
Portable microwave assisted extraction: An original concept for green analytical chemistry.
Perino, Sandrine; Petitcolas, Emmanuel; de la Guardia, Miguel; Chemat, Farid
2013-11-08
This paper describes a portable microwave assisted extraction apparatus (PMAE) for extraction of bioactive compounds especially essential oils and aromas directly in a crop or in a forest. The developed procedure, based on the concept of green analytical chemistry, is appropriate to obtain direct in-field information about the level of essential oils in natural samples and to illustrate green chemical lesson and research. The efficiency of this experiment was validated for the extraction of essential oil of rosemary directly in a crop and allows obtaining a quantitative information on the content of essential oil, which was similar to that obtained by conventional methods in the laboratory. Copyright © 2013 Elsevier B.V. All rights reserved.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2015-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
Testing and Analytical Modeling for Purging Process of a Cryogenic Line
NASA Technical Reports Server (NTRS)
Hedayat, A.; Mazurkivich, P. V.; Nelson, M. A.; Majumdar, A. K.
2013-01-01
To gain confidence in developing analytical models of the purging process for the cryogenic main propulsion systems of upper stage, two test series were conducted. The test article, a 3.35 m long with the diameter of 20 cm incline line, was filled with liquid or gaseous hydrogen and then purged with gaseous helium (GHe). Total of 10 tests were conducted. The influences of GHe flow rates and initial temperatures were evaluated. The Generalized Fluid System Simulation Program (GFSSP), an in-house general-purpose fluid system analyzer computer program, was utilized to model and simulate selective tests. The test procedures, modeling descriptions, and the results are presented in the following sections.
NASA Astrophysics Data System (ADS)
Wang, I. T.
A general method for determining the effective transport wind speed, overlineu, in the Gaussian plume equation is discussed. Physical arguments are given for using the generalized overlineu instead of the often adopted release-level wind speed with the plume diffusion equation. Simple analytical expressions for overlineu applicable to low-level point releases and a wide range of atmospheric conditions are developed. A non-linear plume kinematic equation is derived using these expressions. Crosswind-integrated SF 6 concentration data from the 1983 PNL tracer experiment are used to evaluate the proposed analytical procedures along with the usual approach of using the release-level wind speed. Results of the evaluation are briefly discussed.
Williamson, K.S.; Petty, J.D.; Huckins, J.N.; Lebo, J.A.; Kaiser, E.M.
2002-01-01
High performance liquid chromatography coupled with programmable fluorescence detection was employed for the determination of 15 priority pollutant polycyclic aromatic hydrocarbons (PPPAHs) in water, sediment, and semipermeable membrane devices (SPMDs). Chromatographic separation using this analytical method facilitates selectivity, sensitivity (ppt levels), and can serve as a non-destructive technique for subsequent analysis by other chromatographic and spectroscopic techniques. Extraction and sample cleanup procedures were also developed for water, sediment, and SPMDs using various chromatographic and wet chemical methods. The focus of this publication is to examine the enrichment techniques and the analytical methodologies used in the isolation, characterization, and quantitation of 15 PPPAHs in different sample matrices.
Rim, Jung H.; Armenta, Claudine E.; Gonzales, Edward R.; ...
2015-09-12
This paper describes a new analyte extraction medium called polymer ligand film (PLF) that was developed to rapidly extract radionuclides. PLF is a polymer medium with ligands incorporated in its matrix that selectively and quickly extracts analytes. The main focus of the new technique is to shorten and simplify the procedure for chemically isolating radionuclides for determination through alpha spectroscopy. The PLF system was effective for plutonium and uranium extraction. The PLF was capable of co-extracting or selectively extracting plutonium over uranium depending on the PLF composition. As a result, the PLF and electrodeposited samples had similar alpha spectra resolutions.
NASA Technical Reports Server (NTRS)
Brinson, R. F.
1985-01-01
A method for lifetime or durability predictions for laminated fiber reinforced plastics is given. The procedure is similar to but not the same as the well known time-temperature-superposition principle for polymers. The method is better described as an analytical adaptation of time-stress-super-position methods. The analytical constitutive modeling is based upon a nonlinear viscoelastic constitutive model developed by Schapery. Time dependent failure models are discussed and are related to the constitutive models. Finally, results of an incremental lamination analysis using the constitutive and failure model are compared to experimental results. Favorable results between theory and predictions are presented using data from creep tests of about two months duration.
NASA Astrophysics Data System (ADS)
Kumavat, Hemraj Ramdas
2016-09-01
The compressive stress-strain behavior and mechanical properties of clay brick masonry and its constituents clay bricks and mortar, have been studied by several laboratory tests. Using linear regression analysis, a analytical model has been proposed for obtaining the stress-strain curves for masonry that can be used in the analysis and design procedures. The model requires only the compressive strengths of bricks and mortar as input data, which can be easily obtained experimentally. Development of analytical model from the obtained experimental results of Young's modulus and compressive strength. Simple relationships have been identified for obtaining the modulus of elasticity of bricks, mortar, and masonry from their corresponding compressive strengths. It was observed that the proposed analytical model clearly demonstrates a reasonably good prediction of the stress-strain curves when compared with the experimental curves.
Martin, Rafaela; Schürenkamp, Jennifer; Gasse, Angela; Pfeiffer, Heidi; Köhler, Helga
2013-05-01
A validated method for the simultaneous determination of psilocin, bufotenine, lysergic acid diethylamide and its metabolites in serum, plasma and urine using liquid chromatography-electrospray ionization/tandem mass spectrometry was developed. During the solid-phase extraction procedure with polymeric mixed-mode cation exchange columns, the unstable analytes were protected by ascorbic acid, drying with nitrogen and exclusion of light. The limits of detection and quantitation for all analytes were low. Recovery was ≥86 % for all analytes and no significant matrix effects were observed. Interday and intraday imprecisions at different concentrations ranged from 1.1 to 8.2 % relative standard deviation, bias was within ±5.3 %. Processed samples were stable in the autosampler for at least 2 days. Furthermore, freeze/thaw and long-term stability were investigated. The method was successfully applied to authentic serum and urine samples.
Derivation of an analytic expression for the error associated with the noise reduction rating
NASA Astrophysics Data System (ADS)
Murphy, William J.
2005-04-01
Hearing protection devices are assessed using the Real Ear Attenuation at Threshold (REAT) measurement procedure for the purpose of estimating the amount of noise reduction provided when worn by a subject. The rating number provided on the protector label is a function of the mean and standard deviation of the REAT results achieved by the test subjects. If a group of subjects have a large variance, then it follows that the certainty of the rating should be correspondingly lower. No estimate of the error of a protector's rating is given by existing standards or regulations. Propagation of errors was applied to the Noise Reduction Rating to develop an analytic expression for the hearing protector rating error term. Comparison of the analytic expression for the error to the standard deviation estimated from Monte Carlo simulation of subject attenuations yielded a linear relationship across several protector types and assumptions for the variance of the attenuations.
The analytical solution for drug delivery system with nonhomogeneous moving boundary condition
NASA Astrophysics Data System (ADS)
Saudi, Muhamad Hakimi; Mahali, Shalela Mohd; Harun, Fatimah Noor
2017-08-01
This paper discusses the development and the analytical solution of a mathematical model based on drug release system from a swelling delivery device. The mathematical model is represented by a one-dimensional advection-diffusion equation with nonhomogeneous moving boundary condition. The solution procedures consist of three major steps. Firstly, the application of steady state solution method, which is used to transform the nonhomogeneous moving boundary condition to homogeneous boundary condition. Secondly, the application of the Landau transformation technique that gives a significant impact in removing the advection term in the system of equation and transforming the moving boundary condition to a fixed boundary condition. Thirdly, the used of separation of variables method to find the analytical solution for the resulted initial boundary value problem. The results show that the swelling rate of delivery device and drug release rate is influenced by value of growth factor r.
Study of a vibrating plate: comparison between experimental (ESPI) and analytical results
NASA Astrophysics Data System (ADS)
Romero, G.; Alvarez, L.; Alanís, E.; Nallim, L.; Grossi, R.
2003-07-01
Real-time electronic speckle pattern interferometry (ESPI) was used for tuning and visualization of natural frequencies of a trapezoidal plate. The plate was excited to resonant vibration by a sinusoidal acoustical source, which provided a continuous range of audio frequencies. Fringe patterns produced during the time-average recording of the vibrating plate—corresponding to several resonant frequencies—were registered. From these interferograms, calculations of vibrational amplitudes by means of zero-order Bessel functions were performed in some particular cases. The system was also studied analytically. The analytical approach developed is based on the Rayleigh-Ritz method and on the use of non-orthogonal right triangular co-ordinates. The deflection of the plate is approximated by a set of beam characteristic orthogonal polynomials generated by using the Gram-Schmidt procedure. A high degree of correlation between computational analysis and experimental results was observed.
A Proposed Research Program for Hospital-Medical Care
Feldman, Paul
1967-01-01
This proposal for a federal government program of health services research, written in spring of 1966, played a key role in development of the National Center for Health Services Research and Development, announced by the President early this year. The paper points to the lack of economic incentives for development of cost-saving innovations for hospitals compared to incentives to develop innovations improving the quality of care. It indicates the analytic procedure which, if followed, would lead to an efficient program of research, and points out several aspects of the analysis that are critical requirements for its successful application. PMID:4964151
Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.
Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli
2018-03-13
The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadler, D.A.; Sun, F.; Littlejohn, D.
1995-12-31
ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
Irrgeher, Johanna; Prohaska, Thomas
2016-01-01
Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2010 CFR
2010-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2011 CFR
2011-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
10 CFR 26.137 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-01-01
... validation of analytical procedures. Quality assurance procedures must be designed, implemented, and reviewed... resolving any technical, methodological, or administrative errors in the licensee testing facility's testing...
Innovations in coating technology.
Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut
2008-01-01
Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.
Sandhu, Sundeep Kaur; Kellett, Stephen; Hardy, Gillian
2017-11-01
"Exits" in cognitive analytic therapy (CAT) are methods that change unhelpful patterns or roles during the final "revision" phase of the therapy. How exits are conceived and achieved is currently poorly understood. This study focussed on the revision stage to explore and define how change is accomplished in CAT. Qualitative content analysis studied transcripts of sessions 6 and 7 of a protocol delivered 8-session CAT treatment for depression. Eight participants met the study inclusion criteria, and therefore, 16 sessions were analysed. The exit model developed contained 3 distinct (but interacting) phases: (a) developing an observing self via therapist input or client self-reflection, (b) breaking out of old patterns by creating new roles and procedures, and (c) utilisation of a range of methods to support and maintain change. Levels of interrater reliability for the exit categories that formed the model were good. The revision stage of CAT emerged as a complex and dynamic process involving 3 interacting stages. Further research is recommended to understand how exits relate to durability of change and whether change processes differ according to presenting problem. Exit work in cognitive analytic therapy is a dynamic process that requires progression through stages of insight, active change, and consolidation. Development of an "observing self" is an important foundation stone for change, and cognitive analytic therapists need to work within the client's zone of proximal development. A number of aspects appear important in facilitating change, such as attending to the process and feelings generated by change talk. Copyright © 2017 John Wiley & Sons, Ltd.
New correction procedures for the fast field program which extend its range
NASA Technical Reports Server (NTRS)
West, M.; Sack, R. A.
1990-01-01
A fast field program (FFP) algorithm was developed based on the method of Lee et al., for the prediction of sound pressure level from low frequency, high intensity sources. In order to permit accurate predictions at distances greater than 2 km, new correction procedures have had to be included in the algorithm. Certain functions, whose Hankel transforms can be determined analytically, are subtracted from the depth dependent Green's function. The distance response is then obtained as the sum of these transforms and the Fast Fourier Transformation (FFT) of the residual k dependent function. One procedure, which permits the elimination of most complex exponentials, has allowed significant changes in the structure of the FFP algorithm, which has resulted in a substantial reduction in computation time.
Karge, Lukas; Gilles, Ralph
2017-01-01
An improved data-reduction procedure is proposed and demonstrated for small-angle neutron scattering (SANS) measurements. Its main feature is the correction of geometry- and wavelength-dependent intensity variations on the detector in a separate step from the different pixel sensitivities: the geometric and wavelength effects can be corrected analytically, while pixel sensitivities have to be calibrated to a reference measurement. The geometric effects are treated for position-sensitive 3He proportional counter tubes, where they are anisotropic owing to the cylindrical geometry of the gas tubes. For the calibration of pixel sensitivities, a procedure is developed that is valid for isotropic and anisotropic signals. The proposed procedure can save a significant amount of beamtime which has hitherto been used for calibration measurements. PMID:29021734
Modified procedure to determine acid-insoluble lignin in wood and pulp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Effland, M.J.
1977-10-01
If wood is treated with strong acid, carbohydrates are hydrolyzed and solubilized. The insoluble residue is by definition lignin and can be measured gravimetrically. The standard method of analysis requires samples of 1 or 2 g of wood or pulp. In research at this laboratory these amounts of sample are often not available for analytical determinations. Thus we developed a modification of the standard procedure suitable for much smaller sample amounts. The modification is based on the procedure of Saeman. Wood samples require extraction prior to lignin analysis to remove acid-insoluble extractives that will be measured as lignin. Usually thismore » involves only a standard extraction with ethanol--benzene. However, woods high in tannin must also be subjected to extraction with alcohol. Pulps seldom require extraction.« less
NASA Astrophysics Data System (ADS)
Khellat, M. R.; Mirjalili, A.
2017-03-01
We first consider the idea of renormalization group-induced estimates, in the context of optimization procedures, for the Brodsky-Lepage-Mackenzie approach to generate higher-order contributions to QCD perturbative series. Secondly, we develop the deviation pattern approach (DPA) in which through a series of comparisons between lowerorder RG-induced estimates and the corresponding analytical calculations, one could modify higher-order RG-induced estimates. Finally, using the normal estimation procedure and DPA, we get estimates of αs4 corrections for the Bjorken sum rule of polarized deep-inelastic scattering and for the non-singlet contribution to the Adler function.
Bozzolino, Cristina; Leporati, Marta; Gani, Federica; Ferrero, Cinzia; Vincenti, Marco
2018-02-20
A fast analytical method for the simultaneous detection of 24 β 2 -agonists in human urine was developed and validated. The method covers the therapeutic drugs most commonly administered, but also potentially abused β 2 -agonists. The procedure is based on enzymatic deconjugation with β-glucuronidase followed by SPE clean up using mixed-phase cartridges with both ion-exchange and lipophilic properties. Instrumental analysis conducted by UHPLC-MS/MS allowed high peak resolution and rapid chromatographic separation, with reduced time and costs. The method was fully validated according ISO 17025:2005 principles. The following parameters were determined for each analyte: specificity, selectivity, linearity, limit of detection, limit of quantification, precision, accuracy, matrix effect, recovery and carry-over. The method was tested on real samples obtained from patients subjected to clinical treatment under chronic or acute therapy with either formoterol, indacaterol, salbutamol, or salmeterol. The drugs were administered using pressurized metered dose inhalers. All β 2 -agonists administered to the patients were detected in the real samples. The method proved adequate to accurately measure the concentration of these analytes in the real samples. The observed analytical data are discussed with reference to the administered dose and the duration of the therapy. Copyright © 2017 Elsevier B.V. All rights reserved.
Long, Stephen E; Catron, Brittany L; Boggs, Ashley Sp; Tai, Susan Sc; Wise, Stephen A
2016-09-01
The use of urinary iodine as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in urine. Likewise, the use of dietary iodine intake as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in dietary sources, including foods and dietary supplements. Similarly, the use of specific serum biomarkers of thyroid function to screen for both iodine deficiency and iodine excess relies in part on the accuracy of the analytical measurement of those biomarkers. The National Institute of Standards and Technology has been working with the NIH Office of Dietary Supplements for several years to develop higher-order reference measurement procedures and Standard Reference Materials to support the validation of new routine analytical methods for iodine in foods and dietary supplements, for urinary iodine, and for several serum biomarkers of thyroid function including thyroid-stimulating hormone, thyroglobulin, total and free thyroxine, and total and free triiodothyronine. These materials and methods have the potential to improve the assessment of iodine status and thyroid function in observational studies and clinical trials, thereby promoting public health efforts related to iodine nutrition. © 2016 American Society for Nutrition.
NASA Technical Reports Server (NTRS)
Sawdy, D. T.; Beckemeyer, R. J.; Patterson, J. D.
1976-01-01
Results are presented from detailed analytical studies made to define methods for obtaining improved multisegment lining performance by taking advantage of relative placement of each lining segment. Properly phased liner segments reflect and spatially redistribute the incident acoustic energy and thus provide additional attenuation. A mathematical model was developed for rectangular ducts with uniform mean flow. Segmented acoustic fields were represented by duct eigenfunction expansions, and mode-matching was used to ensure continuity of the total field. Parametric studies were performed to identify attenuation mechanisms and define preliminary liner configurations. An optimization procedure was used to determine optimum liner impedance values for a given total lining length, Mach number, and incident modal distribution. Optimal segmented liners are presented and it is shown that, provided the sound source is well-defined and flow environment is known, conventional infinite duct optimum attenuation rates can be improved. To confirm these results, an experimental program was conducted in a laboratory test facility. The measured data are presented in the form of analytical-experimental correlations. Excellent agreement between theory and experiment verifies and substantiates the analytical prediction techniques. The results indicate that phased liners may be of immediate benefit in the development of improved aircraft exhaust duct noise suppressors.
Langlois, Marie-Hélène; Vekris, Antonios; Bousses, Christine; Mordelet, Elodie; Buhannic, Nathalie; Séguard, Céline; Couraud, Pierre-Olivier; Weksler, Babette B; Petry, Klaus G; Gaudin, Karen
2015-04-15
A Reversed Phase-High Performance Liquid Chromatography/Diode Array Detection method was developed and validated for paracetamol quantification in cell culture fluid from an in vitro Blood Brain Barrier model. The chromatographic method and sample preparation were developed using only aqueous solvents. The column was a XTerra RP18 150 × 4.6mm, 3.5 μm with a guard column XTerra RP18 20 × 4.6 mm, 3.5 μm at 35 °C and the mobile phase was composed by 100% formate buffer 20 mM at pH 4 and flow rate was set at 1 mL/min. The detection was at 242 nm. The sample was injected at 10 μL. Validation was performed using the accuracy profile approach. The analytical procedure was validated with the acceptance limits at ± 10% over a range of concentration from 1 to 58 mg L(-1). The procedure was then used in routine to determine paracetamol concentration in a brain blood barrier in vitro model. Application of the Unither paracetamol formulation in Blood Brain Barrier model allowed the determination and comparison of the transcellular passage of paracetamol at 37 °C and 4 °C, that excludes paracellular or non specific leakage. Copyright © 2015 Elsevier B.V. All rights reserved.
Design and analysis of composite structures with stress concentrations
NASA Technical Reports Server (NTRS)
Garbo, S. P.
1983-01-01
An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.
ERIC Educational Resources Information Center
Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba
2010-01-01
Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…
40 CFR 63.786 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...
40 CFR 63.786 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... level of sample dilution must be factored in. (2) Repeatability. First, at the 0.1-5 percent analyte... percent analyte range the results would be suspect if duplicates vary by more than 5 percent relative and...) Reproducibility. First, at the 0.1-5 percent analyte range the results would be suspect if lab to lab variation...
How to conduct External Quality Assessment Schemes for the pre-analytical phase?
Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre
2014-01-01
In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, F.H.; Borek, T.T.; Christopher, J.Z.
1997-12-01
Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less
Estimating Aquifer Properties Using Sinusoidal Pumping Tests
NASA Astrophysics Data System (ADS)
Rasmussen, T. C.; Haborak, K. G.; Young, M. H.
2001-12-01
We develop the theoretical and applied framework for using sinusoidal pumping tests to estimate aquifer properties for confined, leaky, and partially penetrating conditions. The framework 1) derives analytical solutions for three boundary conditions suitable for many practical applications, 2) validates the analytical solutions against a finite element model, 3) establishes a protocol for conducting sinusoidal pumping tests, and 4) estimates aquifer hydraulic parameters based on the analytical solutions. The analytical solutions to sinusoidal stimuli in radial coordinates are derived for boundary value problems that are analogous to the Theis (1935) confined aquifer solution, the Hantush and Jacob (1955) leaky aquifer solution, and the Hantush (1964) partially penetrated confined aquifer solution. The analytical solutions compare favorably to a finite-element solution of a simulated flow domain, except in the region immediately adjacent to the pumping well where the implicit assumption of zero borehole radius is violated. The procedure is demonstrated in one unconfined and two confined aquifer units near the General Separations Area at the Savannah River Site, a federal nuclear facility located in South Carolina. Aquifer hydraulic parameters estimated using this framework provide independent confirmation of parameters obtained from conventional aquifer tests. The sinusoidal approach also resulted in the elimination of investigation-derived wastes.
The pitfalls of hair analysis for toxicants in clinical practice: three case reports.
Frisch, Melissa; Schwartz, Brian S
2002-01-01
Hair analysis is used to assess exposure to heavy metals in patients presenting with nonspecific symptoms and is a commonly used procedure in patients referred to our clinic. We are frequently called on to evaluate patients who have health-related concerns as a result of hair analysis. Three patients first presented to outside physicians with nonspecific, multisystemic symptoms. A panel of analytes was measured in hair, and one or more values were interpreted as elevated. As a result of the hair analysis and other unconventional diagnostic tests, the patients presented to us believing they suffered from metal toxicity. In this paper we review the clinical efficacy of this procedure within the context of a patient population with somatic disorders and no clear risk factors for metal intoxication. We also review limitations of hair analysis in this setting; these limitations include patient factors such as low pretest probability of disease and test factors such as the lack of validation of analytic techniques, the inability to discern between exogenous contaminants and endogenous toxicants in hair, the variability of analytic procedures, low interlaboratory reliability, and the increased likelihood of false positive test results in the measurement of panels of analytes. PMID:11940463
NASA Astrophysics Data System (ADS)
Conte, Eric D.; Barry, Eugene F.; Rubinstein, Harry
1996-12-01
Certain individuals may be sensitive to specific compounds in comsumer products. It is important to quantify these analytes in food products in order to monitor their intake. Caffeine is one such compound. Determination of caffeine in beverages by spectrophotometric procedures requires an extraction procedure, which can prove time-consuming. Although the corresponding determination by HPLC allows for a direct injection, capillary zone electrophoresis provides several advantages such as extremely low solvent consumption, smaller sample volume requirements, and improved sensitivity.
A Fuzzy-Based Decision Support Model for Selecting the Best Dialyser Flux in Haemodialysis.
Oztürk, Necla; Tozan, Hakan
2015-01-01
Decision making is an important procedure for every organization. The procedure is particularly challenging for complicated multi-criteria problems. Selection of dialyser flux is one of the decisions routinely made for haemodialysis treatment provided for chronic kidney failure patients. This study provides a decision support model for selecting the best dialyser flux between high-flux and low-flux dialyser alternatives. The preferences of decision makers were collected via a questionnaire. A total of 45 questionnaires filled by dialysis physicians and nephrologists were assessed. A hybrid fuzzy-based decision support software that enables the use of Analytic Hierarchy Process (AHP), Fuzzy Analytic Hierarchy Process (FAHP), Analytic Network Process (ANP), and Fuzzy Analytic Network Process (FANP) was used to evaluate the flux selection model. In conclusion, the results showed that a high-flux dialyser is the best. option for haemodialysis treatment.
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Dynamic variational asymptotic procedure for laminated composite shells
NASA Astrophysics Data System (ADS)
Lee, Chang-Yong
Unlike published shell theories, the main two parts of this thesis are devoted to the asymptotic construction of a refined theory for composite laminated shells valid over a wide range of frequencies and wavelengths. The resulting theory is applicable to shells each layer of which is made of materials with monoclinic symmetry. It enables one to analyze shell dynamic responses within both long-wavelength, low- and high-frequency vibration regimes. It also leads to energy functionals that are both positive definiteness and sufficient simplicity for all wavelengths. This whole procedure was first performed analytically. From the insight gained from the procedure, a finite element version of the analysis was then developed; and a corresponding computer program, DVAPAS, was developed. DVAPAS can obtain the generalized 2-D constitutive law and recover accurately the 3-D results for stress and strain in composite shells. Some independent works will be needed to develop the corresponding 2-D surface analysis associated with the present theory and to continue towards full verification and validation of the present process by comparison with available published works.
Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.
Sugino, T; Kawahira, H; Nakamura, R
2014-09-01
Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information. Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits. Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently. Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.
NASA Technical Reports Server (NTRS)
Wilkenfeld, J. M.; Judge, R. J. R.; Harlacher, B. L.
1982-01-01
A combined experimental and analytical program to develop system electrical test procedures for the qualification of spacecraft against damage produced by space-electron-induced discharges (EID) occurring on spacecraft dielectric outer surfaces is described. The data on the response of a simple satellite model, called CAN, to electron-induced discharges is presented. The experimental results were compared to predicted behavior and to the response of the CAN to electrical injection techniques simulating blowoff and arc discharges. Also included is a review of significant results from other ground tests and the P78-2 program to form a data base from which is specified those test procedures which optimally simulate the response of spacecraft to EID. The electrical and electron spraying test data were evaluated to provide a first-cut determination of the best methods for performance of electrical excitation qualification tests from the point of view of simulation fidelity.
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
Fundamental procedures of geographic information analysis
NASA Technical Reports Server (NTRS)
Berry, J. K.; Tomlin, C. D.
1981-01-01
Analytical procedures common to most computer-oriented geographic information systems are composed of fundamental map processing operations. A conceptual framework for such procedures is developed and basic operations common to a broad range of applications are described. Among the major classes of primitive operations identified are those associated with: reclassifying map categories as a function of the initial classification, the shape, the position, or the size of the spatial configuration associated with each category; overlaying maps on a point-by-point, a category-wide, or a map-wide basis; measuring distance; establishing visual or optimal path connectivity; and characterizing cartographic neighborhoods based on the thematic or spatial attributes of the data values within each neighborhood. By organizing such operations in a coherent manner, the basis for a generalized cartographic modeling structure can be developed which accommodates a variety of needs in a common, flexible and intuitive manner. The use of each is limited only by the general thematic and spatial nature of the data to which it is applied.
The NASTRAN theoretical manual
NASA Technical Reports Server (NTRS)
1981-01-01
Designed to accommodate additions and modifications, this commentary on NASTRAN describes the problem solving capabilities of the program in a narrative fashion and presents developments of the analytical and numerical procedures that underlie the program. Seventeen major sections and numerous subsections cover; the organizational aspects of the program, utility matrix routines, static structural analysis, heat transfer, dynamic structural analysis, computer graphics, special structural modeling techniques, error analysis, interaction between structures and fluids, and aeroelastic analysis.
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Walpurgis, Katja; Geyer, Hans; Schänzer, Wilhelm
2016-01-01
The aim of improving anti-doping efforts is predicated on several different pillars, including, amongst others, optimized analytical methods. These commonly result from exploiting most recent developments in analytical instrumentation as well as research data on elite athletes' physiology in general, and pharmacology, metabolism, elimination, and downstream effects of prohibited substances and methods of doping, in particular. The need for frequent and adequate adaptations of sports drug testing procedures has been incessant, largely due to the uninterrupted emergence of new chemical entities but also due to the apparent use of established or even obsolete drugs for reasons other than therapeutic means, such as assumed beneficial effects on endurance, strength, and regeneration capacities. Continuing the series of annual banned-substance reviews, literature concerning human sports drug testing published between October 2014 and September 2015 is summarized and reviewed in reference to the content of the 2015 Prohibited List as issued by the World Anti-Doping Agency (WADA), with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2016 John Wiley & Sons, Ltd.
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
42 CFR 493.801 - Condition: Enrollment and testing of samples.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subpart. (2)(i) Designate the program(s) to be used for each specialty, subspecialty, and analyte or test... procedures, in accordance with § 493.1236(c)(1). (3) For each specialty, subspecialty and analyte or test...
75 FR 5722 - Procedures for Transportation Workplace Drug and Alcohol Testing Programs
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... drugs in a DOT drug test. You must not test ``DOT specimens'' for any other drugs. (a) Marijuana... test analyte concentration analyte concentration Marijuana metabolites 50 ng/mL THCA \\1\\ 15 ng/mL...
Kumar, Keshav; Mishra, Ashok Kumar
2015-07-01
Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.
Sample Collection Procedures and Strategies
Individuals responsible for collecting environmental and building material samples following a contamination incident, can use these procedures to plan for and/or collect samples for analysis using the analytical methods listed in EPA's SAM
Mirasoli, Mara; Guardigli, Massimo; Michelini, Elisa; Roda, Aldo
2014-01-01
Miniaturization of analytical procedures through microchips, lab-on-a-chip or micro total analysis systems is one of the most recent trends in chemical and biological analysis. These systems are designed to perform all the steps in an analytical procedure, with the advantages of low sample and reagent consumption, fast analysis, reduced costs, possibility of extra-laboratory application. A range of detection technologies have been employed in miniaturized analytical systems, but most applications relied on fluorescence and electrochemical detection. Chemical luminescence (which includes chemiluminescence, bioluminescence, and electrogenerated chemiluminescence) represents an alternative detection principle that offered comparable (or better) analytical performance and easier implementation in miniaturized analytical devices. Nevertheless, chemical luminescence-based ones represents only a small fraction of the microfluidic devices reported in the literature, and until now no review has been focused on these devices. Here we review the most relevant applications (since 2009) of miniaturized analytical devices based on chemical luminescence detection. After a brief overview of the main chemical luminescence systems and of the recent technological advancements regarding their implementation in miniaturized analytical devices, analytical applications are reviewed according to the nature of the device (microfluidic chips, microchip electrophoresis, lateral flow- and paper-based devices) and the type of application (micro-flow injection assays, enzyme assays, immunoassays, gene probe hybridization assays, cell assays, whole-cell biosensors). Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Ghaffari, F.; Chaturvedi, S. K.
1984-01-01
An analytical design procedure for leading edge extensions (LEE) was developed for thick delta wings. This LEE device is designed to be mounted to a wing along the pseudo-stagnation stream surface associated with the attached flow design lift coefficient of greater than zero. The intended purpose of this device is to improve the aerodynamic performance of high subsonic and low supersonic aircraft at incidences above that of attached flow design lift coefficient, by using a vortex system emanating along the leading edges of the device. The low pressure associated with these vortices would act on the LEE upper surface and the forward facing area at the wing leading edges, providing an additional lift and effective leading edge thrust recovery. The first application of this technique was to a thick, round edged, twisted and cambered wing of approximately triangular planform having a sweep of 58 deg and aspect ratio of 2.30. The panel aerodynamics and vortex lattice method with suction analogy computer codes were employed to determine the pseudo-stagnation stream surface and an optimized LEE planform shape.
Salvo, Andrea; La Torre, Giovanna Loredana; Di Stefano, Vita; Capocchiano, Valentina; Mangano, Valentina; Saija, Emanuele; Pellizzeri, Vito; Casale, Katia Erminia; Dugo, Giacomo
2017-04-15
A fast reversed-phase UPLC method was developed for squalene determination in Sicilian pistachio samples that entry in the European register of the products with P.D.O. In the present study the SPE procedure was optimized for the squalene extraction prior to the UPLC/PDA analysis. The precision of the full analytical procedure was satisfactory and the mean recoveries were 92.8±0.3% and 96.6±0.1% for 25 and 50mgL -1 level of addition, respectively. Selected chromatographic conditions allowed a very fast squalene determination; in fact it was well separated in ∼0.54min with good resolution. Squalene was detected in all the pistachio samples analyzed and the levels ranged from 55.45-226.34mgkg -1 . Comparing our results with those of other studies it emerges that squalene contents in P.D.O. Sicilian pistachio samples, generally, were higher than those measured for other samples of different geographic origins. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tuzen, Mustafa; Karaman, Isa; Citak, Demirhan; Soylak, Mustafa
2009-07-01
A method has been developed for mercury(II) and methyl mercury speciation on Staphylococcus aureus loaded Dowex Optipore V-493 micro-column in the presented work, by using cold vapour atomic absorption spectrometry. Selective and sequential elution with 0.1 molL(-1) HCl for methyl mercury and 2 molL(-1) HCl for mercury(II) were performed at the pH range of 2-6. Optimal analytical conditions including pH, amounts of biosorbent, sample volumes were investigated. The detection limits of the analytes were 2.5 ngL(-1) for Hg(II) and 1.7 ngL(-1) for methyl mercury. The capacity of biosorbent for mercury(II) and methyl mercury was 6.5 and 5.4 mgg(-1), respectively. The validation of the presented procedure is performed by the analysis of standard reference material. The speciation procedure established was successfully applied to the speciation of mercury(II) and methyl mercury in natural water and microwave digested fish samples.
Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F
2017-11-01
Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
Gentili, Stefano; Mortali, Claudia; Mastrobattista, Luisa; Berretta, Paolo; Zaami, Simona
2016-09-10
A procedure based on headspace solid-phase microextraction (HS-SPME) coupled with gas chromatography/mass spectrometry (GC/MS) has been developed for the determination of most commonly used drugs of abuse in sweat of drivers stopped during roadside controls. DrugWipe 5A sweat screening device was used to collect sweat by a specific pad rubbed gently over forehead skin surface. The procedure involved an acid hydrolysis, a HS-SPME extraction for drugs of abuse but Δ(9)-tetrahydrocannabinol, which was directly extracted in alkaline medium HS-SPME conditions, a GC separation of analytes by a capillary column and MS detection by electron impact ionisation. The method was linear from the limit of quantification (LOQ) to 50ng drug per pad (r(2)≥0.99), with an intra- and inter-assay precision and accuracy always less than 15% and an analytical recovery between 95.1% and 102.8%, depending on the considered analyte. Using the validated method, sweat from 60 apparently intoxicated drivers were found positive to one or more drugs of abuse, showing sweat patches testing as a viable economic and simple alternative to conventional (blood and/or urine) and non conventional (oral fluid) testing of drugs of abuse in drugged drivers. Copyright © 2016 Elsevier B.V. All rights reserved.
Loading-unloading response of circular GLARE fiber-metal laminates under lateral indentation
NASA Astrophysics Data System (ADS)
Tsamasphyros, George J.; Bikakis, George S.
2015-01-01
GLARE is a Fiber-Metal laminated material used in aerospace structures which are frequently subjected to various impact damages. Hence, the response of GLARE plates subjected to lateral indentation is very important. In this paper, analytical expressions are derived and a non-linear finite element modeling procedure is proposed in order to predict the static load-indentation curves of circular GLARE plates during loading and unloading by a hemispherical indentor. We have recently published analytical formulas and a finite element procedure for the static indentation of circular GLARE plates which are now used during the loading stage. Here, considering that aluminum layers are in a state of membrane yield and employing energy balance during unloading, the unloading path is determined. Using this unloading path, an algebraic equation is derived for calculating the permanent dent depth of the GLARE plate after the indentor's withdrawal. Furthermore, our finite element procedure is modified in order to simulate the unloading stage as well. The derived formulas and the proposed finite element modeling procedure are applied successfully to GLARE 2-2/1-0.3 and to GLARE 3-3/2-0.4 circular plates. The analytical results are compared with corresponding FEM results and a good agreement is found. The analytically calculated permanent dent depth is within 6 % for the GLARE 2 plate, and within 7 % for the GLARE 3 plate, of the corresponding numerically calculated result. No other solution of this problem is known to the authors.
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
40 CFR 265.92 - Sampling and analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Analytical procedures; and (4) Chain of custody control. [Comment: See “Procedures Manual For Ground-water... characterizing the suitability of the ground water as a drinking water supply, as specified in appendix III. (2...
14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82...
NASA Astrophysics Data System (ADS)
Fukushima, Toshio
2018-02-01
In order to accelerate the spherical harmonic synthesis and/or analysis of arbitrary function on the unit sphere, we developed a pair of procedures to transform between a truncated spherical harmonic expansion and the corresponding two-dimensional Fourier series. First, we obtained an analytic expression of the sine/cosine series coefficient of the 4 π fully normalized associated Legendre function in terms of the rectangle values of the Wigner d function. Then, we elaborated the existing method to transform the coefficients of the surface spherical harmonic expansion to those of the double Fourier series so as to be capable with arbitrary high degree and order. Next, we created a new method to transform inversely a given double Fourier series to the corresponding surface spherical harmonic expansion. The key of the new method is a couple of new recurrence formulas to compute the inverse transformation coefficients: a decreasing-order, fixed-degree, and fixed-wavenumber three-term formula for general terms, and an increasing-degree-and-order and fixed-wavenumber two-term formula for diagonal terms. Meanwhile, the two seed values are analytically prepared. Both of the forward and inverse transformation procedures are confirmed to be sufficiently accurate and applicable to an extremely high degree/order/wavenumber as 2^{30} {≈ } 10^9. The developed procedures will be useful not only in the synthesis and analysis of the spherical harmonic expansion of arbitrary high degree and order, but also in the evaluation of the derivatives and integrals of the spherical harmonic expansion.
Li, Jian; Chen, Tian; Wang, Yuwei; Shi, Zhixiong; Zhou, Xianqing; Sun, Zhiwei; Wang, Dejun; Wu, Yongning
2017-02-01
Two simplified sample preparation procedures for simultaneous extraction and clean-up of tetrabromobisphenol A, α-, β-, and γ-hexabromocyclododecane and polybrominated diphenyl ethers in human serum were developed and validated. The first procedure was based on solid-phase extraction. Sample extraction, purification, and lipid removal were carried out directly on an Oasis HLB cartridge. The second procedure was a quick, easy, cheap, effective, rugged, and safe-based approach using octadecyl-modified silica particles as a sorbent. After sample extraction and cleanup, tetrabromobisphenol A/hexabromocyclododecane was separated from polybrominated diphenyl ethers by using a Si-based cartridge. Tetrabromobisphenol A and hexabromocyclododecane were then detected by high-performance liquid chromatography coupled to tandem mass spectrometry, while polybrominated diphenyl ethers were detected by gas chromatography coupled to tandem mass spectrometry. The results of the spike recovery test using fetal bovine serum showed that the average recoveries of the analytes ranged from 87.3 to 115.3% with relative standard deviations equal to or lower than 13.4 %. Limits of detection of the analytes were in the range of 0.4-19 pg/mL except for decabromodiphenyl ether. The developed method was successfully applied to routine analysis of human serum samples from occupational workers and the general population. Extremely high serum polybrominated diphenyl ethers levels up to 3.32 × 10 4 ng/g lipid weight were found in occupational workers. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Surface slope metrology of highly curved x-ray optics with an interferometric microscope
NASA Astrophysics Data System (ADS)
Gevorkyan, Gevork S.; Centers, Gary; Polonska, Kateryna S.; Nikitin, Sergey M.; Lacey, Ian; Yashchuk, Valeriy V.
2017-09-01
The development of deterministic polishing techniques has given rise to vendors that manufacture high quality threedimensional x-ray optics. The surface metrology on these optics remains a difficult task. For the fabrication, vendors usually use unique surface metrology tools, generally developed on site, that are not available in the optical metrology labs at x-ray facilities. At the Advanced Light Source X-Ray Optics Laboratory, we have developed a rather straightforward interferometric-microscopy-based procedure capable of sub microradian characterization of sagittal slope variation of x-ray optics for two-dimensionally focusing and collimating (such as ellipsoids, paraboloids, etc.). In the paper, we provide the mathematical foundation of the procedure and describe the related instrument calibration. We also present analytical expression describing the ideal surface shape in the sagittal direction of a spheroid specified by the conjugate parameters of the optic's beamline application. The expression is useful when analyzing data obtained with such optics. The high efficiency of the developed measurement and data analysis procedures is demonstrated in results of measurements with a number of x-ray optics with sagittal radius of curvature between 56 mm and 480 mm. We also discuss potential areas of further improvement.
Communication Network Analysis Methods.
ERIC Educational Resources Information Center
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu
2008-04-25
The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination identification in UPW production lines.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
An analytic technique for statistically modeling random atomic clock errors in estimation
NASA Technical Reports Server (NTRS)
Fell, P. J.
1981-01-01
Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.
Biosensors for the determination of environmental inhibitors of enzymes
NASA Astrophysics Data System (ADS)
Evtugyn, Gennadii A.; Budnikov, Herman C.; Nikolskaya, Elena B.
1999-12-01
Characteristic features of functioning and practical application of enzyme-based biosensors for the determination of environmental pollutants as enzyme inhibitors are considered with special emphasis on the influence of the methods used for the measurement of the rates of enzymic reactions, of enzyme immobilisation procedure and of the composition of the reaction medium on the analytical characteristics of inhibitor assays. The published data on the development of biosensors for detecting pesticides and heavy metals are surveyed. Special attention is given to the use of cholinesterase-based biosensors in environmental and analytical monitoring. The approaches to the estimation of kinetic parameters of inhibition are reviewed and the factors determining the selectivity and sensitivity of inhibitor assays in environmental objects are analysed. The bibliography includes 195 references.
Bassarab, P; Williams, D; Dean, J R; Ludkin, E; Perry, J J
2011-02-04
A method for the simultaneous determination of two biocidal quaternary ammonium compounds; didecyldimethylammonium chloride (didecyldimethyl quat) and dodecylbenzyldimethylammonium chloride (benzyl quat), in seawater by solid phase extraction (SPE) followed by liquid chromatography-mass spectrometry (LC-MS) was developed. The optimised procedure utilised off-line extraction of the analytes from seawater using polymeric (Strata-X) SPE cartridges. Recoveries ranged from 80 to 105%, with detection limits at the low parts-per-trillion (ng/l) level for both analytes. To demonstrate sensitivity, environmental concentrations were measured at three different locations along the North East coast of England with measured values in the range 120-270ng/l. Copyright © 2010 Elsevier B.V. All rights reserved.
Muthu, Pravin; Lutz, Stefan
2016-04-05
Fast, simple and cost-effective methods for detecting and quantifying pharmaceutical agents in patients are highly sought after to replace equipment and labor-intensive analytical procedures. The development of new diagnostic technology including portable detection devices also enables point-of-care by non-specialists in resource-limited environments. We have focused on the detection and dose monitoring of nucleoside analogues used in viral and cancer therapies. Using deoxyribonucleoside kinases (dNKs) as biosensors, our chemometric model compares observed time-resolved kinetics of unknown analytes to known substrate interactions across multiple enzymes. The resulting dataset can simultaneously identify and quantify multiple nucleosides and nucleoside analogues in complex sample mixtures. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Validation of the replica trick for simple models
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2018-04-01
We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.
SAMPLING AND ANALYSIS OF MERCURY IN CRUDE OIL
Sampling and analytical procedures used to determine total mercury content in crude oils were examined. Three analytical methods were compared with respect to accuracy, precision and detection limit. The combustion method and a commercial extraction method were found adequate to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and... specifications for fuels, engine fluids, and analytical gases; these specifications apply for testing under this...
Containment of composite fan blades
NASA Technical Reports Server (NTRS)
Stotler, C. L.; Coppa, A. P.
1979-01-01
A lightweight containment was developed for turbofan engine fan blades. Subscale ballistic-type tests were first run on a number of concepts. The most promising configuration was selected and further evaluated by larger scale tests in a rotating test rig. Weight savings made possible by the use of this new containment system were determined and extrapolated to a CF6-size engine. An analytical technique was also developed to predict the released blades motion when involved in the blade/casing interaction process. Initial checkout of this procedure was accomplished using several of the tests run during the program.
Foreman, William T.; Gray, James L.; ReVello, Rhiannon C.; Lindley, Chris E.; Losche, Scott A.; Barber, Larry B.
2012-01-01
A new analytical method has been developed and implemented at the U.S. Geological Survey National Water Quality Laboratory that determines a suite of 20 steroid hormones and related compounds in filtered water (using laboratory schedule 2434) and in unfiltered water (using laboratory schedule 4434). This report documents the procedures and initial performance data for the method and provides guidance on application of the method and considerations of data quality in relation to data interpretation. The analytical method determines 6 natural and 3 synthetic estrogen compounds, 6 natural androgens, 1 natural and 1 synthetic progestin compound, and 2 sterols: cholesterol and 3--coprostanol. These two sterols have limited biological activity but typically are abundant in wastewater effluents and serve as useful tracers. Bisphenol A, an industrial chemical used primarily to produce polycarbonate plastic and epoxy resins and that has been shown to have estrogenic activity, also is determined by the method. A technique referred to as isotope-dilution quantification is used to improve quantitative accuracy by accounting for sample-specific procedural losses in the determined analyte concentration. Briefly, deuterium- or carbon-13-labeled isotope-dilution standards (IDSs), all of which are direct or chemically similar isotopic analogs of the method analytes, are added to all environmental and quality-control and quality-assurance samples before extraction. Method analytes and IDS compounds are isolated from filtered or unfiltered water by solid-phase extraction onto an octadecylsilyl disk, overlain with a graded glass-fiber filter to facilitate extraction of unfiltered sample matrices. The disks are eluted with methanol, and the extract is evaporated to dryness, reconstituted in solvent, passed through a Florisil solid-phase extraction column to remove polar organic interferences, and again evaporated to dryness in a reaction vial. The method compounds are reacted with activated -methyl--trimethylsilyl trifluoroacetamide at 65 degrees Celsius for 1 hour to form trimethylsilyl or trimethylsilyl-enol ether derivatives that are more amenable to gas chromatographic separation than the underivatized compounds. Analysis is carried out by gas chromatography with tandem mass spectrometry using calibration standards that are derivatized concurrently with the sample extracts. Analyte concentrations are quantified relative to specific IDS compounds in the sample, which directly compensate for procedural losses (incomplete recovery) in the determined and reported analyte concentrations. Thus, reported analyte concentrations (or analyte recoveries for spiked samples) are corrected based on recovery of the corresponding IDS compound during the quantification process. Recovery for each IDS compound is reported for each sample and represents an absolute recovery in a manner comparable to surrogate recoveries for other organic methods used by the National Water Quality Laboratory. Thus, IDS recoveries provide a useful tool for evaluating sample-specific analytical performance from an absolute mass recovery standpoint. IDS absolute recovery will differ and typically be lower than the corresponding analyte’s method recovery in spiked samples. However, additional correction of reported analyte concentrations is unnecessary and inappropriate because the analyte concentration (or recovery) already is compensated for by the isotope-dilution quantification procedure. Method analytes were spiked at 10 and 100 nanograms per liter (ng/L) for most analytes (10 times greater spike levels were used for bisphenol A and 100 times greater spike levels were used for 3--coprostanol and cholesterol) into the following validation-sample matrices: reagent water, wastewater-affected surface water, a secondary-treated wastewater effluent, and a primary (no biological treatment) wastewater effluent. Overall method recovery for all analytes in these matrices averaged 100 percent, with overall relative standard deviation of 28 percent. Mean recoveries of the 20 individual analytes for spiked reagent-water samples prepared along with field samples and analyzed in 2009–2010 ranged from 84–104 percent, with relative standard deviations of 6–36 percent. Concentrations for two analytes, equilin and progesterone, are reported as estimated because these analytes had excessive bias or variability, or both. Additional database coding is applied to other reported analyte data as needed, based on sample-specific IDS recovery performance. Detection levels were derived statistically by fortifying reagent water at six different levels (0.1 to 4 ng/L) and range from about 0.4 to 4 ng/L for 16 analytes. Interim reporting levels applied to analytes in this report range from 0.8 to 8 ng/L. Bisphenol A and the sterols (cholesterol and 3-beta-coprostanol) were consistently detected in laboratory and field blanks. The minimum reporting levels were set at 100 ng/L for bisphenol A and at 200 ng/L for the two sterols to prevent any bias associated with the presence of these compounds in the blanks. A minimum reporting level of 2 ng/L was set for 11-ketotestosterone to minimize false positive risk from an interfering siloxane compound emanating as chromatographic-column bleed, from vial septum material, or from other sources at no more than 1 ng/L.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
NASA Astrophysics Data System (ADS)
Haddout, Y.; Essaghir, E.; Oubarra, A.; Lahjomri, J.
2017-12-01
Thermally developing laminar slip flow through a micropipe and a parallel plate microchannel, with axial heat conduction and uniform wall heat flux, is studied analytically by using a powerful method of self-adjoint formalism. This method results from a decomposition of the elliptic energy equation into a system of two first-order partial differential equations. The advantage of this method over other methods, resides in the fact that the decomposition procedure leads to a selfadjoint problem although the initial problem is apparently not a self-adjoint one. The solution is an extension of prior studies and considers a first order slip model boundary conditions at the fluid-wall interface. The analytical expressions for the developing temperature and local Nusselt number in the thermal entrance region are obtained in the general case. Therefore, the solution obtained could be extended easily to any hydrodynamically developed flow and arbitrary heat flux distribution. The analytical results obtained are compared for select simplified cases with available numerical calculations and they both agree. The results show that the heat transfer characteristics of flow in the thermal entrance region are strongly influenced by the axial heat conduction and rarefaction effects which are respectively characterized by Péclet and Knudsen numbers.
NASA Astrophysics Data System (ADS)
Haddout, Y.; Essaghir, E.; Oubarra, A.; Lahjomri, J.
2018-06-01
Thermally developing laminar slip flow through a micropipe and a parallel plate microchannel, with axial heat conduction and uniform wall heat flux, is studied analytically by using a powerful method of self-adjoint formalism. This method results from a decomposition of the elliptic energy equation into a system of two first-order partial differential equations. The advantage of this method over other methods, resides in the fact that the decomposition procedure leads to a selfadjoint problem although the initial problem is apparently not a self-adjoint one. The solution is an extension of prior studies and considers a first order slip model boundary conditions at the fluid-wall interface. The analytical expressions for the developing temperature and local Nusselt number in the thermal entrance region are obtained in the general case. Therefore, the solution obtained could be extended easily to any hydrodynamically developed flow and arbitrary heat flux distribution. The analytical results obtained are compared for select simplified cases with available numerical calculations and they both agree. The results show that the heat transfer characteristics of flow in the thermal entrance region are strongly influenced by the axial heat conduction and rarefaction effects which are respectively characterized by Péclet and Knudsen numbers.
NASA Technical Reports Server (NTRS)
Dillard, D. A.; Morris, D. H.; Brinson, H. F.
1981-01-01
An incremental numerical procedure based on lamination theory is developed to predict creep and creep rupture of general laminates. Existing unidirectional creep compliance and delayed failure data is used to develop analytical models for lamina response. The compliance model is based on a procedure proposed by Findley which incorporates the power law for creep into a nonlinear constitutive relationship. The matrix octahedral shear stress is assumed to control the stress interaction effect. A modified superposition principle is used to account for the varying stress level effect on the creep strain. The lamina failure model is based on a modification of the Tsai-Hill theory which includes the time dependent creep rupture strength. A linear cumulative damage law is used to monitor the remaining lifetime in each ply.
NASA/FAA general aviation crash dynamics program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.; Carden, H. D.
1981-01-01
The program involves controlled full scale crash testing, nonlinear structural analyses to predict large deflection elastoplastic response, and load attenuating concepts for use in improved seat and subfloor structure. Both analytical and experimental methods are used to develop expertise in these areas. Analyses include simplified procedures for estimating energy dissipating capabilities and comprehensive computerized procedures for predicting airframe response. These analyses are developed to provide designers with methods for predicting accelerations, loads, and displacements on collapsing structure. Tests on typical full scale aircraft and on full and subscale structural components are performed to verify the analyses and to demonstrate load attenuating concepts. A special apparatus was built to test emergency locator transmitters when attached to representative aircraft structure. The apparatus is shown to provide a good simulation of the longitudinal crash pulse observed in full scale aircraft crash tests.
Skin-stiffener interface stresses in composite stiffened panels
NASA Technical Reports Server (NTRS)
Wang, J. T. S.; Biggers, S. B.
1984-01-01
A model and solution method for determining the normal and shear stresses in the interface between the skin and the stiffener attached flange were developed. An efficient, analytical solution procedure was developed and incorporated in a sizing code for stiffened panels. The analysis procedure described provides a means to study the effects of material and geometric design parameters on the interface stresses. These stresses include the normal stress, and the shear stresses in both the longitudinal and the transverse directions. The tendency toward skin/stiffener separation may therefore be minimized by choosing appropriate values for the design variables. The most important design variables include the relative bending stiffnesses of the skin and stiffener attached flange, the bending stiffness of the stiffener web, and the flange width. The longitudinal compressive loads in the flange and skin have significant effects on the interface stresses.
Solving a Mock Arsenic-Poisoning Case Using Atomic Spectroscopy
NASA Astrophysics Data System (ADS)
Tarr, Matthew A.
2001-01-01
A new upper-level undergraduate atomic spectroscopy laboratory procedure has been developed that presents a realistic problem to students and asks them to assist in solving it. Students are given arsenic-laced soda samples from a mock crime scene. From these samples, they are to gather evidence to help prosecute a murder suspect. The samples are analyzed by inductively coupled plasma atomic emission spectroscopy or by atomic absorbance spectroscopy to determine the content of specific metal impurities. By statistical comparison of the samples' composition, the students determine if the soda samples can be linked to arsenic found in the suspect's home. As much as possible, the procedures and interpretations are developed by the students. Particular emphasis is placed on evaluating the limitations and capabilities of the analytical method with respect to the demands of the problem.
The Capillary Flow Experiments Aboard the International Space Station: Increments 9-15
NASA Technical Reports Server (NTRS)
Jenson, Ryan M.; Weislogel, Mark M.; Tavan, Noel T.; Chen, Yongkang; Semerjian, Ben; Bunnell, Charles T.; Collicott, Steven H.; Klatte, Jorg; dreyer, Michael E.
2009-01-01
This report provides a summary of the experimental, analytical, and numerical results of the Capillary Flow Experiment (CFE) performed aboard the International Space Station (ISS). The experiments were conducted in space beginning with Increment 9 through Increment 16, beginning August 2004 and ending December 2007. Both primary and extra science experiments were conducted during 19 operations performed by 7 astronauts including: M. Fincke, W. McArthur, J. Williams, S. Williams, M. Lopez-Alegria, C. Anderson, and P. Whitson. CFE consists of 6 approximately 1 to 2 kg handheld experiment units designed to investigate a selection of capillary phenomena of fundamental and applied importance, such as large length scale contact line dynamics (CFE-Contact Line), critical wetting in discontinuous structures (CFE-Vane Gap), and capillary flows and passive phase separations in complex containers (CFE-Interior Corner Flow). Highly quantitative video from the simply performed flight experiments provide data helpful in benchmarking numerical methods, confirming theoretical models, and guiding new model development. In an extensive executive summary, a brief history of the experiment is reviewed before introducing the science investigated. A selection of experimental results and comparisons with both analytic and numerical predictions is given. The subsequent chapters provide additional details of the experimental and analytical methods developed and employed. These include current presentations of the state of the data reduction which we anticipate will continue throughout the year and culminate in several more publications. An extensive appendix is used to provide support material such as an experiment history, dissemination items to date (CFE publication, etc.), detailed design drawings, and crew procedures. Despite the simple nature of the experiments and procedures, many of the experimental results may be practically employed to enhance the design of spacecraft engineering systems involving capillary interface dynamics.
Tarigh, Ghazale Daneshvar; Shemirani, Farzaneh
2014-06-01
A simple and rapid method for the simultaneous in situ derivatizaion, preconcentration and extraction of thiamine (vitamin B1) as a model analyte was developed by a novel quantitative method, namely ultrasound-assisted dispersive magnetic solid phase extraction spectrofluorimetry (USA-DMSPE-FL) from different real samples. This method consists of sample preparation, in situ derivatization, exhaustive extraction and clean up by a single process. High extraction efficiency and in situ derivatization in a short period of time is the main advantages of this procedure. For this purpose, the reusable magnetic multi-wall carbon nanotube (MMWCNT) nanocomposite was used as an adsorbent for preconcentration and determination of thiamine. Thiamine was, simultaneously, in situ derivatized as thiochrome by potassium hexacyanoferrate (III) and adsorbed on MMWCNT in an ultrasonic water bath. The MMWCNTs were then collected using an external magnetic field. Subsequently, the extracted thiochrome was washed from the surface of the adsorbent and determined by spectrofluorimetry. The developed method, which has been analytically characterized under its optimal operating conditions, allows the detection of the analyte in the samples with method detection limits of 0.37 µg L(-1). The repeatability of the method, expressed as the relative standard deviation (RSD, n=6), varies between 2.0% and 4.8% in different real samples, while the enhancement factor is 197. The proposed procedure has been applied for the determination of thiamine in biological (serum and urine), pharmaceutical (multivitamin tablet and B complex syrup) and foodstuff samples (cereal, wheat flour, banana and honey) with the good recoveries in the range from 90% to 105%. Copyright © 2014 Elsevier B.V. All rights reserved.
Evaluation of sampling plans to detect Cry9C protein in corn flour and meal.
Whitaker, Thomas B; Trucksess, Mary W; Giesbrecht, Francis G; Slate, Andrew B; Thomas, Francis S
2004-01-01
StarLink is a genetically modified corn that produces an insecticidal protein, Cry9C. Studies were conducted to determine the variability and Cry9C distribution among sample test results when Cry9C protein was estimated in a bulk lot of corn flour and meal. Emphasis was placed on measuring sampling and analytical variances associated with each step of the test procedure used to measure Cry9C in corn flour and meal. Two commercially available enzyme-linked immunosorbent assay kits were used: one for the determination of Cry9C protein concentration and the other for % StarLink seed. The sampling and analytical variances associated with each step of the Cry9C test procedures were determined for flour and meal. Variances were found to be functions of Cry9C concentration, and regression equations were developed to describe the relationships. Because of the larger particle size, sampling variability associated with cornmeal was about double that for corn flour. For cornmeal, the sampling variance accounted for 92.6% of the total testing variability. The observed sampling and analytical distributions were compared with the Normal distribution. In almost all comparisons, the null hypothesis that the Cry9C protein values were sampled from a Normal distribution could not be rejected at 95% confidence limits. The Normal distribution and the variance estimates were used to evaluate the performance of several Cry9C protein sampling plans for corn flour and meal. Operating characteristic curves were developed and used to demonstrate the effect of increasing sample size on reducing false positives (seller's risk) and false negatives (buyer's risk).
1983-05-01
DESIGN PROCEDURE M. S. IIAndal, University of Vermont, Burlington, VT Machinery Dynamics ANALYTICAL AND EXPERIMENTAL INVESTIGATION OF ROTATING BLADE... methodology to accurately predict rotor vibratory loads and has recently been initiated for detail design and bench test- coupled rotor/airframe vibrations... design methodology , a trating on the basic disciplines of aerodynamics and struc. coupled rotor/airframe vibration analysis has been developed. tural
Elly E. Holcombe; Duane G. Moore; Richard L. Fredriksen
1986-01-01
A modification of the macro-Kjeldahl method that provides increased sensitivity was developed for determining very low levels of nitrogen in forest streams and in rain-water. The method is suitable as a routine laboratory procedure. Analytical range of the method is 0.02 to 1.5 mg/L with high recovery and excellent precision and ac-curacy. The range can be increased to...
Saffron Samples of Different Origin: An NMR Study of Microwave-Assisted Extracts
Sobolev, Anatoly P.; Carradori, Simone; Capitani, Donatella; Vista, Silvia; Trella, Agata; Marini, Federico; Mannina, Luisa
2014-01-01
An NMR analytical protocol is proposed to characterize saffron samples of different geographical origin (Greece, Spain, Hungary, Turkey and Italy). A microwave-assisted extraction procedure was developed to obtain a comparable recovery of metabolites with respect to the ISO specifications, reducing the solvent volume and the extraction time needed. Metabolite profiles of geographically different saffron extracts were compared showing significant differences in the content of some metabolites. PMID:28234327
Low level vapor verification of monomethyl hydrazine
NASA Technical Reports Server (NTRS)
Mehta, Narinder
1990-01-01
The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.
Larson, S.J.; Capel, P.D.; VanderLoop, A.G.
1996-01-01
Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.
NASA Technical Reports Server (NTRS)
Payne, M. H.
1973-01-01
The bounds for the normalized associated Legendre functions P sub nm were studied to provide a rational basis for the truncation of the geopotential series in spherical harmonics in various orbital analyses. The conjecture is made that the largest maximum of the normalized associated Legendre function lies in the interval which indicates the greatest integer function. A procedure is developed for verifying this conjecture. An on-line algebraic manipulator, IAM, is used to implement the procedure and the verification is carried out for all n equal to or less than 2m, for m = 1 through 6. A rigorous proof of the conjecture is not available.
NASA Technical Reports Server (NTRS)
Arya, L. M. (Principal Investigator)
1980-01-01
Predictive procedures for developing soil hydrologic properties (i.e., relationships of soil water pressure and hydraulic conductivity to soil water content) are presented. Three models of the soil water pressure-water content relationship and one model of the hydraulic conductivity-water content relationship are discussed. Input requirements for the models are indicated, and computational procedures are outlined. Computed hydrologic properties for Keith silt loam, a soil typer near Colby, Kansas, on which the 1978 Agricultural Soil Moisture Experiment was conducted, are presented. A comparison of computed results with experimental data in the dry range shows that analytical models utilizing a few basic hydrophysical parameters can produce satisfactory data for large-scale applications.
The Da Vinci European BioBank: A Metabolomics-Driven Infrastructure
Carotenuto, Dario; Luchinat, Claudio; Marcon, Giordana; Rosato, Antonio; Turano, Paola
2015-01-01
We present here the organization of the recently-constituted da Vinci European BioBank (daVEB, https://www.davincieuropeanbiobank.org/it). The biobank was created as an infrastructure to support the activities of the Fiorgen Foundation (http://www.fiorgen.net/), a nonprofit organization that promotes research in the field of pharmacogenomics and personalized medicine. The way operating procedures concerning samples and data have been developed at daVEB largely stems from the strong metabolomics connotation of Fiorgen and from the involvement of the scientific collaborators of the foundation in international/European projects aimed to tackle the standardization of pre-analytical procedures and the promotion of data standards in metabolomics. PMID:25913579
Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek
2013-12-15
The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.
Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita
2016-10-11
We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise.
Seamless Digital Environment – Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less
Yang, Yunjia; Yu, Jianlong; Yin, Jie; Shao, Bing; Zhang, Jing
2014-11-19
This study aimed to develop a selective analytical method for the simultaneous determination of seven bisphenol analogues in beverage and canned food samples by using a new molecularly imprinted polymer (MIP) as a sorbent for solid-phase extraction (SPE). Liquid chromatography coupled to triple-quadruple tandem mass spectrometry (LC-MS/MS) was used to identify and quantify the target analytes. The MIP-SPE method exhibited a higher level of selectivity and purification than the traditional SPE method. The developed procedures were further validated in terms of accuracy, precision, and sensitivity. The obtained recoveries varied from 50% to 103% at three fortification levels and yielded a relative standard deviation (RSD, %) of less than 15% for all of the analytes. The limits of quantification (LOQ) for the seven analytes varied from 0.002 to 0.15 ng/mL for beverage samples and from 0.03 to 1.5 ng/g for canned food samples. This method was used to analyze real samples that were collected from a supermarket in Beijing. Overall, the results revealed that bisphenol A and bisphenol F were the most frequently detected bisphenols in the beverage and canned food samples and that their concentrations were closely associated with the type of packaging material. This study provides an alternative method of traditional SPE extraction for screening bisphenol analogues in food matrices.
Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin
2017-10-01
The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.
Daşbaşı, Teslima; Saçmacı, Şerife; Çankaya, Nevin; Soykan, Cengiz
2016-11-15
In this study, a simple and rapid solid phase extraction/preconcentration procedure was developed for determination of Cd(II), Co(II), Cr(III), Cu(II), Fe(III), Mn(II), Pb(II), and Zn(II) trace metals by flame atomic absorption spectrometry (FAAS). A new chelating resin, poly(N-cyclohexylacrylamide-co-divinylbenzene-co-2-acrylamido-2-methyl-1-propanesulfonic acid) (NCA-co-DVB-co-AMPS) (hereafter CDAP) was synthesized and characterized. The influences of the analytical parameters such as pH of the sample solution, type and concentration of eluent, flow rates of the sample and eluent, volume of the sample and eluent, amount of chelating resin, and interference of ions were examined. The limit of detection (LOD) of analytes were found (3s) to be in the range of 0.65-1.90μgL(-1). Preconcentration factor (PF) of 200 and the relative standard deviation (RSD) of ⩽2% were achieved (n=11). The developed method was applied for determination of analytes in some dairy samples and certified reference materials. Copyright © 2016 Elsevier Ltd. All rights reserved.
Wang, Yan-Hong; Avonto, Cristina; Avula, Bharathi; Wang, Mei; Rua, Diego; Khan, Ikhlas A
2015-01-01
An HPLC-UV method was developed for the quantitative analysis of nine skin whitening agents in a single injection. These compounds are α-arbutin, β-arbutin, kojic acid, nicotinamide, resorcinol, ascorbic acid, hydroquinone, 4-methoxyphenol, and 4-ethoxyphenol. The separation was achieved on a reversed-phase C18 column within 30 min. The mobile phase was composed of water and methanol, both containing 0.1% acetic acid (v/v). The stability of the analytes was evaluated at different pH values between 2.3 and 7.6, and the extraction procedure was validated for different types of skin whitening product matrixes, which included two creams, a soap bar, and a capsule. The best solvent system for sample preparation was 20 mM NaH2PO4 containing 10% methanol at pH 2.3. The analytical method was validated for accuracy, precision, LOD, and LOQ. The developed HPLC-UV method was applied for the quantitation of the nine analytes in 59 skin whitening products including creams, lotions, sera, foams, gels, mask sheets, soap bars, tablets, and capsules.
Meyer, Golo M J; Weber, Armin A; Maurer, Hans H
2014-05-01
Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.