40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2012 CFR
2012-07-01
....astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis of Limestone, Quicklime, and....194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for Chemical Analysis of Hydraulic... approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test Method for Heat of Combustion of...
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2013 CFR
2013-07-01
....astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis of Limestone, Quicklime, and....194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for Chemical Analysis of Hydraulic... approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test Method for Heat of Combustion of...
Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.
ERIC Educational Resources Information Center
American Society for Testing and Materials, Philadelphia, PA.
Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…
The estimation of the measurement results with using statistical methods
NASA Astrophysics Data System (ADS)
Velychko, O.; Gordiyenko, T.
2015-02-01
The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.
NASA Astrophysics Data System (ADS)
Wang, J.; Shi, M.; Zheng, P.; Xue, Sh.; Peng, R.
2018-03-01
Laser-induced breakdown spectroscopy has been applied for the quantitative analysis of Ca, Mg, and K in the roots of Angelica pubescens Maxim. f. biserrata Shan et Yuan used in traditional Chinese medicine. Ca II 317.993 nm, Mg I 517.268 nm, and K I 769.896 nm spectral lines have been chosen to set up calibration models for the analysis using the external standard and artificial neural network methods. The linear correlation coefficients of the predicted concentrations versus the standard concentrations of six samples determined by the artificial neural network method are 0.9896, 0.9945, and 0.9911 for Ca, Mg, and K, respectively, which are better than for the external standard method. The artificial neural network method also gives better performance comparing with the external standard method for the average and maximum relative errors, average relative standard deviations, and most maximum relative standard deviations of the predicted concentrations of Ca, Mg, and K in the six samples. Finally, it is proved that the artificial neural network method gives better performance compared to the external standard method for the quantitative analysis of Ca, Mg, and K in the roots of Angelica pubescens.
Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W
2013-02-01
Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.
Zietze, Stefan; Müller, Rainer H; Brecht, René
2008-03-01
In order to set up a batch-to-batch-consistency analytical scheme for N-glycosylation analysis, several sample preparation steps including enzyme digestions and fluorophore labelling and two HPLC-methods were established. The whole method scheme was standardized, evaluated and validated according to the requirements on analytical testing in early clinical drug development by usage of a recombinant produced reference glycoprotein (RGP). The standardization of the methods was performed by clearly defined standard operation procedures. During evaluation of the methods, the major interest was in the loss determination of oligosaccharides within the analytical scheme. Validation of the methods was performed with respect to specificity, linearity, repeatability, LOD and LOQ. Due to the fact that reference N-glycan standards were not available, a statistical approach was chosen to derive accuracy from the linearity data. After finishing the validation procedure, defined limits for method variability could be calculated and differences observed in consistency analysis could be separated into significant and incidental ones.
Nonparametric Estimation of Standard Errors in Covariance Analysis Using the Infinitesimal Jackknife
ERIC Educational Resources Information Center
Jennrich, Robert I.
2008-01-01
The infinitesimal jackknife provides a simple general method for estimating standard errors in covariance structure analysis. Beyond its simplicity and generality what makes the infinitesimal jackknife method attractive is that essentially no assumptions are required to produce consistent standard error estimates, not even the requirement that the…
Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis
Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.
2011-01-01
Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184
Optimal Multicomponent Analysis Using the Generalized Standard Addition Method.
ERIC Educational Resources Information Center
Raymond, Margaret; And Others
1983-01-01
Describes an experiment on the simultaneous determination of chromium and magnesium by spectophotometry modified to include the Generalized Standard Addition Method computer program, a multivariate calibration method that provides optimal multicomponent analysis in the presence of interference and matrix effects. Provides instructions for…
ERIC Educational Resources Information Center
Ji, Chang; Boisvert, Susanne M.; Arida, Ann-Marie C.; Day, Shannon E.
2008-01-01
An internal standard method applicable to undergraduate instrumental analysis or environmental chemistry laboratory has been designed and tested to determine the Henry's law constants for a series of alkyl nitriles. In this method, a mixture of the analytes and an internal standard is prepared and used to make a standard solution (organic solvent)…
REPRESENTATIVE SAMPLING AND ANALYSIS OF HETEROGENEOUS SOILS
Standard sampling and analysis methods for hazardous substances in contaminated soils currently are available and routinely employed. Standard methods inherently assume a homogeneous soil matrix and contaminant distribution; therefore only small sample quantities typically are p...
External Standards or Standard Addition? Selecting and Validating a Method of Standardization
NASA Astrophysics Data System (ADS)
Harvey, David T.
2002-05-01
A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.
This SOP describes the method used for preparing surrogate recovery standard and internal standard solutions for the analysis of polar target analytes. It also describes the method for preparing calibration standard solutions for polar analytes used for gas chromatography/mass sp...
40 CFR 86.1 - Reference materials.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) ASTM D1945-91, Standard Test Method for Analysis of Natural Gas by Gas Chromatography, IBR approved for §§ 86.113-94, 86.513-94, 86.1213-94, 86.1313-94. (iii) ASTM D2163-91, Standard Test Method for Analysis... §§ 86.113-94, 86.1213-94, 86.1313-94. (iv) ASTM D2986-95a, Reapproved 1999, Standard Practice for...
40 CFR 86.1 - Reference materials.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) ASTM D1945-91, Standard Test Method for Analysis of Natural Gas by Gas Chromatography, IBR approved for §§ 86.113-94, 86.513-94, 86.1213-94, 86.1313-94. (iii) ASTM D2163-91, Standard Test Method for Analysis... §§ 86.113-94, 86.1213-94, 86.1313-94. (iv) ASTM D2986-95a, Reapproved 1999, Standard Practice for...
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
Samuel V. Glass; Stanley D. Gatland II; Kohta Ueno; Christopher J. Schumacher
2017-01-01
ASHRAE Standard 160, Criteria for Moisture-Control Design Analysis in Buildings, was published in 2009. The standard sets criteria for moisture design loads, hygrothermal analysis methods, and satisfactory moisture performance of the building envelope. One of the evaluation criteria specifies conditions necessary to avoid mold growth. The current standard requires that...
High-pressure liquid chromatography analysis of antibiotic susceptibility disks.
Hagel, R B; Waysek, E H; Cort, W M
1979-01-01
The analysis of antibiotic susceptibility disks by high-pressure liquid chromatography (HPLC) was investigated. Methods are presented for the potency determination of mecillinam, ampicillin, carbenicillin, and cephalothin alone and in various combinations. Good agreement between HPLC and microbiological data is observed for potency determinations with recoveries of greater than 95%. Relative standard deviations of lower than 2% are recorded for each HPLC method. HPLC methods offer improved accuracy and greater precision when compared to the standard microbiological methods of analysis for susceptibility disks. PMID:507793
Kwon, Deukwoo; Reis, Isildinha M
2015-08-12
When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.
The Infinitesimal Jackknife with Exploratory Factor Analysis
ERIC Educational Resources Information Center
Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.
2012-01-01
The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…
Liu, Shu-Yu; Hu, Chang-Qin
2007-10-17
This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.
Selection of reference standard during method development using the analytical hierarchy process.
Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun
2015-03-25
Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.
Research on the calibration methods of the luminance parameter of radiation luminance meters
NASA Astrophysics Data System (ADS)
Cheng, Weihai; Huang, Biyong; Lin, Fangsheng; Li, Tiecheng; Yin, Dejin; Lai, Lei
2017-10-01
This paper introduces standard diffusion reflection white plate method and integrating sphere standard luminance source method to calibrate the luminance parameter. The paper compares the effects of calibration results by using these two methods through principle analysis and experimental verification. After using two methods to calibrate the same radiation luminance meter, the data obtained verifies the testing results of the two methods are both reliable. The results show that the display value using standard white plate method has fewer errors and better reproducibility. However, standard luminance source method is more convenient and suitable for on-site calibration. Moreover, standard luminance source method has wider range and can test the linear performance of the instruments.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
40 CFR 98.7 - What standardized methods are incorporated by reference into this part?
Code of Federal Regulations, 2014 CFR
2014-07-01
...-B2959, (800) 262-1373, http://www.astm.org. (1) ASTM C25-06 Standard Test Method for Chemical Analysis...), § 98.174(b), § 98.184(b), § 98.194(c), and § 98.334(b). (2) ASTM C114-09 Standard Test Methods for... Dry Cleaning Solvent), IBR approved for § 98.6. (4) ASTM D240-02 (Reapproved 2007) Standard Test...
Trofimov, Vyacheslav A.; Varentsova, Svetlana A.
2016-01-01
Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria. PMID:27070617
Trofimov, Vyacheslav A; Varentsova, Svetlana A
2016-04-08
Low efficiency of the standard THz TDS method of the detection and identification of substances based on a comparison of the spectrum for the signal under investigation with a standard signal spectrum is demonstrated using the physical experiments conducted under real conditions with a thick paper bag as well as with Si-based semiconductors under laboratory conditions. In fact, standard THz spectroscopy leads to false detection of hazardous substances in neutral samples, which do not contain them. This disadvantage of the THz TDS method can be overcome by using time-dependent THz pulse spectrum analysis. For a quality assessment of the standard substance spectral features presence in the signal under analysis, one may use time-dependent integral correlation criteria.
Establishment of analysis method for methane detection by gas chromatography
NASA Astrophysics Data System (ADS)
Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu
2018-02-01
The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.
portfolio standards, new methods for accessing natural gas reserves and aging power plants are opening standards, new methods for accessing natural gas reserves, and aging power plants are opening opportunities
NASA Astrophysics Data System (ADS)
Asadpour-Zeynali, Karim; Bastami, Mohammad
2010-02-01
In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.
This standard operating procedure describes the method used for preparing internal standard, surrogate recovery standard and calibration standard solutions for neutral analytes used for gas chromatography/mass spectrometry analysis.
Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.
2014-01-01
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416
Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L
2014-07-25
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Wu, Yan; He, Yi; He, Wenyi; Zhang, Yumei; Lu, Jing; Dai, Zhong; Ma, Shuangcheng; Lin, Ruichao
2014-03-01
Quantitative nuclear magnetic resonance spectroscopy (qNMR) has been developed into an important tool in the drug analysis, biomacromolecule detection, and metabolism study. Compared with mass balance method, qNMR method bears some advantages in the calibration of reference standard (RS): it determines the absolute amount of a sample; other chemical compound and its certified reference material (CRM) can be used as internal standard (IS) to obtain the purity of the sample. Protoberberine alkaloids have many biological activities and have been used as reference standards for the control of many herbal drugs. In present study, the qNMR methods were developed for the calibration of berberine hydrochloride, palmatine hydrochloride, tetrahydropalmatine, and phellodendrine hydrochloride with potassium hydrogen phthalate as IS. Method validation was carried out according to the guidelines for the method validation of Chinese Pharmacopoeia. The results of qNMR were compared with those of mass balance method and the differences between the results of two methods were acceptable based on the analysis of estimated measurement uncertainties. Therefore, qNMR is an effective and reliable analysis method for the calibration of RS and can be used as a good complementarity to the mass balance method. Copyright © 2013 Elsevier B.V. All rights reserved.
MASW on the standard seismic prospective scale using full spread recording
NASA Astrophysics Data System (ADS)
Białas, Sebastian; Majdański, Mariusz; Trzeciak, Maciej; Gałczyński, Edward; Maksym, Andrzej
2015-04-01
The Multichannel Analysis of Surface Waves (MASW) is one of seismic survey methods that use the dispersion curve of surface waves in order to describe the stiffness of the surface. Is is used mainly for geotechnical engineering scale with total length of spread between 5 - 450 m and spread offset between 1 - 100 m, the hummer is the seismic source on this surveys. The standard procedure of MASW survey is: data acquisition, dispersion analysis and inversion of extracting dispersion curve to obtain the closest theoretical curve. The final result includes share-wave velocity (Vs) values at different depth along the surveyed lines. The main goal of this work is to expand this engineering method to the bigger scale with the length of standard prospecting spread of 20 km using 4.5 Hz version of vertical component geophones. The standard vibroseis and explosive method are used as the seismic source. The acquisition were conducted on the full spread all the time during each single shoot. The seismic data acquisition used for this analysis were carried out on the Braniewo 2014 project in north of Poland. The results achieved during standard MASW procedure says that this method can be used on much bigger scale as well. The different methodology of this analysis requires only much stronger seismic source.
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Acid Rain Analysis by Standard Addition Titration.
ERIC Educational Resources Information Center
Ophardt, Charles E.
1985-01-01
The standard addition titration is a precise and rapid method for the determination of the acidity in rain or snow samples. The method requires use of a standard buret, a pH meter, and Gran's plot to determine the equivalence point. Experimental procedures used and typical results obtained are presented. (JN)
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
Accelerated Monte Carlo Simulation for Safety Analysis of the Advanced Airspace Concept
NASA Technical Reports Server (NTRS)
Thipphavong, David
2010-01-01
Safe separation of aircraft is a primary objective of any air traffic control system. An accelerated Monte Carlo approach was developed to assess the level of safety provided by a proposed next-generation air traffic control system. It combines features of fault tree and standard Monte Carlo methods. It runs more than one order of magnitude faster than the standard Monte Carlo method while providing risk estimates that only differ by about 10%. It also preserves component-level model fidelity that is difficult to maintain using the standard fault tree method. This balance of speed and fidelity allows sensitivity analysis to be completed in days instead of weeks or months with the standard Monte Carlo method. Results indicate that risk estimates are sensitive to transponder, pilot visual avoidance, and conflict detection failure probabilities.
Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L
2012-10-01
Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.
Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki
2010-09-15
Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Preparation and application of in-fibre internal standardization solid-phase microextraction.
Zhao, Wennan; Ouyang, Gangfeng; Pawliszyn, Janusz
2007-03-01
The in-fibre standardization method is a novel approach that has been developed for field sampling/sample preparation, in which an internal standard is pre-loaded onto a solid-phase microextraction (SPME) fibre for calibration of the extraction of target analytes in field samples. The same method can also be used for in-vial sample analysis. In this study, different techniques to load the standard to a non-porous SPME fibre were investigated. It was found that the appropriateness of the technique depends on the physical properties of the standards that are used for the analysis. Headspace extraction of the standard dissolved in pumping oil works well for volatile compounds. Conversely, headspace extraction of the pure standard is an effective approach for semi-volatile compounds. For compounds with low volatility, a syringe-fibre transfer method and direct extraction of the standard dissolved in a solvent exhibited a good reproducibility (<5% RSD). The main advantage of the approaches investigated in this study is that the standard generation vials can be reused for hundreds of analyses without exhibiting significant loss. Moreover, most of the standard loading processes studied can be performed automatically, which is efficient and precise. Finally, the standard loading technique and in-fibre standardization method were applied to a complex matrix (milk) and the results illustrated that the matrix effect can be effectively compensated for with this approach.
Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang
2018-01-01
Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.
Li, Yongtao; Whitaker, Joshua S; McCarty, Christina L
2012-07-06
A large volume direct aqueous injection method was developed for the analysis of iodinated haloacetic acids in drinking water by using reversed-phase liquid chromatography/electrospray ionization/tandem mass spectrometry in the negative ion mode. Both the external and internal standard calibration methods were studied for the analysis of monoiodoacetic acid, chloroiodoacetic acid, bromoiodoacetic acid, and diiodoacetic acid in drinking water. The use of a divert valve technique for the mobile phase solvent delay, along with isotopically labeled analogs used as internal standards, effectively reduced and compensated for the ionization suppression typically caused by coexisting common inorganic anions. Under the optimized method conditions, the mean absolute and relative recoveries resulting from the replicate fortified deionized water and chlorinated drinking water analyses were 83-107% with a relative standard deviation of 0.7-11.7% and 84-111% with a relative standard deviation of 0.8-12.1%, respectively. The method detection limits resulting from the external and internal standard calibrations, based on seven fortified deionized water replicates, were 0.7-2.3 ng/L and 0.5-1.9 ng/L, respectively. Copyright © 2012 Elsevier B.V. All rights reserved.
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
NASA Astrophysics Data System (ADS)
Basye, Austin T.
A matrix element method analysis of the Standard Model Higgs boson, produced in association with two top quarks decaying to the lepton-plus-jets channel is presented. Based on 20.3 fb--1 of s=8 TeV data, produced at the Large Hadron Collider and collected by the ATLAS detector, this analysis utilizes multiple advanced techniques to search for ttH signatures with a 125 GeV Higgs boson decaying to two b -quarks. After categorizing selected events based on their jet and b-tag multiplicities, signal rich regions are analyzed using the matrix element method. Resulting variables are then propagated to two parallel multivariate analyses utilizing Neural Networks and Boosted Decision Trees respectively. As no significant excess is found, an observed (expected) limit of 3.4 (2.2) times the Standard Model cross-section is determined at 95% confidence, using the CLs method, for the Neural Network analysis. For the Boosted Decision Tree analysis, an observed (expected) limit of 5.2 (2.7) times the Standard Model cross-section is determined at 95% confidence, using the CLs method. Corresponding unconstrained fits of the Higgs boson signal strength to the observed data result in the measured signal cross-section to Standard Model cross-section prediction of mu = 1.2 +/- 1.3(total) +/- 0.7(stat.) for the Neural Network analysis, and mu = 2.9 +/- 1.4(total) +/- 0.8(stat.) for the Boosted Decision Tree analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The first part covers standards for gaseous fuels. The standard part covers standards on coal and coke including the classification of coals, determination of major elements in coal ash and trace elements in coal, metallurgical properties of coal and coke, methods of analysis of coal and coke, petrographic analysis of coal and coke, physical characteristics of coal, quality assurance and sampling.
Robust Mediation Analysis Based on Median Regression
Yuan, Ying; MacKinnon, David P.
2014-01-01
Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925
Cantrill, Richard C
2008-01-01
Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Santiago, E C; Bello, F B B
2003-06-01
The Association of Official Analytical Chemists (AOAC) Standard Method 972.23 (dry ashing and flame atomic absorption spectrophotometry (FAAS)), applied to the analysis of lead in tuna, was validated in three selected local laboratories to determine the acceptability of the method to both the Codex Alimentarius Commission (Codex) and the European Union (EU) Commission for monitoring lead in canned tuna. Initial validation showed that the standard AOAC method as performed in the three participating laboratories cannot satisfy the Codex/EU proposed criteria for the method detection limit for monitoring lead in fish at the present regulation level of 0.5 mg x kg(-1). Modification of the standard method by chelation/concentration of the digest solution before FAAS analysis showed that the modified method has the potential to meet Codex/EU criteria on sensitivity, accuracy and precision at the specified regulation level.
Koyama, Kazuo; Miyazaki, Kinuko; Abe, Kousuke; Egawa, Yoshitsugu; Fukazawa, Toru; Kitta, Tadashi; Miyashita, Takashi; Nezu, Toru; Nohara, Hidenori; Sano, Takashi; Takahashi, Yukinari; Taniguchi, Hideji; Yada, Hiroshi; Yamazaki, Kumiko; Watanabe, Yomi
2017-06-01
An indirect enzymatic analysis method for the quantification of fatty acid esters of 2-/3-monochloro-1,2-propanediol (2/3-MCPD) and glycidol was developed, using the deuterated internal standard of each free-form component. A statistical method for calibration and quantification of 2-MCPD-d 5 , which is difficult to obtain, is substituted by 3-MCPD-d 5 used for calculation of 3-MCPD. Using data from a previous collaborative study, the current method for the determination of 2-MCPD content using 2-MCPD-d 5 was compared to three alternative new methods using 3-MCPD-d 5 . The regression analysis showed that the alternative methods were unbiased compared to the current method. The relative standard deviation (RSD R ) among the testing laboratories was ≤ 15% and the Horwitz ratio was ≤ 1.0, a satisfactory value.
NASA Astrophysics Data System (ADS)
Tian, Lunfu; Wang, Lili; Gao, Wei; Weng, Xiaodong; Liu, Jianhui; Zou, Deshuang; Dai, Yichun; Huang, Shuke
2018-03-01
For the quantitative analysis of the principal elements in lead-antimony-tin alloys, directly X-ray fluorescence (XRF) method using solid metal disks introduces considerable errors due to the microstructure inhomogeneity. To solve this problem, an aqueous solution XRF method is proposed for determining major amounts of Sb, Sn, Pb in lead-based bearing alloys. The alloy samples were dissolved by a mixture of nitric acid and tartaric acid to eliminated the effects of microstructure of these alloys on the XRF analysis. Rh Compton scattering was used as internal standard for Sb and Sn, and Bi was added as internal standard for Pb, to correct for matrix effects, instrumental and operational variations. High-purity lead, antimony and tin were used to prepare synthetic standards. Using these standards, calibration curves were constructed for the three elements after optimizing the spectrometer parameters. The method has been successfully applied to the analysis of lead-based bearing alloys and is more rapid than classical titration methods normally used. The determination results are consistent with certified values or those obtained by titrations.
TOWARDS A STANDARD METHOD FOR THE MEASUREMENT OF ORGANIC CARBON IN SEDIMENTS
The precisions achieved by two different methods for analysis of organic carbon in soils and sediments were determined and compared. The first method is a rapid dichromate oxidation technique (Walkley-Black) that has long been a standard in soil chemistry. The second is an automa...
Shuttle user analysis (study 2.2). Volume 4: Standardized subsystem modules analysis
NASA Technical Reports Server (NTRS)
1974-01-01
The capability to analyze payloads constructed of standardized modules was provided for the planning of future mission models. An inventory of standardized module designs previously obtained was used as a starting point. Some of the conclusions and recommendations are: (1) the two growth factor synthesis methods provide logical configurations for satellite type selection; (2) the recommended method is the one that determines the growth factor as a function of the baseline subsystem weight, since it provides a larger growth factor for small subsystem weights and results in a greater overkill due to standardization; (3) the method that is not recommended is the one that depends upon a subsystem similarity selection, since care must be used in the subsystem similarity selection; (4) it is recommended that the application of standardized subsystem factors be limited to satellites with baseline dry weights between about 700 and 6,500 lbs; and (5) the standardized satellite design approach applies to satellites maintainable in orbit or retrieved for ground maintenance.
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
Faassen, Elisabeth J; Antoniou, Maria G; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-02-29
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer's disease and Parkinson's disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D₃BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%-32%), implying that D₃BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis.
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
NASA Astrophysics Data System (ADS)
Muji Susantoro, Tri; Wikantika, Ketut; Saepuloh, Asep; Handoyo Harsolumakso, Agus
2018-05-01
Selection of vegetation indices in plant mapping is needed to provide the best information of plant conditions. The methods used in this research are the standard deviation and the linear regression. This research tried to determine the vegetation indices used for mapping the sugarcane conditions around oil and gas fields. The data used in this study is Landsat 8 OLI/TIRS. The standard deviation analysis on the 23 vegetation indices with 27 samples has resulted in the six highest standard deviations of vegetation indices, termed as GRVI, SR, NLI, SIPI, GEMI and LAI. The standard deviation values are 0.47; 0.43; 0.30; 0.17; 0.16 and 0.13. Regression correlation analysis on the 23 vegetation indices with 280 samples has resulted in the six vegetation indices, termed as NDVI, ENDVI, GDVI, VARI, LAI and SIPI. This was performed based on regression correlation with the lowest value R2 than 0,8. The combined analysis of the standard deviation and the regression correlation has obtained the five vegetation indices, termed as NDVI, ENDVI, GDVI, LAI and SIPI. The results of the analysis of both methods show that a combination of two methods needs to be done to produce a good analysis of sugarcane conditions. It has been clarified through field surveys and showed good results for the prediction of microseepages.
Ito, Shinya; Tsukada, Katsuo
2002-01-11
An evaluation of the feasibility of liquid chromatography-mass spectrometry (LC-MS) with atmospheric pressure ionization was made for quantitation of four diarrhetic shellfish poisoning toxins, okadaic acid, dinophysistoxin-1, pectenotoxin-6 and yessotoxin in scallops. When LC-MS was applied to the analysis of scallop extracts, large signal suppressions were observed due to coeluting substances from the column. To compensate for these matrix signal suppressions, the standard addition method was applied. First, the sample was analyzed and then the sample involving the addition of calibration standards is analyzed. Although this method requires two LC-MS runs per analysis, effective correction of quantitative errors was found.
Zhu, H B; Su, C J; Tang, H F; Ruan, Z; Liu, D H; Wang, H; Qian, Y L
2017-10-20
Objective: To establish a method for rapid determination of 47 volatile organic compounds in the air of workplace using portable gas chromatography - mass spectrometer(GC - MS). Methods: The mixed standard gas with different concentration levels was made by using the static gas distribution method with the high purity nitrogen as dilution gas. The samples were injected into the GC - MS by a hand - held probe. Retention time and characteristic ion were used for qualitative analysis,and the internal standard method was usd for quantitation. Results: The 47 poisonous substances were separated and determined well. The linear range of this method was 0.2 - 16.0 mg/m(3),and the relative standard deviation of 45 volatile ovganic compounds was 3.8% - 15.8%. The average recovery was 79.3% - 119.0%. Conclusion: The method is simple,accurate,sensitive,has good separation effect,short analysis period, can be used for qualitative and quantitative analysis of volatile organic compounds in the workplace, and also supports the rapid identification and detection of occupational hazards.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T
2016-12-20
Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Conceptual designs for in situ analysis of Mars soil
NASA Technical Reports Server (NTRS)
Mckay, C. P.; Zent, A. P.; Hartman, H.
1991-01-01
A goal of this research is to develop conceptual designs for instrumentation to perform in situ measurements of the Martian soil in order to determine the existence and nature of any reactive chemicals. Our approach involves assessment and critical review of the Viking biology results which indicated the presence of a soil oxidant, an investigation of the possible application of standard soil science techniques to the analysis of Martian soil, and a preliminary consideration of non-standard methods that may be necessary for use in the highly oxidizing Martian soil. Based on our preliminary analysis, we have developed strawman concepts for standard soil analysis on Mars, including pH, suitable for use on a Mars rover mission. In addition, we have devised a method for the determination of the possible strong oxidants on Mars.
Comparison of histomorphometrical data obtained with two different image analysis methods.
Ballerini, Lucia; Franke-Stenport, Victoria; Borgefors, Gunilla; Johansson, Carina B
2007-08-01
A common way to determine tissue acceptance of biomaterials is to perform histomorphometrical analysis on histologically stained sections from retrieved samples with surrounding tissue, using various methods. The "time and money consuming" methods and techniques used are often "in house standards". We address light microscopic investigations of bone tissue reactions on un-decalcified cut and ground sections of threaded implants. In order to screen sections and generate results faster, the aim of this pilot project was to compare results generated with the in-house standard visual image analysis tool (i.e., quantifications and judgements done by the naked eye) with a custom made automatic image analysis program. The histomorphometrical bone area measurements revealed no significant differences between the methods but the results of the bony contacts varied significantly. The raw results were in relative agreement, i.e., the values from the two methods were proportional to each other: low bony contact values in the visual method corresponded to low values with the automatic method. With similar resolution images and further improvements of the automatic method this difference should become insignificant. A great advantage using the new automatic image analysis method is that it is time saving--analysis time can be significantly reduced.
For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules
An improved method for bivariate meta-analysis when within-study correlations are unknown.
Hong, Chuan; D Riley, Richard; Chen, Yong
2018-03-01
Multivariate meta-analysis, which jointly analyzes multiple and possibly correlated outcomes in a single analysis, is becoming increasingly popular in recent years. An attractive feature of the multivariate meta-analysis is its ability to account for the dependence between multiple estimates from the same study. However, standard inference procedures for multivariate meta-analysis require the knowledge of within-study correlations, which are usually unavailable. This limits standard inference approaches in practice. Riley et al proposed a working model and an overall synthesis correlation parameter to account for the marginal correlation between outcomes, where the only data needed are those required for a separate univariate random-effects meta-analysis. As within-study correlations are not required, the Riley method is applicable to a wide variety of evidence synthesis situations. However, the standard variance estimator of the Riley method is not entirely correct under many important settings. As a consequence, the coverage of a function of pooled estimates may not reach the nominal level even when the number of studies in the multivariate meta-analysis is large. In this paper, we improve the Riley method by proposing a robust variance estimator, which is asymptotically correct even when the model is misspecified (ie, when the likelihood function is incorrect). Simulation studies of a bivariate meta-analysis, in a variety of settings, show a function of pooled estimates has improved performance when using the proposed robust variance estimator. In terms of individual pooled estimates themselves, the standard variance estimator and robust variance estimator give similar results to the original method, with appropriate coverage. The proposed robust variance estimator performs well when the number of studies is relatively large. Therefore, we recommend the use of the robust method for meta-analyses with a relatively large number of studies (eg, m≥50). When the sample size is relatively small, we recommend the use of the robust method under the working independence assumption. We illustrate the proposed method through 2 meta-analyses. Copyright © 2017 John Wiley & Sons, Ltd.
Van De Steene, Jet C; Lambert, Willy E
2008-05-01
When developing an LC-MS/MS-method matrix effects are a major issue. The effect of co-eluting compounds arising from the matrix can result in signal enhancement or suppression. During method development much attention should be paid to diminishing matrix effects as much as possible. The present work evaluates matrix effects from aqueous environmental samples in the simultaneous analysis of a group of 9 specific pharmaceuticals with HPLC-ESI/MS/MS and UPLC-ESI/MS/MS: flubendazole, propiconazole, pipamperone, cinnarizine, ketoconazole, miconazole, rabeprazole, itraconazole and domperidone. When HPLC-MS/MS is used, matrix effects are substantial and can not be compensated for with analogue internal standards. For different surface water samples different matrix effects are found. For accurate quantification the standard addition approach is necessary. Due to the better resolution and more narrow peaks in UPLC, analytes will co-elute less with interferences during ionisation, so matrix effects could be lower, or even eliminated. If matrix effects are eliminated with this technique, the standard addition method for quantification can be omitted and the overall method will be simplified. Results show that matrix effects are almost eliminated if internal standards (structural analogues) are used. Instead of the time-consuming and labour-intensive standard addition method, with UPLC the internal standardization can be used for quantification and the overall method is substantially simplified.
Improved score statistics for meta-analysis in single-variant and gene-level association studies.
Yang, Jingjing; Chen, Sai; Abecasis, Gonçalo
2018-06-01
Meta-analysis is now an essential tool for genetic association studies, allowing them to combine large studies and greatly accelerating the pace of genetic discovery. Although the standard meta-analysis methods perform equivalently as the more cumbersome joint analysis under ideal settings, they result in substantial power loss under unbalanced settings with various case-control ratios. Here, we investigate the power loss problem by the standard meta-analysis methods for unbalanced studies, and further propose novel meta-analysis methods performing equivalently to the joint analysis under both balanced and unbalanced settings. We derive improved meta-score-statistics that can accurately approximate the joint-score-statistics with combined individual-level data, for both linear and logistic regression models, with and without covariates. In addition, we propose a novel approach to adjust for population stratification by correcting for known population structures through minor allele frequencies. In the simulated gene-level association studies under unbalanced settings, our method recovered up to 85% power loss caused by the standard methods. We further showed the power gain of our methods in gene-level tests with 26 unbalanced studies of age-related macular degeneration . In addition, we took the meta-analysis of three unbalanced studies of type 2 diabetes as an example to discuss the challenges of meta-analyzing multi-ethnic samples. In summary, our improved meta-score-statistics with corrections for population stratification can be used to construct both single-variant and gene-level association studies, providing a useful framework for ensuring well-powered, convenient, cross-study analyses. © 2018 WILEY PERIODICALS, INC.
Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga
2010-05-10
Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.
A refined method for multivariate meta-analysis and meta-regression.
Jackson, Daniel; Riley, Richard D
2014-02-20
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects' standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. Copyright © 2013 John Wiley & Sons, Ltd.
Neutron activation analysis of certified samples by the absolute method
NASA Astrophysics Data System (ADS)
Kadem, F.; Belouadah, N.; Idiri, Z.
2015-07-01
The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.
Establishment of gold-quartz standard GQS-1
Millard, Hugh T.; Marinenko, John; McLane, John E.
1969-01-01
A homogeneous gold-quartz standard, GQS-1, was prepared from a heterogeneous gold-bearing quartz by chemical treatment. The concentration of gold in GQS-1 was determined by both instrumental neutron activation analysis and radioisotope dilution analysis to be 2.61?0.10 parts per million. Analysis of 10 samples of the standard by both instrumental neutron activation analysis and radioisotope dilution analysis failed to reveal heterogeneity within the standard. The precision of the analytical methods, expressed as standard error, was approximately 0.1 part per million. The analytical data were also used to estimate the average size of gold particles. The chemical treatment apparently reduced the average diameter of the gold particles by at least an order of magnitude and increased the concentration of gold grains by a factor of at least 4,000.
Garbarino, John R.
1999-01-01
The inductively coupled plasma?mass spectrometric (ICP?MS) methods have been expanded to include the determination of dissolved arsenic, boron, lithium, selenium, strontium, thallium, and vanadium in filtered, acidified natural water. Method detection limits for these elements are now 10 to 200 times lower than by former U.S. Geological Survey (USGS) methods, thus providing lower variability at ambient concentrations. The bias and variability of the method was determined by using results from spike recoveries, standard reference materials, and validation samples. Spike recoveries at 5 to 10 times the method detection limit and 75 micrograms per liter in reagent-water, surface-water, and groundwater matrices averaged 93 percent for seven replicates, although selected elemental recoveries in a ground-water matrix with an extremely high iron sulfate concentration were negatively biased by 30 percent. Results for standard reference materials were within 1 standard deviation of the most probable value. Statistical analysis of the results from about 60 filtered, acidified natural-water samples indicated that there was no significant difference between ICP?MS and former USGS official methods of analysis.
Silbernagel, Karen M; Jechorek, Robert P; Kaufer, Amanda L; Johnson, Ronald L; Aleo, V; Brown, B; Buen, M; Buresh, J; Carson, M; Franklin, J; Ham, P; Humes, L; Husby, G; Hutchins, J; Jechorek, R; Jenkins, J; Kaufer, A; Kexel, N; Kora, L; Lam, L; Lau, D; Leighton, S; Loftis, M; Luc, S; Martin, J; Nacar, I; Nogle, J; Park, J; Schultz, A; Seymore, D; Smith, C; Smith, J; Thou, P; Ulmer, M; Voss, R; Weaver, V
2005-01-01
A multilaboratory study was conducted to compare the VIDAS LIS immunoassay with the standard cultural methods for the detection of Listeria in foods using an enrichment modification of AOAC Official Method 999.06. The modified enrichment protocol was implemented to harmonize the VIDAS LIS assay with the VIDAS LMO2 assay. Five food types--brie cheese, vanilla ice cream, frozen green beans, frozen raw tilapia fish, and cooked roast beef--at 3 inoculation levels, were analyzed by each method. A total of 15 laboratories representing government and industry participated. In this study, 1206 test portions were tested, of which 1170 were used in the statistical analysis. There were 433 positive by the VIDAS LIS assay and 396 positive by the standard culture methods. A Chi-square analysis of each of the 5 food types, at the 3 inoculation levels tested, was performed. The resulting average Chi square analysis, 0.42, indicated that, overall, there are no statistical differences between the VIDAS LIS assay and the standard methods at the 5% level of significance.
Berridge, Georgina; Chalk, Rod; D’Avanzo, Nazzareno; Dong, Liang; Doyle, Declan; Kim, Jung-In; Xia, Xiaobing; Burgess-Brown, Nicola; deRiso, Antonio; Carpenter, Elisabeth Paula; Gileadi, Opher
2011-01-01
We have developed a method for intact mass analysis of detergent-solubilized and purified integral membrane proteins using liquid chromatography–mass spectrometry (LC–MS) with methanol as the organic mobile phase. Membrane proteins and detergents are separated chromatographically during the isocratic stage of the gradient profile from a 150-mm C3 reversed-phase column. The mass accuracy is comparable to standard methods employed for soluble proteins; the sensitivity is 10-fold lower, requiring 0.2–5 μg of protein. The method is also compatible with our standard LC–MS method used for intact mass analysis of soluble proteins and may therefore be applied on a multiuser instrument or in a high-throughput environment. PMID:21093405
Grate, Jay W; Gonzalez, Jhanis J; O'Hara, Matthew J; Kellogg, Cynthia M; Morrison, Samuel S; Koppenaal, David W; Chan, George C-Y; Mao, Xianglei; Zorba, Vassilia; Russo, Richard E
2017-09-08
Solid sampling and analysis methods, such as laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), are challenged by matrix effects and calibration difficulties. Matrix-matched standards for external calibration are seldom available and it is difficult to distribute spikes evenly into a solid matrix as internal standards. While isotopic ratios of the same element can be measured to high precision, matrix-dependent effects in the sampling and analysis process frustrate accurate quantification and elemental ratio determinations. Here we introduce a potentially general solid matrix transformation approach entailing chemical reactions in molten ammonium bifluoride (ABF) salt that enables the introduction of spikes as tracers or internal standards. Proof of principle experiments show that the decomposition of uranium ore in sealed PFA fluoropolymer vials at 230 °C yields, after cooling, new solids suitable for direct solid sampling by LA. When spikes are included in the molten salt reaction, subsequent LA-ICP-MS sampling at several spots indicate that the spikes are evenly distributed, and that U-235 tracer dramatically improves reproducibility in U-238 analysis. Precisions improved from 17% relative standard deviation for U-238 signals to 0.1% for the ratio of sample U-238 to spiked U-235, a factor of over two orders of magnitude. These results introduce the concept of solid matrix transformation (SMT) using ABF, and provide proof of principle for a new method of incorporating internal standards into a solid for LA-ICP-MS. This new approach, SMT-LA-ICP-MS, provides opportunities to improve calibration and quantification in solids based analysis. Looking forward, tracer addition to transformed solids opens up LA-based methods to analytical methodologies such as standard addition, isotope dilution, preparation of matrix-matched solid standards, external calibration, and monitoring instrument drift against external calibration standards.
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
NASA Technical Reports Server (NTRS)
Otterson, D. A.; Seng, G. T.
1984-01-01
A new high-performance liquid chromatographic (HPLC) method for group-type analysis of middistillate fuels is described. It uses a refractive index detector and standards that are prepared by reacting a portion of the fuel sample with sulfuric acid. A complete analysis of a middistillate fuel for saturates and aromatics (including the preparation of the standard) requires about 15 min if standards for several fuels are prepared simultaneously. From model fuel studies, the method was found to be accurate to within 0.4 vol% saturates or aromatics, and provides a precision of + or - 0.4 vol%. Olefin determinations require an additional 15 min of analysis time. However, this determination is needed only for those fuels displaying a significant olefin response at 200 nm (obtained routinely during the saturated/aromatics analysis procedure). The olefin determination uses the responses of the olefins and the corresponding saturates, as well as the average value of their refractive index sensitivity ratios (1.1). Studied indicated that, although the relative error in the olefins result could reach 10 percent by using this average sensitivity ratio, it was 5 percent for the fuels used in this study. Olefin concentrations as low as 0.1 vol% have been determined using this method.
Faassen, Elisabeth J.; Antoniou, Maria G.; Beekman-Lukassen, Wendy; Blahova, Lucie; Chernova, Ekaterina; Christophoridis, Christophoros; Combes, Audrey; Edwards, Christine; Fastner, Jutta; Harmsen, Joop; Hiskia, Anastasia; Ilag, Leopold L.; Kaloudis, Triantafyllos; Lopicic, Srdjan; Lürling, Miquel; Mazur-Marzec, Hanna; Meriluoto, Jussi; Porojan, Cristina; Viner-Mozzini, Yehudit; Zguna, Nadezda
2016-01-01
Exposure to β-N-methylamino-l-alanine (BMAA) might be linked to the incidence of amyotrophic lateral sclerosis, Alzheimer’s disease and Parkinson’s disease. Analytical chemistry plays a crucial role in determining human BMAA exposure and the associated health risk, but the performance of various analytical methods currently employed is rarely compared. A CYANOCOST initiated workshop was organized aimed at training scientists in BMAA analysis, creating mutual understanding and paving the way towards interlaboratory comparison exercises. During this workshop, we tested different methods (extraction followed by derivatization and liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) analysis, or directly followed by LC-MS/MS analysis) for trueness and intermediate precision. We adapted three workup methods for the underivatized analysis of animal, brain and cyanobacterial samples. Based on recovery of the internal standard D3BMAA, the underivatized methods were accurate (mean recovery 80%) and precise (mean relative standard deviation 10%), except for the cyanobacterium Leptolyngbya. However, total BMAA concentrations in the positive controls (cycad seeds) showed higher variation (relative standard deviation 21%–32%), implying that D3BMAA was not a good indicator for the release of BMAA from bound forms. Significant losses occurred during workup for the derivatized method, resulting in low recovery (<10%). Most BMAA was found in a trichloroacetic acid soluble, bound form and we recommend including this fraction during analysis. PMID:26938542
Berlinger, Balazs; Harper, Martin
2018-02-01
There is interest in the bioaccessible metal components of aerosols, but this has been minimally studied because standardized sampling and analytical methods have not yet been developed. An interlaboratory study (ILS) has been carried out to evaluate a method for determining the water-soluble component of realistic welding fume (WF) air samples. Replicate samples were generated in the laboratory and distributed to participating laboratories to be analyzed according to a standardized procedure. Within-laboratory precision of replicate sample analysis (repeatability) was very good. Reproducibility between laboratories was not as good, but within limits of acceptability for the analysis of typical aerosol samples. These results can be used to support the development of a standardized test method.
Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James
2017-01-01
Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.
Treating Depression during Pregnancy and the Postpartum: A Preliminary Meta-Analysis
ERIC Educational Resources Information Center
Bledsoe, Sarah E.; Grote, Nancy K.
2006-01-01
Objectives: This meta-analysis evaluates treatment effects for nonpsychotic major depression during pregnancy and postpartum comparing interventions by type and timing. Methods: Studies for decreasing depressive severity during pregnancy and postpartum applying treatment trials and standardized measures were included. Standardized mean differences…
Atmospheric Transformation of Volatile Organic Compounds
2008-03-01
Study Analysis Reactant mixtures and standards from product identification experiments were sampled by exposing a 100% polydimethylsiloxane solid...later using the DNPH derivatization method described above and confirmed against a commercial standard. HPLC analysis of the DNPH cartridges also...reaction mixture for a combined total photolysis time ofapproximately 50 seconds. 2.3. Kinetic Study Analysis Samples from kinetic studies were
Method for matching customer and manufacturer positions for metal product parameters standardization
NASA Astrophysics Data System (ADS)
Polyakova, Marina; Rubin, Gennadij; Danilova, Yulija
2018-04-01
Decision making is the main stage of regulation the relations between customer and manufacturer during the design the demands of norms in standards. It is necessary to match the positions of the negotiating sides in order to gain the consensus. In order to take into consideration the differences of customer and manufacturer estimation of the object under standardization process it is obvious to use special methods of analysis. It is proposed to establish relationships between product properties and its functions using functional-target analysis. The special feature of this type of functional analysis is the consideration of the research object functions and properties. It is shown on the example of hexagonal head crew the possibility to establish links between its functions and properties. Such approach allows obtaining a quantitative assessment of the closeness the positions of customer and manufacturer at decision making during the standard norms establishment.
ERIC Educational Resources Information Center
Anderson, Carl B.; Metzger, Scott Alan
2011-01-01
This study is a mixed-methods text analysis of African American representation within K-12 U.S. History content standards treating the revolutionary era, the early U.S. republic, the Civil War era, and Reconstruction. The states included in the analysis are Michigan, New Jersey, South Carolina, and Virginia. The analysis finds that the reviewed…
Matteson, Brent S; Hanson, Susan K; Miller, Jeffrey L; Oldham, Warren J
2015-04-01
An optimized method was developed to analyze environmental soil and sediment samples for (237)Np, (239)Pu, and (240)Pu by ICP-MS using a (242)Pu isotope dilution standard. The high yield, short time frame required for analysis, and the commercial availability of the (242)Pu tracer are significant advantages of the method. Control experiments designed to assess method uncertainty, including variation in inter-element fractionation that occurs during the purification protocol, suggest that the overall precision for measurements of (237)Np is typically on the order of ± 5%. Measurements of the (237)Np concentration in a Peruvian Soil blank (NIST SRM 4355) spiked with a known concentration of (237)Np tracer confirmed the accuracy of the method, agreeing well with the expected value. The method has been used to determine neptunium and plutonium concentrations in several environmental matrix standard reference materials available from NIST: SRM 4357 (Radioactivity Standard), SRM 1646a (Estuarine Sediment) and SRM 2702 (Inorganics in Marine Sediment). Copyright © 2015 Elsevier Ltd. All rights reserved.
Current federal regulations required monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then...
Current federal regulations require monitoring for fecal coliforms or Salmonella in biosolids destined for land application. Methods used for analysis of fecal coliforms and Salmonella were reviewed and a standard protocol was developed. The protocols were then evaluated by testi...
Novel methods of imaging and analysis for the thermoregulatory sweat test.
Carroll, Michael Sean; Reed, David W; Kuntz, Nancy L; Weese-Mayer, Debra Ellyn
2018-06-07
The thermoregulatory sweat test (TST) can be central to the identification and management of disorders affecting sudomotor function and small sensory and autonomic nerve fibers, but the cumbersome nature of the standard testing protocol has prevented its widespread adoption. A high resolution, quantitative, clean and simple assay of sweating could significantly improve identification and management of these disorders. Images from 89 clinical TSTs were analyzed retrospectively using two novel techniques. First, using the standard indicator powder, skin surface sweat distributions were determined algorithmically for each patient. Second, a fundamentally novel method using thermal imaging of forced evaporative cooling was evaluated through comparison with the standard technique. Correlation and receiver operating characteristic analyses were used to determine the degree of match between these methods, and the potential limits of thermal imaging were examined through cumulative analysis of all studied patients. Algorithmic encoding of sweating and non-sweating regions produces a more objective analysis for clinical decision making. Additionally, results from the forced cooling method correspond well with those from indicator powder imaging, with a correlation across spatial regions of -0.78 (CI: -0.84 to -0.71). The method works similarly across body regions, and frame-by-frame analysis suggests the ability to identify sweating regions within about 1 second of imaging. While algorithmic encoding can enhance the standard sweat testing protocol, thermal imaging with forced evaporative cooling can dramatically improve the TST by making it less time-consuming and more patient-friendly than the current approach.
National Standard Catalog (Selected Pages).
1987-09-11
Sodium thiosulphate - Iodine content method for measuring copper B 223.19-82 Chemical analysis of 63. 12.31 82.7.9 e.4. 1 steel iron and alloy...82.6.21 83.3.1 st ibium; Using burning iodine content to determine sulphur GB 3253.7-82 Chemical analysis of 82.6.21 83.3.1 stibium; Using 3, 3’ - two...Standard ID Standard Name __ _ Year Month Day Using iodine content to determinate stibium oxide GB 3254.2-82 Chemical analysis of 82.6.21 83.3.1
Wake vortex separation standards : analysis methods
DOT National Transportation Integrated Search
1997-01-01
Wake vortex separation standards are used to prevent hazardous wake vortex encounters. A "safe" separation model can be used to assess the safety of proposed changes in the standards. A safe separation model can be derived from an encounter hazard mo...
CrowdMapping: A Crowdsourcing-Based Terminology Mapping Method for Medical Data Standardization.
Mao, Huajian; Chi, Chenyang; Huang, Boyu; Meng, Haibin; Yu, Jinghui; Zhao, Dongsheng
2017-01-01
Standardized terminology is the prerequisite of data exchange in analysis of clinical processes. However, data from different electronic health record systems are based on idiosyncratic terminology systems, especially when the data is from different hospitals and healthcare organizations. Terminology standardization is necessary for the medical data analysis. We propose a crowdsourcing-based terminology mapping method, CrowdMapping, to standardize the terminology in medical data. CrowdMapping uses a confidential model to determine how terminologies are mapped to a standard system, like ICD-10. The model uses mappings from different health care organizations and evaluates the diversity of the mapping to determine a more sophisticated mapping rule. Further, the CrowdMapping model enables users to rate the mapping result and interact with the model evaluation. CrowdMapping is a work-in-progress system, we present initial results mapping terminologies.
Tsuchiyama, Tomoyuki; Katsuhara, Miki; Nakajima, Masahiro
2017-11-17
In the multi-residue analysis of pesticides using GC-MS, the quantitative results are adversely affected by a phenomenon known as the matrix effect. Although the use of matrix-matched standards is considered to be one of the most practical solutions to this problem, complete removal of the matrix effect is difficult in complex food matrices owing to their inconsistency. As a result, residual matrix effects can introduce analytical errors. To compensate for residual matrix effects, we have developed a novel method that employs multiple isotopically labeled internal standards (ILIS). The matrix effects of ILIS and pesticides were evaluated in spiked matrix extracts of various agricultural commodities, and the obtained data were subjected to simple statistical analysis. Based on the similarities between the patterns of variation in the analytical response, a total of 32 isotopically labeled compounds were assigned to 338 pesticides as internal standards. It was found that by utilizing multiple ILIS, residual matrix effects could be effectively compensated. The developed method exhibited superior quantitative performance compared with the common single-internal-standard method. The proposed method is more feasible for regulatory purposes than that using only predetermined correction factors and is considered to be promising for practical applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Mao, De-Hua; Hu, Guang-Wei; Liu, Hui-Jie; Li, Zheng-Zui; Li, Zhi-Long; Tan, Zi-Fang
2014-02-01
The annual emergy and currency value of the main ecological service value of returning cropland to lake in Dongting Lake region from 1999 to 2010 was calculated based on emergy analysis. The calculation method of ecological compensation standard was established by calculating annual total emergy of ecological service function increment since the starting year of returning cropland to lake, and the annual ecological compensation standard and compensation area were analyzed from 1999 to 2010. The results indicated that ecological compensation standard from 1999 to 2010 was 40.31-86.48 yuan x m(-2) with the mean of 57.33 yuan x m(-2). The ecological compensation standard presented an increase trend year by year due to the effect of eco-recovery of returning cropland to lake. The ecological compensation standard in the research area presented a swift and steady growth trend after 2005 mainly due to the intensive economy development of Hunan Province, suggesting the value of natural ecological resources would increase along with the development of society and economy. Appling the emergy analysis to research the ecological compensation standard could reveal the dynamics of annual ecological compensation standard, solve the abutment problem of matter flow, energy flow and economic flow, and overcome the subjective and arbitrary of environment economic methods. The empirical research of ecological compensation standard in Dongting Lake region showed that the emergy analysis was feasible and advanced.
Monitoring for airborne allergens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burge, H.A.
1992-07-01
Monitoring for allergens can provide some information on the kinds and levels of exposure experienced by local patient populations, providing volumetric methods are used for sample collection and analysis is accurate and consistent. Such data can also be used to develop standards for the specific environment and to begin to develop predictive models. Comparing outdoor allergen aerosols between different monitoring sites requires identical collection and analysis methods and some kind of rational standard, whether arbitrary, or based on recognized health effects.32 references.
Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie
2012-07-06
Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.
Chromý, Vratislav; Vinklárková, Bára; Šprongl, Luděk; Bittová, Miroslava
2015-01-01
We found previously that albumin-calibrated total protein in certified reference materials causes unacceptable positive bias in analysis of human sera. The simplest way to cure this defect is the use of human-based serum/plasma standards calibrated by the Kjeldahl method. Such standards, commutative with serum samples, will compensate for bias caused by lipids and bilirubin in most human sera. To find a suitable primary reference procedure for total protein in reference materials, we reviewed Kjeldahl methods adopted by laboratory medicine. We found two methods recommended for total protein in human samples: an indirect analysis based on total Kjeldahl nitrogen corrected for its nonprotein nitrogen and a direct analysis made on isolated protein precipitates. The methods found will be assessed in a subsequent article.
Ma, Weina; Yang, Liu; Lv, Yanni; Fu, Jia; Zhang, Yanmin; He, Langchong
2017-06-23
The equilibrium dissociation constant (K D ) of drug-membrane receptor affinity is the basic parameter that reflects the strength of interaction. The cell membrane chromatography (CMC) method is an effective technique to study the characteristics of drug-membrane receptor affinity. In this study, the K D value of CMC relative standard method for the determination of drug-membrane receptor affinity was established to analyze the relative K D values of drugs binding to the membrane receptors (Epidermal growth factor receptor and angiotensin II receptor). The K D values obtained by the CMC relative standard method had a strong correlation with those obtained by the frontal analysis method. Additionally, the K D values obtained by CMC relative standard method correlated with pharmacological activity of the drug being evaluated. The CMC relative standard method is a convenient and effective method to evaluate drug-membrane receptor affinity. Copyright © 2017 Elsevier B.V. All rights reserved.
The Structure of Mixed Method Studies in Educational Research: A Content Analysis
ERIC Educational Resources Information Center
Bryant, Lauren H.
2011-01-01
Educational researchers are beginning to use mixed methods designs to answer complex research questions. This content analysis investigates the structure and use of mixed methods in educational research in order to work toward a more standardized presentation. I used a concurrent mixed methods approach to analyze 30 studies from three prominent…
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
A Comparison of Approaches for Setting Proficiency Standards.
ERIC Educational Resources Information Center
Koffler, Stephen L.
This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…
Qualitative Analysis on Stage: Making the Research Process More Public.
ERIC Educational Resources Information Center
Anfara, Vincent A., Jr.; Brown, Kathleen M.
The increased use of qualitative research methods has spurred interest in developing formal standards for assessing its validity. These standards, however, fall short if they do not include public disclosure of methods as a criterion. The researcher must be accountable in documenting the actions associated with establishing internal validity…
A Comparison of the Effectiveness of Two Design Methodologies in a Secondary School Setting.
ERIC Educational Resources Information Center
Cannizzaro, Brenton; Boughton, Doug
1998-01-01
Examines the effectiveness of the analysis-synthesis and generator-conjuncture-analysis models of design education. Concludes that the generator-conjecture-analysis design method produced student design product of a slightly higher standard than the analysis-synthesis design method. Discusses the findings in more detail and considers implications.…
HPLC analysis and standardization of Brahmi vati – An Ayurvedic poly-herbal formulation
Mishra, Amrita; Mishra, Arun K.; Tiwari, Om Prakash; Jha, Shivesh
2013-01-01
Objectives The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC–UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. Materials and methods An HPLC–UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. Results The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. Conclusion The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine. PMID:24396246
Metrological activity determination of 133Ba by sum-peak absolute method
NASA Astrophysics Data System (ADS)
da Silva, R. L.; de Almeida, M. C. M.; Delgado, J. U.; Poledna, R.; Santos, A.; de Veras, E. V.; Rangel, J.; Trindade, O. L.
2016-07-01
The National Laboratory for Metrology of Ionizing Radiation provides gamma sources of radionuclide and standardized in activity with reduced uncertainties. Relative methods require standards to determine the sample activity while the absolute methods, as sum-peak, not. The activity is obtained directly with good accuracy and low uncertainties. 133Ba is used in research laboratories and on calibration of detectors for analysis in different work areas. Classical absolute methods don't calibrate 133Ba due to its complex decay scheme. The sum-peak method using gamma spectrometry with germanium detector standardizes 133Ba samples. Uncertainties lower than 1% to activity results were obtained.
HPLC analysis and standardization of Brahmi vati - An Ayurvedic poly-herbal formulation.
Mishra, Amrita; Mishra, Arun K; Tiwari, Om Prakash; Jha, Shivesh
2013-09-01
The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC-UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. An HPLC-UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine.
Elemental Analysis in Biological Matrices Using ICP-MS.
Hansen, Matthew N; Clogston, Jeffrey D
2018-01-01
The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
1982-10-28
form a non- soluble complex. After filtering and burning the non-pure molybdenum trioxide is weighed. Ammonia water is used to dissolve the molybdenum...niobium and tantalum should use the methyl alcohol distillation - curcumin absorption luminosity 66 method for determination. II. The Methyl Alcohol...Distillation - Curcumin Absorption Luminosity Method 1. Summary of Method In a phosphorus sulfate medium, boron and methyl alcohol produce methyl borate
2010-01-01
Background Cluster analysis, and in particular hierarchical clustering, is widely used to extract information from gene expression data. The aim is to discover new classes, or sub-classes, of either individuals or genes. Performing a cluster analysis commonly involve decisions on how to; handle missing values, standardize the data and select genes. In addition, pre-processing, involving various types of filtration and normalization procedures, can have an effect on the ability to discover biologically relevant classes. Here we consider cluster analysis in a broad sense and perform a comprehensive evaluation that covers several aspects of cluster analyses, including normalization. Result We evaluated 2780 cluster analysis methods on seven publicly available 2-channel microarray data sets with common reference designs. Each cluster analysis method differed in data normalization (5 normalizations were considered), missing value imputation (2), standardization of data (2), gene selection (19) or clustering method (11). The cluster analyses are evaluated using known classes, such as cancer types, and the adjusted Rand index. The performances of the different analyses vary between the data sets and it is difficult to give general recommendations. However, normalization, gene selection and clustering method are all variables that have a significant impact on the performance. In particular, gene selection is important and it is generally necessary to include a relatively large number of genes in order to get good performance. Selecting genes with high standard deviation or using principal component analysis are shown to be the preferred gene selection methods. Hierarchical clustering using Ward's method, k-means clustering and Mclust are the clustering methods considered in this paper that achieves the highest adjusted Rand. Normalization can have a significant positive impact on the ability to cluster individuals, and there are indications that background correction is preferable, in particular if the gene selection is successful. However, this is an area that needs to be studied further in order to draw any general conclusions. Conclusions The choice of cluster analysis, and in particular gene selection, has a large impact on the ability to cluster individuals correctly based on expression profiles. Normalization has a positive effect, but the relative performance of different normalizations is an area that needs more research. In summary, although clustering, gene selection and normalization are considered standard methods in bioinformatics, our comprehensive analysis shows that selecting the right methods, and the right combinations of methods, is far from trivial and that much is still unexplored in what is considered to be the most basic analysis of genomic data. PMID:20937082
Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A
2009-10-15
This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.
Wan, Xiang; Wang, Wenqian; Liu, Jiming; Tong, Tiejun
2014-12-19
In systematic reviews and meta-analysis, researchers often pool the results of the sample mean and standard deviation from a set of similar clinical trials. A number of the trials, however, reported the study using the median, the minimum and maximum values, and/or the first and third quartiles. Hence, in order to combine results, one may have to estimate the sample mean and standard deviation for such trials. In this paper, we propose to improve the existing literature in several directions. First, we show that the sample standard deviation estimation in Hozo et al.'s method (BMC Med Res Methodol 5:13, 2005) has some serious limitations and is always less satisfactory in practice. Inspired by this, we propose a new estimation method by incorporating the sample size. Second, we systematically study the sample mean and standard deviation estimation problem under several other interesting settings where the interquartile range is also available for the trials. We demonstrate the performance of the proposed methods through simulation studies for the three frequently encountered scenarios, respectively. For the first two scenarios, our method greatly improves existing methods and provides a nearly unbiased estimate of the true sample standard deviation for normal data and a slightly biased estimate for skewed data. For the third scenario, our method still performs very well for both normal data and skewed data. Furthermore, we compare the estimators of the sample mean and standard deviation under all three scenarios and present some suggestions on which scenario is preferred in real-world applications. In this paper, we discuss different approximation methods in the estimation of the sample mean and standard deviation and propose some new estimation methods to improve the existing literature. We conclude our work with a summary table (an Excel spread sheet including all formulas) that serves as a comprehensive guidance for performing meta-analysis in different situations.
Greenhouse Gas Analysis by GC/MS
NASA Astrophysics Data System (ADS)
Bock, E. M.; Easton, Z. M.; Macek, P.
2015-12-01
Current methods to analyze greenhouse gases rely on designated complex, multiple-column, multiple-detector gas chromatographs. A novel method was developed in partnership with Shimadzu for simultaneous quantification of carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) in environmental gas samples. Gas bulbs were used to make custom standard mixtures by injecting small volumes of pure analyte into the nitrogen-filled bulb. Resulting calibration curves were validated using a certified gas standard. The use of GC/MS systems to perform this analysis has the potential to move the analysis of greenhouse gasses from expensive, custom GC systems to standard single-quadrupole GC/MS systems that are available in most laboratories, which wide variety of applications beyond greenhouse gas analysis. Additionally, use of mass spectrometry can provide confirmation of identity of target analytes, and will assist in the identification of unknown peaks should they be present in the chromatogram.
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...
2016-10-18
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
Evaluating the Risks of Clinical Research: Direct Comparative Analysis
Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David
2014-01-01
Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks. PMID:25210944
Bosse, Hans Martin; Nickel, Martin; Huwendiek, Sören; Schultz, Jobst Hendrik; Nikendei, Christoph
2015-10-24
The few studies directly comparing the methodological approach of peer role play (RP) and standardized patients (SP) for the delivery of communication skills all suggest that both methods are effective. In this study we calculated the costs of both methods (given comparable outcomes) and are the first to generate a differential cost-effectiveness analysis of both methods. Medical students in their prefinal year were randomly assigned to one of two groups receiving communication training in Pediatrics either with RP (N = 34) or 19 individually trained SP (N = 35). In an OSCE with standardized patients using the Calgary-Cambridge Referenced Observation Guide both groups achieved comparable high scores (results published). In this study, corresponding costs were assessed as man-hours resulting from hours of work of SP and tutors. A cost-effectiveness analysis was performed. Cost-effectiveness analysis revealed a major advantage for RP as compared to SP (112 vs. 172 man hours; cost effectiveness ratio .74 vs. .45) at comparable performance levels after training with both methods. While both peer role play and training with standardized patients have their value in medical curricula, RP has a major advantage in terms of cost-effectiveness. This could be taken into account in future decisions.
Yousuf, Naveed; Violato, Claudio; Zuberi, Rukhsana W
2015-01-01
CONSTRUCT: Authentic standard setting methods will demonstrate high convergent validity evidence of their outcomes, that is, cutoff scores and pass/fail decisions, with most other methods when compared with each other. The objective structured clinical examination (OSCE) was established for valid, reliable, and objective assessment of clinical skills in health professions education. Various standard setting methods have been proposed to identify objective, reliable, and valid cutoff scores on OSCEs. These methods may identify different cutoff scores for the same examinations. Identification of valid and reliable cutoff scores for OSCEs remains an important issue and a challenge. Thirty OSCE stations administered at least twice in the years 2010-2012 to 393 medical students in Years 2 and 3 at Aga Khan University are included. Psychometric properties of the scores are determined. Cutoff scores and pass/fail decisions of Wijnen, Cohen, Mean-1.5SD, Mean-1SD, Angoff, borderline group and borderline regression (BL-R) methods are compared with each other and with three variants of cluster analysis using repeated measures analysis of variance and Cohen's kappa. The mean psychometric indices on the 30 OSCE stations are reliability coefficient = 0.76 (SD = 0.12); standard error of measurement = 5.66 (SD = 1.38); coefficient of determination = 0.47 (SD = 0.19), and intergrade discrimination = 7.19 (SD = 1.89). BL-R and Wijnen methods show the highest convergent validity evidence among other methods on the defined criteria. Angoff and Mean-1.5SD demonstrated least convergent validity evidence. The three cluster variants showed substantial convergent validity with borderline methods. Although there was a high level of convergent validity of Wijnen method, it lacks the theoretical strength to be used for competency-based assessments. The BL-R method is found to show the highest convergent validity evidences for OSCEs with other standard setting methods used in the present study. We also found that cluster analysis using mean method can be used for quality assurance of borderline methods. These findings should be further confirmed by studies in other settings.
A refined method for multivariate meta-analysis and meta-regression
Jackson, Daniel; Riley, Richard D
2014-01-01
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects’ standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:23996351
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... ASTM D1945-03, Standard Test Method for Analysis of Natural Gas by Gas Chromatography; ASTM D1946-90... (Reapproved 2006), Standard Test Method for Heating Value of Gases in Natural Gas Range by Stoichiometric...
Negeri, Zelalem F; Shaikh, Mateen; Beyene, Joseph
2018-05-11
Diagnostic or screening tests are widely used in medical fields to classify patients according to their disease status. Several statistical models for meta-analysis of diagnostic test accuracy studies have been developed to synthesize test sensitivity and specificity of a diagnostic test of interest. Because of the correlation between test sensitivity and specificity, modeling the two measures using a bivariate model is recommended. In this paper, we extend the current standard bivariate linear mixed model (LMM) by proposing two variance-stabilizing transformations: the arcsine square root and the Freeman-Tukey double arcsine transformation. We compared the performance of the proposed methods with the standard method through simulations using several performance measures. The simulation results showed that our proposed methods performed better than the standard LMM in terms of bias, root mean square error, and coverage probability in most of the scenarios, even when data were generated assuming the standard LMM. We also illustrated the methods using two real data sets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
Vajda, E G; Skedros, J G; Bloebaum, R D
1998-10-01
Backscattered electron (BSE) imaging has proven to be a useful method for analyzing the mineral distribution in microscopic regions of bone. However, an accepted method of standardization has not been developed, limiting the utility of BSE imaging for truly quantitative analysis. Previous work has suggested that BSE images can be standardized by energy-dispersive x-ray spectrometry (EDX). Unfortunately, EDX-standardized BSE images tend to underestimate the mineral content of bone when compared with traditional ash measurements. The goal of this study is to investigate the nature of the deficit between EDX-standardized BSE images and ash measurements. A series of analytical standards, ashed bone specimens, and unembedded bone specimens were investigated to determine the source of the deficit previously reported. The primary source of error was found to be inaccurate ZAF corrections to account for the organic phase of the bone matrix. Conductive coatings, methylmethacrylate embedding media, and minor elemental constituents in bone mineral introduced negligible errors. It is suggested that the errors would remain constant and an empirical correction could be used to account for the deficit. However, extensive preliminary testing of the analysis equipment is essential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The first part covers standards for gaseous fuels. The second part covers standards on coal and coke including the classification of coals, determination of major elements in coal ash and trace elements in coal, metallurgical properties of coal and coke, methods of analysis of coal and coke, petrogrpahic analysis of coal and coke, physical characteristics of coal, quality assurance and sampling.
A Standards-Based Content Analysis of Selected Biological Science Websites
ERIC Educational Resources Information Center
Stewart, Joy E.
2010-01-01
The purpose of this study was to analyze the biology content, instructional strategies, and assessment methods of 100 biological science websites that were appropriate for Grade 12 educational purposes. For the analysis of each website, an instrument, developed from the National Science Education Standards (NSES) for Grade 12 Life Science coupled…
An Effective Method for Substance Detection Using the Broad Spectrum THz Signal: A “Terahertz Nose”
Trofimov, Vyacheslav A.; Varentsova, Svetlana A.
2015-01-01
We propose an effective method for the detection and identification of dangerous substances by using the broadband THz pulse. This pulse excites, for example, many vibrational or rotational energy levels of molecules simultaneously. By analyzing the time-dependent spectrum of the THz pulse transmitted through or reflected from a substance, we follow the average response spectrum dynamics. Comparing the absorption and emission spectrum dynamics of a substance under analysis with the corresponding data for a standard substance, one can detect and identify the substance under real conditions taking into account the influence of packing material, water vapor and substance surface. For quality assessment of the standard substance detection in the signal under analysis, we propose time-dependent integral correlation criteria. Restrictions of usually used detection and identification methods, based on a comparison between the absorption frequencies of a substance under analysis and a standard substance, are demonstrated using a physical experiment with paper napkins. PMID:26020281
Coplen, T.B.; Wildman, J.D.; Chen, J.
1991-01-01
Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.
ANALYSIS OF ALDEHYDES AND KETONES IN THE GAS PHASE
The development and testing of a 2,4-dinitrophenylhydrazine-acetonitrile (DNPH-ACN) method for the analysis of aldehydes and ketones in ambient air are described. A discussion of interferences, preparation of calibration standards, analytical testing, fluorescence methods and car...
Li, Xiongwei; Wang, Zhe; Fu, Yangting; Li, Zheng; Liu, Jianmin; Ni, Weidou
2014-01-01
Measurement of coal carbon content using laser-induced breakdown spectroscopy (LIBS) is limited by its low precision and accuracy. A modified spectrum standardization method was proposed to achieve both reproducible and accurate results for the quantitative analysis of carbon content in coal using LIBS. The proposed method used the molecular emissions of diatomic carbon (C2) and cyanide (CN) to compensate for the diminution of atomic carbon emissions in high volatile content coal samples caused by matrix effect. The compensated carbon line intensities were further converted into an assumed standard state with standard plasma temperature, electron number density, and total number density of carbon, under which the carbon line intensity is proportional to its concentration in the coal samples. To obtain better compensation for fluctuations of total carbon number density, the segmental spectral area was used and an iterative algorithm was applied that is different from our previous spectrum standardization calculations. The modified spectrum standardization model was applied to the measurement of carbon content in 24 bituminous coal samples. The results demonstrate that the proposed method has superior performance over the generally applied normalization methods. The average relative standard deviation was 3.21%, the coefficient of determination was 0.90, the root mean square error of prediction was 2.24%, and the average maximum relative error for the modified model was 12.18%, showing an overall improvement over the corresponding values for the normalization with segmental spectrum area, 6.00%, 0.75, 3.77%, and 15.40%, respectively.
Sert, Şenol
2013-07-01
A comparison method for the determination (without sample pre-concentration) of uranium in ore by inductively coupled plasma optical emission spectrometry (ICP-OES) has been performed. The experiments were conducted using three procedures: matrix matching, plasma optimization, and internal standardization for three emission lines of uranium. Three wavelengths of Sm were tested as internal standard for the internal standardization method. The robust conditions were evaluated using applied radiofrequency power, nebulizer argon gas flow rate, and sample uptake flow rate by considering the intensity ratio of the Mg(II) 280.270 nm and Mg(I) 285.213 nm lines. Analytical characterization of method was assessed by limit of detection and relative standard deviation values. The certificated reference soil sample IAEA S-8 was analyzed, and the uranium determination at 367.007 nm with internal standardization using Sm at 359.260 nm has been shown to improve accuracy compared with other methods. The developed method was used for real uranium ore sample analysis.
[Optimization of cluster analysis based on drug resistance profiles of MRSA isolates].
Tani, Hiroya; Kishi, Takahiko; Gotoh, Minehiro; Yamagishi, Yuka; Mikamo, Hiroshige
2015-12-01
We examined 402 methicillin-resistant Staphylococcus aureus (MRSA) strains isolated from clinical specimens in our hospital between November 19, 2010 and December 27, 2011 to evaluate the similarity between cluster analysis of drug susceptibility tests and pulsed-field gel electrophoresis (PFGE). The results showed that the 402 strains tested were classified into 27 PFGE patterns (151 subtypes of patterns). Cluster analyses of drug susceptibility tests with the cut-off distance yielding a similar classification capability showed favorable results--when the MIC method was used, and minimum inhibitory concentration (MIC) values were used directly in the method, the level of agreement with PFGE was 74.2% when 15 drugs were tested. The Unweighted Pair Group Method with Arithmetic mean (UPGMA) method was effective when the cut-off distance was 16. Using the SIR method in which susceptible (S), intermediate (I), and resistant (R) were coded as 0, 2, and 3, respectively, according to the Clinical and Laboratory Standards Institute (CLSI) criteria, the level of agreement with PFGE was 75.9% when the number of drugs tested was 17, the method used for clustering was the UPGMA, and the cut-off distance was 3.6. In addition, to assess the reproducibility of the results, 10 strains were randomly sampled from the overall test and subjected to cluster analysis. This was repeated 100 times under the same conditions. The results indicated good reproducibility of the results, with the level of agreement with PFGE showing a mean of 82.0%, standard deviation of 12.1%, and mode of 90.0% for the MIC method and a mean of 80.0%, standard deviation of 13.4%, and mode of 90.0% for the SIR method. In summary, cluster analysis for drug susceptibility tests is useful for the epidemiological analysis of MRSA.
Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly
2015-12-18
This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.
A Comparison of Imputation Methods for Bayesian Factor Analysis Models
ERIC Educational Resources Information Center
Merkle, Edgar C.
2011-01-01
Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Jaromy; Sun Zaijing; Wells, Doug
2009-03-10
Photon activation analysis detected elements in two NIST standards that did not have reported concentration values. A method is currently being developed to infer these concentrations by using scaling parameters and the appropriate known quantities within the NIST standard itself. Scaling parameters include: threshold, peak and endpoint energies; photo-nuclear cross sections for specific isotopes; Bremstrahlung spectrum; target thickness; and photon flux. Photo-nuclear cross sections and energies from the unknown elements must also be known. With these quantities, the same integral was performed for both the known and unknown elements resulting in an inference of the concentration of the un-reported elementmore » based on the reported value. Since Rb and Mn were elements that were reported in the standards, and because they had well-identified peaks, they were used as the standards of inference to determine concentrations of the unreported elements of As, I, Nb, Y, and Zr. This method was tested by choosing other known elements within the standards and inferring a value based on the stated procedure. The reported value of Mn in the first NIST standard was 403{+-}15 ppm and the reported value of Ca in the second NIST standard was 87000 ppm (no reported uncertainty). The inferred concentrations were 370{+-}23 ppm and 80200{+-}8700 ppm respectively.« less
Preliminary analysis techniques for ring and stringer stiffened cylindrical shells
NASA Technical Reports Server (NTRS)
Graham, J.
1993-01-01
This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.
Computing tools for implementing standards for single-case designs.
Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E
2015-11-01
In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.
Evaluating the risks of clinical research: direct comparative analysis.
Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David
2014-09-01
Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.
40 CFR 60.2120 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Standards of Performance for Commercial and Industrial Solid Waste Incineration Units Emission Limitations... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
PPM mixtures of formaldehyde in gas cylinders: Stability and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, K.C.; Miller, S.B.; Patterson, L.M.
1999-07-01
Scott Specialty Gases has been successful in producing stable calibration gases of formaldehyde at low concentration. Critical to this success has been the development of a treatment process for high pressure aluminum cylinders. Formaldehyde cylinders having concentrations of 20ppm and 4ppm were found to show only small decline in concentrations over a period of approximately 12 months. Since no NIST traceable formaldehyde standards (or Standard Reference Material) are available, all Scott's formaldehyde cylinders were originally certified by traditional impinger method. This method involves an extremely tedious purification procedure for 2,4-dinitrophenylhydrazine (2,4-DNPH). A modified version of the impinger method has beenmore » developed and does not require extensive reagent purification for formaldehyde analysis. Extremely low formaldehyde blanks have been obtained with the modified method. The HPLC conditions in the original method were used for chromatographic separations. The modified method results in a lower analytical uncertainty for the formaldehyde standard mixtures. Consequently, it is possible to discern small differences between analytical results that are important for stability study.« less
Time-variant random interval natural frequency analysis of structures
NASA Astrophysics Data System (ADS)
Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin
2018-02-01
This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by reference, see § 98.7). You... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
USDA-ARS?s Scientific Manuscript database
Consumption of resistant starch (RS) may lead to reduced glycemia, improved satiety, and beneficial changes in gut microbiota due to its unique digestive and absorptive properties. We developed a standardized protocol for preparation of potatoes in order to assess their RS content and modified a com...
Wort free amino nitrogen analysis adapted to a microplate format
USDA-ARS?s Scientific Manuscript database
The standard method for determining wort free amino nitrogen content calls for the use of test tubes and glass marbles, as well as boiling and 20°C water baths. In this paper we describe how the standard method can be updated and streamlined by replacing water baths, test tubes and marbles with a th...
A Critical Analysis of the Body of Work Method for Setting Cut-Scores
ERIC Educational Resources Information Center
Radwan, Nizam; Rogers, W. Todd
2006-01-01
The recent increase in the use of constructed-response items in educational assessment and the dissatisfaction with the nature of the decision that the judges must make using traditional standard-setting methods created a need to develop new and effective standard-setting procedures for tests that include both multiple-choice and…
ERIC Educational Resources Information Center
Swars, Susan Lee; Chestnutt, Cliff
2016-01-01
This mixed methods study explored elementary teachers' (n = 73) experiences with and perspectives on the recently implemented Common Core State Standards for Mathematics (CCSS-Mathematics) at a high-needs, urban school. Analysis of the survey, questionnaire, and interview data reveals the findings cluster around: familiarity with and preparation…
ERIC Educational Resources Information Center
Lee, Jaekyung
2010-01-01
This study examines potential consequences of the discrepancies between national and state performance standards for school funding in Kentucky and Maine. Applying the successful schools observation method and cost function analysis method to integrated data-sets that match schools' eight-grade mathematics test performance measures to district…
USDA-ARS?s Scientific Manuscript database
An ‘extract-filter-shoot’ method for analysis of vitamin D2, ergocalciferol, in a dry powdered dietary supplement capsule containing rice flour excipient and in National Institute of Standards and Technology (NIST) standard reference material (SRM) 3280 is reported. Quantification of vitamin D2 was...
Analysis of standard reference materials by absolute INAA
NASA Astrophysics Data System (ADS)
Heft, R. E.; Koszykowski, R. F.
1981-07-01
Three standard reference materials: flyash, soil, and ASI 4340 steel, are analyzed by a method of absolute instrumental neutron activation analysis. Two different light water pool-type reactors were used to produce equivalent analytical results even though the epithermal to thermal flux ratio in one reactor was higher than that in the other by a factor of two.
The method for extracting and preparing a dermal or surface wipe sample for analysis of acidic persistent organic pollutants is summarized in this standard operating procedure. It covers the extraction and concentration of samples that are to be analyzed by gas chromatography/mas...
Internal Standards: A Source of Analytical Bias For Volatile Organic Analyte Determinations
The use of internal standards in the determination of volatile organic compounds as described in SW-846 Method 8260C introduces a potential for bias in results once the internal standards (ISTDs) are added to a sample for analysis. The bias is relative to the dissimilarity betw...
Wilson, S.A.; Ridley, W.I.; Koenig, A.E.
2002-01-01
The requirements of standard materials for LA-ICP-MS analysis have been difficult to meet for the determination of trace elements in sulfides. We describe a method for the production of synthetic sulfides by precipitation from solution. The method is detailed by the production of approximately 200 g of a material, PS-1, with a suite of chalcophilic trace elements in an Fe-Zn-Cu-S matrix. Preliminary composition data, together with an evaluation of the homogeneity for individual elements, suggests that this type of material meets the requirements for a sulfide calibration standard that allows for quantitative analysis. Contamination of the standard with Na suggests that H2S gas may prove a better sulfur source for future experiments. We recommend that calibration data be collected in whatever mode is closest to that employed for the analysis of the unknown material, because of variable fractionation effects as a function of analytical mode. For instance, if individual spot analyses are attempted on unknown sample, then a raster of several individual spot analyses, not a continuous scan, should be collected and averaged for the standard. Hg and Au are exceptions to the above and calibration data should always be collected in a scanning mode. Au is more heterogeneously distributed than other trace metals and large-area scans are required to provide an average value for calibration purposes. We emphasize that the values given in Table 1 are preliminary values. Further chemical characterization of this standard, through a round-robin analysis program, will allow the USGS to provide both certified and recommended values for individual elements. The USGS has developed PS-1 as a potential new LA-ICP-MS standard for use by the analytical community, and requests for this material should be addressed to S. Wilson. However, it is stressed that an important aspect of the method described here is the flexibility for individual investigators to produce sulfides with a wide range of trace metals in variable matrices. For example, PS-1 is not well suited to the analysis of galena, and it would be relatively straightforward for other standards to be developed with Pb present in the matrix as a major constituent. These standards can be made easily and cheaply in a standard wet chemistry laboratory using equipment and chemicals that are readily available.
Dong, Shuya; He, Jiao; Hou, Huiping; Shuai, Yaping; Wang, Qi; Yang, Wenling; Sun, Zheng; Li, Qing; Bi, Kaishun; Liu, Ran
2017-12-01
A novel, improved, and comprehensive method for quality evaluation and discrimination of Herba Leonuri has been developed and validated based on normal- and reversed-phase chromatographic methods. To identify Herba Leonuri, normal- and reversed-phase high-performance thin-layer chromatography fingerprints were obtained by comparing the colors and R f values of the bands, and reversed-phase high-performance liquid chromatography fingerprints were obtained by using an Agilent Poroshell 120 SB-C18 within 28 min. By similarity analysis and hierarchical clustering analysis, we show that there are similar chromatographic patterns in Herba Leonuri samples, but significant differences in counterfeits and variants. To quantify the bio-active components of Herba Leonuri, reversed-phase high-performance liquid chromatography was performed to analyze syringate, leonurine, quercetin-3-O-robiniaglycoside, hyperoside, rutin, isoquercitrin, wogonin, and genkwanin simultaneously by single standard to determine multi-components method with rutin as internal standard. Meanwhile, normal-phase high-performance liquid chromatography was performed by using an Agilent ZORBAX HILIC Plus within 6 min to determine trigonelline and stachydrine using trigonelline as internal standard. Innovatively, among these compounds, bio-active components of quercetin-3-O-robiniaglycoside and trigonelline were first determined in Herba Leonuri. In general, the method integrating multi-chromatographic analyses offered an efficient way for the standardization and identification of Herba Leonuri. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sastre Toraño, J; van Hattum, S H
2001-10-01
A new method is presented for the quantitative analysis of compounds in pharmaceutical preparations Fourier transform (FT) mid-infrared (MIR) spectroscopy with an attenuated total reflection (ATR) module. Reduction of the quantity of overlapping absorption bands, by interaction of the compound of interest with an appropriate solvent, and the employment of an internal standard (IS), makes MIR suitable for quantitative analysis. Vigabatrin, as active compound in vigabatrin 100-mg capsules, was used as a model compound for the development of the method. Vigabatrin was extracted from the capsule content with water after addition of a sodium thiosulfate IS solution. The extract was concentrated by volume reduction and applied to the FTMIR-ATR module. Concentrations of unknown samples were calculated from the ratio of the vigabatrin band area (1321-1610 cm(-1)) and the IS band area (883-1215 cm(-1)) using a calibration standard. The ratio of the area of the vigabatrin peak to that of the IS was linear with the concentration in the range of interest (90-110 mg, in twofold; n=2). The accuracy of the method in this range was 99.7-100.5% (n=5) with a variability of 0.4-1.3% (n=5). The comparison of the presented method with an HPLC assay showed similar results; the analysis of five vigabatrin 100-mg capsules resulted in a mean concentration of 102 mg with a variation of 2% with both methods.
Luce, T. C.; Petty, C. C.; Meyer, W. H.; ...
2016-11-02
An approximate method to correct the motional Stark effect (MSE) spectroscopy for the effects of intrinsic plasma electric fields has been developed. The motivation for using an approximate method is to incorporate electric field effects for between-pulse or real-time analysis of the current density or safety factor profile. The toroidal velocity term in the momentum balance equation is normally the dominant contribution to the electric field orthogonal to the flux surface over most of the plasma. When this approximation is valid, the correction to the MSE data can be included in a form like that used when electric field effectsmore » are neglected. This allows measurements of the toroidal velocity to be integrated into the interpretation of the MSE polarization angles without changing how the data is treated in existing codes. In some cases, such as the DIII-D system, the correction is especially simple, due to the details of the neutral beam and MSE viewing geometry. The correction method is compared using DIII-D data in a variety of plasma conditions to analysis that assumes no radial electric field is present and to analysis that uses the standard correction method, which involves significant human intervention for profile fitting. The comparison shows that the new correction method is close to the standard one, and in all cases appears to offer a better result than use of the uncorrected data. Lastly, the method has been integrated into the standard DIII-D equilibrium reconstruction code in use for analysis between plasma pulses and is sufficiently fast that it will be implemented in real-time equilibrium analysis for control applications.« less
Standardization of Spore Inactivation Method for PMA-PhyloChip Analysis
NASA Technical Reports Server (NTRS)
Schrader, Michael
2011-01-01
In compliance with the Committee on Space Research (COSPAR) planetary protection policy, National Aeronautics and Space Administration (NASA) monitors the total microbial burden of spacecraft as a means for minimizing the inadvertent transfer of viable contaminant microorganisms to extraterrestrial environments (forward contamination). NASA standard assay-based counts are used both as a proxy for relative surface cleanliness and to estimate overall microbial burden as well as to assess whether forward planetary protection risk criteria are met for a given mission, which vary by the planetary body to be explored and whether or not life detection missions are present. Despite efforts to reduce presence of microorganisms from spacecraft prior to launch, microbes have been isolated from spacecraft and associated surfaces within the extreme conditions of clean room facilities using state of the art molecular technologies. Development of a more sensitive method that will better enumerate all viable microorganisms from spacecraft and associated surfaces could support future life detection missions. Current culture-based (NASA standard spore assay) and nucleic-acid-based polymerase chain reaction (PCR) methods have significant shortcomings in this type of analysis. The overall goal of this project is to evaluate and validate a new molecular method based on the use of a deoxyribonucleic acid (DNA) intercalating agent propidium monoazide (PMA). This is used in combination with DNA microarray (PhyloChip) which has been shown to identify very low levels of organisms on spacecraft associated surfaces. PMA can only penetrate the membrane of dead cells. Once penetrated, it intercalates the DNA and, upon photolysis using visible light it produces stable DNA monoadducts. This allows DNA to be unavailable for further PCR analysis. The specific aim of this study is to standardize the spore inactivation method for PMA-PhyloChip analysis. We have used the bacterial spores Bacillus subtilis 168 (standard laboratory isolate) as a test organism.
Wang, Chunyan; Zhang, Wenyan; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2012-05-01
The analysis by electrospray-ionization tandem mass spectrometry of amino acids with butyl esterification and isotopically labeled internal standard is routine in newborn screening laboratories worldwide. In the present study, we established a direct analysis method of higher accuracy that uses a non-deuterated internal standard. The automatic sampler and the pump of an LC apparatus were used to inject sample and mobile phase to MS, but no LC column was needed. The dry blood spot (DBS) material was prepared at levels of low, medium and high concentration; the running time was 1 min. In parallel to the new procedure, we applied the established method to analyze nine amino acids on DBS of healthy newborns and phenylketonuria newborns. The newly proposed method of product ion confirmation scan along with multiple reaction monitoring resulted in a very accurate identification of each amino acid. Our innovative protocol had high sensitivity and specificity in the analysis of cases of suspected metabolic diseases.
NASA Astrophysics Data System (ADS)
Tang, Xiaoxing; Qian, Yuan; Guo, Yanchuan; Wei, Nannan; Li, Yulan; Yao, Jian; Wang, Guanghua; Ma, Jifei; Liu, Wei
2017-12-01
A novel method has been improved for analyzing atmospheric pollutant metals (Be, Mn, Fe, Co, Ni, Cu, Zn, Se, Sr, Cd, and Pb) by laser ablation inductively coupled plasma mass spectrometry. In this method, solid standards are prepared by depositing droplets of aqueous standard solutions on the surface of a membrane filter, which is the same type as used for collecting atmospheric pollutant metals. Laser parameters were optimized, and ablation behaviors of the filter discs were studied. The mode of radial line scans across the filter disc was a representative ablation strategy and can avoid error from the inhomogeneous filter standards and marginal effect of the filter disc. Pt, as the internal standard, greatly improved the correlation coefficient of the calibration curve. The developed method provides low detection limits, from 0.01 ng m- 3 for Be and Co to 1.92 ng m- 3 for Fe. It was successfully applied for the determination of atmospheric pollutant metals collected in Lhasa, China. The analytical results showed good agreement with those obtained by conventional liquid analysis. In contrast to the conventional acid digestion procedure, the novel method not only greatly reduces sample preparation and shortens the analysis time but also provides a possible means for studying the spatial distribution of atmospheric filter samples.
Comparison of scoring approaches for the NEI VFQ-25 in low vision.
Dougherty, Bradley E; Bullimore, Mark A
2010-08-01
The aim of this study was to evaluate different approaches to scoring the National Eye Institute Visual Functioning Questionnaire-25 (NEI VFQ-25) in patients with low vision including scoring by the standard method, by Rasch analysis, and by use of an algorithm created by Massof to approximate Rasch person measure. Subscale validity and use of a 7-item short form instrument proposed by Ryan et al. were also investigated. NEI VFQ-25 data from 50 patients with low vision were analyzed using the standard method of summing Likert-type scores and calculating an overall average, Rasch analysis using Winsteps software, and the Massof algorithm in Excel. Correlations between scores were calculated. Rasch person separation reliability and other indicators were calculated to determine the validity of the subscales and of the 7-item instrument. Scores calculated using all three methods were highly correlated, but evidence of floor and ceiling effects was found with the standard scoring method. None of the subscales investigated proved valid. The 7-item instrument showed acceptable person separation reliability and good targeting and item performance. Although standard scores and Rasch scores are highly correlated, Rasch analysis has the advantages of eliminating floor and ceiling effects and producing interval-scaled data. The Massof algorithm for approximation of the Rasch person measure performed well in this group of low-vision patients. The validity of the subscales VFQ-25 should be reconsidered.
A simple method for plasma total vitamin C analysis suitable for routine clinical laboratory use.
Robitaille, Line; Hoffer, L John
2016-04-21
In-hospital hypovitaminosis C is highly prevalent but almost completely unrecognized. Medical awareness of this potentially important disorder is hindered by the inability of most hospital laboratories to determine plasma vitamin C concentrations. The availability of a simple, reliable method for analyzing plasma vitamin C could increase opportunities for routine plasma vitamin C analysis in clinical medicine. Plasma vitamin C can be analyzed by high performance liquid chromatography (HPLC) with electrochemical (EC) or ultraviolet (UV) light detection. We modified existing UV-HPLC methods for plasma total vitamin C analysis (the sum of ascorbic and dehydroascorbic acid) to develop a simple, constant-low-pH sample reduction procedure followed by isocratic reverse-phase HPLC separation using a purely aqueous low-pH non-buffered mobile phase. Although EC-HPLC is widely recommended over UV-HPLC for plasma total vitamin C analysis, the two methods have never been directly compared. We formally compared the simplified UV-HPLC method with EC-HPLC in 80 consecutive clinical samples. The simplified UV-HPLC method was less expensive, easier to set up, required fewer reagents and no pH adjustments, and demonstrated greater sample stability than many existing methods for plasma vitamin C analysis. When compared with the gold-standard EC-HPLC method in 80 consecutive clinical samples exhibiting a wide range of plasma vitamin C concentrations, it performed equivalently. The easy set up, simplicity and sensitivity of the plasma vitamin C analysis method described here could make it practical in a normally equipped hospital laboratory. Unlike any prior UV-HPLC method for plasma total vitamin C analysis, it was rigorously compared with the gold-standard EC-HPLC method and performed equivalently. Adoption of this method could increase the availability of plasma vitamin C analysis in clinical medicine.
Methods proposed to achieve air quality standards for mobile sources and technology surveillance.
Piver, W T
1975-01-01
The methods proposed to meet the 1975 Standards of the Clean Air Act for mobile sources are alternative antiknocks, exhaust emission control devices, and alternative engine designs. Technology surveillance analysis applied to this situation is an attempt to anticipate potential public and environmental health problems from these methods, before they happen. Components of this analysis are exhaust emission characterization, environmental transport and transformation, levels of public and environmental exposure, and the influence of economics on the selection of alternative methods. The purpose of this presentation is to show trends as a result of the interaction of these different components. In no manner can these trends be interpreted explicitly as to what will really happen. Such an analysis is necessary so that public and environmental health officials have the opportunity to act on potential problems before they become manifest. PMID:50944
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Analysis of low levels of rare earths by radiochemical neutron activation analysis
Wandless, G.A.; Morgan, J.W.
1985-01-01
A procedure for the radiochemical neutron-activation analysis for the rare earth elements (REE) involves the separation of the REE as a group by rapid ion-exchange methods and determination of yields by reactivation or by energy dispersive X-ray fluorescence (EDXRF) spectrometry. The U. S. Geological Survey (USGS) standard rocks, BCR-1 and AGV-1, were analyzed to determine the precision and accuracy of the method. We found that the precision was ??5-10% on the basis of replicate analysis and that, in general the accuracy was within ??5% of accepted values for most REE. Data for USGS standard rocks BIR-1 (Icelandic basalt) and DNC-1 (North Carolina diabase) are also presented. ?? 1985 Akade??miai Kiado??.
Garbarino, John R.; Taylor, Howard E.
1987-01-01
Inductively coupled plasma mass spectrometry is employed in the determination of Ni, Cu, Sr, Cd, Ba, Ti, and Pb in nonsaline, natural water samples by stable isotope dilution analysis. Hydrologic samples were directly analyzed without any unusual pretreatment. Interference effects related to overlapping isobars, formation of metal oxide and multiply charged ions, and matrix composition were identified and suitable methods of correction evaluated. A comparability study snowed that single-element isotope dilution analysis was only marginally better than sequential multielement isotope dilution analysis. Accuracy and precision of the single-element method were determined on the basis of results obtained for standard reference materials. The instrumental technique was shown to be ideally suited for programs associated with certification of standard reference materials.
ERIC Educational Resources Information Center
Sforza, Dario; Tienken, Christopher H.; Kim, Eunyoung
2016-01-01
The creators and supporters of the Common Core State Standards claim that the Standards require greater emphasis on higher-order thinking than previous state standards in mathematics and English language arts. We used a qualitative case study design with content analysis methods to test the claim. We compared the levels of thinking required by the…
Using Robust Standard Errors to Combine Multiple Regression Estimates with Meta-Analysis
ERIC Educational Resources Information Center
Williams, Ryan T.
2012-01-01
Combining multiple regression estimates with meta-analysis has continued to be a difficult task. A variety of methods have been proposed and used to combine multiple regression slope estimates with meta-analysis, however, most of these methods have serious methodological and practical limitations. The purpose of this study was to explore the use…
Living systematic reviews: 3. Statistical methods for updating meta-analyses.
Simmonds, Mark; Salanti, Georgia; McKenzie, Joanne; Elliott, Julian
2017-11-01
A living systematic review (LSR) should keep the review current as new research evidence emerges. Any meta-analyses included in the review will also need updating as new material is identified. If the aim of the review is solely to present the best current evidence standard meta-analysis may be sufficient, provided reviewers are aware that results may change at later updates. If the review is used in a decision-making context, more caution may be needed. When using standard meta-analysis methods, the chance of incorrectly concluding that any updated meta-analysis is statistically significant when there is no effect (the type I error) increases rapidly as more updates are performed. Inaccurate estimation of any heterogeneity across studies may also lead to inappropriate conclusions. This paper considers four methods to avoid some of these statistical problems when updating meta-analyses: two methods, that is, law of the iterated logarithm and the Shuster method control primarily for inflation of type I error and two other methods, that is, trial sequential analysis and sequential meta-analysis control for type I and II errors (failing to detect a genuine effect) and take account of heterogeneity. This paper compares the methods and considers how they could be applied to LSRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Evaluation of a High Throughput Starch Analysis Optimised for Wood
Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco
2014-01-01
Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863
NASA Technical Reports Server (NTRS)
1984-01-01
Standardized methods are established for screening of JAN B microcircuits and JANTXV semiconductor components for space mission or other critical applications when JAN S devices are not available. General specifications are provided which outline the DPA (destructive physical analysis), environmental, electrical, and data requirements for screening of various component technologies. This standard was developed for Air Force Space Division, and is available for use by other DOD agencies, NASA, and space systems contractors for establishing common screening methods for electronic components.
Nonclinical dose formulation analysis method validation and sample analysis.
Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D
2010-12-01
Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.
Toward the Standardization of Biochar Analysis: The COST Action TD1107 Interlaboratory Comparison.
Bachmann, Hans Jörg; Bucheli, Thomas D; Dieguez-Alonso, Alba; Fabbri, Daniele; Knicker, Heike; Schmidt, Hans-Peter; Ulbricht, Axel; Becker, Roland; Buscaroli, Alessandro; Buerge, Diane; Cross, Andrew; Dickinson, Dane; Enders, Akio; Esteves, Valdemar I; Evangelou, Michael W H; Fellet, Guido; Friedrich, Kevin; Gasco Guerrero, Gabriel; Glaser, Bruno; Hanke, Ulrich M; Hanley, Kelly; Hilber, Isabel; Kalderis, Dimitrios; Leifeld, Jens; Masek, Ondrej; Mumme, Jan; Carmona, Marina Paneque; Calvelo Pereira, Roberto; Rees, Frederic; Rombolà, Alessandro G; de la Rosa, José Maria; Sakrabani, Ruben; Sohi, Saran; Soja, Gerhard; Valagussa, Massimo; Verheijen, Frank; Zehetner, Franz
2016-01-20
Biochar produced by pyrolysis of organic residues is increasingly used for soil amendment and many other applications. However, analytical methods for its physical and chemical characterization are yet far from being specifically adapted, optimized, and standardized. Therefore, COST Action TD1107 conducted an interlaboratory comparison in which 22 laboratories from 12 countries analyzed three different types of biochar for 38 physical-chemical parameters (macro- and microelements, heavy metals, polycyclic aromatic hydrocarbons, pH, electrical conductivity, and specific surface area) with their preferential methods. The data were evaluated in detail using professional interlaboratory testing software. Whereas intralaboratory repeatability was generally good or at least acceptable, interlaboratory reproducibility was mostly not (20% < mean reproducibility standard deviation < 460%). This paper contributes to better comparability of biochar data published already and provides recommendations to improve and harmonize specific methods for biochar analysis in the future.
Reshadat, S; Saedi, S; Zangeneh, A; Ghasemi, S R; Gilan, N R; Karbasi, A; Bavandpoor, E
2015-09-08
Geographic information systems (GIS) analysis has not been widely used in underdeveloped countries to ensure that vulnerable populations have accessibility to primary health-care services. This study applied GIS methods to analyse the spatial accessibility to urban primary-care centres of the population in Kermanshah city, Islamic Republic of Iran, by age and sex groups. In a descriptive-analytical study over 3 time periods, network analysis, mean centre and standard distance methods were applied using ArcGIS 9.3. The analysis was based on a standard radius of 750 m distance from health centres, walking speed of 1 m/s and desired access time to health centres of 12.5 mins. The proportion of the population with inadequate geographical access to health centres rose from 47.3% in 1997 to 58.4% in 2012. The mean centre and standard distance mapping showed that the spatial distribution of health centres in Kermanshah needed to be adjusted to changes in population distribution.
Characterization of Triaxial Braided Composite Material Properties for Impact Simulation
NASA Technical Reports Server (NTRS)
Roberts, Gary D.; Goldberg, Robert K.; Biniendak, Wieslaw K.; Arnold, William A.; Littell, Justin D.; Kohlman, Lee W.
2009-01-01
The reliability of impact simulations for aircraft components made with triaxial braided carbon fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Improvements to standard quasi-static test methods are needed to account for the large unit cell size and localized damage within the unit cell. The deformation and damage of a triaxial braided composite material was examined using standard quasi-static in-plane tension, compression, and shear tests. Some modifications to standard test specimen geometries are suggested, and methods for measuring the local strain at the onset of failure within the braid unit cell are presented. Deformation and damage at higher strain rates is examined using ballistic impact tests on 61- by 61- by 3.2-mm (24- by 24- by 0.125-in.) composite panels. Digital image correlation techniques were used to examine full-field deformation and damage during both quasi-static and impact tests. An impact analysis method is presented that utilizes both local and global deformation and failure information from the quasi-static tests as input for impact simulations. Improvements that are needed in test and analysis methods for better predictive capability are examined.
Slavin, Margaret; Yu, Liangli Lucy
2012-12-15
A saponification/extraction procedure and high performance liquid chromatography (HPLC) analysis method were developed and validated for simultaneous analysis of phytosterols, tocopherols and lutein (a carotenoid) in soybeans. Separation was achieved on a phenyl column with a ternary, isocratic solvent system of acetonitrile, methanol and water (48:22.5:29.5, v/v/v). Evaporative light scattering detection (ELSD) was used to quantify β-sitosterol, stigmasterol, campesterol, and α-, δ- and γ-tocopherols, while lutein was quantified with visible light absorption at 450 nm. Peak identification was verified by retention times and spikes with external standards. Standard curves were constructed (R(2)>0.99) to allow for sample quantification. Recovery of the saponification and extraction was demonstrated via analysis of spiked samples. Also, the accuracy of results of four soybeans using the described saponification and HPLC analytical method was validated against existing methods. This method offers a more efficient alternative to individual methods for quantifying lutein, tocopherols and sterols in soybeans. Copyright © 2012 Elsevier Ltd. All rights reserved.
Non-standard analysis and embedded software
NASA Technical Reports Server (NTRS)
Platek, Richard
1995-01-01
One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.
Dettmer, Katja; Stevens, Axel P; Fagerer, Stephan R; Kaspar, Hannelore; Oefner, Peter J
2012-01-01
Two mass spectrometry-based methods for the quantitative analysis of free amino acids are described. The first method uses propyl chloroformate/propanol derivatization and gas chromatography-quadrupole mass spectrometry (GC-qMS) analysis in single-ion monitoring mode. Derivatization is carried out directly in aqueous samples, thereby allowing automation of the entire procedure, including addition of reagents, extraction, and injection into the GC-MS. The method delivers the quantification of 26 amino acids. The isobaric tagging for relative and absolute quantification (iTRAQ) method employs the labeling of amino acids with isobaric iTRAQ tags. The tags contain two different cleavable reporter ions, one for the sample and one for the standard, which are detected by fragmentation in a tandem mass spectrometer. Reversed-phase liquid chromatography of the labeled amino acids is performed prior to mass spectrometric analysis to separate isobaric amino acids. The commercial iTRAQ kit allows for the analysis of 42 physiological amino acids with a respective isotope-labeled standard for each of these 42 amino acids.
Analysis of street drugs in seized material without primary reference standards.
Laks, Suvi; Pelander, Anna; Vuori, Erkki; Ali-Tolppa, Elisa; Sippola, Erkki; Ojanperä, Ilkka
2004-12-15
A novel approach was used to analyze street drugs in seized material without primary reference standards. Identification was performed by liquid chromatography/time-of-flight mass spectrometry (LC/TOFMS), essentially based on accurate mass determination using a target library of 735 exact monoisotopic masses. Quantification was carried out by liquid chromatography/chemiluminescence nitrogen detection (LC/CLND) with a single secondary standard (caffeine), utilizing the detector's equimolar response to nitrogen. Sample preparation comprised dilution, first with methanol and further with the LC mobile phase. Altogether 21 seized drug samples were analyzed blind by the present method, and results were compared to accredited reference methods utilizing identification by gas chromatography/mass spectrometry and quantification by gas chromatography or liquid chromatography. The 31 drug findings by LC/TOFMS comprised 19 different drugs-of-abuse, byproducts, and adulterants, including amphetamine and tryptamine designer drugs, with one unresolved pair of compounds having an identical mass. By the reference methods, 27 findings could be confirmed, and among the four unconfirmed findings, only 1 apparent false positive was found. In the quantitative analysis of 11 amphetamine, heroin, and cocaine findings, mean relative difference between the results of LC/CLND and the reference methods was 11% (range 4.2-21%), without any observable bias. Mean relative standard deviation for three parallel LC/CLND results was 6%. Results suggest that the present combination of LC/TOFMS and LC/CLND offers a simple solution for the analysis of scheduled and designer drugs in seized material, independent of the availability of primary reference standards.
Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing
2015-03-01
A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. Copyright © 2014 Elsevier B.V. All rights reserved.
Milton, Martin J T; Wang, Jian
2003-01-01
A new isotope dilution mass spectrometry (IDMS) method for high-accuracy quantitative analysis of gases has been developed and validated by the analysis of standard mixtures of carbon dioxide in nitrogen. The method does not require certified isotopic reference materials and does not require direct measurements of the highly enriched spike. The relative uncertainty of the method is shown to be 0.2%. Reproduced with the permission of Her Majesty's Stationery Office. Copyright Crown copyright 2003.
Field methods and data processing techniques associated with mapped inventory plots
William A. Bechtold; Stanley J. Zarnoch
1999-01-01
The U.S. Forest Inventory and Analysis (FIA) and Forest Health Monitoring (FHM) programs utilize a fixed-area mapped-plot design as the national standard for extensive forest inventories. The mapped-plot design is explained, as well as the rationale for its selection as the national standard. Ratio-of-means estimators am presented as a method to process data from...
Face recognition using slow feature analysis and contourlet transform
NASA Astrophysics Data System (ADS)
Wang, Yuehao; Peng, Lingling; Zhe, Fuchuan
2018-04-01
In this paper we propose a novel face recognition approach based on slow feature analysis (SFA) in contourlet transform domain. This method firstly use contourlet transform to decompose the face image into low frequency and high frequency part, and then takes technological advantages of slow feature analysis for facial feature extraction. We named the new method combining the slow feature analysis and contourlet transform as CT-SFA. The experimental results on international standard face database demonstrate that the new face recognition method is effective and competitive.
Gao, Wen; Wang, Rui; Li, Dan; Liu, Ke; Chen, Jun; Li, Hui-Jun; Xu, Xiaojun; Li, Ping; Yang, Hua
2016-01-05
The flowers of Lonicera japonica Thunb. were extensively used to treat many diseases. As the demands for L. japonica increased, some related Lonicera plants were often confused or misused. Caffeoylquinic acids were always regarded as chemical markers in the quality control of L. japonica, but they could be found in all Lonicera species. Thus, a simple and reliable method for the evaluation of different Lonicera flowers is necessary to be established. In this work a method based on single standard to determine multi-components (SSDMC) combined with principal component analysis (PCA) for control and distinguish of Lonicera species flowers have been developed. Six components including three caffeoylquinic acids and three iridoid glycosides were assayed simultaneously using chlorogenic acid as the reference standard. The credibility and feasibility of the SSDMC method were carefully validated and the results demonstrated that there were no remarkable differences compared with external standard method. Finally, a total of fifty-one batches covering five Lonicera species were analyzed and PCA was successfully applied to distinguish the Lonicera species. This strategy simplifies the processes in the quality control of multiple-componential herbal medicine which effectively adapted for improving the quality control of those herbs belonging to closely related species. Copyright © 2015 Elsevier B.V. All rights reserved.
40 CFR 63.762 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1272 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.762 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1272 - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... an affirmative defense to a claim for civil penalties for violations of such standards that are... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makoto Kashiwagi; Garamszeghy, Mike; Lantes, Bertrand
Disposal of low-and intermediate-level activated waste generated at nuclear power plants is being planned or carried out in many countries. The radioactivity concentrations and/or total quantities of long-lived, difficult-to-measure nuclides (DTM nuclides), such as C-14, Ni-63, Nb-94, α emitting nuclides etc., are often restricted by the safety case for a final repository as determined by each country's safety regulations, and these concentrations or amounts are required to be known and declared. With respect to waste contaminated by contact with process water, the Scaling Factor method (SF method), which is empirically based on sampling and analysis data, has been applied asmore » an important method for determining concentrations of DTM nuclides. This method was standardized by the International Organization for Standardization (ISO) and published in 2007 as ISO21238 'Scaling factor method to determine the radioactivity of low and intermediate-level radioactive waste packages generated at nuclear power plants' [1]. However, for activated metal waste with comparatively high concentrations of radioactivity, such as may be found in reactor control rods and internal structures, direct sampling and radiochemical analysis methods to evaluate the DTM nuclides are limited by access to the material and potentially high personnel radiation exposure. In this case, theoretical calculation methods in combination with empirical methods based on remote radiation surveys need to be used to best advantage for determining the disposal inventory of DTM nuclides while minimizing exposure to radiation workers. Pursuant to this objective a standard for the theoretical evaluation of the radioactivity concentration of DTM nuclides in activated waste, is in process through ISO TC85/SC5 (ISO Technical Committee 85: Nuclear energy, nuclear technologies, and radiological protection; Subcommittee 5: Nuclear fuel cycle). The project team for this ISO standard was formed in 2011 and is composed of experts from 11 countries. The project team has been conducting technical discussions on theoretical methods for determining concentrations of radioactivity, and has developed the draft International Standard of ISO16966 'Theoretical activation calculation method to evaluate the radioactivity of activated waste generated at nuclear reactors' [2]. This paper describes the international standardization process developed by the ISO project team, and outlines the following two theoretical activity evaluation methods:? Point method? Range method. (authors)« less
Mezouari, S; Liu, W Yun; Pace, G; Hartman, T G
2015-01-01
The objective of this study was to develop an improved analytical method for the determination of 3-chloro-1,2-propanediol (3-MCPD) and 1,3-dichloropropanol (1,3-DCP) in paper-type food packaging. The established method includes aqueous extraction, matrix spiking of a deuterated surrogate internal standard (3-MCPD-d₅), clean-up using Extrelut solid-phase extraction, derivatisation using a silylation reagent, and GC-MS analysis of the chloropropanols as their corresponding trimethyl silyl ethers. The new method is applicable to food-grade packaging samples using European Commission standard aqueous extraction and aqueous food stimulant migration tests. In this improved method, the derivatisation procedure was optimised; the cost and time of the analysis were reduced by using 10 times less sample, solvents and reagents than in previously described methods. Overall the validation data demonstrate that the method is precise and reliable. The limit of detection (LOD) of the aqueous extract was 0.010 mg kg(-1) (w/w) for both 3-MCPD and 1,3-DCP. Analytical precision had a relative standard deviation (RSD) of 3.36% for 3-MCPD and an RSD of 7.65% for 1,3-DCP. The new method was satisfactorily applied to the analysis of over 100 commercial paperboard packaging samples. The data are being used to guide the product development of a next generation of wet-strength resins with reduced chloropropanol content, and also for risk assessments to calculate the virtual safe dose (VSD).
Microprobe Analysis of Pu-Ga Standards
Wall, Angélique D.; Romero, Joseph P.; Schwartz, Daniel
2017-08-04
In order to obtain quantitative analysis using an Electron Scanning Microprobe it is essential to have a standard of known composition. Most elemental and multi-elemental standards can be easily obtained from places like Elemental Scientific or other standards organizations that are NIST (National Institute of Standards and Technology) traceable. It is, however, more challenging to find standards for plutonium. Past work performed in our group has typically involved using the plutonium sample to be analysed as its own standard as long as all other known components of the sample have standards to be compared to [1,2,3]. Finally, this method worksmore » well enough, but this experiment was performed in order to develop a more reliable standard for plutonium using five samples of known chemistry of a plutonium gallium mix that could then be used as the main plutonium and gallium standards for future experiments.« less
Microprobe Analysis of Pu-Ga Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Angélique D.; Romero, Joseph P.; Schwartz, Daniel
In order to obtain quantitative analysis using an Electron Scanning Microprobe it is essential to have a standard of known composition. Most elemental and multi-elemental standards can be easily obtained from places like Elemental Scientific or other standards organizations that are NIST (National Institute of Standards and Technology) traceable. It is, however, more challenging to find standards for plutonium. Past work performed in our group has typically involved using the plutonium sample to be analysed as its own standard as long as all other known components of the sample have standards to be compared to [1,2,3]. Finally, this method worksmore » well enough, but this experiment was performed in order to develop a more reliable standard for plutonium using five samples of known chemistry of a plutonium gallium mix that could then be used as the main plutonium and gallium standards for future experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Kimberly E; Gerdes, Kirk
2013-07-01
A new and complete GC–ICP-MS method is described for direct analysis of trace metals in a gas phase process stream. The proposed method is derived from standard analytical procedures developed for ICP-MS, which are regularly exercised in standard ICP-MS laboratories. In order to implement the method, a series of empirical factors were generated to calibrate detector response with respect to a known concentration of an internal standard analyte. Calibrated responses are ultimately used to determine the concentration of metal analytes in a gas stream using a semi-quantitative algorithm. The method was verified using a traditional gas injection from a GCmore » sampling valve and a standard gas mixture containing either a 1 ppm Xe + Kr mix with helium balance or 100 ppm Xe with helium balance. Data collected for Xe and Kr gas analytes revealed that agreement of 6–20% with the actual concentration can be expected for various experimental conditions. To demonstrate the method using a relevant “unknown” gas mixture, experiments were performed for continuous 4 and 7 hour periods using a Hg-containing sample gas that was co-introduced into the GC sample loop with the xenon gas standard. System performance and detector response to the dilute concentration of the internal standard were pre-determined, which allowed semi-quantitative evaluation of the analyte. The calculated analyte concentrations varied during the course of the 4 hour experiment, particularly during the first hour of the analysis where the actual Hg concentration was under predicted by up to 72%. Calculated concentration improved to within 30–60% for data collected after the first hour of the experiment. Similar results were seen during the 7 hour test with the deviation from the actual concentration being 11–81% during the first hour and then decreasing for the remaining period. The method detection limit (MDL) was determined for the mercury by injecting the sample gas into the system following a period of equilibration. The MDL for Hg was calculated as 6.8 μg · m -3. This work describes the first complete GC–ICP-MS method to directly analyze gas phase samples, and detailed sample calculations and comparisons to conventional ICP-MS methods are provided.« less
Bedner, Mary; Schantz, Michele M; Sander, Lane C; Sharpless, Katherine E
2008-05-23
Liquid chromatographic (LC) methods using atmospheric pressure chemical ionization/mass spectrometric (APCI-MS) detection were developed for the separation and analysis of the phytosterols campesterol, cycloartenol, lupenone, lupeol, beta-sitosterol, and stigmasterol. Brassicasterol and cholesterol were also included for investigation as internal standards. The methods were used to identify and quantify the phytosterols in each of two Serenoa repens (saw palmetto) Standard Reference Materials (SRMs) developed by the National Institute of Standards and Technology (NIST). Values obtained by LC-MS were compared to those obtained using the more traditional approach of gas chromatography with flame ionization detection. This is the first reported use of LC-MS to determine phytosterols in saw palmetto dietary supplement materials.
Evaluating Public Libraries Using Standard Scores: The Library Quotient.
ERIC Educational Resources Information Center
O'Connor, Daniel O.
1982-01-01
Describes a method for assessing the performance of public libraries using a standardized scoring system and provides an analysis of public library data from New Jersey as an example. Library standards and the derivation of measurement ratios are also discussed. A 33-item bibliography and three data tables are included. (JL)
An Empirical Comparison of Variable Standardization Methods in Cluster Analysis.
ERIC Educational Resources Information Center
Schaffer, Catherine M.; Green, Paul E.
1996-01-01
The common marketing research practice of standardizing the columns of a persons-by-variables data matrix prior to clustering the entities corresponding to the rows was evaluated with 10 large-scale data sets. Results indicate that the column standardization practice may be problematic for some kinds of data that marketing researchers used for…
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Optimization Based Efficiencies in First Order Reliability Analysis
NASA Technical Reports Server (NTRS)
Peck, Jeffrey A.; Mahadevan, Sankaran
2003-01-01
This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.
Detrended fluctuation analysis as a regression framework: Estimating dependence at different scales
NASA Astrophysics Data System (ADS)
Kristoufek, Ladislav
2015-02-01
We propose a framework combining detrended fluctuation analysis with standard regression methodology. The method is built on detrended variances and covariances and it is designed to estimate regression parameters at different scales and under potential nonstationarity and power-law correlations. The former feature allows for distinguishing between effects for a pair of variables from different temporal perspectives. The latter ones make the method a significant improvement over the standard least squares estimation. Theoretical claims are supported by Monte Carlo simulations. The method is then applied on selected examples from physics, finance, environmental science, and epidemiology. For most of the studied cases, the relationship between variables of interest varies strongly across scales.
Bogren, Sara; Fornara, Andrea; Ludwig, Frank; del Puerto Morales, Maria; Steinhoff, Uwe; Fougt Hansen, Mikkel; Kazakova, Olga; Johansson, Christer
2015-01-01
This study presents classification of different magnetic single- and multi-core particle systems using their measured dynamic magnetic properties together with their nanocrystal and particle sizes. The dynamic magnetic properties are measured with AC (dynamical) susceptometry and magnetorelaxometry and the size parameters are determined from electron microscopy and dynamic light scattering. Using these methods, we also show that the nanocrystal size and particle morphology determines the dynamic magnetic properties for both single- and multi-core particles. The presented results are obtained from the four year EU NMP FP7 project, NanoMag, which is focused on standardization of analysis methods for magnetic nanoparticles. PMID:26343639
Garbarino, John R.; Struzeski, Tedmund M.
1998-01-01
Inductively coupled plasma-optical emission spectrometry (ICP-OES) and inductively coupled plasma-mass spectrometry (ICP-MS) can be used to determine 26 elements in whole-water digests. Both methods have distinct advantages and disadvantages--ICP-OES is capable of analyzing samples with higher elemental concentrations without dilution, however, ICP-MS is more sensitive and capable of determining much lower elemental concentrations. Both techniques gave accurate results for spike recoveries, digested standard reference-water samples, and whole-water digests. Average spike recoveries in whole-water digests were 100 plus/minus 10 percent, although recoveries for digests with high dissolved-solid concentrations were lower for selected elements by ICP-MS. Results for standard reference-water samples were generally within 1 standard deviation of hte most probable values. Statistical analysis of the results from 43 whole-water digest indicated that there was no significant difference among ICP-OES, ICP-MS, and former official methods of analysis for 24 of the 26 elements evaluated.
Standard-less analysis of Zircaloy clad samples by an instrumental neutron activation method
NASA Astrophysics Data System (ADS)
Acharya, R.; Nair, A. G. C.; Reddy, A. V. R.; Goswami, A.
2004-03-01
A non-destructive method for analysis of irregular shape and size samples of Zircaloy has been developed using the recently standardized k0-based internal mono standard instrumental neutron activation analysis (INAA). The samples of Zircaloy-2 and -4 tubes, used as fuel cladding in Indian boiling water reactors (BWR) and pressurized heavy water reactors (PHWR), respectively, have been analyzed. Samples weighing in the range of a few tens of grams were irradiated in the thermal column of Apsara reactor to minimize neutron flux perturbations and high radiation dose. The method utilizes in situ relative detection efficiency using the γ-rays of selected activation products in the sample for overcoming γ-ray self-attenuation. Since the major and minor constituents (Zr, Sn, Fe, Cr and/or Ni) in these samples were amenable to NAA, the absolute concentrations of all the elements were determined using mass balance instead of using the concentration of the internal mono standard. Concentrations were also determined in a smaller size Zircaloy-4 sample by irradiating in the core position of the reactor to validate the present methodology. The results were compared with literature specifications and were found to be satisfactory. Values of sensitivities and detection limits have been evaluated for the elements analyzed.
Standardizing Flow Cytometry Immunophenotyping Analysis from the Human ImmunoPhenotyping Consortium
Finak, Greg; Langweiler, Marc; Jaimes, Maria; Malek, Mehrnoush; Taghiyar, Jafar; Korin, Yael; Raddassi, Khadir; Devine, Lesley; Obermoser, Gerlinde; Pekalski, Marcin L.; Pontikos, Nikolas; Diaz, Alain; Heck, Susanne; Villanova, Federica; Terrazzini, Nadia; Kern, Florian; Qian, Yu; Stanton, Rick; Wang, Kui; Brandes, Aaron; Ramey, John; Aghaeepour, Nima; Mosmann, Tim; Scheuermann, Richard H.; Reed, Elaine; Palucka, Karolina; Pascual, Virginia; Blomberg, Bonnie B.; Nestle, Frank; Nussenblatt, Robert B.; Brinkman, Ryan Remy; Gottardo, Raphael; Maecker, Holden; McCoy, J Philip
2016-01-01
Standardization of immunophenotyping requires careful attention to reagents, sample handling, instrument setup, and data analysis, and is essential for successful cross-study and cross-center comparison of data. Experts developed five standardized, eight-color panels for identification of major immune cell subsets in peripheral blood. These were produced as pre-configured, lyophilized, reagents in 96-well plates. We present the results of a coordinated analysis of samples across nine laboratories using these panels with standardized operating procedures (SOPs). Manual gating was performed by each site and by a central site. Automated gating algorithms were developed and tested by the FlowCAP consortium. Centralized manual gating can reduce cross-center variability, and we sought to determine whether automated methods could streamline and standardize the analysis. Within-site variability was low in all experiments, but cross-site variability was lower when central analysis was performed in comparison with site-specific analysis. It was also lower for clearly defined cell subsets than those based on dim markers and for rare populations. Automated gating was able to match the performance of central manual analysis for all tested panels, exhibiting little to no bias and comparable variability. Standardized staining, data collection, and automated gating can increase power, reduce variability, and streamline analysis for immunophenotyping. PMID:26861911
METHOD FOR THE DETERMINATION OF PERCHLORATE ANION IN PLANT AND SOLID MATRICES BY ION CHROMATOGRAPHY
A standardized method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The procedure greatly reduced the ionic interferences in water extracts of plant materials. Ion chro...
Cueto Díaz, Sergio; Ruiz Encinar, Jorge; García Alonso, J Ignacio
2014-09-24
We present a novel method for the purity assessment of peptide standards which is applicable to any water soluble peptide. The method is based on the online (13)C isotope dilution approach in which the peptide is separated from its related impurities by liquid chromatography (LC) and the eluent is mixed post-column with a continuous flow of (13)C-enriched sodium bicarbonate. An online oxidation step using sodium persulfate in acidic media at 99°C provides quantitative oxidation to (12)CO2 and (13)CO2 respectively which is extracted to a gaseous phase with the help of a gas permeable membrane. The measurement of the isotope ratio 44/45 in the mass spectrometer allows the construction of the mass flow chromatogram. As the only species that is finally measured in the mass spectrometer is CO2, the peptide content in the standard can be quantified, on the base of its carbon content, using a generic primary standard such as potassium hydrogen phthalate. The approach was validated by the analysis of a reference material (NIST 8327), and applied to the quantification of two commercial synthetic peptide standards. In that case, the results obtained were compared with those obtained using alternative methods, such as amino acid analysis and ICP-MS. The results obtained proved the value of the method for the fast, accurate and precise mass purity assignment of synthetic peptide standards. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sitko, Rafał
2008-11-01
Knowledge of X-ray tube spectral distribution is necessary in theoretical methods of matrix correction, i.e. in both fundamental parameter (FP) methods and theoretical influence coefficient algorithms. Thus, the influence of X-ray tube distribution on the accuracy of the analysis of thin films and bulk samples is presented. The calculations are performed using experimental X-ray tube spectra taken from the literature and theoretical X-ray tube spectra evaluated by three different algorithms proposed by Pella et al. (X-Ray Spectrom. 14 (1985) 125-135), Ebel (X-Ray Spectrom. 28 (1999) 255-266), and Finkelshtein and Pavlova (X-Ray Spectrom. 28 (1999) 27-32). In this study, Fe-Cr-Ni system is selected as an example and the calculations are performed for X-ray tubes commonly applied in X-ray fluorescence analysis (XRF), i.e., Cr, Mo, Rh and W. The influence of X-ray tube spectra on FP analysis is evaluated when quantification is performed using various types of calibration samples. FP analysis of bulk samples is performed using pure-element bulk standards and multielement bulk standards similar to the analyzed material, whereas for FP analysis of thin films, the bulk and thin pure-element standards are used. For the evaluation of the influence of X-ray tube spectra on XRF analysis performed by theoretical influence coefficient methods, two algorithms for bulk samples are selected, i.e. Claisse-Quintin (Can. Spectrosc. 12 (1967) 129-134) and COLA algorithms (G.R. Lachance, Paper Presented at the International Conference on Industrial Inorganic Elemental Analysis, Metz, France, June 3, 1981) and two algorithms (constant and linear coefficients) for thin films recently proposed by Sitko (X-Ray Spectrom. 37 (2008) 265-272).
ERIC Educational Resources Information Center
Abrams, Neal M.
2012-01-01
A cloud network system is combined with standard computing applications and a course management system to provide a robust method for sharing data among students. This system provides a unique method to improve data analysis by easily increasing the amount of sampled data available for analysis. The data can be shared within one course as well as…
Wang, Guiqin; Wu, Yangsiqian; Lin, Yangting
2016-02-28
Nearly 99% of the total content of extraterrestrial metals is composed of Fe and Ni, but with greatly variable trace element contents. The accuracy obtained in the inductively coupled plasma mass spectrometry (ICP-MS) analysis of solutions of these samples can be significantly influenced by matrix contents, polyatomic ion interference, and the concentrations of external standard solutions. An ICP-MS instrument (X Series 2) was used to determine 30 standard solutions with different concentrations of trace elements, and different matrix contents. Based on these measurements, the matrix effects were determined. Three iron meteorites were dissolved separately in aqua regia and HNO3. Deviations due to variation of matrix contents in the external standard solutions were evaluated and the analysis results of the two digestion methods for iron meteorites were assessed. Our results show obvious deviations due to unmatched matrix contents in the external standard solutions. Furthermore, discrepancy in the measurement of some elements was found between the sample solutions prepared with aqua regia and HNO3, due to loss of chloride during sample preparation and/or incomplete digestion of highly siderophile elements in iron meteorites. An accurate ICP-MS analysis method for extraterrestrial metal samples has been established using external standard solutions with matched matrix contents and digesting the samples with HNO3 and aqua regia. Using the data from this work, the Mundrabilla iron meteorite previously classified as IAB-ung is reclassified as IAB-MG. Copyright © 2016 John Wiley & Sons, Ltd.
Extracting insights from the shape of complex data using topology
Lum, P. Y.; Singh, G.; Lehman, A.; Ishkanov, T.; Vejdemo-Johansson, M.; Alagappan, M.; Carlsson, J.; Carlsson, G.
2013-01-01
This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods. PMID:23393618
Extracting insights from the shape of complex data using topology.
Lum, P Y; Singh, G; Lehman, A; Ishkanov, T; Vejdemo-Johansson, M; Alagappan, M; Carlsson, J; Carlsson, G
2013-01-01
This paper applies topological methods to study complex high dimensional data sets by extracting shapes (patterns) and obtaining insights about them. Our method combines the best features of existing standard methodologies such as principal component and cluster analyses to provide a geometric representation of complex data sets. Through this hybrid method, we often find subgroups in data sets that traditional methodologies fail to find. Our method also permits the analysis of individual data sets as well as the analysis of relationships between related data sets. We illustrate the use of our method by applying it to three very different kinds of data, namely gene expression from breast tumors, voting data from the United States House of Representatives and player performance data from the NBA, in each case finding stratifications of the data which are more refined than those produced by standard methods.
40 CFR 60.74a - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... affirmative defense to a claim for civil penalties for violations of such standards that are caused by... analysis shall also specify, using best monitoring methods and engineering judgment, the amount of any...
40 CFR 60.74a - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... affirmative defense to a claim for civil penalties for violations of such standards that are caused by... analysis shall also specify, using best monitoring methods and engineering judgment, the amount of any...
40 CFR 60.286a - Affirmative defense for violations of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., you may assert an affirmative defense to a claim for civil penalties for violations of such standards... malfunction event at issue. The analysis must also specify, using best monitoring methods and engineering...
40 CFR 63.1344 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2013 CFR
2013-07-01
... defense to a claim for civil penalties for violations of such standards that are caused by malfunction, as... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
40 CFR 63.1344 - Affirmative defense for violation of emission standards during malfunction.
Code of Federal Regulations, 2014 CFR
2014-07-01
... defense to a claim for civil penalties for violations of such standards that are caused by malfunction, as... malfunction event at issue. The analysis shall also specify, using best monitoring methods and engineering...
Burkhardt, Mark R.; Cinotto, Pete J.; Frahm, Galen W.; Woodworth, Mark T.; Pritt, Jeffrey W.
1995-01-01
A method for the determination of methylene blue active substances in whole-water samples by liquid-liquid extraction and spectrophotometric detection is described. Sulfate and sulfonate-based surfectants are reacted with methylene blue to form a blue-colored complex. The complex is extracted into chloroform, back-washed with an acidified phosphate-based buffer solution, and measured against external standards with a probe spectrophotometer. The method detection limt for routine analysis is 0.02 milligram per liter. The precision is plus/minus 10 percent relative standard deviation. The positive bias from nitrate and chloride and U.S. Geological Survey method O-3111-83 for methylene blue active substances is minized by adding a back-washing step.
Xia, Ben-Li; Cong, Ji-Xin; Li, Xia; Wang, Xuan-Jun
2011-06-01
The rocket kerosene quality properties such as density, distillation range, viscosity and iodine value were successfully measured based on their near-infrared spectrum (NIRS) and chemometrics. In the present paper, more than 70 rocket kerosene samples were determined by near infrared spectrum, the models were built using the partial least squares method within the appropriate wavelength range. The correlation coefficients (R2) of every rocket kerosene's quality properties ranged from 0.862 to 0.999. Ten unknown samples were determined with the model, and the result showed that the prediction accuracy of near infrared spectrum method accords with standard analysis requirements. The new method is well suitable for replacing the traditional standard method to rapidly determine the properties of the rocket kerosene.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
Bootstrap Methods: A Very Leisurely Look.
ERIC Educational Resources Information Center
Hinkle, Dennis E.; Winstead, Wayland H.
The Bootstrap method, a computer-intensive statistical method of estimation, is illustrated using a simple and efficient Statistical Analysis System (SAS) routine. The utility of the method for generating unknown parameters, including standard errors for simple statistics, regression coefficients, discriminant function coefficients, and factor…
Weir, Christopher J; Butcher, Isabella; Assi, Valentina; Lewis, Stephanie C; Murray, Gordon D; Langhorne, Peter; Brady, Marian C
2018-03-07
Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses.
Evaluating the Performance of the IEEE Standard 1366 Method for Identifying Major Event Days
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eto, Joseph H.; LaCommare, Kristina Hamachi; Sohn, Michael D.
IEEE Standard 1366 offers a method for segmenting reliability performance data to isolate the effects of major events from the underlying year-to-year trends in reliability. Recent analysis by the IEEE Distribution Reliability Working Group (DRWG) has found that reliability performance of some utilities differs from the expectations that helped guide the development of the Standard 1366 method. This paper proposes quantitative metrics to evaluate the performance of the Standard 1366 method in identifying major events and in reducing year-to-year variability in utility reliability. The metrics are applied to a large sample of utility-reported reliability data to assess performance of themore » method with alternative specifications that have been considered by the DRWG. We find that none of the alternatives perform uniformly 'better' than the current Standard 1366 method. That is, none of the modifications uniformly lowers the year-to-year variability in System Average Interruption Duration Index without major events. Instead, for any given alternative, while it may lower the value of this metric for some utilities, it also increases it for other utilities (sometimes dramatically). Thus, we illustrate some of the trade-offs that must be considered in using the Standard 1366 method and highlight the usefulness of the metrics we have proposed in conducting these evaluations.« less
Colour measurements of pigmented rice grain using flatbed scanning and image analysis
NASA Astrophysics Data System (ADS)
Kaisaat, Khotchakorn; Keawdonree, Nuttapong; Chomkokard, Sakchai; Jinuntuya, Noparit; Pattanasiri, Busara
2017-09-01
Recently, the National Bureau of Agricultural Commodity and Food Standards (ACFS) have drafted a manual of Thai colour rice standards. However, there are no quantitative descriptions of rice colour and its measurement method. These drawbacks might lead to misunderstanding for people who use the manual. In this work, we proposed an inexpensive method, using flatbed scanning together with image analysis, to quantitatively measure rice colour and colour uniformity. To demonstrate its general applicability for colour differentiation of rice, we applied it to different kinds of pigmented rice, including Riceberry rice with and without uniform colour and Chinese black rice.
Semi-automated potentiometric titration method for uranium characterization.
Cristiano, B F G; Delgado, J U; da Silva, J W S; de Barros, P D; de Araújo, R M S; Lopes, R T
2012-07-01
The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.
Complete 3D kinematics of upper extremity functional tasks.
van Andel, Carolien J; Wolterbeek, Nienke; Doorenbosch, Caroline A M; Veeger, DirkJan H E J; Harlaar, Jaap
2008-01-01
Upper extremity (UX) movement analysis by means of 3D kinematics has the potential to become an important clinical evaluation method. However, no standardized protocol for clinical application has yet been developed, that includes the whole upper limb. Standardization problems include the lack of a single representative function, the wide range of motion of joints and the complexity of the anatomical structures. A useful protocol would focus on the functional status of the arm and particularly the orientation of the hand. The aim of this work was to develop a standardized measurement method for unconstrained movement analysis of the UX that includes hand orientation, for a set of functional tasks for the UX and obtain normative values. Ten healthy subjects performed four representative activities of daily living (ADL). In addition, six standard active range of motion (ROM) tasks were executed. Joint angles of the wrist, elbow, shoulder and scapula were analyzed throughout each ADL task and minimum/maximum angles were determined from the ROM tasks. Characteristic trajectories were found for the ADL tasks, standard deviations were generally small and ROM results were consistent with the literature. The results of this study could form the normative basis for the development of a 'UX analysis report' equivalent to the 'gait analysis report' and would allow for future comparisons with pediatric and/or pathologic movement patterns.
1993-06-18
the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and clustering methods...rule rather than the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and...experiments using two microcosm protocols. We use nonmetric clustering, a multivariate pattern recognition technique developed by Matthews and Heame (1991
Buzzi, Marina; Guarino, Anna; Gatto, Claudio; Manara, Sabrina; Dainese, Luca; Polvani, Gianluca; Tóthová, Jana D'Amato
2014-01-01
We investigated the presence of antibiotics in cryopreserved cardiovascular tissues and cryopreservation media, after tissue decontamination with antibiotic cocktails, and the impact of antibiotic residues on standard tissue bank microbiological analyses. Sixteen cardiovascular tissues were decontaminated with bank-prepared cocktails and cryopreserved by two different tissue banks according to their standard operating procedures. Before and after decontamination, samples underwent microbiological analysis by standard tissue bank methods. Cryopreserved samples were tested again with and without the removal of antibiotic residues using a RESEP tube, after thawing. Presence of antibiotics in tissue homogenates and processing liquids was determined by a modified agar diffusion test. All cryopreserved tissue homogenates and cryopreservation media induced important inhibition zones on both Staphylococcus aureus- and Pseudomonas aeruginosa-seeded plates, immediately after thawing and at the end of the sterility test. The RESEP tube treatment markedly reduced or totally eliminated the antimicrobial activity of tested tissues and media. Based on standard tissue bank analysis, 50% of tissues were found positive for bacteria and/or fungi, before decontamination and 2 out of 16 tested samples (13%) still contained microorganisms after decontamination. After thawing, none of the 16 cryopreserved samples resulted positive with direct inoculum method. When the same samples were tested after removal of antibiotic residues, 8 out of 16 (50%) were contaminated. Antibiotic residues present in tissue allografts and processing liquids after decontamination may mask microbial contamination during microbiological analysis performed with standard tissue bank methods, thus resulting in false negatives.
Gatto, Claudio; Manara, Sabrina; Dainese, Luca; Polvani, Gianluca; Tóthová, Jana D'Amato
2014-01-01
We investigated the presence of antibiotics in cryopreserved cardiovascular tissues and cryopreservation media, after tissue decontamination with antibiotic cocktails, and the impact of antibiotic residues on standard tissue bank microbiological analyses. Sixteen cardiovascular tissues were decontaminated with bank-prepared cocktails and cryopreserved by two different tissue banks according to their standard operating procedures. Before and after decontamination, samples underwent microbiological analysis by standard tissue bank methods. Cryopreserved samples were tested again with and without the removal of antibiotic residues using a RESEP tube, after thawing. Presence of antibiotics in tissue homogenates and processing liquids was determined by a modified agar diffusion test. All cryopreserved tissue homogenates and cryopreservation media induced important inhibition zones on both Staphylococcus aureus- and Pseudomonas aeruginosa-seeded plates, immediately after thawing and at the end of the sterility test. The RESEP tube treatment markedly reduced or totally eliminated the antimicrobial activity of tested tissues and media. Based on standard tissue bank analysis, 50% of tissues were found positive for bacteria and/or fungi, before decontamination and 2 out of 16 tested samples (13%) still contained microorganisms after decontamination. After thawing, none of the 16 cryopreserved samples resulted positive with direct inoculum method. When the same samples were tested after removal of antibiotic residues, 8 out of 16 (50%) were contaminated. Antibiotic residues present in tissue allografts and processing liquids after decontamination may mask microbial contamination during microbiological analysis performed with standard tissue bank methods, thus resulting in false negatives. PMID:25397402
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Jordana R.; Gill, Gary A.; Kuo, Li-Jung
2016-04-20
Trace element determinations in seawater by inductively coupled plasma mass spectrometry are analytically challenging due to the typically very low concentrations of the trace elements and the potential interference of the salt matrix. In this study, we did a comparison for uranium analysis using inductively coupled plasma mass spectrometry (ICP-MS) of Sequim Bay seawater samples and three seawater certified reference materials (SLEW-3, CASS-5 and NASS-6) using seven different analytical approaches. The methods evaluated include: direct analysis, Fe/Pd reductive precipitation, standard addition calibration, online automated dilution using an external calibration with and without matrix matching, and online automated pre-concentration. The methodmore » which produced the most accurate results was the method of standard addition calibration, recovering uranium from a Sequim Bay seawater sample at 101 ± 1.2%. The on-line preconcentration method and the automated dilution with matrix-matched calibration method also performed well. The two least effective methods were the direct analysis and the Fe/Pd reductive precipitation using sodium borohydride« less
NASA Astrophysics Data System (ADS)
Nielsen, S. Suzanne
Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet the legal requirements established by governmental agencies. Government regulations and international standards most relevant to the analysis of foods are mentioned here but covered in more detail in Chap. 2, and nutrition labeling regulations in the USA are covered in Chap. 3. Internet addresses for many of the organizations and government agencies discussed are given at the end of this chapter.
Reviving common standards in point-count surveys for broad inference across studies
Matsuoka, Steven M.; Mahon, C. Lisa; Handel, Colleen M.; Solymos, Peter; Bayne, Erin M.; Fontaine, Patricia C.; Ralph, C.J.
2014-01-01
We revisit the common standards recommended by Ralph et al. (1993, 1995a) for conducting point-count surveys to assess the relative abundance of landbirds breeding in North America. The standards originated from discussions among ornithologists in 1991 and were developed so that point-count survey data could be broadly compared and jointly analyzed by national data centers with the goals of monitoring populations and managing habitat. Twenty years later, we revisit these standards because (1) they have not been universally followed and (2) new methods allow estimation of absolute abundance from point counts, but these methods generally require data beyond the original standards to account for imperfect detection. Lack of standardization and the complications it introduces for analysis become apparent from aggregated data. For example, only 3% of 196,000 point counts conducted during the period 1992-2011 across Alaska and Canada followed the standards recommended for the count period and count radius. Ten-minute, unlimited-count-radius surveys increased the number of birds detected by >300% over 3-minute, 50-m-radius surveys. This effect size, which could be eliminated by standardized sampling, was ≥10 times the published effect sizes of observers, time of day, and date of the surveys. We suggest that the recommendations by Ralph et al. (1995a) continue to form the common standards when conducting point counts. This protocol is inexpensive and easy to follow but still allows the surveys to be adjusted for detection probabilities. Investigators might optionally collect additional information so that they can analyze their data with more flexible forms of removal and time-of-detection models, distance sampling, multiple-observer methods, repeated counts, or combinations of these methods. Maintaining the common standards as a base protocol, even as these study-specific modifications are added, will maximize the value of point-count data, allowing compilation and analysis by regional and national data centers.
A frequency standard via spectrum analysis and direct digital synthesis
NASA Astrophysics Data System (ADS)
Li, Dawei; Shi, Daiting; Hu, Ermeng; Wang, Yigen; Tian, Lu; Zhao, Jianye; Wang, Zhong
2014-11-01
We demonstrated a frequency standard based on a detuned coherent population beating phenomenon. In this phenomenon, the beat frequency of the radio frequency for laser modulation and the hyperfine splitting can be obtained by digital signal processing technology. After analyzing the spectrum of the beat frequency, the fluctuation information is obtained and applied to compensate for the frequency shift to generate the standard frequency by the digital synthesis method. Frequency instability of 2.6 × 1012 at 1000 s is observed in our preliminary experiment. By eliminating the phase-locking loop, the method will enable us to achieve a full-digital frequency standard with remarkable stability.
Status and analysis of test standard for on-board charger
NASA Astrophysics Data System (ADS)
Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan
2018-05-01
This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.
Standardized Analytical Methods for Environmental Restoration Following Homeland Security Events
USDA-ARS?s Scientific Manuscript database
Methodology was formulated for use in the event of a terrorist attack using a variety of chemical, radioactive, biological, and toxic agents. Standardized analysis procedures were determined for use should these events occur. This publication is annually updated....
Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T
2012-09-01
The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan
2017-04-01
Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.
Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2015-09-18
In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.
Anneken, David; Striebich, Richard; DeWitt, Matthew J; Klingshirn, Christopher; Corporan, Edwin
2015-03-01
Aircraft turbine engines are a significant source of particulate matter (PM) and gaseous emissions in the vicinity of airports and military installations. Hazardous air pollutants (HAPs) (e.g., formaldehyde, benzene, naphthalene and other compounds) associated with aircraft emissions are an environmental concern both in flight and at ground level. Therefore, effective sampling, identification, and accurate measurement of these trace species are important to assess their environmental impact. This effort evaluates two established ambient air sampling and analysis methods, U.S. Environmental Protection Agency (EPA) Method TO-11A and National Institute for Occupational Safety and Health (NIOSH) Method 1501, for potential use to quantify HAPs from aircraft turbine engines. The techniques were used to perform analysis of the exhaust from a T63 turboshaft engine, and were examined using certified gas standards transferred through the heated sampling systems used for engine exhaust gaseous emissions measurements. Test results show that the EPA Method TO-11A (for aldehydes) and NIOSH Method 1501 (for semivolatile hydrocarbons) were effective techniques for the sampling and analysis of most HAPs of interest. Both methods showed reasonable extraction efficiencies of HAP species from the sorbent tubes, with the exception of acrolein, styrene, and phenol, which were not well quantified. Formaldehyde measurements using dinitrophenylhydrazine (DNPH) tubes (EPA method TO-11A) were accurate for gas-phase standards, and compared favorably to measurements using gas-phase Fourier-transform infrared (FTIR) spectroscopy. In general, these two standard methodologies proved to be suitable techniques for field measurement of turbine engine HAPs within a reasonable (5-10 minutes) sampling period. Details of the tests, the analysis methods, calibration procedures, and results from the gas standards and T63 engine tested using a conventional JP-8 jet fuel are provided. HAPs from aviation-related sources are important because of their adverse health and environmental impacts in and around airports and flight lines. Simpler, more convenient techniques to measure the important HAPs, especially aldehydes and volatile organic HAPs, are needed to provide information about their occurrence and assist in the development of engines that emit fewer harmful emissions.
Ohno, Yoshiharu; Koyama, Hisanobu; Yoshikawa, Takeshi; Kishida, Yuji; Seki, Shinichiro; Takenaka, Daisuke; Yui, Masao; Miyazaki, Mitsue; Sugimura, Kazuro
2017-08-01
Purpose To compare the capability of pulmonary thin-section magnetic resonance (MR) imaging with ultrashort echo time (UTE) with that of standard- and reduced-dose thin-section computed tomography (CT) in nodule detection and evaluation of nodule type. Materials and Methods The institutional review board approved this study, and written informed consent was obtained from each patient. Standard- and reduced-dose chest CT (60 and 250 mA) and MR imaging with UTE were used to examine 52 patients; 29 were men (mean age, 66.4 years ± 7.3 [standard deviation]; age range, 48-79 years) and 23 were women (mean age, 64.8 years ± 10.1; age range, 42-83 years). Probability of nodule presence was assessed for all methods with a five-point visual scoring system. All nodules were then classified as missed, ground-glass, part-solid, or solid nodules. To compare nodule detection capability of the three methods, consensus for performances was rated by using jackknife free-response receiver operating characteristic analysis, and κ analysis was used to compare intermethod agreement for nodule type classification. Results There was no significant difference (F = 0.70, P = .59) in figure of merit between methods (standard-dose CT, 0.86; reduced-dose CT, 0.84; MR imaging with UTE, 0.86). There was no significant difference in sensitivity between methods (standard-dose CT vs reduced-dose CT, P = .50; standard-dose CT vs MR imaging with UTE, P = .50; reduced-dose CT vs MR imaging with UTE, P >.99). Intermethod agreement was excellent (standard-dose CT vs reduced-dose CT, κ = 0.98, P < .001; standard-dose CT vs MR imaging with UTE, κ = 0.98, P < .001; reduced-dose CT vs MR imaging with UTE, κ = 0.99, P < .001). Conclusion Pulmonary thin-section MR imaging with UTE was useful in nodule detection and evaluation of nodule type, and it is considered at least as efficacious as standard- or reduced-dose thin-section CT. © RSNA, 2017 Online supplemental material is available for this article.
Nonlinear Analysis of Surface EMG Time Series
NASA Astrophysics Data System (ADS)
Zurcher, Ulrich; Kaufman, Miron; Sung, Paul
2004-04-01
Applications of nonlinear analysis of surface electromyography time series of patients with and without low back pain are presented. Limitations of the standard methods based on the power spectrum are discussed.
16 CFR 309.10 - Alternative vehicle fuel rating.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Analysis of Natural Gas by Gas Chromatography.” For the purposes of this section, fuel ratings for the... methods set forth in ASTM D 1946-90, “Standard Practice for Analysis of Reformed Gas by Gas Chromatography... the principal component of compressed natural gas are to be determined in accordance with test methods...
16 CFR 309.10 - Alternative vehicle fuel rating.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Analysis of Natural Gas by Gas Chromatography.” For the purposes of this section, fuel ratings for the... methods set forth in ASTM D 1946-90, “Standard Practice for Analysis of Reformed Gas by Gas Chromatography... the principal component of compressed natural gas are to be determined in accordance with test methods...
A Multicomponent UV Analysis of ["alpha"]- and ["beta"]-Acids in Hops
ERIC Educational Resources Information Center
Egts, Haley; Durben, Dan J.; Dixson, John A.; Zehfus, Micheal H.
2012-01-01
A method is presented for the determination of ["alpha"]- and ["beta"]-acids (humulones and lupulones) in a hops sample using a multicomponent UV spectroscopic analysis of a methanolic hop extract. When compared with standard methods, this lab can be considered "greener" because it uses smaller volumes of safer solvents (methanol instead of…
A SAS Interface for Bayesian Analysis with WinBUGS
ERIC Educational Resources Information Center
Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki
2008-01-01
Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
In this report, the scope of the tests, the method of analysis, the results, and the conclusions are discussed. The first test indicated that the requirements generated by the Standard procedures and formulae appear to yield reasonable results, although some of the cost data provided as defaults in the Standard should be reevaluated. The second test provided experience that was useful in modifying the points compliance format, but did not uncover any procedural issues that would lead to unreasonable results. These conclusions are based on analysis using the Automated Residential Energy Standard (ARES) computer program, developed to simplify the processmore » of standards generation.« less
Soil analysis based on sa,ples withdrawn from different volumes: correlation versus calibration
Lucian Weilopolski; Kurt Johnsen; Yuen Zhang
2010-01-01
Soil, particularly in forests, is replete with spatial variation with respect to soil C. Th e present standard chemical method for soil analysis by dry combustion (DC) is destructive, and comprehensive sampling is labor intensive and time consuming. Th ese, among other factors, are contributing to the development of new methods for soil analysis. Th ese include a near...
[Modified Delphi method in the constitution of school sanitation standard].
Yin, Xunqiang; Liang, Ying; Tan, Hongzhuan; Gong, Wenjie; Deng, Jing; Luo, Jiayou; Di, Xiaokang; Wu, Yue
2012-11-01
To constitute school sanitation standard using modified Delphi method, and to explore the feasibility and the predominance of Delphi method in the constitution of school sanitation standard. Two rounds of expert consultations were adopted in this study. The data were analyzed with SPSS15.0 to screen indices of school sanitation standard. Thirty-two experts accomplished the 2 rounds of consultations. The average length of expert service was (24.69 ±8.53) years. The authority coefficient was 0.729 ±0.172. The expert positive coefficient was 94.12% (32/34) in the first round and 100% (32/32) in the second round. The harmonious coefficients of importance, feasibility and rationality in the second round were 0.493 (P<0.05), 0.527 (P<0.01), and 0.535 (P<0.01), respectively, suggesting unanimous expert opinions. According to the second round of consultation, 38 indices were included in the framework. Theoretical analysis, literature review, investigation and so on are generally used in health standard constitution currently. Delphi method is a rapid, effective and feasible method in this field.
Hancewicz, Thomas M; Xiao, Chunhong; Zhang, Shuliang; Misra, Manoj
2013-12-01
In vivo confocal Raman spectroscopy has become the measurement technique of choice for skin health and skin care related communities as a way of measuring functional chemistry aspects of skin that are key indicators for care and treatment of various skin conditions. Chief among these techniques are stratum corneum water content, a critical health indicator for severe skin condition related to dryness, and natural moisturizing factor components that are associated with skin protection and barrier health. In addition, in vivo Raman spectroscopy has proven to be a rapid and effective method for quantifying component penetration in skin for topically applied skin care formulations. The benefit of such a capability is that noninvasive analytical chemistry can be performed in vivo in a clinical setting, significantly simplifying studies aimed at evaluating product performance. This presumes, however, that the data and analysis methods used are compatible and appropriate for the intended purpose. The standard analysis method used by most researchers for in vivo Raman data is ordinary least squares (OLS) regression. The focus of work described in this paper is the applicability of OLS for in vivo Raman analysis with particular attention given to use for non-ideal data that often violate the inherent limitations and deficiencies associated with proper application of OLS. We then describe a newly developed in vivo Raman spectroscopic analysis methodology called multivariate curve resolution-augmented ordinary least squares (MCR-OLS), a relatively simple route to addressing many of the issues with OLS. The method is compared with the standard OLS method using the same in vivo Raman data set and using both qualitative and quantitative comparisons based on model fit error, adherence to known data constraints, and performance against calibration samples. A clear improvement is shown in each comparison for MCR-OLS over standard OLS, thus supporting the premise that the MCR-OLS method is better suited for general-purpose multicomponent analysis of in vivo Raman spectral data. This suggests that the methodology is more readily adaptable to a wide range of component systems and is thus more generally applicable than standard OLS.
7 CFR 160.17 - Laboratory analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...
7 CFR 160.17 - Laboratory analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...
Cabinetmaker. Occupational Analysis Series.
ERIC Educational Resources Information Center
Chinien, Chris; Boutin, France
This document contains the analysis of the occupation of cabinetmaker, or joiner, that is accepted by the Canadian Council of Directors as the national standard for the occupation. The front matter preceding the analysis includes exploration of the development of the analysis, structure of the analysis, validation method, scope of the cabinetmaker…
A catalog of automated analysis methods for enterprise models.
Florez, Hector; Sánchez, Mario; Villalobos, Jorge
2016-01-01
Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.
Urinary Amino Acid Analysis: A Comparison of iTRAQ®-LC-MS/MS, GC-MS, and Amino Acid Analyzer
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L.; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J.
2009-01-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ® derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ® tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ®-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27±5.22, 21.18±10.94, and 18.34±14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39±5.35, 6.23±3.84, and 35.37±29.42. Both GC-MS and iTRAQ®-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines. PMID:19481989
Urinary amino acid analysis: a comparison of iTRAQ-LC-MS/MS, GC-MS, and amino acid analyzer.
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J
2009-07-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27+/-5.22, 21.18+/-10.94, and 18.34+/-14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39+/-5.35, 6.23+/-3.84, and 35.37+/-29.42. Both GC-MS and iTRAQ-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines.
Identifying inaccuracy of MS Project using system analysis
NASA Astrophysics Data System (ADS)
Fachrurrazi; Husin, Saiful; Malahayati, Nurul; Irzaidi
2018-05-01
The problem encountered in project owner’s financial accounting report is the difference in total project costs of MS Project to the Indonesian Standard (Standard Indonesia Standard / Cost Estimating Standard Book of Indonesia). It is one of the MS Project problems concerning to its cost accuracy, so cost data cannot be used in an integrated way for all project components. This study focuses on finding the causes of inaccuracy of the MS Projects. The aim of this study, which is operationally, are: (i) identifying cost analysis procedures for both current methods (SNI) and MS Project; (ii) identifying cost bias in each element of the cost analysis procedure; and (iii) analysing the cost differences (cost bias) in each element to identify what the cause of inaccuracies in MS Project toward SNI is. The method in this study is comparing for both the system analysis of MS Project and SNI. The results are: (i) MS Project system in Work of Resources element has limitation for two decimal digits only, have led to its inaccuracy. Where the Work of Resources (referred to as effort) in MS Project represents multiplication between the Quantities of Activities and Requirements of resources in SNI; (ii) MS Project and SNI have differences in the costing methods (the cost estimation methods), in which the SNI uses the Quantity-Based Costing (QBC), meanwhile MS Project uses the Time-Based Costing (TBC). Based on this research, we recommend to the contractors who use SNI should make an adjustment for Work of Resources in MS Project (with correction index) so that it can be used in an integrated way to the project owner’s financial accounting system. Further research will conduct for improvement the MS Project as an integrated tool toward all part of the project participant.
Standard methods for chemical analysis of steel, cast iron, open-hearth iron, and wrought iron
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1973-01-01
Methods are described for determining manganese, phosphorus, sulfur, selenium, copper, nickel, chromium, vanadium, tungsten, titanium, lead, boron, molybdenum ( alpha -benzoin oxime method), zirconium (cupferron --phosphate method), niobium and tantalum (hydrolysis with perchloric and sulfurous acids (gravimetric, titrimetric, and photometric methods)), and beryllium (oxide method). (DHM)
Influence analysis in quantitative trait loci detection.
Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko
2014-07-01
This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hondrogiannis, Ellen M; Ehrlinger, Erin; Poplaski, Alyssa; Lisle, Meredith
2013-11-27
A total of 11 elements found in 25 vanilla samples from Uganda, Madagascar, Indonesia, and Papua New Guinea were measured by laser ablation-inductively coupled plasma-time-of-flight-mass spectrometry (LA-ICP-TOF-MS) for the purpose of collecting data that could be used to discriminate among the origins. Pellets were prepared of the samples, and elemental concentrations were obtained on the basis of external calibration curves created using five National Institute of Standards and Technology (NIST) standards and one Chinese standard with (13)C internal standardization. These curves were validated using NIST 1573a (tomato leaves) as a check standard. Discriminant analysis was used to successfully classify the vanilla samples by their origin. Our method illustrates the feasibility of using LA-ICP-TOF-MS with an external calibration curve for high-throughput screening of spice screening analysis.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
Exploiting salient semantic analysis for information retrieval
NASA Astrophysics Data System (ADS)
Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui
2016-11-01
Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.
Johnson, R.G.; Wandless, G.A.
1984-01-01
A new method is described for determining carrier yield in the radiochemical neutron activation analysis of rare-earth elements in silicate rocks by group separation. The method involves the determination of the rare-earth elements present in the carrier by means of energy-dispersive X-ray fluorescence analysis, eliminating the need to re-irradiate samples in a nuclear reactor after the gamma ray analysis is complete. Results from the analysis of USGS standards AGV-1 and BCR-1 compare favorably with those obtained using the conventional method. ?? 1984 Akade??miai Kiado??.
Precise determination of N-acetylcysteine in pharmaceuticals by microchip electrophoresis.
Rudašová, Marína; Masár, Marián
2016-01-01
A novel microchip electrophoresis method for the rapid and high-precision determination of N-acetylcysteine, a pharmaceutically active ingredient, in mucolytics has been developed. Isotachophoresis separations were carried out at pH 6.0 on a microchip with conductivity detection. The methods of external calibration and internal standard were used to evaluate the results. The internal standard method effectively eliminated variations in various working parameters, mainly run-to-run fluctuations of an injected volume. The repeatability and accuracy of N-acetylcysteine determination in all mucolytic preparations tested (Solmucol 90 and 200, and ACC Long 600) were more than satisfactory with the relative standard deviation and relative error values <0.7 and <1.9%, respectively. A recovery range of 99-101% of N-acetylcysteine in the analyzed pharmaceuticals predetermines the proposed method for accurate analysis as well. This work, in general, indicates analytical possibilities of microchip isotachophoresis for the quantitative analysis of simplified samples such as pharmaceuticals that contain the analyte(s) at relatively high concentrations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Growth rate measurement in free jet experiments
NASA Astrophysics Data System (ADS)
Charpentier, Jean-Baptiste; Renoult, Marie-Charlotte; Crumeyrolle, Olivier; Mutabazi, Innocent
2017-07-01
An experimental method was developed to measure the growth rate of the capillary instability for free liquid jets. The method uses a standard shadow-graph imaging technique to visualize a jet, produced by extruding a liquid through a circular orifice, and a statistical analysis of the entire jet. The analysis relies on the computation of the standard deviation of a set of jet profiles, obtained in the same experimental conditions. The principle and robustness of the method are illustrated with a set of emulated jet profiles. The method is also applied to free falling jet experiments conducted for various Weber numbers and two low-viscosity solutions: a Newtonian and a viscoelastic one. Growth rate measurements are found in good agreement with linear stability theory in the Rayleigh's regime, as expected from previous studies. In addition, the standard deviation curve is used to obtain an indirect measurement of the initial perturbation amplitude and to identify beads on a string structure on the jet. This last result serves to demonstrate the capability of the present technique to explore in the future the dynamics of viscoelastic liquid jets.
NASA Technical Reports Server (NTRS)
Gurgiolo, Chris; Vinas, Adolfo F.
2009-01-01
This paper presents a spherical harmonic analysis of the plasma velocity distribution function using high-angular, energy, and time resolution Cluster data obtained from the PEACE spectrometer instrument to demonstrate how this analysis models the particle distribution function and its moments and anisotropies. The results show that spherical harmonic analysis produced a robust physical representation model of the velocity distribution function, resolving the main features of the measured distributions. From the spherical harmonic analysis, a minimum set of nine spectral coefficients was obtained from which the moment (up to the heat flux), anisotropy, and asymmetry calculations of the velocity distribution function were obtained. The spherical harmonic method provides a potentially effective "compression" technique that can be easily carried out onboard a spacecraft to determine the moments and anisotropies of the particle velocity distribution function for any species. These calculations were implemented using three different approaches, namely, the standard traditional integration, the spherical harmonic (SPH) spectral coefficients integration, and the singular value decomposition (SVD) on the spherical harmonic methods. A comparison among the various methods shows that both SPH and SVD approaches provide remarkable agreement with the standard moment integration method.
The role of explicit and implicit standards in visual speed discrimination.
Norman, J Farley; Pattison, Kristina F; Norman, Hideko F; Craft, Amy E; Wiesemann, Elizabeth Y; Taylor, M Jett
2008-01-01
Five experiments were designed to investigate visual speed discrimination. Variations of the method of constant stimuli were used to obtain speed discrimination thresholds in experiments 1, 2, 4, and 5, while the method of single stimuli was used in experiment 3. The observers' thresholds were significantly influenced by the choice of psychophysical method and by changes in the standard speed. The observers' judgments were unaffected, however, by changes in the magnitude of random variations in stimulus duration, reinforcing the conclusions of Lappin et al (1975 Journal of Experimental Psychology: Human Perception and Performance 1 383 394). When an implicit standard was used, the observers produced relatively low discrimination thresholds (7.0% of the standard speed), verifying the results of McKee (1981 Vision Research 21 491-500). When an explicit standard was used in a 2AFC variant of the method of constant stimuli, however, the observers' discrimination thresholds increased by 74% (to 12.2%), resembling the high thresholds obtained by Mandriota et al (1962 Science 138 437-438). A subsequent signal-detection analysis revealed that the observers' actual sensitivities to differences in speed were in fact equivalent for both psychophysical methods. The formation of an implicit standard in the method of single stimuli allows human observers to make judgments of speed that are as precise as those obtained when explicit standards are available.
Accuracy of Blood Loss Measurement during Cesarean Delivery.
Doctorvaladan, Sahar V; Jelks, Andrea T; Hsieh, Eric W; Thurer, Robert L; Zakowski, Mark I; Lagrew, David C
2017-04-01
Objective This study aims to compare the accuracy of visual, quantitative gravimetric, and colorimetric methods used to determine blood loss during cesarean delivery procedures employing a hemoglobin extraction assay as the reference standard. Study Design In 50 patients having cesarean deliveries blood loss determined by assays of hemoglobin content on surgical sponges and in suction canisters was compared with obstetricians' visual estimates, a quantitative gravimetric method, and the blood loss determined by a novel colorimetric system. Agreement between the reference assay and other measures was evaluated by the Bland-Altman method. Results Compared with the blood loss measured by the reference assay (470 ± 296 mL), the colorimetric system (572 ± 334 mL) was more accurate than either visual estimation (928 ± 261 mL) or gravimetric measurement (822 ± 489 mL). The correlation between the assay method and the colorimetric system was more predictive (standardized coefficient = 0.951, adjusted R 2 = 0.902) than either visual estimation (standardized coefficient = 0.700, adjusted R 2 = 00.479) or the gravimetric determination (standardized coefficient = 0.564, adjusted R 2 = 0.304). Conclusion During cesarean delivery, measuring blood loss using colorimetric image analysis is superior to visual estimation and a gravimetric method. Implementation of colorimetric analysis may enhance the ability of management protocols to improve clinical outcomes.
Accuracy of Blood Loss Measurement during Cesarean Delivery
Doctorvaladan, Sahar V.; Jelks, Andrea T.; Hsieh, Eric W.; Thurer, Robert L.; Zakowski, Mark I.; Lagrew, David C.
2017-01-01
Objective This study aims to compare the accuracy of visual, quantitative gravimetric, and colorimetric methods used to determine blood loss during cesarean delivery procedures employing a hemoglobin extraction assay as the reference standard. Study Design In 50 patients having cesarean deliveries blood loss determined by assays of hemoglobin content on surgical sponges and in suction canisters was compared with obstetricians' visual estimates, a quantitative gravimetric method, and the blood loss determined by a novel colorimetric system. Agreement between the reference assay and other measures was evaluated by the Bland–Altman method. Results Compared with the blood loss measured by the reference assay (470 ± 296 mL), the colorimetric system (572 ± 334 mL) was more accurate than either visual estimation (928 ± 261 mL) or gravimetric measurement (822 ± 489 mL). The correlation between the assay method and the colorimetric system was more predictive (standardized coefficient = 0.951, adjusted R2 = 0.902) than either visual estimation (standardized coefficient = 0.700, adjusted R2 = 00.479) or the gravimetric determination (standardized coefficient = 0.564, adjusted R2 = 0.304). Conclusion During cesarean delivery, measuring blood loss using colorimetric image analysis is superior to visual estimation and a gravimetric method. Implementation of colorimetric analysis may enhance the ability of management protocols to improve clinical outcomes. PMID:28497007
Dahlquist, Robert T.; Reyner, Karina; Robinson, Richard D.; Farzad, Ali; Laureano-Phillips, Jessica; Garrett, John S.; Young, Joseph M.; Zenarosa, Nestor R.; Wang, Hao
2018-01-01
Background Emergency department (ED) shift handoffs are potential sources of delay in care. We aimed to determine the impact that using standardized reporting tool and process may have on throughput metrics for patients undergoing a transition of care at shift change. Methods We performed a prospective, pre- and post-intervention quality improvement study from September 1 to November 30, 2015. A handoff procedure intervention, including a mandatory workshop and personnel training on a standard reporting system template, was implemented. The primary endpoint was patient length of stay (LOS). A comparative analysis of differences between patient LOS and various handoff communication methods were assessed pre- and post-intervention. Communication methods were entered a multivariable logistic regression model independently as risk factors for patient LOS. Results The final analysis included 1,006 patients, with 327 comprising the pre-intervention and 679 comprising the post-intervention populations. Bedside rounding occurred 45% of the time without a standard reporting during pre-intervention and increased to 85% of the time with the use of a standard reporting system in the post-intervention period (P < 0.001). Provider time (provider-initiated care to patient care completed) in the pre-intervention period averaged 297 min, but decreased to 265 min in the post-intervention period (P < 0.001). After adjusting for other communication methods, the use of a standard reporting system during handoff was associated with shortened ED LOS (OR = 0.60, 95% CI 0.40 - 0.90, P < 0.05). Conclusions Standard reporting system use during emergency physician handoffs at shift change improves ED throughput efficiency and is associated with shorter ED LOS. PMID:29581808
ERIC Educational Resources Information Center
Oladele, Babatunde
2017-01-01
The aim of the current study is to analyse the 2014 Post UTME scores of candidates in the university of Ibadan towards the establishment of cut off using two methods of standard settings. Prospective candidates who seek admission to higher institution are often denied admission through the Post UTME exercise. There is no single recommended…
Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Ciniciato, Diego de Souza; Maserati, Marc Peter; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia
2017-08-09
Morphological analysis is the standard method of assessing embryo quality; however, its inherent subjectivity tends to generate discrepancies among evaluators. Using genetic algorithms and artificial neural networks (ANNs), we developed a new method for embryo analysis that is more robust and reliable than standard methods. Bovine blastocysts produced in vitro were classified as grade 1 (excellent or good), 2 (fair), or 3 (poor) by three experienced embryologists according to the International Embryo Technology Society (IETS) standard. The images (n = 482) were subjected to automatic feature extraction, and the results were used as input for a supervised learning process. One part of the dataset (15%) was used for a blind test posterior to the fitting, for which the system had an accuracy of 76.4%. Interestingly, when the same embryologists evaluated a sub-sample (10%) of the dataset, there was only 54.0% agreement with the standard (mode for grades). However, when using the ANN to assess this sub-sample, there was 87.5% agreement with the modal values obtained by the evaluators. The presented methodology is covered by National Institute of Industrial Property (INPI) and World Intellectual Property Organization (WIPO) patents and is currently undergoing a commercial evaluation of its feasibility.
HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.
Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua
2014-03-01
Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.
Prediction models for clustered data: comparison of a random intercept and standard regression model
2013-01-01
Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate better than the standard model only if the cluster effect is used for predictions. The prediction model with random intercept had good calibration within clusters. PMID:23414436
Error analysis on squareness of multi-sensor integrated CMM for the multistep registration method
NASA Astrophysics Data System (ADS)
Zhao, Yan; Wang, Yiwen; Ye, Xiuling; Wang, Zhong; Fu, Luhua
2018-01-01
The multistep registration(MSR) method in [1] is to register two different classes of sensors deployed on z-arm of CMM(coordinate measuring machine): a video camera and a tactile probe sensor. In general, it is difficult to obtain a very precise registration result with a single common standard, instead, this method is achieved by measuring two different standards with a constant distance between them two which are fixed on a steel plate. Although many factors have been considered such as the measuring ability of sensors, the uncertainty of the machine and the number of data pairs, there is no exact analysis on the squareness between the x-axis and the y-axis on the xy plane. For this sake, error analysis on the squareness of multi-sensor integrated CMM for the multistep registration method will be made to examine the validation of the MSR method. Synthetic experiments on the squareness on the xy plane for the simplified MSR with an inclination rotation are simulated, which will lead to a regular result. Experiments have been carried out with the multi-standard device designed also in [1], meanwhile, inspections with the help of a laser interferometer on the xy plane have been carried out. The final results are conformed to the simulations, and the squareness errors of the MSR method are also similar to the results of interferometer. In other word, the MSR can also adopted/utilized to verify the squareness of a CMM.
Ojanperä, Suvi; Rasanen, Ilpo; Sistonen, Johanna; Pelander, Anna; Vuori, Erkki; Ojanperä, Ilkka
2007-08-01
Lack of availability of reference standards for drug metabolites, newly released drugs, and illicit drugs hinders the analysis of these substances in biologic samples. To counter this problem, an approach is presented here for quantitative drug analysis in plasma without primary reference standards by liquid chromatography-chemiluminescence nitrogen detection (LC-CLND). To demonstrate the feasibility of the method, metabolic ratios of the opioid drug tramadol were determined in the setting of a pharmacogenetic study. Four volunteers were given a single 100-mg oral dose of tramadol, and a blood sample was collected from each subject 1 hour later. Tramadol, O-desmethyltramadol, and nortramadol were determined in plasma by LC-CLND without reference standards and by a gas chromatography-mass spectrometry reference method. In contrast to previous CLND studies lacking an extraction step, a liquid-liquid extraction system was created for 5-mL plasma samples using n-butyl chloride-isopropyl alcohol (98 + 2) at pH 10. Extraction recovery estimation was based on model compounds chosen according to their similar physicochemical characteristics (retention time, pKa, logD). Instrument calibration was performed with a single secondary standard (caffeine) using the equimolar response of the detector to nitrogen. The mean differences between the results of the LC-CLND and gas chromatography-mass spectrometry methods for tramadol, O-desmethyltramadol, and nortramadol were 8%, 32%, and 19%, respectively. The sensitivity of LC-CLND was sufficient for therapeutic concentrations of tramadol and metabolites. A good correlation was obtained between genotype, expressed by the number of functional genes, and the plasma metabolite ratios. This experiment suggests that a recovery-corrected LC-CLND analysis produces sufficiently accurate results to be useful in a clinical context, particularly in instances in which reference standards are not readily accessible.
Protein quantification using a cleavable reporter peptide.
Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno
2015-02-06
Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.
Granja, Rodrigo H M M; Salerno, Alessandro G; de Lima, Andreia C; Montalvo, Cynthia; Reche, Karine V G; Giannotti, Fabio M; Wanschel, Amarylis C B A
2014-01-01
Boldenone, an androgenic steroid, is forbidden for use in meat production in most countries worldwide. Residues of this drug in food present a potential risk to consumers. A sensitive LC/MS/MS method for analysis of 17β-boldenone using boldenone-d3 as an internal standard was developed. An enzymatic hydrolysis and extraction using ethyl acetate, methanol, and hexane were performed in the sample preparation. Parameters such as decision limit (CCα), detection capability (CCβ), precision, recovery, and ruggedness were evaluated according to the Brazilian Regulation 24/2009 (equivalent to European Union Decision 2002/657/EC) and International Organization for Standardization/International Electrotechnical Commission 17025:2005. CCα and CCβ were determined to be 0.17 and 0.29 μg/kg, respectively. Average recoveries from bovine liver samples fortified with 1, 1.5, and 2 μg/kg were around 100%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is considered robust after being subjected to day-to-day analytical variations and has been used as a standard method in Brazil to report boldenone levels in bovine liver.
In Search of a Time Efficient Approach to Crack and Delamination Growth Predictions in Composites
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Carvalho, Nelson
2016-01-01
Analysis benchmarking was used to assess the accuracy and time efficiency of algorithms suitable for automated delamination growth analysis. First, the Floating Node Method (FNM) was introduced and its combination with a simple exponential growth law (Paris Law) and Virtual Crack Closure technique (VCCT) was discussed. Implementation of the method into a user element (UEL) in Abaqus/Standard(Registered TradeMark) was also presented. For the assessment of growth prediction capabilities, an existing benchmark case based on the Double Cantilever Beam (DCB) specimen was briefly summarized. Additionally, the development of new benchmark cases based on the Mixed-Mode Bending (MMB) specimen to assess the growth prediction capabilities under mixed-mode I/II conditions was discussed in detail. A comparison was presented, in which the benchmark cases were used to assess the existing low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) in comparison to the FNM-VCCT fatigue growth analysis implementation. The low-cycle fatigue analysis tool in Abaqus/Standard(Registered TradeMark) was able to yield results that were in good agreement with the DCB benchmark example. Results for the MMB benchmark cases, however, only captured the trend correctly. The user element (FNM-VCCT) always yielded results that were in excellent agreement with all benchmark cases, at a fraction of the analysis time. The ability to assess the implementation of two methods in one finite element code illustrated the value of establishing benchmark solutions.
Ansermot, Nicolas; Rudaz, Serge; Brawand-Amey, Marlyse; Fleury-Souverain, Sandrine; Veuthey, Jean-Luc; Eap, Chin B
2009-08-01
Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.
ERIC Educational Resources Information Center
Dondlinger, Mary Jo; McLeod, Julie; Vasinda, Sheri
2016-01-01
This article explores links between student experiences with technology-rich mathematics instruction and the ISTE Standards for Students. Research methods applied constructivist grounded theory to analyze data from student interviews against the ISTE Standards for Students to identify which elements of the design of this learning environment…
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
Delay correlation analysis and representation for vital complaint VHDL models
Rich, Marvin J.; Misra, Ashutosh
2004-11-09
A method and system unbind a rise/fall tuple of a VHDL generic variable and create rise time and fall time generics of each generic variable that are independent of each other. Then, according to a predetermined correlation policy, the method and system collect delay values in a VHDL standard delay file, sort the delay values, remove duplicate delay values, group the delay values into correlation sets, and output an analysis file. The correlation policy may include collecting all generic variables in a VHDL standard delay file, selecting each generic variable, and performing reductions on the set of delay values associated with each selected generic variable.
Pérez, Teresa; Makrestsov, Nikita; Garatt, John; Torlakovic, Emina; Gilks, C Blake; Mallett, Susan
The Canadian Immunohistochemistry Quality Control program monitors clinical laboratory performance for estrogen receptor and progesterone receptor tests used in breast cancer treatment management in Canada. Current methods assess sensitivity and specificity at each time point, compared with a reference standard. We investigate alternative performance analysis methods to enhance the quality assessment. We used 3 methods of analysis: meta-analysis of sensitivity and specificity of each laboratory across all time points; sensitivity and specificity at each time point for each laboratory; and fitting models for repeated measurements to examine differences between laboratories adjusted by test and time point. Results show 88 laboratories participated in quality control at up to 13 time points using typically 37 to 54 histology samples. In meta-analysis across all time points no laboratories have sensitivity or specificity below 80%. Current methods, presenting sensitivity and specificity separately for each run, result in wide 95% confidence intervals, typically spanning 15% to 30%. Models of a single diagnostic outcome demonstrated that 82% to 100% of laboratories had no difference to reference standard for estrogen receptor and 75% to 100% for progesterone receptor, with the exception of 1 progesterone receptor run. Laboratories with significant differences to reference standard identified with Generalized Estimating Equation modeling also have reduced performance by meta-analysis across all time points. The Canadian Immunohistochemistry Quality Control program has a good design, and with this modeling approach has sufficient precision to measure performance at each time point and allow laboratories with a significantly lower performance to be targeted for advice.
The Second International Standard for Penicillin*
Humphrey, J. H.; Mussett, M. V.; Perry, W. L. M.
1953-01-01
In 1950 the Department of Biological Standards, National Institute for Medical Research, London, was authorized by the WHO Expert Committee on Biological Standardization to prepare the Second International Standard for Penicillin. A single batch of specially recrystallized sodium penicillin G was obtained and 11 laboratories in seven different countries were requested to take part in its collaborative assay. 112 assays were carried out, of which 101 were done by cup-plate methods using either Staphylococcus aureus or Bacillus subtilis. The results were subjected to standard methods of analysis, on the basis of which the authors define the Second International Standard for Penicillin as containing 1,670 International Units (IU) per mg, with limits of error (P = 0.05) of 1,666-1,674 IU/mg. The International Unit is therefore redefined as the activity contained in 0.0005988 mg of the Second International Standard for Penicillin. PMID:13082387
El Yazbi, Fawzy A; Hassan, Ekram M; Khamis, Essam F; Ragab, Marwa A A; Hamdy, Mohamed M A
2017-11-15
Ketorolac tromethamine (KTC) with phenylephrine hydrochloride (PHE) binary mixture (mixture 1) and their ternary mixture with chlorpheniramine maleate (CPM) (mixture 2) were analyzed using a validated HPLC-DAD method. The developed method was suitable for the in vitro as well as quantitative analysis of the targeted mixtures in rabbit aqueous humor. The analysis in dosage form (eye drops) was a stability indicating one at which drugs were separated from possible degradation products arising from different stress conditions (in vitro analysis). For analysis in aqueous humor, Guaifenesin (GUF) was used as internal standard and the method was validated according to FDA regulation for analysis in biological fluids. Agilent 5 HC-C18(2) 150×4.6mm was used as stationary phase with a gradient eluting solvent of 20mM phosphate buffer pH 4.6 containing 0.2% triethylamine and acetonitrile. The drugs were resolved with retention times of 2.41, 5.26, 7.92 and 9.64min for PHE, GUF, KTC and CPM, respectively. The method was sensitive and selective to analyze simultaneously the three drugs in presence of possible forced degradation products and dosage form excipients (in vitro analysis) and also with the internal standard, in presence of aqueous humor interferences (analysis in biological fluid), at a single wavelength (261nm). No extraction procedure was required for analysis in aqueous humor. The simplicity of the method emphasizes its capability to analyze the drugs in vivo (in rabbit aqueous humor) and in vitro (in pharmaceutical formulations). Copyright © 2017 Elsevier B.V. All rights reserved.
Sapozhnikova, Yelena; Lehotay, Steven J
2013-01-03
A multi-class, multi-residue method for the analysis of 13 novel flame retardants, 18 representative pesticides, 14 polychlorinated biphenyl (PCB) congeners, 16 polycyclic aromatic hydrocarbons (PAHs), and 7 polybrominated diphenyl ether (PBDE) congeners in catfish muscle was developed and evaluated using fast low pressure gas chromatography triple quadrupole tandem mass spectrometry (LP-GC/MS-MS). The method was based on a QuEChERS (quick, easy, cheap, effective, rugged, safe) extraction with acetonitrile and dispersive solid-phase extraction (d-SPE) clean-up with zirconium-based sorbent prior to LP-GC/MS-MS analysis. The developed method was evaluated at 4 spiking levels and further validated by analysis of NIST Standard Reference Materials (SRMs) 1974B and 1947. Sample preparation for a batch of 10 homogenized samples took about 1h/analyst, and LP-GC/MS-MS analysis provided fast separation of multiple analytes within 9min achieving high throughput. With the use of isotopically labeled internal standards, recoveries of all but one analyte were between 70 and 120% with relative standard deviations less than 20% (n=5). The measured values for both SRMs agreed with certified/reference values (72-119% accuracy) for the majority of analytes. The detection limits were 0.1-0.5ng g(-1) for PCBs, 0.5-10ng g(-1) for PBDEs, 0.5-5ng g(-1) for select pesticides and PAHs and 1-10ng g(-1) for flame retardants. The developed method was successfully applied for analysis of catfish samples from the market. Published by Elsevier B.V.
Qualitative Research in Career Development: Content Analysis from 1990 to 2009
ERIC Educational Resources Information Center
Stead, Graham B.; Perry, Justin C.; Munka, Linda M.; Bonnett, Heather R.; Shiban, Abbey P.; Care, Esther
2012-01-01
A content analysis of 11 journals that published career, vocational, and work-related articles from 1990 to 2009 was conducted. Of 3,279 articles analyzed, 55.9% used quantitative methods and 35.5% were theoretical/conceptual articles. Only 6.3% used qualitative research methods. Among the qualitative empirical studies, standards of academic rigor…
Zhu, Yuying; Wang, Jianmin; Wang, Cunfang
2018-05-01
Taking fresh goat milk as raw material after filtering, centrifuging, hollow fiber ultrafiltration, allocating formula, value detection and preparation processing, a set of 10 goat milk mixed standard substances was prepared on the basis of one-factor-at-a-time using a uniform design method, and its accuracy, uniformity and stability were evaluated by paired t-test and F-test of one-way analysis of variance. The results showed that three milk composition contents of these standard products were independent of each other, and the preparation using the quasi-level design method, and without emulsifier was the best program. Compared with detection value by cow milk standards for calibration fast analyzer, the calibration by goat milk mixed standard was more applicable to rapid detection of goat milk composition, detection value was more accurate and the deviation showed less error. Single factor analysis of variance showed that the uniformity and stability of the mixed standard substance were better; it could be stored for 15 days at 4°C. The uniformity and stability of the in-units and inter-units could meet the requirements of the preparation of national standard products. © 2018 Japanese Society of Animal Science.
Lozano, Valeria A; Ibañez, Gabriela A; Olivieri, Alejandro C
2009-10-05
In the presence of analyte-background interactions and a significant background signal, both second-order multivariate calibration and standard addition are required for successful analyte quantitation achieving the second-order advantage. This report discusses a modified second-order standard addition method, in which the test data matrix is subtracted from the standard addition matrices, and quantitation proceeds via the classical external calibration procedure. It is shown that this novel data processing method allows one to apply not only parallel factor analysis (PARAFAC) and multivariate curve resolution-alternating least-squares (MCR-ALS), but also the recently introduced and more flexible partial least-squares (PLS) models coupled to residual bilinearization (RBL). In particular, the multidimensional variant N-PLS/RBL is shown to produce the best analytical results. The comparison is carried out with the aid of a set of simulated data, as well as two experimental data sets: one aimed at the determination of salicylate in human serum in the presence of naproxen as an additional interferent, and the second one devoted to the analysis of danofloxacin in human serum in the presence of salicylate.
Fortini, Martina; Migliorini, Marzia; Cherubini, Chiara; Cecchi, Lorenzo; Calamai, Luca
2017-04-01
The commercial value of virgin olive oils (VOOs) strongly depends on their classification, also based on the aroma of the oils, usually evaluated by a panel test. Nowadays, a reliable analytical method is still needed to evaluate the volatile organic compounds (VOCs) and support the standard panel test method. To date, the use of HS-SPME sampling coupled to GC-MS is generally accepted for the analysis of VOCs in VOOs. However, VOO is a challenging matrix due to the simultaneous presence of: i) compounds at ppm and ppb concentrations; ii) molecules belonging to different chemical classes and iii) analytes with a wide range of molecular mass. Therefore, HS-SPME-GC-MS quantitation based upon the use of external standard method or of only a single internal standard (ISTD) for data normalization in an internal standard method, may be troublesome. In this work a multiple internal standard normalization is proposed to overcome these problems and improving quantitation of VOO-VOCs. As many as 11 ISTDs were used for quantitation of 71 VOCs. For each of them the most suitable ISTD was selected and a good linearity in a wide range of calibration was obtained. Except for E-2-hexenal, without ISTD or with an unsuitable ISTD, the linear range of calibration was narrower with respect to that obtained by a suitable ISTD, confirming the usefulness of multiple internal standard normalization for the correct quantitation of VOCs profile in VOOs. The method was validated for 71 VOCs, and then applied to a series of lampante virgin olive oils and extra virgin olive oils. In light of our results, we propose the application of this analytical approach for routine quantitative analyses and to support sensorial analysis for the evaluation of positive and negative VOOs attributes. Copyright © 2017 Elsevier B.V. All rights reserved.
Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.
Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei
2017-04-01
Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
Smith, Justin D.
2013-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-06-01
The bibliography contains citations concerning standards and standard tests for water quality in drinking water sources, reservoirs, and distribution systems. Standards from domestic and international sources are presented. Glossaries and vocabularies that concern water quality analysis, testing, and evaluation are included. Standard test methods for individual elements, selected chemicals, sensory properties, radioactivity, and other chemical and physical properties are described. Discussions for proposed standards on new pollutant materials are briefly considered. (Contains a minimum of 203 citations and includes a subject term index and title list.)
Betty Petersen Memorial Library - NCWCP Publications - NWS
. Polger P. Comparative Analysis of a New Integration Method With Certain Standard Methods (.PDF file) 52 file) 169 1978 Gerrity J. Elemental-Filter Design Considerations (.PDF file) 170 1978 Shuman F. G
Use of focused ultrasonication in activity-based profiling of deubiquitinating enzymes in tissue.
Nanduri, Bindu; Shack, Leslie A; Rai, Aswathy N; Epperson, William B; Baumgartner, Wes; Schmidt, Ty B; Edelmann, Mariola J
2016-12-15
To develop a reproducible tissue lysis method that retains enzyme function for activity-based protein profiling, we compared four different methods to obtain protein extracts from bovine lung tissue: focused ultrasonication, standard sonication, mortar & pestle method, and homogenization combined with standard sonication. Focused ultrasonication and mortar & pestle methods were sufficiently effective for activity-based profiling of deubiquitinases in tissue, and focused ultrasonication also had the fastest processing time. We used focused-ultrasonicator for subsequent activity-based proteomic analysis of deubiquitinases to test the compatibility of this method in sample preparation for activity-based chemical proteomics. Copyright © 2016 Elsevier Inc. All rights reserved.
Marijnissen, A C A; Vincken, K L; Vos, P A J M; Saris, D B F; Viergever, M A; Bijlsma, J W J; Bartels, L W; Lafeber, F P J G
2008-02-01
Radiography is still the golden standard for imaging features of osteoarthritis (OA), such as joint space narrowing, subchondral sclerosis, and osteophyte formation. Objective assessment, however, remains difficult. The goal of the present study was to evaluate a novel digital method to analyse standard knee radiographs. Standardized radiographs of 20 healthy and 55 OA knees were taken in general practise according to the semi-flexed method by Buckland-Wright. Joint Space Width (JSW), osteophyte area, subchondral bone density, joint angle, and tibial eminence height were measured as continuous variables using newly developed Knee Images Digital Analysis (KIDA) software on a standard PC. Two observers evaluated the radiographs twice, each on two different occasions. The observers were blinded to the source of the radiographs and to their previous measurements. Statistical analysis to compare measurements within and between observers was performed according to Bland and Altman. Correlations between KIDA data and Kellgren & Lawrence (K&L) grade were calculated and data of healthy knees were compared to those of OA knees. Intra- and inter-observer variations for measurement of JSW, subchondral bone density, osteophytes, tibial eminence, and joint angle were small. Significant correlations were found between KIDA parameters and K&L grade. Furthermore, significant differences were found between healthy and OA knees. In addition to JSW measurement, objective evaluation of osteophyte formation and subchondral bone density is possible on standard radiographs. The measured differences between OA and healthy individuals suggest that KIDA allows detection of changes in time, although sensitivity to change has to be demonstrated in long-term follow-up studies.
Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya
2013-01-01
Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607
Tanuja, Penmatsa; Venugopal, Namburi; Sashidhar, Rao Beedu
2007-01-01
A simple thin-layer chromatography-digital image-based analytical method has been developed for the quantitation of the botanical pesticide, azadirachtin. The method was validated by analyzing azadirachtin in the spiked food matrixes and processed commercial pesticide formulations, using acidified vanillin reagent as a postchromatographic derivatizing agent. The separated azadirachtin was clearly identified as a green spot. The Rf value was found to be 0.55, which was similar to that of a reference standard. A standard calibration plot was established using a reference standard, based on the linear regression analysis [r2 = 0.996; y = 371.43 + (634.82)x]. The sensitivity of the method was found to be 0.875 microg azadirachtin. Spiking studies conducted at the 1 ppm (microg/g) level in various agricultural matrixes, such as brinjal, tomato, coffee, and cotton seeds, revealed the recoveries of azadirachtin in the range of 67-92%. Azadirachtin content of commercial neem formulations analyzed by the method was in the range of 190-1825 ppm (microg/mL). Further, the present method was compared with an immunoanalytical method enzyme-linked immonosorbent assay developed earlier in our laboratory. Statistical comparison of the 2 methods, using Fischer's F-test, indicated no significant difference in variance, suggesting that both methods are comparable.
Measurement of edge residual stresses in glass by the phase-shifting method
NASA Astrophysics Data System (ADS)
Ajovalasit, A.; Petrucci, G.; Scafidi, M.
2011-05-01
Control and measurement of residual stress in glass is of great importance in the industrial field. Since glass is a birefringent material, the residual stress analysis is based mainly on the photoelastic method. This paper considers two methods of automated analysis of membrane residual stress in glass sheets, based on the phase-shifting concept in monochromatic light. In particular these methods are the automated versions of goniometric compensation methods of Tardy and Sénarmont. The proposed methods can effectively replace manual methods of compensation (goniometric compensation of Tardy and Sénarmont, Babinet and Babinet-Soleil compensators) provided by current standards on the analysis of residual stresses in glasses.
NASA Astrophysics Data System (ADS)
Pomata, Donatella; Di Filippo, Patrizia; Riccardi, Carmela; Buiarelli, Francesca; Gallo, Valentina
2014-02-01
Organic component of airborne particulate matter originates from both natural and anthropogenic sources whose contributions can be identified through the analysis of chemical markers. The validation of analytical methods for analysis of compounds used as chemical markers is of great importance especially if they must be determined in rather complex matrices. Currently, standard reference materials (SRM) with certified values for all those analytes are not available. In this paper, we report a method for the simultaneous determination of levoglucosan and xylitol as tracers for biomass burning emissions, and arabitol, mannitol and ergosterol as biomarkers for airborne fungi in SRM 1649a, by GC/MS. Their quantitative analysis in SRM 1649a was carried out using both internal standard calibration curves and standard addition method. A matrix effect was observed for all analytes, minor for levoglucosan and major for polyols and ergosterol. The results related to levoglucosan around 160 μg g-1 agreed with those reported by other authors, while no comparison was possible for xylitol (120 μg g-1), arabitol (15 μg g-1), mannitol (18 μg g-1), and ergosterol (0.5 μg g-1). The analytical method used for SRM 1649a was also applied to PM10 samples collected in Rome during four seasonal sampling campaigns. The ratios between annual analyte concentrations in PM10 samples and in SRM 1649a were of the same order of magnitude although particulate matter samples analyzed were collected in two different sites and periods.
Comparison of macro-gravimetric and micro-colorimetric lipid determination methods.
Inouye, Laura S; Lotufo, Guiherme R
2006-10-15
In order to validate a method for lipid analysis of small tissue samples, the standard macro-gravimetric method of Bligh-Dyer (1959) [E.G. Bligh, W.J. Dyer, Can. J. Biochem. Physiol. 37 (1959) 911] and a modification of the micro-colorimetric assay developed by Van Handel (1985) [E. Van Handel, J. Am. Mosq. Control Assoc. 1 (1985) 302] were compared. No significant differences were observed for wet tissues of two species of fish. However, limited analysis of wet tissue of the amphipod, Leptocheirusplumulosus, indicated that the Bligh-Dyer gravimetric method generated higher lipid values, most likely due to the inclusion of non-lipid materials. Additionally, significant differences between the methods were observed with dry tissues, with the micro-colorimetric method consistently reporting calculated lipid values greater than as reported by the gravimetric method. This was most likely due to poor extraction of dry tissue in the standard Bligh-Dyer method, as no significant differences were found when analyzing a single composite extract. The data presented supports the conclusion that the micro-colorimetric method described in this paper is accurate, rapid, and minimizes time and solvent use.
Science and Technology Highlights | NREL
Leads to Enhanced Upgrading Methods NREL's efforts to standardize techniques for bio-oil analysis inform enhanced modeling capability and affordable methods to increase energy efficiency. December 2012 NREL Meets Performance Demands of Advanced Lithium-ion Batteries Novel surface modification methods are
A Comparison of Methods for Detecting Differential Distractor Functioning
ERIC Educational Resources Information Center
Koon, Sharon
2010-01-01
This study examined the effectiveness of the odds-ratio method (Penfield, 2008) and the multinomial logistic regression method (Kato, Moen, & Thurlow, 2009) for measuring differential distractor functioning (DDF) effects in comparison to the standardized distractor analysis approach (Schmitt & Bleistein, 1987). Students classified as participating…
[Research progress on mechanical performance evaluation of artificial intervertebral disc].
Li, Rui; Wang, Song; Liao, Zhenhua; Liu, Weiqiang
2018-03-01
The mechanical properties of artificial intervertebral disc (AID) are related to long-term reliability of prosthesis. There are three testing methods involved in the mechanical performance evaluation of AID based on different tools: the testing method using mechanical simulator, in vitro specimen testing method and finite element analysis method. In this study, the testing standard, testing equipment and materials of AID were firstly introduced. Then, the present status of AID static mechanical properties test (static axial compression, static axial compression-shear), dynamic mechanical properties test (dynamic axial compression, dynamic axial compression-shear), creep and stress relaxation test, device pushout test, core pushout test, subsidence test, etc. were focused on. The experimental techniques using in vitro specimen testing method and testing results of available artificial discs were summarized. The experimental methods and research status of finite element analysis were also summarized. Finally, the research trends of AID mechanical performance evaluation were forecasted. The simulator, load, dynamic cycle, motion mode, specimen and test standard would be important research fields in the future.
Eckner, Karl F.
1998-01-01
A total of 338 water samples, 261 drinking water samples and 77 bathing water samples, obtained for routine testing were analyzed in duplicate by Swedish standard methods using multiple-tube fermentation or membrane filtration and by the Colilert and/or Enterolert methods. Water samples came from a wide variety of sources in southern Sweden (Skåne). The Colilert method was found to be more sensitive than Swedish standard methods for detecting coliform bacteria and of equal sensitivity for detecting Escherichia coli when all drinking water samples were grouped together. Based on these results, Swedac, the Swedish laboratory accreditation body, approved for the first time in Sweden use of the Colilert method at this laboratory for the analysis of all water sources not falling under public water regulations (A-krav). The coliform detection study of bathing water yielded anomalous results due to confirmation difficulties. E. coli detection in bathing water was similar by both the Colilert and Swedish standard methods as was fecal streptococcus and enterococcus detection by both the Enterolert and Swedish standard methods. PMID:9687478
NASA Technical Reports Server (NTRS)
Stretchberry, D. M.; Hein, G. F.
1972-01-01
The general concepts of costing, budgeting, and benefit-cost ratio and cost-effectiveness analysis are discussed. The three common methods of costing are presented. Budgeting distributions are discussed. The use of discounting procedures is outlined. The benefit-cost ratio and cost-effectiveness analysis is defined and their current application to NASA planning is pointed out. Specific practices and techniques are discussed, and actual costing and budgeting procedures are outlined. The recommended method of calculating benefit-cost ratios is described. A standardized method of cost-effectiveness analysis and long-range planning are also discussed.
Multi-element analysis of emeralds and associated rocks by k(o) neutron activation analysis
Acharya; Mondal; Burte; Nair; Reddy; Reddy; Reddy; Manohar
2000-12-01
Multi-element analysis was carried out in natural emeralds, their associated rocks and one sample of beryl obtained from Rajasthan, India. The concentrations of 21 elements were assayed by Instrumental Neutron Activation Analysis using the k0 method (k0 INAA method) and high-resolution gamma ray spectrometry. The data reveal the segregation of some elements from associated (trapped and host) rocks to the mineral beryl forming the gemstones. A reference rock standard of the US Geological Survey (USGS BCR-1) was also analysed as a control of the method.
NASA Astrophysics Data System (ADS)
Smallwood, Jeremy; Swenson, David E.
2011-06-01
Evaluation of electrostatic performance of footwear and flooring in combination is necessary in applications such as electrostatic discharge (ESD) control in electronics manufacture, evaluation of equipment for avoidance of factory process electrostatic ignition risks and avoidance of electrostatic shocks to personnel in working environments. Typical standards use a walking test in which the voltage produced on a subject is evaluated by identification and measurement of the magnitude of the 5 highest "peaks" and "valleys" of the recorded voltage waveform. This method does not lend itself to effective analysis of the risk that the voltage will exceed a hazard threshold. This paper shows the advantages of voltage probability analysis and recommends that the method is adopted for use in future standards.
[A study of biomechanical method for urine test based on color difference estimation].
Wang, Chunhong; Zhou, Yue; Zhao, Hongxia; Zhou, Fengkun
2008-02-01
The biochemical analysis of urine is an important inspection and diagnosis method in hospitals. The conventional method of urine analysis covers mainly colorimetric visual appraisement and automation detection, in which the colorimetric visual appraisement technique has been superseded basically, and the automation detection method is adopted in hospital; moreover, the price of urine biochemical analyzer on market is around twenty thousand RMB yuan (Y), which is hard to enter into ordinary families. It is known that computer vision system is not subject to the physiological and psychological influence of person, its appraisement standard is objective and steady. Therefore, according to the color theory, we have established a computer vision system, which can carry through collection, management, display, and appraisement of color difference between the color of standard threshold value and the color of urine test paper after reaction with urine liquid, and then the level of an illness can be judged accurately. In this paper, we introduce the Urine Test Biochemical Analysis method, which is new and can be popularized in families. Experimental result shows that this test method is easy-to-use and cost-effective. It can realize the monitoring of a whole course and can find extensive applications.
Cao, Ying; Rajan, Suja S; Wei, Peng
2016-12-01
A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.
Dahlquist, Robert T; Reyner, Karina; Robinson, Richard D; Farzad, Ali; Laureano-Phillips, Jessica; Garrett, John S; Young, Joseph M; Zenarosa, Nestor R; Wang, Hao
2018-05-01
Emergency department (ED) shift handoffs are potential sources of delay in care. We aimed to determine the impact that using standardized reporting tool and process may have on throughput metrics for patients undergoing a transition of care at shift change. We performed a prospective, pre- and post-intervention quality improvement study from September 1 to November 30, 2015. A handoff procedure intervention, including a mandatory workshop and personnel training on a standard reporting system template, was implemented. The primary endpoint was patient length of stay (LOS). A comparative analysis of differences between patient LOS and various handoff communication methods were assessed pre- and post-intervention. Communication methods were entered a multivariable logistic regression model independently as risk factors for patient LOS. The final analysis included 1,006 patients, with 327 comprising the pre-intervention and 679 comprising the post-intervention populations. Bedside rounding occurred 45% of the time without a standard reporting during pre-intervention and increased to 85% of the time with the use of a standard reporting system in the post-intervention period (P < 0.001). Provider time (provider-initiated care to patient care completed) in the pre-intervention period averaged 297 min, but decreased to 265 min in the post-intervention period (P < 0.001). After adjusting for other communication methods, the use of a standard reporting system during handoff was associated with shortened ED LOS (OR = 0.60, 95% CI 0.40 - 0.90, P < 0.05). Standard reporting system use during emergency physician handoffs at shift change improves ED throughput efficiency and is associated with shorter ED LOS.
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include various graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Keviar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Study Methods to Standardize Thermography NDE
NASA Technical Reports Server (NTRS)
Walker, James L.; Workman, Gary L.
1998-01-01
The purpose of this work is to develop thermographic inspection methods and standards for use in evaluating structural composites and aerospace hardware. Qualification techniques and calibration methods are investigated to standardize the thermographic method for use in the field. Along with the inspections of test standards structural hardware, support hardware is designed and fabricated to aid in the thermographic process. Also, a standard operating procedure is developed for performing inspections with the Bales Thermal Image Processor (TIP). Inspections are performed on a broad range of structural composites. These materials include graphite/epoxies, graphite/cyanide-ester, graphite/silicon-carbide, graphite phenolic and Kevlar/epoxy. Also metal honeycomb (titanium and aluminum faceplates over an aluminum honeycomb core) structures are investigated. Various structural shapes are investigated and the thickness of the structures vary from as few as 3 plies to as many as 80 plies. Special emphasis is placed on characterizing defects in attachment holes and bondlines, in addition to those resulting from impact damage and the inclusion of foreign matter. Image processing through statistical analysis and digital filtering is investigated to enhance the quality and quantify the NDE thermal images when necessary.
Device and methods for "gold standard" registration of clinical 3D and 2D cerebral angiograms
NASA Astrophysics Data System (ADS)
Madan, Hennadii; Likar, Boštjan; Pernuš, Franjo; Å piclin, Žiga
2015-03-01
Translation of any novel and existing 3D-2D image registration methods into clinical image-guidance systems is limited due to lack of their objective validation on clinical image datasets. The main reason is that, besides the calibration of the 2D imaging system, a reference or "gold standard" registration is very difficult to obtain on clinical image datasets. In the context of cerebral endovascular image-guided interventions (EIGIs), we present a calibration device in the form of a headband with integrated fiducial markers and, secondly, propose an automated pipeline comprising 3D and 2D image processing, analysis and annotation steps, the result of which is a retrospective calibration of the 2D imaging system and an optimal, i.e., "gold standard" registration of 3D and 2D images. The device and methods were used to create the "gold standard" on 15 datasets of 3D and 2D cerebral angiograms, whereas each dataset was acquired on a patient undergoing EIGI for either aneurysm coiling or embolization of arteriovenous malformation. The use of the device integrated seamlessly in the clinical workflow of EIGI. While the automated pipeline eliminated all manual input or interactive image processing, analysis or annotation. In this way, the time to obtain the "gold standard" was reduced from 30 to less than one minute and the "gold standard" of 3D-2D registration on all 15 datasets of cerebral angiograms was obtained with a sub-0.1 mm accuracy.
NASA Astrophysics Data System (ADS)
Medvedeva, Maria F.; Doubrovski, Valery A.
2017-03-01
The resolution of the acousto-optical method for blood typing was estimated experimentally by means of two types of reagents: monoclonal antibodies and standard hemagglutinating sera. The peculiarity of this work is the application of digital photo images processing by pixel analysis previously proposed by the authors. The influence of the concentrations of reagents, of blood sample, which is to be tested, as well as of the duration of the ultrasonic action on the biological object upon the resolution of acousto-optical method were investigated. The optimal experimental conditions to obtain maximum of the resolution of the acousto-optical method were found, it creates the prerequisites for a reliable blood typing. The present paper is a further step in the development of acousto-optical method for determining human blood groups.
Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong
2018-01-01
Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.
Synthesis and evaluation of L-cystathionine as a standard for amino acid analysis.
Amino, Yusuke; Suzuki, Yumiko
2017-01-01
L-Cystathionine is a key nonprotein amino acid related to metabolic conditions. The quantitative determination of L-cystathionine in physiological fluids by amino acid analysis is important for clinical diagnosis; however, certified reference material for L-cystathionine with satisfactory purity, content, and quantity has been unavailable until recently. Consequently, a practical and simple method for the preparation of L-cystathionine was examined, which involves thioalkylation of N-tert-butoxycarbonyl-L-cysteine tert-butyl ester, derived from L-cystine, with (2S)-2-(tert-butoxycarbonyl)amino-4-iodobutanoic acid tert-butyl ester, derived from L-aspartic acid, to obtain L-cystathionine with protecting groups, followed by single-step deprotection under mild conditions. This method produces L-cystathionine in high purity (99.4%) and having sufficient percentage content according to amino acid analysis, which could be used as a standard for the amino acid analysis of physiological fluids.
Coggins, L.G.; Pine, William E.; Walters, C.J.; Martell, S.J.D.
2006-01-01
We present a new model to estimate capture probabilities, survival, abundance, and recruitment using traditional Jolly-Seber capture-recapture methods within a standard fisheries virtual population analysis framework. This approach compares the numbers of marked and unmarked fish at age captured in each year of sampling with predictions based on estimated vulnerabilities and abundance in a likelihood function. Recruitment to the earliest age at which fish can be tagged is estimated by using a virtual population analysis method to back-calculate the expected numbers of unmarked fish at risk of capture. By using information from both marked and unmarked animals in a standard fisheries age structure framework, this approach is well suited to the sparse data situations common in long-term capture-recapture programs with variable sampling effort. ?? Copyright by the American Fisheries Society 2006.
This compendium includes descriptions of methods for analyzing metals, pesticides and volatile organic compounds (VOCs) in water. The individual methods covered are these: (1) Method 200.8: determination of trace elements in waters and wastes by inductively coupled plasma-mass s...
Atmospheric pollution measurement by optical cross correlation methods - A concept
NASA Technical Reports Server (NTRS)
Fisher, M. J.; Krause, F. R.
1971-01-01
Method combines standard spectroscopy with statistical cross correlation analysis of two narrow light beams for remote sensing to detect foreign matter of given particulate size and consistency. Method is applicable in studies of generation and motion of clouds, nuclear debris, ozone, and radiation belts.
Peedikayil, Musthafa Chalikandy; AlSohaibani, Fahad Ibrahim; Alkhenizan, Abdullah Hamad
2014-01-01
Background First-line levofloxacin-based treatments eradicate Helicobacter pylori with varying success. We examined the efficacy and safety of first-line levofloxacin-based treatment in comparison to standard first-line therapy for H pylori eradication. Materials and Methods We searched literature databases from Medline, EMBASE, and the Cochrane Register of Randomized Controlled Trials through March 2013 for randomized controlled trials comparing first-line levofloxacin and standard therapy. We included randomized controlled trials conducted only on naïve H pylori infected patients in adults. A systematic review was conducted. Meta-analysis was performed with Review Manager 5.2. Treatment effect was determined by relative risk with a random or fixed model by the Mantel-Haenszel method. Results Seven trials were identified with 888 patients receiving 7 days of first-line levofloxacin and 894 treated with standard therapy (Amoxicillin, Clarithromycin and proton pump inhibitor) for 7 days. The overall crude eradication rate in the Levofloxacin group was 79.05% versus 81.4% in the standard group (risk ratio 0.97; 95% CI; 0.93, 1.02). The overall dropout was 46 (5.2%) in the levofloxacin group and 52 (5.8%) for standard therapy. The dizziness was more common among group who took Levofloxacin based treatment and taste disturbance was more common among group who took standard therapy. Meta-analysis of overall adverse events were similar between the two groups with a relative risk of 1.06 (95% CI 0.72, 1.57). Conclusion Helicobacter pylori eradication with 7 days of Levofloxacin-based first line therapy was safe and equal compared to 7 days of standard first-line therapy. PMID:24465624
Levitt, Heidi M; Bamberg, Michael; Creswell, John W; Frost, David M; Josselson, Ruthellen; Suárez-Orozco, Carola
2018-01-01
The American Psychological Association Publications and Communications Board Working Group on Journal Article Reporting Standards for Qualitative Research (JARS-Qual Working Group) was charged with examining the state of journal article reporting standards as they applied to qualitative research and with generating recommendations for standards that would be appropriate for a wide range of methods within the discipline of psychology. These standards describe what should be included in a research report to enable and facilitate the review process. This publication marks a historical moment-the first inclusion of qualitative research in APA Style, which is the basis of both the Publication Manual of the American Psychological Association (APA, 2010) and APA Style CENTRAL, an online program to support APA Style. In addition to the general JARS-Qual guidelines, the Working Group has developed standards for both qualitative meta-analysis and mixed methods research. The reporting standards were developed for psychological qualitative research but may hold utility for a broad range of social sciences. They honor a range of qualitative traditions, methods, and reporting styles. The Working Group was composed of a group of researchers with backgrounds in varying methods, research topics, and approaches to inquiry. In this article, they present these standards and their rationale, and they detail the ways that the standards differ from the quantitative research reporting standards. They describe how the standards can be used by authors in the process of writing qualitative research for submission as well as by reviewers and editors in the process of reviewing research. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Analysis of titanium content in titanium tetrachloride solution
NASA Astrophysics Data System (ADS)
Bi, Xiaoguo; Dong, Yingnan; Li, Shanshan; Guan, Duojiao; Wang, Jianyu; Tang, Meiling
2018-03-01
Strontium titanate, barium titan and lead titanate are new type of functional ceramic materials with good prospect, and titanium tetrachloride is a commonly in the production such products. Which excellent electrochemical performance of ferroelectric tempreature coefficient effect.In this article, three methods are used to calibrate the samples of titanium tetrachloride solution by back titration method, replacement titration method and gravimetric analysis method. The results show that the back titration method has many good points, for example, relatively simple operation, easy to judgment the titration end point, better accuracy and precision of analytical results, the relative standard deviation not less than 0.2%. So, it is the ideal of conventional analysis methods in the mass production.
40 CFR 406.11 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Corn Wet Milling Subcategory § 406.11 Specialized definitions... and methods of analysis set forth in 40 CFR part 401 shall apply to this subpart. (b) The term corn shall mean the shelled corn delivered to a plant before processing. (c) The term standard bushel shall...
He, Yufei; Li, Qing; Bi, Kaishun
2015-04-01
To control the quality of Rhizoma Chuanxiong, a method based on high-performance liquid chromatography method coupled with diode array detection was developed for the quantitative analysis of six active ingredients using a single standard to determine multi-components and chemical fingerprint analysis for the first time. The separation was performed on an Agilent Zorbax SB-C18 column by gradient elution with methanol and aqueous phase (containing 0.5% glacial acetic acid) at a flow rate of 1.0 mL/min. The UV wavelength was set at 274 nm. This assay was fully validated with respect to precision, repeatability, and accuracy. All calibration curves showed good linearity (R(2) > 0.9994) within test ranges. The limit of detection and limit of quantification were lower than 0.01 and 0.03 μg/mL, respectively. The relative standard deviation for repeatability and the intermediate precision of six analytes were less than 1.6 and 2.5%, respectively, the overall recovery was 96.1-103.1%. In addition, fingerprint chromatography using hierarchical clustering analysis and similarity analysis was performed to differentiate and classify the samples. The method described here could provide a more comprehensive and reasonable scientific assessment of the quality of Rhizoma Chuanxiong. Therefore, the strategy is feasible, credible, and is easily and effectively adapted for evaluating the quality control of Rhizoma Chuanxiong. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A framework for the meta-analysis of Bland-Altman studies based on a limits of agreement approach.
Tipton, Elizabeth; Shuster, Jonathan
2017-10-15
Bland-Altman method comparison studies are common in the medical sciences and are used to compare a new measure to a gold-standard (often costlier or more invasive) measure. The distribution of these differences is summarized by two statistics, the 'bias' and standard deviation, and these measures are combined to provide estimates of the limits of agreement (LoA). When these LoA are within the bounds of clinically insignificant differences, the new non-invasive measure is preferred. Very often, multiple Bland-Altman studies have been conducted comparing the same two measures, and random-effects meta-analysis provides a means to pool these estimates. We provide a framework for the meta-analysis of Bland-Altman studies, including methods for estimating the LoA and measures of uncertainty (i.e., confidence intervals). Importantly, these LoA are likely to be wider than those typically reported in Bland-Altman meta-analyses. Frequently, Bland-Altman studies report results based on repeated measures designs but do not properly adjust for this design in the analysis. Meta-analyses of Bland-Altman studies frequently exclude these studies for this reason. We provide a meta-analytic approach that allows inclusion of estimates from these studies. This includes adjustments to the estimate of the standard deviation and a method for pooling the estimates based upon robust variance estimation. An example is included based on a previously published meta-analysis. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It provide...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
This compendium contains seven SOPs developed by Food and Drug Administration (FDA) laboratories for methods of analyzing trace metals in dietary samples collected using Total Diet study procedures. The SOPs include the following: (1) Quality Control for Analysis of NHEXAS Food o...
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). It pro...
Endo, Yasushi
2018-01-01
Edible fats and oils are among the basic components of the human diet, along with carbohydrates and proteins, and they are the source of high energy and essential fatty acids such as linoleic and linolenic acids. Edible fats and oils are used in for pan- and deep-frying, and in salad dressing, mayonnaise and processed foods such as chocolates and cream. The physical and chemical properties of edible fats and oils can affect the quality of oil foods and hence must be evaluated in detail. The physical characteristics of edible fats and oils include color, specific gravity, refractive index, melting point, congeal point, smoke point, flash point, fire point, and viscosity, while the chemical characteristics include acid value, saponification value, iodine value, fatty acid composition, trans isomers, triacylglycerol composition, unsaponifiable matters (sterols, tocopherols) and minor components (phospholipids, chlorophyll pigments, glycidyl fatty acid esters). Peroxide value, p-anisidine value, carbonyl value, polar compounds and polymerized triacylglycerols are indexes of the deterioration of edible fats and oils. This review describes the analytical methods to evaluate the quality of edible fats and oils, especially the Standard Methods for Analysis of Fats, Oils and Related Materials edited by Japan Oil Chemists' Society (the JOCS standard methods) and advanced methods.
Brown, Richard J C; Beccaceci, Sonya; Butterfield, David M; Quincey, Paul G; Harris, Peter M; Maggos, Thomas; Panteliadis, Pavlos; John, Astrid; Jedynska, Aleksandra; Kuhlbusch, Thomas A J; Putaud, Jean-Philippe; Karanasiou, Angeliki
2017-10-18
The European Committee for Standardisation (CEN) Technical Committee 264 'Air Quality' has recently produced a standard method for the measurements of organic carbon and elemental carbon in PM 2.5 within its working group 35 in response to the requirements of European Directive 2008/50/EC. It is expected that this method will be used in future by all Member States making measurements of the carbonaceous content of PM 2.5 . This paper details the results of a laboratory and field measurement campaign and the statistical analysis performed to validate the standard method, assess its uncertainty and define its working range to provide clarity and confidence in the underpinning science for future users of the method. The statistical analysis showed that the expanded combined uncertainty for transmittance protocol measurements of OC, EC and TC is expected to be below 25%, at the 95% level of confidence, above filter loadings of 2 μg cm -2 . An estimation of the detection limit of the method for total carbon was 2 μg cm -2 . As a result of the laboratory and field measurement campaign the EUSAAR2 transmittance measurement protocol was chosen as the basis of the standard method EN 16909:2017.
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
Lien, Stina K; Kvitvang, Hans Fredrik Nyvold; Bruheim, Per
2012-07-20
GC-MS analysis of silylated metabolites is a sensitive method that covers important metabolite groups such as sugars, amino acids and non-amino organic acids, and it has become one of the most important analytical methods for exploring the metabolome. Absolute quantitative GC-MS analysis of silylated metabolites poses a challenge as different metabolites have different derivatization kinetics and as their silyl-derivates have varying stability. This report describes the development of a targeted GC-MS/MS method for quantification of metabolites. Internal standards for each individual metabolite were obtained by derivatization of a mixture of standards with deuterated N-methyl-N-trimethylsilyltrifluoroacetamide (d9-MSTFA), and spiking this solution into MSTFA derivatized samples prior to GC-MS/MS analysis. The derivatization and spiking protocol needed optimization to ensure that the behaviour of labelled compound responses in the spiked sample correctly reflected the behaviour of unlabelled compound responses. Using labelled and unlabelled MSTFA in this way enabled normalization of metabolite responses by the response of their deuterated counterpart (i.e. individual correction). Such individual correction of metabolite responses reproducibly resulted in significantly higher precision than traditional data correction strategies when tested on samples both with and without serum and urine matrices. The developed method is thus a valuable contribution to the field of absolute quantitative metabolomics. Copyright © 2012 Elsevier B.V. All rights reserved.
Comparison of a novel fixation device with standard suturing methods for spinal cord stimulators.
Bowman, Richard G; Caraway, David; Bentley, Ishmael
2013-01-01
Spinal cord stimulation is a well-established treatment for chronic neuropathic pain of the trunk or limbs. Currently, the standard method of fixation is to affix the leads of the neuromodulation device to soft tissue, fascia or ligament, through the use of manually tying general suture. A novel semiautomated device is proposed that may be advantageous to the current standard. Comparison testing in an excised caprine spine and simulated bench top model was performed. Three tests were performed: 1) perpendicular pull from fascia of caprine spine; 2) axial pull from fascia of caprine spine; and 3) axial pull from Mylar film. Six samples of each configuration were tested for each scenario. Standard 2-0 Ethibond was compared with a novel semiautomated device (Anulex fiXate). Upon completion of testing statistical analysis was performed for each scenario. For perpendicular pull in the caprine spine, the failure load for standard suture was 8.95 lbs with a standard deviation of 1.39 whereas for fiXate the load was 15.93 lbs with a standard deviation of 2.09. For axial pull in the caprine spine, the failure load for standard suture was 6.79 lbs with a standard deviation of 1.55 whereas for fiXate the load was 12.31 lbs with a standard deviation of 4.26. For axial pull in Mylar film, the failure load for standard suture was 10.87 lbs with a standard deviation of 1.56 whereas for fiXate the load was 19.54 lbs with a standard deviation of 2.24. These data suggest a novel semiautomated device offers a method of fixation that may be utilized in lieu of standard suturing methods as a means of securing neuromodulation devices. Data suggest the novel semiautomated device in fact may provide a more secure fixation than standard suturing methods. © 2012 International Neuromodulation Society.
Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.
2015-10-30
The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less
System of Systems Analytic Workbench - 2017
2017-08-31
and transitional activities with key collaborators. The tools include: System Operational Dependency Analysis/System Developmental Dependency Analysis...in the methods of the SoS-AWB involve the following: 1. System Operability Dependency Analysis (SODA)/System Development Dependency Analysis...available f. Development of standard dependencies with combinations of low-medium-high parameters Report No. SERC-2017-TR-111
A pseudo differential Gm—C complex filter with frequency tuning for IEEE802.15.4 applications
NASA Astrophysics Data System (ADS)
Xin, Cheng; Lungui, Zhong; Haigang, Yang; Fei, Liu; Tongqiang, Gao
2011-07-01
This paper presents a CMOS Gm—C complex filter for a low-IF receiver of the IEEE 802.15.4 standard. A pseudo differential OTA with reconfigurable common mode feedback and common mode feed-forward is proposed as well as the frequency tuning method based on a relaxation oscillator. A detailed analysis of non-ideality of the OTA and the frequency tuning method is elaborated. The analysis and measurement results have shown that the center frequency of the complex filter could be tuned accurately. The chip was fabricated in a standard 0.35 μm CMOS process, with a single 3.3 V power supply. The filter consumes 2.1mA current, has a measured in-band group delay ripple of less than 0.16 μs and an IRR larger than 28 dB at 2 MHz apart, which could meet the requirements oftheIEEE802.15.4 standard.
Validation of Physics Standardized Test Items
NASA Astrophysics Data System (ADS)
Marshall, Jill
2008-10-01
The Texas Physics Assessment Team (TPAT) examined the Texas Assessment of Knowledge and Skills (TAKS) to determine whether it is a valid indicator of physics preparation for future course work and employment, and of the knowledge and skills needed to act as an informed citizen in a technological society. We categorized science items from the 2003 and 2004 10th and 11th grade TAKS by content area(s) covered, knowledge and skills required to select the correct answer, and overall quality. We also analyzed a 5000 student sample of item-level results from the 2004 11th grade exam using standard statistical methods employed by test developers (factor analysis and Item Response Theory). Triangulation of our results revealed strengths and weaknesses of the different methods of analysis. The TAKS was found to be only weakly indicative of physics preparation and we make recommendations for increasing the validity of standardized physics testing..
NASA Astrophysics Data System (ADS)
Ctvrtnickova, T.; Mateo, M. P.; Yañez, A.; Nicolas, G.
2011-04-01
Presented work brings results of Laser-Induced Breakdown Spectroscopy (LIBS) and Thermo-Mechanical Analysis (TMA) of coals and coal blends used in coal fired power plants all over Spain. Several coal specimens, its blends and corresponding laboratory ash were analyzed by mentioned techniques and results were compared to standard laboratory methods. The indices of slagging, which predict the tendency of coal ash deposition on the boiler walls, were determined by means of standard chemical analysis, LIBS and TMA. The optimal coal suitable to be blended with the problematic national lignite coal was suggested in order to diminish the slagging problems. Used techniques were evaluated based on the precision, acquisition time, extension and quality of information they could provide. Finally, the applicability of LIBS and TMA to the successful calculation of slagging indices is discussed and their substitution of time-consuming and instrumentally difficult standard methods is considered.
NASA Technical Reports Server (NTRS)
Marley, Mike
2008-01-01
The focus of this paper will be on the thermal balance testing for the Operationally Responsive Space Standard Bus Battery. The Standard Bus thermal design required that the battery be isolated from the bus itself. This required the battery to have its own thermal control, including heaters and a radiator surface. Since the battery was not ready for testing during the overall bus thermal balance testing, a separate test was conducted to verify the thermal design for the battery. This paper will discuss in detail, the test set up, test procedure, and results from this test. Additionally this paper will consider the methods taken to determine the heat dissipation of the battery during charge and discharge. It seems that the heat dissipation for Lithium Ion batteries is relatively unknown and hard to quantify. The methods used during test and the post test analysis to estimate the heat dissipation of the battery will be discussed.
Li, Dan; Jiang, Jia; Han, Dandan; Yu, Xinyu; Wang, Kun; Zang, Shuang; Lu, Dayong; Yu, Aimin; Zhang, Ziwei
2016-04-05
A new method is proposed for measuring the antioxidant capacity by electron spin resonance spectroscopy based on the loss of electron spin resonance signal after Cu(2+) is reduced to Cu(+) with antioxidant. Cu(+) was removed by precipitation in the presence of SCN(-). The remaining Cu(2+) was coordinated with diethyldithiocarbamate, extracted into n-butanol and determined by electron spin resonance spectrometry. Eight standards widely used in antioxidant capacity determination, including Trolox, ascorbic acid, ferulic acid, rutin, caffeic acid, quercetin, chlorogenic acid, and gallic acid were investigated. The standard curves for determining the eight standards were plotted, and results showed that the linear regression correlation coefficients were all high enough (r > 0.99). Trolox equivalent antioxidant capacity values for the antioxidant standards were calculated, and a good correlation (r > 0.94) between the values obtained by the present method and cupric reducing antioxidant capacity method was observed. The present method was applied to the analysis of real fruit samples and the evaluation of the antioxidant capacity of these fruits.
Limitation of the Cavitron technique by conifer pit aspiration.
Beikircher, B; Ameglio, T; Cochard, H; Mayr, S
2010-07-01
The Cavitron technique facilitates time and material saving for vulnerability analysis. The use of rotors with small diameters leads to high water pressure gradients (DeltaP) across samples, which may cause pit aspiration in conifers. In this study, the effect of pit aspiration on Cavitron measurements was analysed and a modified 'conifer method' was tested which avoids critical (i.e. pit aspiration inducing) DeltaP. Four conifer species were used (Juniperus communis, Picea abies, Pinus sylvestris, and Larix decidua) for vulnerability analysis based on the standard Cavitron technique and the conifer method. In addition, DeltaP thresholds for pit aspiration were determined and water extraction curves were constructed. Vulnerability curves obtained with the standard method showed generally a less negative P for the induction of embolism than curves of the conifer method. Differences were species-specific with the smallest effects in Juniperus. Larix showed the most pronounced shifts in P(50) (pressure at 50% loss of conductivity) between the standard (-1.5 MPa) and the conifer (-3.5 MPa) methods. Pit aspiration occurred at the lowest DeltaP in Larix and at the highest in Juniperus. Accordingly, at a spinning velocity inducing P(50), DeltaP caused only a 4% loss of conductivity induced by pit aspiration in Juniperus, but about 60% in Larix. Water extraction curves were similar to vulnerability curves indicating that spinning itself did not affect pits. Conifer pit aspiration can have major influences on Cavitron measurements and lead to an overestimation of vulnerability thresholds when a small rotor is used. Thus, the conifer method presented here enables correct vulnerability analysis by avoiding artificial conductivity losses.
A convenient method for X-ray analysis in TEM that measures mass thickness and composition
NASA Astrophysics Data System (ADS)
Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.
2018-01-01
We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Liang, Xue; Ji, Hai-yan; Wang, Peng-xin; Rao, Zhen-hong; Shen, Bing-hui
2010-01-01
Preprocess method of multiplicative scatter correction (MSC) was used to reject noises in the original spectra produced by the environmental physical factor effectively, then the principal components of near-infrared spectroscopy were calculated by nonlinear iterative partial least squares (NIPALS) before building the back propagation artificial neural networks method (BP-ANN), and the numbers of principal components were calculated by the method of cross validation. The calculated principal components were used as the inputs of the artificial neural networks model, and the artificial neural networks model was used to find the relation between chlorophyll in winter wheat and reflective spectrum, which can predict the content of chlorophyll in winter wheat. The correlation coefficient (r) of calibration set was 0.9604, while the standard deviation (SD) and relative standard deviation (RSD) was 0.187 and 5.18% respectively. The correlation coefficient (r) of predicted set was 0.9600, and the standard deviation (SD) and relative standard deviation (RSD) was 0.145 and 4.21% respectively. It means that the MSC-ANN algorithm can reject noises in the original spectra produced by the environmental physical factor effectively and set up an exact model to predict the contents of chlorophyll in living leaves veraciously to replace the classical method and meet the needs of fast analysis of agricultural products.
New robust bilinear least squares method for the analysis of spectral-pH matrix data.
Goicoechea, Héctor C; Olivieri, Alejandro C
2005-07-01
A new second-order multivariate method has been developed for the analysis of spectral-pH matrix data, based on a bilinear least-squares (BLLS) model achieving the second-order advantage and handling multiple calibration standards. A simulated Monte Carlo study of synthetic absorbance-pH data allowed comparison of the newly proposed BLLS methodology with constrained parallel factor analysis (PARAFAC) and with the combination multivariate curve resolution-alternating least-squares (MCR-ALS) technique under different conditions of sample-to-sample pH mismatch and analyte-background ratio. The results indicate an improved prediction ability for the new method. Experimental data generated by measuring absorption spectra of several calibration standards of ascorbic acid and samples of orange juice were subjected to second-order calibration analysis with PARAFAC, MCR-ALS, and the new BLLS method. The results indicate that the latter method provides the best analytical results in regard to analyte recovery in samples of complex composition requiring strict adherence to the second-order advantage. Linear dependencies appear when multivariate data are produced by using the pH or a reaction time as one of the data dimensions, posing a challenge to classical multivariate calibration models. The presently discussed algorithm is useful for these latter systems.
Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin
2017-11-10
The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Cline, James P; Mendenhall, Marcus H; Black, David; Windover, Donald; Henins, Albert
2015-01-01
The laboratory X-ray powder diffractometer is one of the primary analytical tools in materials science. It is applicable to nearly any crystalline material, and with advanced data analysis methods, it can provide a wealth of information concerning sample character. Data from these machines, however, are beset by a complex aberration function that can be addressed through calibration with the use of NIST Standard Reference Materials (SRMs). Laboratory diffractometers can be set up in a range of optical geometries; considered herein are those of Bragg-Brentano divergent beam configuration using both incident and diffracted beam monochromators. We review the origin of the various aberrations affecting instruments of this geometry and the methods developed at NIST to align these machines in a first principles context. Data analysis methods are considered as being in two distinct categories: those that use empirical methods to parameterize the nature of the data for subsequent analysis, and those that use model functions to link the observation directly to a specific aspect of the experiment. We consider a multifaceted approach to instrument calibration using both the empirical and model based data analysis methods. The particular benefits of the fundamental parameters approach are reviewed.
An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Evans, Katherine
2018-03-01
Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.
Pazesh, Samaneh; Lazorova, Lucia; Berggren, Jonas; Alderborn, Göran; Gråsjö, Johan
2016-09-10
The main purpose of the study was to evaluate various pre-processing and quantification approaches of Raman spectrum to quantify low level of amorphous content in milled lactose powder. To improve the quantification analysis, several spectral pre-processing methods were used to adjust background effects. The effects of spectral noise on the variation of determined amorphous content were also investigated theoretically by propagation of error analysis and were compared to the experimentally obtained values. Additionally, the applicability of calibration method with crystalline or amorphous domains in the estimation of amorphous content in milled lactose powder was discussed. Two straight baseline pre-processing methods gave the best and almost equal performance. By the succeeding quantification methods, PCA performed best, although the classical least square analysis (CLS) gave comparable results, while peak parameter analysis displayed to be inferior. The standard deviations of experimental determined percentage amorphous content were 0.94% and 0.25% for pure crystalline and pure amorphous samples respectively, which was very close to the standard deviation values from propagated spectral noise. The reasonable conformity between the milled samples spectra and synthesized spectra indicated representativeness of physical mixtures with crystalline or amorphous domains in the estimation of apparent amorphous content in milled lactose. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.
Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P
2016-07-01
Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Viera, Mariela S; Rizzetti, Tiele M; de Souza, Maiara P; Martins, Manoel L; Prestes, Osmar D; Adaime, Martha B; Zanella, Renato
2017-12-01
In this study, a QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) method, optimized by a 2 3 full factorial design, was developed for the determination of 72 pesticides in plant parts of carrot, corn, melon, rice, soy, silage, tobacco, cassava, lettuce and wheat by ultra-high-performance liquid chromatographic tandem mass spectrometry (UHPLC-MS/MS). Considering the complexity of these matrices and the need of use calibration in matrix, a new calibration approach based on single level standard addition in the sample (SLSAS) was proposed in this work and compared with the matrix-matched calibration (MMC), the procedural standard calibration (PSC) and the diluted standard addition calibration (DSAC). All approaches presented satisfactory validation parameters with recoveries from 70 to 120% and relative standard deviations≤20%. SLSAS was the most practical from the evaluated approaches and proved to be an effective way of calibration. Method limit of detection were between 4.8 and 48μgkg -1 and limit of quantification were from 16 to 160μgkg -1 . Method application to different kinds of plants found residues of 20 pesticides that were quantified with z-scores values≤2 in comparison with other calibration approaches. The proposed QuEChERS method combined with UHPLC-MS/MS analysis and using an easy and effective calibration procedure presented satisfactory results for pesticide residues determination in different crop plants and is a good alternative for routine analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
de Babos, Diego Victor; Bechlin, Marcos André; Barros, Ariane Isis; Ferreira, Edilene Cristina; Gomes Neto, José Anchieta; de Oliveira, Silvana Ruella
2016-05-15
A new method is proposed for the simultaneous determination of Mo and Ni in plant materials by high-resolution continuum source graphite furnace atomic absorption spectrometry (HR-CS GFAAS), employing direct solid sample analysis (DSS) and internal standardization (IS). Cobalt was used as internal standard to minimize matrix effects during Ni determinations, enabling the use of aqueous standards for calibration. Correlation coefficients for the calibration curves were typically better than 0.9937. The performance of the method was checked by analysis of six plant certified reference materials, and the results for Mo and Ni were in agreement with the certified values (95% confidence level, t-test). Analysis was made of different types of plant materials used as renewable sources of energy, including sugarcane leaves, banana tree fiber, soybean straw, coffee pods, orange bagasse, peanut hulls, and sugarcane bagasse. The concentrations found for Mo and Ni ranged from 0.08 to 0.63 ng mg(-1) and from 0.41 to 6.92 ng mg(-1), respectively. Precision (RSD) varied from 2.1% to 11% for Mo and from 3.7% to 10% for Ni. Limits of quantification of 0.055 and 0.074 ng were obtained for Mo and Ni, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.
Quality Control Method for a Micro-Nano-Channel Microfabricated Device
NASA Technical Reports Server (NTRS)
Grattoni, Alessandro; Ferrari, Mauro; Li, Xuewu
2012-01-01
A variety of silicon-fabricated devices is used in medical applications such as drug and cell delivery, and DNA and protein separation and analysis. When a fluidic device inlet is connected to a compressed gas reservoir, and the outlet is at a lower pressure, a gas flow occurs through the membrane toward the outside. The method relies on the measurement of the gas pressure over the elapsed time inside the upstream and downstream environments. By knowing the volume of the upstream reservoir, the gas flow rate through the membrane over the pressure drop can be calculated. This quality control method consists of measuring the gas flow through a device and comparing the results with a standard curve, which can be obtained by testing standard devices. Standard devices can be selected through a variety of techniques, both destructive and nondestructive, such as SEM, AFM, and standard particle filtration.
A novel spinal kinematic analysis using X-ray imaging and vicon motion analysis: a case study.
Noh, Dong K; Lee, Nam G; You, Joshua H
2014-01-01
This study highlights a novel spinal kinematic analysis method and the feasibility of X-ray imaging measurements to accurately assess thoracic spine motion. The advanced X-ray Nash-Moe method and analysis were used to compute the segmental range of motion in thoracic vertebra pedicles in vivo. This Nash-Moe X-ray imaging method was compared with a standardized method using the Vicon 3-dimensional motion capture system. Linear regression analysis showed an excellent and significant correlation between the two methods (R2 = 0.99, p < 0.05), suggesting that the analysis of spinal segmental range of motion using X-ray imaging measurements was accurate and comparable to the conventional 3-dimensional motion analysis system. Clinically, this novel finding is compelling evidence demonstrating that measurements with X-ray imaging are useful to accurately decipher pathological spinal alignment and movement impairments in idiopathic scoliosis (IS).
Roberts, D J; Spellman, R A; Sanok, K; Chen, H; Chan, M; Yurt, P; Thakur, A K; DeVito, G L; Murli, H; Stankowski, L F
2012-05-01
A flow cytometric procedure for determining mitotic index (MI) as part of the metaphase chromosome aberrations assay, developed and utilized routinely at Pfizer as part of their standard assay design, has been adopted successfully by Covance laboratories. This method, using antibodies against phosphorylated histone tails (H3PS10) and nucleic acid stain, has been evaluated by the two independent test sites and compared to manual scoring. Primary human lymphocytes were treated with cyclophosphamide, mitomycin C, benzo(a)pyrene, and etoposide at concentrations inducing dose-dependent cytotoxicity. Deming regression analysis indicates that the results generated via flow cytometry (FCM) were more consistent between sites than those generated via microscopy. Further analysis using the Bland-Altman modification of the Tukey mean difference method supports this finding, as the standard deviations (SDs) of differences in MI generated by FCM were less than half of those generated manually. Decreases in scoring variability owing to the objective nature of FCM, and the greater number of cells analyzed, make FCM a superior method for MI determination. In addition, the FCM method has proven to be transferable and easily integrated into standard genetic toxicology laboratory operations. Copyright © 2012 Wiley Periodicals, Inc.
Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.
Demerdash, Omar N A; Mitchell, Julie C
2012-07-01
Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.
Gao, Hongying; Deng, Shibing; Obach, R Scott
2015-12-01
An unbiased scanning methodology using ultra high-performance liquid chromatography coupled with high-resolution mass spectrometry was used to bank data and plasma samples for comparing the data generated at different dates. This method was applied to bank the data generated earlier in animal samples and then to compare the exposure to metabolites in animal versus human for safety assessment. With neither authentic standards nor prior knowledge of the identities and structures of metabolites, full scans for precursor ions and all ion fragments (AIF) were employed with a generic gradient LC method to analyze plasma samples at positive and negative polarity, respectively. In a total of 22 tested drugs and metabolites, 21 analytes were detected using this unbiased scanning method except that naproxen was not detected due to low sensitivity at negative polarity and interference at positive polarity; and 4'- or 5-hydroxy diclofenac was not separated by a generic UPLC method. Statistical analysis of the peak area ratios of the analytes versus the internal standard in five repetitive analyses over approximately 1 year demonstrated that the analysis variation was significantly different from sample instability. The confidence limits for comparing the exposure using peak area ratio of metabolites in animal plasma versus human plasma measured over approximately 1 year apart were comparable to the analysis undertaken side by side on the same days. These statistical analysis results showed it was feasible to compare data generated at different dates with neither authentic standards nor prior knowledge of the analytes.
Airborne environmental endotoxin: a cross-validation of sampling and analysis techniques.
Walters, M; Milton, D; Larsson, L; Ford, T
1994-01-01
A standard method for measurement of airborne environmental endotoxin was developed and field tested in a fiberglass insulation-manufacturing facility. This method involved sampling with a capillary-pore membrane filter, extraction in buffer using a sonication bath, and analysis by the kinetic-Limulus assay with resistant-parallel-line estimation (KLARE). Cross-validation of the extraction and assay method was performed by comparison with methanolysis of samples followed by 3-hydroxy fatty acid (3-OHFA) analysis by gas chromatography-mass spectrometry. Direct methanolysis of filter samples and methanolysis of buffer extracts of the filters yielded similar 3-OHFA content (P = 0.72); the average difference was 2.1%. Analysis of buffer extracts for endotoxin content by the KLARE method and by gas chromatography-mass spectrometry for 3-OHFA content produced similar results (P = 0.23); the average difference was 0.88%. The source of endotoxin was gram-negative bacteria growing in recycled washwater used to clean the insulation-manufacturing equipment. The endotoxin and bacteria become airborne during spray cleaning operations. The types of 3-OHFAs in bacteria cultured from the washwater, present in the washwater and in the air, were similar. Virtually all of the bacteria cultured from air and water were gram negative composed mostly of two species, Deleya aesta and Acinetobacter johnsonii. Airborne countable bacteria correlated well with endotoxin (r2 = 0.64). Replicate sampling showed that results with the standard sampling, extraction, and Limulus assay by the KLARE method were highly reproducible (95% confidence interval for endotoxin measurement +/- 0.28 log10). These results demonstrate the accuracy, precision, and sensitivity of the standard procedure proposed for airborne environmental endotoxin. PMID:8161191
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
Well-characterized and standardized methods are the foundation upon which monitoring of regulated and unregulated contaminants in drinking water are based. To obtain reliable, high quality data for trace analysis of contaminants, these methods must be rugged, selective and sensit...
Evaluation of digestion methods for analysis of trace metals in mammalian tissues and NIST 1577c.
Binder, Grace A; Metcalf, Rainer; Atlas, Zachary; Daniel, Kenyon G
2018-02-15
Digestion techniques for ICP analysis have been poorly studied for biological samples. This report describes an optimized method for analysis of trace metals that can be used across a variety of sample types. Digestion methods were tested and optimized with the analysis of trace metals in cancerous as compared to normal tissue as the end goal. Anthropological, forensic, oncological and environmental research groups can employ this method reasonably cheaply and safely whilst still being able to compare between laboratories. We examined combined HNO 3 and H 2 O 2 digestion at 170 °C for human, porcine and bovine samples whether they are frozen, fresh or lyophilized powder. Little discrepancy is found between microwave digestion and PFA Teflon pressure vessels. The elements of interest (Cu, Zn, Fe and Ni) yielded consistently higher and more accurate values on standard reference material than samples heated to 75 °C or samples that utilized HNO 3 alone. Use of H 2 SO 4 does not improve homogeneity of the sample and lowers precision during ICP analysis. High temperature digestions (>165 °C) using a combination of HNO 3 and H 2 O 2 as outlined are proposed as a standard technique for all mammalian tissues, specifically, human tissues and yield greater than 300% higher values than samples digested at 75 °C regardless of the acid or acid combinations used. The proposed standardized technique is designed to accurately quantify potential discrepancies in metal loads between cancerous and healthy tissues and applies to numerous tissue studies requiring quick, effective and safe digestions. Copyright © 2017 Elsevier Inc. All rights reserved.
A fast and efficient method for device level layout analysis
NASA Astrophysics Data System (ADS)
Dong, YaoQi; Zou, Elaine; Pang, Jenny; Huang, Lucas; Yang, Legender; Zhang, Chunlei; Du, Chunshan; Hu, Xinyi; Wan, Qijian
2017-03-01
There is an increasing demand for device level layout analysis, especially as technology advances. The analysis is to study standard cells by extracting and classifying critical dimension parameters. There are couples of parameters to extract, like channel width, length, gate to active distance, and active to adjacent active distance, etc. for 14nm technology, there are some other parameters that are cared about. On the one hand, these parameters are very important for studying standard cell structures and spice model development with the goal of improving standard cell manufacturing yield and optimizing circuit performance; on the other hand, a full chip device statistics analysis can provide useful information to diagnose the yield issue. Device analysis is essential for standard cell customization and enhancements and manufacturability failure diagnosis. Traditional parasitic parameters extraction tool like Calibre xRC is powerful but it is not sufficient for this device level layout analysis application as engineers would like to review, classify and filter out the data more easily. This paper presents a fast and efficient method based on Calibre equation-based DRC (eqDRC). Equation-based DRC extends the traditional DRC technology to provide a flexible programmable modeling engine which allows the end user to define grouped multi-dimensional feature measurements using flexible mathematical expressions. This paper demonstrates how such an engine and its programming language can be used to implement critical device parameter extraction. The device parameters are extracted and stored in a DFM database which can be processed by Calibre YieldServer. YieldServer is data processing software that lets engineers query, manipulate, modify, and create data in a DFM database. These parameters, known as properties in eqDRC language, can be annotated back to the layout for easily review. Calibre DesignRev can create a HTML formatted report of the results displayed in Calibre RVE which makes it easy to share results among groups. This method has been proven and used in SMIC PDE team and SPICE team.
Setting the standards for signal transduction research.
Saez-Rodriguez, Julio; Alexopoulos, Leonidas G; Stolovitzky, Gustavo
2011-02-15
Major advances in high-throughput technology platforms, coupled with increasingly sophisticated computational methods for systematic data analysis, have provided scientists with tools to better understand the complexity of signaling networks. In this era of massive and diverse data collection, standardization efforts that streamline data gathering, analysis, storage, and sharing are becoming a necessity. Here, we give an overview of current technologies to study signal transduction. We argue that along with the opportunities the new technologies open, their heterogeneous nature poses critical challenges for data handling that are further increased when data are to be integrated in mathematical models. Efficient standardization through markup languages and data annotation is a sine qua non condition for a systems-level analysis of signaling processes. It remains to be seen the extent to which and the speed at which the emerging standardization efforts will be embraced by the signaling community.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards.
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument.
Translating Radiometric Requirements for Satellite Sensors to Match International Standards
Pearlman, Aaron; Datla, Raju; Kacker, Raghu; Cao, Changyong
2014-01-01
International scientific standards organizations created standards on evaluating uncertainty in the early 1990s. Although scientists from many fields use these standards, they are not consistently implemented in the remote sensing community, where traditional error analysis framework persists. For a satellite instrument under development, this can create confusion in showing whether requirements are met. We aim to create a methodology for translating requirements from the error analysis framework to the modern uncertainty approach using the product level requirements of the Advanced Baseline Imager (ABI) that will fly on the Geostationary Operational Environmental Satellite R-Series (GOES-R). In this paper we prescribe a method to combine several measurement performance requirements, written using a traditional error analysis framework, into a single specification using the propagation of uncertainties formula. By using this approach, scientists can communicate requirements in a consistent uncertainty framework leading to uniform interpretation throughout the development and operation of any satellite instrument. PMID:26601032
Spike shape analysis of electromyography for parkinsonian tremor evaluation.
Marusiak, Jarosław; Andrzejewska, Renata; Świercz, Dominika; Kisiel-Sajewicz, Katarzyna; Jaskólska, Anna; Jaskólski, Artur
2015-12-01
Standard electromyography (EMG) parameters have limited utility for evaluation of Parkinson disease (PD) tremor. Spike shape analysis (SSA) EMG parameters are more sensitive than standard EMG parameters for studying motor control mechanisms in healthy subjects. SSA of EMG has not been used to assess parkinsonian tremor. This study assessed the utility of SSA and standard time and frequency analysis for electromyographic evaluation of PD-related resting tremor. We analyzed 1-s periods of EMG recordings to detect nontremor and tremor signals in relaxed biceps brachii muscle of seven mild to moderate PD patients. SSA revealed higher mean spike amplitude, duration, and slope and lower mean spike frequency in tremor signals than in nontremor signals. Standard EMG parameters (root mean square, median, and mean frequency) did not show differences between the tremor and nontremor signals. SSA of EMG data is a sensitive method for parkinsonian tremor evaluation. © 2015 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Snelling, Anastasia M.; Yezek, Jennifer
2012-01-01
Background: The study investigated how nutrient standards affected the number of kilocalories and grams of fat and saturated fat in competitive foods offered and sold in 3 high schools. Methods: The study is a quasi-experimental design with 3 schools serving as the units of assignment and analysis. The effect of the nutrient standards was measured…
NASA Astrophysics Data System (ADS)
Cao, Qian; Wan, Xiaoxia; Li, Junfeng; Liu, Qiang; Liang, Jingxing; Li, Chan
2016-10-01
This paper proposed two weight functions based on principal component analysis (PCA) to reserve more colorimetric information in spectral data compression process. One weight function consisted of the CIE XYZ color-matching functions representing the characteristic of the human visual system, while another was made up of the CIE XYZ color-matching functions of human visual system and relative spectral power distribution of the CIE standard illuminant D65. The improvement obtained from the proposed two methods were tested to compress and reconstruct the reflectance spectra of 1600 glossy Munsell color chips and 1950 Natural Color System color chips as well as six multispectral images. The performance was evaluated by the mean values of color difference under the CIE 1931 standard colorimetric observer and the CIE standard illuminant D65 and A. The mean values of root mean square errors between the original and reconstructed spectra were also calculated. The experimental results show that the proposed two methods significantly outperform the standard PCA and another two weighted PCA in the aspects of colorimetric reconstruction accuracy with very slight degradation in spectral reconstruction accuracy. In addition, weight functions with the CIE standard illuminant D65 can improve the colorimetric reconstruction accuracy compared to weight functions without the CIE standard illuminant D65.
Development of a Thiolysis HPLC Method for the Analysis of Procyanidins in Cranberry Products.
Gao, Chi; Cunningham, David G; Liu, Haiyan; Khoo, Christina; Gu, Liwei
2018-03-07
The objective of this study was to develop a thiolysis HPLC method to quantify total procyanidins, the ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Cysteamine was utilized as a low-odor substitute of toluene-α-thiol for thiolysis depolymerization. A reaction temperature of 70 °C and reaction time of 20 min, in 0.3 M of HCl, were determined to be optimum depolymerization conditions. Thiolytic products of cranberry procyanidins were separated by RP-HPLC and identified using high-resolution mass spectrometry. Standards curves of good linearity were obtained on thiolyzed procyanidin dimer A2 and B2 external standards. The detection and quantification limits, recovery, and precision of this method were validated. The new method was applied to quantitate total procyanidins, average degree of polymerization, ratio of A-type linkages, and A-type procyanidin equivalents in cranberry products. Results showed that the method was suitable for quantitative and qualitative analysis of procyanidins in cranberry products.
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
Keller, Nicole S; Stefánsson, Andri; Sigfússon, Bergur
2014-10-01
A method for the analysis of arsenic species in aqueous sulfide samples is presented. The method uses an ion chromatography system connected with a Hydride-Generation Atomic Fluorescence Spectrometer (IC-HG-AFS). With this method inorganic As(III) and As(V) species in water samples can be analyzed, including arsenite (HnAs(III)O3(n-3)), thioarsenite (HnAs(III)S3(n-3)), arsenate (HnAs(V)O4(n-3)), monothioarsenate (HnAs(V)SO3(n-3)), dithioarsenate (HnAs(V)S2O2(n-3)), trithioarsenate (HnAs(V)S3O(n-3)) and tetrathioarsenate (HnAs(V)S4(n-3)). The peak identification and retention times were determined based on standard analysis of the various arsenic compounds. The analytical detection limit was ~1-3 µg L(-1) (LOD), depending on the quality of the baseline. This low detection limit makes this method also applicable to discriminate between waters meeting the drinking water standard of max. 10 µg L(-1) As, and waters that do not meet this standard. The new method was successfully applied for on-site determination of arsenic species in natural sulfidic waters, in which seven species were unambiguously identified. Copyright © 2014 Elsevier B.V. All rights reserved.
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
Collier, J W; Shah, R B; Bryant, A R; Habib, M J; Khan, M A; Faustino, P J
2011-02-20
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (L-T(4)) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250 mm × 3.9 mm) using a 0.01 M phosphate buffer (pH 3.0)-methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 μL and the column temperature was maintained at 28°C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r(2)>0.99) over the analytical range of 0.08-0.8 μg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for L-T(4) over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. Published by Elsevier B.V.
Collier, J.W.; Shah, R.B.; Bryant, A.R.; Habib, M.J.; Khan, M.A.; Faustino, P.J.
2011-01-01
A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (l-T4) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250mm × 3.9mm) using a 0.01 M phosphate buffer (pH 3.0)–methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 µL and the column temperature was maintained at 28 °C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 0.08–0.8 µg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for l-T4 over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. PMID:20947276
Yoakum, A M; Stewart, P L; Sterrett, J E
1975-01-01
An emission spectrochemical method is described for the determination of trace quantities of platinum, lead, and manganese in biological tissues. Total energy burns in an argon-oxygen atmosphere are employed. Sample preparation, conditions of analysis, and preparation of standards are discussed. The precision of the method is consistently better than +/- 15%, and comparative analyses indicate comparable accuracies. Data obtained for experimental rat tissues and for selected autopsy tissues are presented. PMID:1157798
Lin, Kai; Collins, Jeremy D; Lloyd-Jones, Donald M; Jolly, Marie-Pierre; Li, Debiao; Markl, Michael; Carr, James C
2016-03-01
To assess the performance of automated quantification of left ventricular function and mass based on heart deformation analysis (HDA) in asymptomatic older adults. This study complied with Health Insurance Portability and Accountability Act regulations. Following the approval of the institutional review board, 160 asymptomatic older participants were recruited for cardiac magnetic resonance imaging including two-dimensional cine images covering the entire left ventricle in short-axis view. Data analysis included the calculation of left ventricular ejection fraction (LVEF), left ventricular mass (LVM), and cardiac output (CO) using HDA and standard global cardiac function analysis (delineation of end-systolic and end-diastolic left ventricle epi- and endocardial borders). The agreement between methods was evaluated using intraclass correlation coefficient (ICC) and coefficient of variation (CoV). HDA had a shorter processing time than the standard method (1.5 ± 0.3 min/case vs. 5.8 ± 1.4 min/case, P < 0.001). There was good agreement for LVEF (ICC = 0.552, CoV = 10.5%), CO (ICC = 0.773, CoV = 13.5%), and LVM (ICC = 0.859, CoV = 14.5%) acquired with the standard method and HDA. There was a systemic bias toward lower LVEF (62.8% ± 8.3% vs. 69.3% ± 6.7%, P < 0.001) and CO (4.4 ± 1.0 L/min vs. 4.8 ± 1.3 L/min, P < 0.001) by HDA compared to the standard technique. Conversely, HDA overestimated LVM (114.8 ± 30.1 g vs. 100.2 ± 29.0 g, P < 0.001) as compared to the reference method. HDA has the potential to measure LVEF, CO, and LVM without the need for user interaction based on standard cardiac two-dimensional cine images. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
High-coverage quantitative proteomics using amine-specific isotopic labeling.
Melanson, Jeremy E; Avery, Steven L; Pinto, Devanand M
2006-08-01
Peptide dimethylation with isotopically coded formaldehydes was evaluated as a potential alternative to techniques such as the iTRAQ method for comparative proteomics. The isotopic labeling strategy and custom-designed protein quantitation software were tested using protein standards and then applied to measure proteins levels associated with Alzheimer's disease (AD). The method provided high accuracy (10% error), precision (14% RSD) and coverage (70%) when applied to the analysis of a standard solution of BSA by LC-MS/MS. The technique was then applied to measure protein abundance levels in brain tissue afflicted with AD relative to normal brain tissue. 2-D LC-MS analysis identified 548 unique proteins (p<0.05). Of these, 349 were quantified with two or more peptides that met the statistical criteria used in this study. Several classes of proteins exhibited significant changes in abundance. For example, elevated levels of antioxidant proteins and decreased levels of mitochondrial electron transport proteins were observed. The results demonstrate the utility of the labeling method for high-throughput quantitative analysis.
OSPAR standard method and software for statistical analysis of beach litter data.
Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit
2017-09-15
The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.
Viidanoja, Jyrki
2015-09-15
A new, sensitive and selective liquid chromatography-electrospray ionization-tandem mass spectrometric (LC-ESI-MS/MS) method was developed for the analysis of Phospholipids (PLs) in bio-oils and fats. This analysis employs hydrophilic interaction liquid chromatography-scheduled multiple reaction monitoring (HILIC-sMRM) with a ZIC-cHILIC column. Eight PL class selective internal standards (homologs) were used for the semi-quantification of 14 PL classes for the first time. More than 400 scheduled MRMs were used for the measurement of PLs with a run time of 34min. The method's performance was evaluated for vegetable oil, animal fat and algae oil. The averaged within-run precision and between-run precision were ≤10% for all of the PL classes that had a direct homologue as an internal standard. The method accuracy was generally within 80-120% for the tested PL analytes in all three sample matrices. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis of clonazepam in a tablet dosage form using smallbore HPLC.
Spell, J C; Stewart, J T
1998-11-01
A stability indicating, reversed phase high-performance liquid chromatographic method utilizing a smallbore HPLC column has been developed for the determination of clonazepam in a commercial tablet dosage form. The use of a small bore column results in a substantial solvent savings, as well as a greater mass sensitivity, especially in the identification of degradation peaks in a chromatogram. The method involves ultraviolet detection at 254 nm and utilized a 150 x 3.0 mm i.d. column packed with 3 microm octyldecylsilane particles with a mobile phase of water methanol acetonitrile (40:30:30, v/v/v) at a flow rate of 400 microl min(-1) at ambient temperature, with and without the use of 1,2-dichlorobenzene as the internal standard. The current USP method for the analysis of clonazepam using a 300 x 3.9 mm i.d. conventional octyldecylsilane column was utilized as a comparison to the smallbore method. The retention times for clonazepam and the internal standard on the 3.0 mm i.d. column were 4.0 and 12.5 min, respectively. The intra- and interday RSDs on the 3.0 mm i.d. column were < 0.55% (n =4) using the internal standard, and < 0.19% (n = 4) without the internal standard at the lower limit of the standard curve, 50 microg ml(-1) and had a limit of detection of 24 ng ml(-1). The assay using the 3.0 mm i.d. column was shown to be suitable for measuring clonazepam in a tablet dosage form.
Sauer, Vernon B.
2002-01-01
Surface-water computation methods and procedures are described in this report to provide standards from which a completely automated electronic processing system can be developed. To the greatest extent possible, the traditional U. S. Geological Survey (USGS) methodology and standards for streamflow data collection and analysis have been incorporated into these standards. Although USGS methodology and standards are the basis for this report, the report is applicable to other organizations doing similar work. The proposed electronic processing system allows field measurement data, including data stored on automatic field recording devices and data recorded by the field hydrographer (a person who collects streamflow and other surface-water data) in electronic field notebooks, to be input easily and automatically. A user of the electronic processing system easily can monitor the incoming data and verify and edit the data, if necessary. Input of the computational procedures, rating curves, shift requirements, and other special methods are interactive processes between the user and the electronic processing system, with much of this processing being automatic. Special computation procedures are provided for complex stations such as velocity-index, slope, control structures, and unsteady-flow models, such as the Branch-Network Dynamic Flow Model (BRANCH). Navigation paths are designed to lead the user through the computational steps for each type of gaging station (stage-only, stagedischarge, velocity-index, slope, rate-of-change in stage, reservoir, tide, structure, and hydraulic model stations). The proposed electronic processing system emphasizes the use of interactive graphics to provide good visual tools for unit values editing, rating curve and shift analysis, hydrograph comparisons, data-estimation procedures, data review, and other needs. Documentation, review, finalization, and publication of records are provided for with the electronic processing system, as well as archiving, quality assurance, and quality control.
Quality assurance and management in microelectronics companies: ISO 9000 versus Six Sigma
NASA Astrophysics Data System (ADS)
Lupan, Razvan; Kobi, Abdessamad; Robledo, Christian; Bacivarov, Ioan; Bacivarov, Angelica
2009-01-01
A strategy for the implementation of the Six Sigma method as an improvement solution for the ISO 9000:2000 Quality Standard is proposed. Our approach is focused on integrating the DMAIC cycle of the Six Sigma method with the PDCA process approach, highly recommended by the standard ISO 9000:2000. The Six Sigma steps applied to each part of the PDCA cycle are presented in detail, giving some tools and training examples. Based on this analysis the authors conclude that applying Six Sigma philosophy to the Quality Standard implementation process is the best way to achieve the optimal results in quality progress and therefore in customers satisfaction.
Time-domain representation of frequency-dependent foundation impedance functions
Safak, E.
2006-01-01
Foundation impedance functions provide a simple means to account for soil-structure interaction (SSI) when studying seismic response of structures. Impedance functions represent the dynamic stiffness of the soil media surrounding the foundation. The fact that impedance functions are frequency dependent makes it difficult to incorporate SSI in standard time-history analysis software. This paper introduces a simple method to convert frequency-dependent impedance functions into time-domain filters. The method is based on the least-squares approximation of impedance functions by ratios of two complex polynomials. Such ratios are equivalent, in the time-domain, to discrete-time recursive filters, which are simple finite-difference equations giving the relationship between foundation forces and displacements. These filters can easily be incorporated into standard time-history analysis programs. Three examples are presented to show the applications of the method.
Tenan, Matthew S; Tweedell, Andrew J; Haynes, Courtney A
2017-01-01
The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60-90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity.
Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun
2011-07-07
In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.
Huang, Shi; MacKinnon, David P.; Perrino, Tatiana; Gallo, Carlos; Cruden, Gracelyn; Brown, C Hendricks
2016-01-01
Mediation analysis often requires larger sample sizes than main effect analysis to achieve the same statistical power. Combining results across similar trials may be the only practical option for increasing statistical power for mediation analysis in some situations. In this paper, we propose a method to estimate: 1) marginal means for mediation path a, the relation of the independent variable to the mediator; 2) marginal means for path b, the relation of the mediator to the outcome, across multiple trials; and 3) the between-trial level variance-covariance matrix based on a bivariate normal distribution. We present the statistical theory and an R computer program to combine regression coefficients from multiple trials to estimate a combined mediated effect and confidence interval under a random effects model. Values of coefficients a and b, along with their standard errors from each trial are the input for the method. This marginal likelihood based approach with Monte Carlo confidence intervals provides more accurate inference than the standard meta-analytic approach. We discuss computational issues, apply the method to two real-data examples and make recommendations for the use of the method in different settings. PMID:28239330
NASA Astrophysics Data System (ADS)
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-12-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34-80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics.
Sitnikov, Dmitri G.; Monnin, Cian S.; Vuckovic, Dajana
2016-01-01
The comparison of extraction methods for global metabolomics is usually executed in biofluids only and focuses on metabolite coverage and method repeatability. This limits our detailed understanding of extraction parameters such as recovery and matrix effects and prevents side-by-side comparison of different sample preparation strategies. To address this gap in knowledge, seven solvent-based and solid-phase extraction methods were systematically evaluated using standard analytes spiked into both buffer and human plasma. We compared recovery, coverage, repeatability, matrix effects, selectivity and orthogonality of all methods tested for non-lipid metabolome in combination with reversed-phased and mixed-mode liquid chromatography mass spectrometry analysis (LC-MS). Our results confirmed wide selectivity and excellent precision of solvent precipitations, but revealed their high susceptibility to matrix effects. The use of all seven methods showed high overlap and redundancy which resulted in metabolite coverage increases of 34–80% depending on LC-MS method employed as compared to the best single extraction protocol (methanol/ethanol precipitation) despite 7x increase in MS analysis time and sample consumption. The most orthogonal methods to methanol-based precipitation were ion-exchange solid-phase extraction and liquid-liquid extraction using methyl-tertbutyl ether. Our results help facilitate rational design and selection of sample preparation methods and internal standards for global metabolomics. PMID:28000704
A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.
Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R
2017-07-01
The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.
Aziz, Mina S R; Dessouki, Omar; Samiezadeh, Saeid; Bougherara, Habiba; Schemitsch, Emil H; Zdero, Radovan
2017-08-01
Acetabular fractures potentially account for up to half of all pelvic fractures, while pelvic fractures potentially account for over one-tenth of all human bone fractures. This is the first biomechanical study to assess acetabular fracture fixation using plates versus cables in the presence of a total hip arthroplasty, as done for the elderly. In Phase 1, finite element (FE) models compared a standard plate method versus 3 cable methods for repairing an acetabular fracture (type: anterior column plus posterior hemi-transverse) subjected to a physiological-type compressive load of 2207N representing 3 x body weight for a 75kg person during walking. FE stress maps were compared to choose the most mechanically stable cable method, i.e. lowest peak bone stress. In Phase 2, mechanical tests were then done in artificial hemipelvises to compare the standard plate method versus the optimal cable method selected from Phase 1. FE analysis results showed peak bone stresses of 255MPa (Plate method), 205MPa (Mears cable method), 250MPa (Kang cable method), and 181MPa (Mouhsine cable method). Mechanical tests then showed that the Plate method versus the Mouhsine cable method selected from Phase 1 had higher stiffness (662versus 385N/mm, p=0.001), strength (3210versus 2060N, p=0.009), and failure energy (8.8versus 6.2J, p=0.002), whilst they were statistically equivalent for interfragmentary sliding (p≥0.179) and interfragmentary gapping (p≥0.08). The Plate method had superior mechanical properties, but the Mouhsine cable method may be a reasonable alternative if osteoporosis prevents good screw thread interdigitation during plating. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.
Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M
2012-11-08
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.
Grootswagers, Tijl; Wardle, Susan G; Carlson, Thomas A
2017-04-01
Multivariate pattern analysis (MVPA) or brain decoding methods have become standard practice in analyzing fMRI data. Although decoding methods have been extensively applied in brain-computer interfaces, these methods have only recently been applied to time series neuroimaging data such as MEG and EEG to address experimental questions in cognitive neuroscience. In a tutorial style review, we describe a broad set of options to inform future time series decoding studies from a cognitive neuroscience perspective. Using example MEG data, we illustrate the effects that different options in the decoding analysis pipeline can have on experimental results where the aim is to "decode" different perceptual stimuli or cognitive states over time from dynamic brain activation patterns. We show that decisions made at both preprocessing (e.g., dimensionality reduction, subsampling, trial averaging) and decoding (e.g., classifier selection, cross-validation design) stages of the analysis can significantly affect the results. In addition to standard decoding, we describe extensions to MVPA for time-varying neuroimaging data including representational similarity analysis, temporal generalization, and the interpretation of classifier weight maps. Finally, we outline important caveats in the design and interpretation of time series decoding experiments.
Irei, Satoshi
2016-01-01
Molecular marker analysis of environmental samples often requires time consuming preseparation steps. Here, analysis of low-volatile nonpolar molecular markers (5-6 ring polycyclic aromatic hydrocarbons or PAHs, hopanoids, and n-alkanes) without the preseparation procedure is presented. Analysis of artificial sample extracts was directly conducted by gas chromatography-mass spectrometry (GC-MS). After every sample injection, a standard mixture was also analyzed to make a correction on the variation of instrumental sensitivity caused by the unfavorable matrix contained in the extract. The method was further validated for the PAHs using the NIST standard reference materials (SRMs) and then applied to airborne particulate matter samples. Tests with the SRMs showed that overall our methodology was validated with the uncertainty of ~30%. The measurement results of airborne particulate matter (PM) filter samples showed a strong correlation between the PAHs, implying the contributions from the same emission source. Analysis of size-segregated PM filter samples showed that their size distributions were found to be in the PM smaller than 0.4 μm aerodynamic diameter. The observations were consistent with our expectation of their possible sources. Thus, the method was found to be useful for molecular marker studies. PMID:27127511
2011-01-01
Background Verbal autopsy methods are critically important for evaluating the leading causes of death in populations without adequate vital registration systems. With a myriad of analytical and data collection approaches, it is essential to create a high quality validation dataset from different populations to evaluate comparative method performance and make recommendations for future verbal autopsy implementation. This study was undertaken to compile a set of strictly defined gold standard deaths for which verbal autopsies were collected to validate the accuracy of different methods of verbal autopsy cause of death assignment. Methods Data collection was implemented in six sites in four countries: Andhra Pradesh, India; Bohol, Philippines; Dar es Salaam, Tanzania; Mexico City, Mexico; Pemba Island, Tanzania; and Uttar Pradesh, India. The Population Health Metrics Research Consortium (PHMRC) developed stringent diagnostic criteria including laboratory, pathology, and medical imaging findings to identify gold standard deaths in health facilities as well as an enhanced verbal autopsy instrument based on World Health Organization (WHO) standards. A cause list was constructed based on the WHO Global Burden of Disease estimates of the leading causes of death, potential to identify unique signs and symptoms, and the likely existence of sufficient medical technology to ascertain gold standard cases. Blinded verbal autopsies were collected on all gold standard deaths. Results Over 12,000 verbal autopsies on deaths with gold standard diagnoses were collected (7,836 adults, 2,075 children, 1,629 neonates, and 1,002 stillbirths). Difficulties in finding sufficient cases to meet gold standard criteria as well as problems with misclassification for certain causes meant that the target list of causes for analysis was reduced to 34 for adults, 21 for children, and 10 for neonates, excluding stillbirths. To ensure strict independence for the validation of methods and assessment of comparative performance, 500 test-train datasets were created from the universe of cases, covering a range of cause-specific compositions. Conclusions This unique, robust validation dataset will allow scholars to evaluate the performance of different verbal autopsy analytic methods as well as instrument design. This dataset can be used to inform the implementation of verbal autopsies to more reliably ascertain cause of death in national health information systems. PMID:21816095
NASA Astrophysics Data System (ADS)
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
Tracking the hyoid bone in videofluoroscopic swallowing studies
NASA Astrophysics Data System (ADS)
Kellen, Patrick M.; Becker, Darci; Reinhardt, Joseph M.; van Daele, Douglas
2008-03-01
Difficulty swallowing, or dysphagia, has become a growing problem. Swallowing complications can lead to malnutrition, dehydration, respiratory infection, and even death. The current gold standard for analyzing and diagnosing dysphagia is the videofluoroscopic barium swallow study. In these studies, a fluoroscope is used to image the patient ingesting barium solutions of different volumes and viscosities. The hyoid bone anchors many key muscles involved in swallowing and plays a key role in the process. Abnormal hyoid bone motion during a swallow can indicate swallowing dysfunction. Currently in clinical settings, hyoid bone motion is assessed qualitatively, which can be subject to intra-rater and inter-rater bias. This paper presents a semi-automatic method for tracking the hyoid bone that makes quantitative analysis feasible. The user defines a template of the hyoid on one frame, and this template is tracked across subsequent frames. The matching phase is optimized by predicting the position of the template based on kinematics. An expert speech pathologist marked the position of the hyoid on each frame of ten studies to serve as the gold standard. Results from performing Bland-Altman analysis at a 95% confidence interval showed a bias of 0.0+/-0.08 pixels in x and -0.08+/-0.09 pixels in y between the manually-defined gold standard and the proposed method. The average Pearson's correlation between the gold standard and the proposed method was 0.987 in x and 0.980 in y. This paper also presents a method for automatically establishing a patient-centric coordinate system for the interpretation of hyoid motion. This coordinate system corrects for upper body patient motion during the study and identifies superior-inferior and anterior-posterior motion components. These tools make the use of quantitative hyoid motion analysis feasible in clinical and research settings.
Francy, D.S.; Hart, T.L.; Virosteck, C.M.
1996-01-01
Bacterial injury, survival, and regrowth were investigated by use of replicate flow-through incubation chambers placed in the Cuyahoga River or Lake Erie in the greater Cleveland metropolitan area during seven 4-day field studies. The chambers contained wastewater or combined-sewer-overflow (CSO) effluents treated three ways-unchlorinated, chlorinated, and dechlorinated. At timestep intervals, the chamber contents were analyzed for concentrations of injured and healthy fecal coliforms by use of standard selective and enhanced-recovery membrane-filtration methods. Mean percent injuries and survivals were calculated from the fecal-coliform concentration data for each field study. The results of analysis of variance (ANOVA) indicated that treatment affected mean percent injury and survival, whereas site did not. In the warm-weather Lake Erie field study, but not in the warm-weather Cuyahoga River studies, the results of ANOVA indicated that dechlorination enhanced the repair of injuries and regrowth of chlorine-injured fecal coliforms on culture media over chlorination alone. The results of ANOVA on the percent injury from CSO effluent field studies indicated that dechlorination reduced the ability of organisms to recover and regrow on culture media over chlorination alone. However, because of atypical patterns of concentration increases and decreases in some CSO effluent samples, more work needs to be done before the effect of dechlorination and chlorination on reducing fecal-coliform concentrations in CSO effluents can be confirmed. The results of ANOVA on percent survivals found statistically significant differences among the three treatment methods for all but one study. Dechlorination was found to be less effective than chlorination alone in reducing the survival of fecal coliforms in wastewater effluent, but not in CSO effluent. If the concentration of fecal coliforms determined by use of the enhanced-recovery method can be predicted accurately from the concentration found by use of the standard method, then increased monitoring and expense to detect chlorine-injured organisms would be unnecessary. The results of linear regression analysis, however, indicated that the relation between enhanced-recovery and standard-method concentrations was best represented when the data were grouped by treatment. The model generated from linear regression of the unchlorinated data set provided an accurate estimate of enhanced-recovery concentrations from standard-method concentrations, whereas the models generated from the chlorinated and dechlorinated data sets did not. In addition, evaluation of fecal-coliform concentrations found in field studies in terms of Ohio recreational water-quality standards showed that concentrations obtained by standard and enhanced-recovery methods were not comparable. Sample treatment and analysis methods were found to affect the percentage of samples meeting and exceeding Ohio's bathing-water, primary-contact, and secondary-contact standards. Therefore, determining the health risk of swimming in receiving waters was often difficult without information on enhanced-recovery method concentrations and was especially difficult in waters receiving high proportions of chlorinated or dechlorinated effluents.
Probabilistic analysis of the eight-hour-averaged CO impacts of highways.
DOT National Transportation Integrated Search
1980-01-01
This report describes a method for estimating the probability that a highway facility will violate the eight hour National Ambient Air Quality Standard (NAAQS) for carbon monoxide (CO). The method is predicated on the assumption that overlapping eigh...
Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro
2007-01-01
To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.
Gmyr, Valery; Bonner, Caroline; Lukowiak, Bruno; Pawlowski, Valerie; Dellaleau, Nathalie; Belaich, Sandrine; Aluka, Isanga; Moermann, Ericka; Thevenet, Julien; Ezzouaoui, Rimed; Queniat, Gurvan; Pattou, Francois; Kerr-Conte, Julie
2015-01-01
Reliable assessment of islet viability, mass, and purity must be met prior to transplanting an islet preparation into patients with type 1 diabetes. The standard method for quantifying human islet preparations is by direct microscopic analysis of dithizone-stained islet samples, but this technique may be susceptible to inter-/intraobserver variability, which may induce false positive/negative islet counts. Here we describe a simple, reliable, automated digital image analysis (ADIA) technique for accurately quantifying islets into total islet number, islet equivalent number (IEQ), and islet purity before islet transplantation. Islets were isolated and purified from n = 42 human pancreata according to the automated method of Ricordi et al. For each preparation, three islet samples were stained with dithizone and expressed as IEQ number. Islets were analyzed manually by microscopy or automatically quantified using Nikon's inverted Eclipse Ti microscope with built-in NIS-Elements Advanced Research (AR) software. The AIDA method significantly enhanced the number of islet preparations eligible for engraftment compared to the standard manual method (p < 0.001). Comparisons of individual methods showed good correlations between mean values of IEQ number (r(2) = 0.91) and total islet number (r(2) = 0.88) and thus increased to r(2) = 0.93 when islet surface area was estimated comparatively with IEQ number. The ADIA method showed very high intraobserver reproducibility compared to the standard manual method (p < 0.001). However, islet purity was routinely estimated as significantly higher with the manual method versus the ADIA method (p < 0.001). The ADIA method also detected small islets between 10 and 50 µm in size. Automated digital image analysis utilizing the Nikon Instruments software is an unbiased, simple, and reliable teaching tool to comprehensively assess the individual size of each islet cell preparation prior to transplantation. Implementation of this technology to improve engraftment may help to advance the therapeutic efficacy and accessibility of islet transplantation across centers.
This compendium includes method summaries provided by the Centers for Disease Control and Prevention/National Center for Environmental Health (CDC/NCEH) for the collection and shipping of blood and urine samples for analysis of metals and volatile organic compounds (VOCs). The pr...
ERIC Educational Resources Information Center
Eckes, Thomas
2017-01-01
This paper presents an approach to standard setting that combines the prototype group method (PGM; Eckes, 2012) with a receiver operating characteristic (ROC) analysis. The combined PGM-ROC approach is applied to setting cut scores on a placement test of English as a foreign language (EFL). To implement the PGM, experts first named learners whom…
Transient loads analysis for space flight applications
NASA Technical Reports Server (NTRS)
Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.
1992-01-01
A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.
Meta-analysis of two studies in the presence of heterogeneity with applications in rare diseases.
Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat
2017-07-01
Random-effects meta-analyses are used to combine evidence of treatment effects from multiple studies. Since treatment effects may vary across trials due to differences in study characteristics, heterogeneity in treatment effects between studies must be accounted for to achieve valid inference. The standard model for random-effects meta-analysis assumes approximately normal effect estimates and a normal random-effects model. However, standard methods based on this model ignore the uncertainty in estimating the between-trial heterogeneity. In the special setting of only two studies and in the presence of heterogeneity, we investigate here alternatives such as the Hartung-Knapp-Sidik-Jonkman method (HKSJ), the modified Knapp-Hartung method (mKH, a variation of the HKSJ method) and Bayesian random-effects meta-analyses with priors covering plausible heterogeneity values; R code to reproduce the examples is presented in an appendix. The properties of these methods are assessed by applying them to five examples from various rare diseases and by a simulation study. Whereas the standard method based on normal quantiles has poor coverage, the HKSJ and mKH generally lead to very long, and therefore inconclusive, confidence intervals. The Bayesian intervals on the whole show satisfying properties and offer a reasonable compromise between these two extremes. © 2016 The Authors. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
40 CFR 92.5 - Reference materials.
Code of Federal Regulations, 2011 CFR
2011-07-01
... § 92.113 ASTM D 1945-91, Standard Test Method for Analysis of Natural Gas by Gas Chromatography § 92... Supercritical Fluid Chromatography § 92.113 ASTM E 29-93a, Standard Practice for Using Significant Digits in....119 SAE Recommended Practice J244, Measurement of Intake Air or Exhaust Gas Flow of Diesel Engines...
A Tool for Estimating Variability in Wood Preservative Treatment Retention
Patricia K. Lebow; Adam M. Taylor; Timothy M. Young
2015-01-01
Composite sampling is standard practice for evaluation of preservative retention levels in preservative-treated wood. Current protocols provide an average retention value but no estimate of uncertainty. Here we describe a statistical method for calculating uncertainty estimates using the standard sampling regime with minimal additional chemical analysis. This tool can...
This standard operating procedure (SOP) describes a new, rapid, and relatively inexpensive way to remove a precise area of paint from the substrate of building structures in preparation for quantitative analysis. This method has been applied successfully in the laboratory, as we...
ERIC Educational Resources Information Center
Porter, Susan G.; Koch, Steven P.; Henderson, Andrew
2010-01-01
Background: There is a lack of consistent, comprehensible data collection and analysis methods for evaluating teacher preparation program's coverage of required standards for accreditation. Of particular concern is the adequate coverage of standards and competencies that address the teaching of English learners and teachers of students from…
A Standardized Mean Difference Effect Size for Single Case Designs
ERIC Educational Resources Information Center
Hedges, Larry V.; Pustejovsky, James E.; Shadish, William R.
2012-01-01
Single case designs are a set of research methods for evaluating treatment effects by assigning different treatments to the same individual and measuring outcomes over time and are used across fields such as behavior analysis, clinical psychology, special education, and medicine. Emerging standards for single case designs have focused attention on…
40 CFR 63.705 - Performance test methods and procedures to determine initial compliance.
Code of Federal Regulations, 2014 CFR
2014-07-01
... per gram-mole. Pi = Barometric pressure at the time of sample analysis, millimeters mercury absolute. 760 = Reference or standard pressure, millimeters mercury absolute. 293 = Reference or standard...: ER15DE94.005 (i) The value of RSi is zero unless the owner or operator submits the following information to...
40 CFR 63.705 - Performance test methods and procedures to determine initial compliance.
Code of Federal Regulations, 2012 CFR
2012-07-01
... per gram-mole. Pi = Barometric pressure at the time of sample analysis, millimeters mercury absolute. 760 = Reference or standard pressure, millimeters mercury absolute. 293 = Reference or standard...: ER15DE94.005 (i) The value of RSi is zero unless the owner or operator submits the following information to...
40 CFR 63.705 - Performance test methods and procedures to determine initial compliance.
Code of Federal Regulations, 2010 CFR
2010-07-01
... per gram-mole. Pi = Barometric pressure at the time of sample analysis, millimeters mercury absolute. 760 = Reference or standard pressure, millimeters mercury absolute. 293 = Reference or standard...: ER15DE94.005 (i) The value of RSi is zero unless the owner or operator submits the following information to...
40 CFR 63.705 - Performance test methods and procedures to determine initial compliance.
Code of Federal Regulations, 2013 CFR
2013-07-01
... per gram-mole. Pi = Barometric pressure at the time of sample analysis, millimeters mercury absolute. 760 = Reference or standard pressure, millimeters mercury absolute. 293 = Reference or standard...: ER15DE94.005 (i) The value of RSi is zero unless the owner or operator submits the following information to...
40 CFR 63.705 - Performance test methods and procedures to determine initial compliance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... per gram-mole. Pi = Barometric pressure at the time of sample analysis, millimeters mercury absolute. 760 = Reference or standard pressure, millimeters mercury absolute. 293 = Reference or standard...: ER15DE94.005 (i) The value of RSi is zero unless the owner or operator submits the following information to...
Feminist Policy Analysis: Expanding Traditional Social Work Methods
ERIC Educational Resources Information Center
Kanenberg, Heather
2013-01-01
In an effort to move the methodology of policy analysis beyond the traditional and artificial position of being objective and value-free, this article is a call to those working and teaching in social work to consider a feminist policy analysis lens. A review of standard policy analysis models is presented alongside feminist models. Such a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman; Jeffrey C. Joe
2005-09-01
An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
Appelbaum, Mark; Cooper, Harris; Kline, Rex B; Mayo-Wilson, Evan; Nezu, Arthur M; Rao, Stephen M
2018-01-01
Following a review of extant reporting standards for scientific publication, and reviewing 10 years of experience since publication of the first set of reporting standards by the American Psychological Association (APA; APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008), the APA Working Group on Quantitative Research Reporting Standards recommended some modifications to the original standards. Examples of modifications include division of hypotheses, analyses, and conclusions into 3 groupings (primary, secondary, and exploratory) and some changes to the section on meta-analysis. Several new modules are included that report standards for observational studies, clinical trials, longitudinal studies, replication studies, and N-of-1 studies. In addition, standards for analytic methods with unique characteristics and output (structural equation modeling and Bayesian analysis) are included. These proposals were accepted by the Publications and Communications Board of APA and supersede the standards included in the 6th edition of the Publication Manual of the American Psychological Association (APA, 2010). (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed
2017-11-01
There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cenciani de Souza, Camila Prado; Aparecida de Abreu, Cleide; Coscione, Aline Renée; Alberto de Andrade, Cristiano; Teixeira, Luiz Antonio Junqueira; Consolini, Flavia
2018-01-01
Rapid, accurate, and low-cost alternative analytical methods for micronutrient quantification in fertilizers are fundamental in QC. The purpose of this study was to evaluate whether zinc (Zn) and copper (Cu) content in mineral fertilizers and industrial by-products determined by the alternative methods USEPA 3051a, 10% HCl, and 10% H2SO4 are statistically equivalent to the standard method, consisting of hot-plate digestion using concentrated HCl. The commercially marketed Zn and Cu sources in Brazil consisted of oxides, carbonate, and sulfate fertilizers and by-products consisting of galvanizing ash, galvanizing sludge, brass ash, and brass or scrap slag. The contents of sources ranged from 15 to 82% and 10 to 45%, respectively, for Zn and Cu. The Zn and Cu contents refer to the variation of the elements found in the different sources evaluated with the concentrated HCl method as shown in Table 1. A protocol based on the following criteria was used for the statistical analysis assessment of the methods: F-test modified by Graybill, t-test for the mean error, and linear correlation coefficient analysis. In terms of equivalents, 10% HCl extraction was equivalent to the standard method for Zn, and the results of the USEPA 3051a and 10% HCl methods indicated that these methods were equivalents for Cu. Therefore, these methods can be considered viable alternatives to the standard method of determination for Cu and Zn in mineral fertilizers and industrial by-products in future research for their complete validation.
Guo, How-Ran
2011-10-20
Despite its limitations, ecological study design is widely applied in epidemiology. In most cases, adjustment for age is necessary, but different methods may lead to different conclusions. To compare three methods of age adjustment, a study on the associations between arsenic in drinking water and incidence of bladder cancer in 243 townships in Taiwan was used as an example. A total of 3068 cases of bladder cancer, including 2276 men and 792 women, were identified during a ten-year study period in the study townships. Three methods were applied to analyze the same data set on the ten-year study period. The first (Direct Method) applied direct standardization to obtain standardized incidence rate and then used it as the dependent variable in the regression analysis. The second (Indirect Method) applied indirect standardization to obtain standardized incidence ratio and then used it as the dependent variable in the regression analysis instead. The third (Variable Method) used proportions of residents in different age groups as a part of the independent variables in the multiple regression models. All three methods showed a statistically significant positive association between arsenic exposure above 0.64 mg/L and incidence of bladder cancer in men and women, but different results were observed for the other exposure categories. In addition, the risk estimates obtained by different methods for the same exposure category were all different. Using an empirical example, the current study confirmed the argument made by other researchers previously that whereas the three different methods of age adjustment may lead to different conclusions, only the third approach can obtain unbiased estimates of the risks. The third method can also generate estimates of the risk associated with each age group, but the other two are unable to evaluate the effects of age directly.
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
Analysis of drugs in human tissues by supercritical fluid extraction/immunoassay
NASA Astrophysics Data System (ADS)
Furton, Kenneth G.; Sabucedo, Alberta; Rein, Joseph; Hearn, W. L.
1997-02-01
A rapid, readily automated method has been developed for the quantitative analysis of phenobarbital from human liver tissues based on supercritical carbon dioxide extraction followed by fluorescence enzyme immunoassay. The method developed significantly reduces sample handling and utilizes the entire liver homogenate. The current method yields comparable recoveries and precision and does not require the use of an internal standard, although traditional GC/MS confirmation can still be performed on sample extracts. Additionally, the proposed method uses non-toxic, inexpensive carbon dioxide, thus eliminating the use of halogenated organic solvents.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Meta‐analysis of test accuracy studies using imputation for partial reporting of multiple thresholds
Deeks, J.J.; Martin, E.C.; Riley, R.D.
2017-01-01
Introduction For tests reporting continuous results, primary studies usually provide test performance at multiple but often different thresholds. This creates missing data when performing a meta‐analysis at each threshold. A standard meta‐analysis (no imputation [NI]) ignores such missing data. A single imputation (SI) approach was recently proposed to recover missing threshold results. Here, we propose a new method that performs multiple imputation of the missing threshold results using discrete combinations (MIDC). Methods The new MIDC method imputes missing threshold results by randomly selecting from the set of all possible discrete combinations which lie between the results for 2 known bounding thresholds. Imputed and observed results are then synthesised at each threshold. This is repeated multiple times, and the multiple pooled results at each threshold are combined using Rubin's rules to give final estimates. We compared the NI, SI, and MIDC approaches via simulation. Results Both imputation methods outperform the NI method in simulations. There was generally little difference in the SI and MIDC methods, but the latter was noticeably better in terms of estimating the between‐study variances and generally gave better coverage, due to slightly larger standard errors of pooled estimates. Given selective reporting of thresholds, the imputation methods also reduced bias in the summary receiver operating characteristic curve. Simulations demonstrate the imputation methods rely on an equal threshold spacing assumption. A real example is presented. Conclusions The SI and, in particular, MIDC methods can be used to examine the impact of missing threshold results in meta‐analysis of test accuracy studies. PMID:29052347
Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H
2016-01-01
Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Guo, Henan; Yang, Xuedong; Liu, Jun; Zheng, Wenfeng
2012-07-01
Flavonoid reference standards were targeted-prepared from Scutellariae Radix under the guidance of high performance liquid chromatography-mass spectrometry (HPLC-MS) analysis. With HPLC-MS analysis of Scutellariae Radix, 19 flavonoid components were identified by analyzing and comparing their retention times, ultraviolet spectra, and mass spectrometry data with literature. The separation and purification protocols of all targeted flavonoid reference standards were optimally designed according to the results of HPLC-MS analysis and related literature. The ethanol extract of Scutellariae Radix was suspended in water and extracted with petroleum ether, ethyl acetate, and n-butanol successively. The ethyl acetate extract and n-butanol extract were separately subjected to primary separation by low pressure reverse phase preparative chromatography. Then the fractions containing targeted compounds were further purified by low pressure reverse and normal phases preparative chromatography. Finally, baicalin and wogonoside reference standards were obtained from n-butanol extract; baicaelin, wogonin, and oroxylin A reference standards were obtained from ethyl acetate extract. The structures of the 5 reference standards were identified by mass spectrometry (MS) and 1H nuclear magnetic resonance (1H NMR) spectroscopy. The HPLC analytical results showed that the purities of the 5 reference standards were all above 98%. It is demonstrated that the rapid targeted-preparation method under the guidance of the HPLC-MS analysis is applicable for the isolation and preparation of chemical components in traditional Chinese medicines.
Waveform shape analysis: extraction of physiologically relevant information from Doppler recordings.
Ramsay, M M; Broughton Pipkin, F; Rubin, P C; Skidmore, R
1994-05-01
1. Doppler recordings were made from the brachial artery of healthy female subjects during a series of manoeuvres which altered the pressure-flow characteristics of the vessel. 2. Changes were induced in the peripheral circulation of the forearm by the application of heat or ice-packs. A sphygmomanometer cuff was used to create graded occlusion of the vessel above and below the point of measurement. Recordings were also made whilst the subjects performed a standardized Valsalva manoeuvre. 3. The Doppler recordings were analysed both with the standard waveform indices (systolic/diastolic ratio, pulsatility index and resistance index) and by the method of Laplace transform analysis. 4. The waveform parameters obtained by Laplace transform analysis distinguished the different changes in flow conditions; they thus had direct physiological relevance, unlike the standard waveform indices.
A modular approach for automated sample preparation and chemical analysis
NASA Technical Reports Server (NTRS)
Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph
1994-01-01
Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.
Analysis of Biomass Sugars Using a Novel HPLC Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agblevor, F. A.; Hames, B. R.; Schell, D.
The precise quantitative analysis of biomass sugars is a very important step in the conversion of biomass feedstocks to fuels and chemicals. However, the most accurate method of biomass sugar analysis is based on the gas chromatography analysis of derivatized sugars either as alditol acetates or trimethylsilanes. The derivatization method is time consuming but the alternative high-performance liquid chromatography (HPLC) method cannot resolve most sugars found in biomass hydrolysates. We have demonstrated for the first time that by careful manipulation of the HPLC mobile phase, biomass monomeric sugars (arabinose, xylose, fructose, glucose, mannose, and galactose) can be analyzed quantitatively andmore » there is excellent baseline resolution of all the sugars. This method was demonstrated for standard sugars, pretreated corn stover liquid and solid fractions. Our method can also be used to analyze dimeric sugars (cellobiose and sucrose).« less
Environmental analysis of higher brominated diphenyl ethers and decabromodiphenyl ethane.
Kierkegaard, Amelie; Sellström, Ulla; McLachlan, Michael S
2009-01-16
Methods for environmental analysis of higher brominated diphenyl ethers (PBDEs), in particular decabromodiphenyl ether (BDE209), and the recently discovered environmental contaminant decabromodiphenyl ethane (deBDethane) are reviewed. The extensive literature on analysis of BDE209 has identified several critical issues, including contamination of the sample, degradation of the analyte during sample preparation and GC analysis, and the selection of appropriate detection methods and surrogate standards. The limited experience with the analysis of deBDethane suggests that there are many commonalities with BDE209. The experience garnered from the analysis of BDE209 over the last 15 years will greatly facilitate progress in the analysis of deBDethane.
Determination of micro amounts of iron, aluminum, and alkaline earth metals in silicon carbide
NASA Technical Reports Server (NTRS)
Hirata, H.; Arai, M.
1978-01-01
A colorimetric method for analysis of micro components in silicon carbide used as the raw material for varistors is described. The microcomponents analyzed included iron soluble in hydrochloric acid, iron, aluminum, calcium and magnesium. Samples were analyzed by the method, and the results for iron and aluminum agreed well with the N.B.S. standard values and the values obtained by the other company. The method can therefore be applied to the analysis of actual samples.
ERIC Educational Resources Information Center
Patalino, Marianne
Problems in current course evaluation methods are discussed and an alternative method is described for the construction, analysis, and interpretation of a test to evaluate instructional programs. The method presented represents a different approach to the traditional overreliance on standardized achievement tests and the total scores they provide.…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-21
... Total Coliform Total Coliform 9221 A, B, C..... 9221 A, B, C..... Fermentation Technique. Total Coliform... Methodology category Method SM 22nd Edition \\28\\ Total Coliforms Lactose Fermentation Standard Total 9221 B.1, B.2 Methods. Coliform Fermentation Technique. Enzyme Substrate Colilert[supreg].... 9223 B Methods...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
7 CFR 58.930 - Official test methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating Procedures § 58.930 Official test methods. (a) Chemical. Chemical analysis, except where otherwise prescribed...
Resnick, Cory M; Kim, Somi; Yorlets, Rachel R; Calabrese, Carly E; Peacock, Zachary S; Kaban, Leonard B
2018-03-22
There is no universally accepted method for determining the ideal sagittal position of the maxilla in orthognathic surgery. In "Element II" of "The Six Elements of Orofacial Harmony," Andrews used the forehead to define the goal maxillary position. The purpose of this study was to compare how well this analysis correlated with postoperative findings in patients who underwent bimaxillary orthognathic surgery planned using other guidelines. The authors hypothesized that the Andrews analysis would more consistently reflect clinical outcomes than standard angular and linear measurements. This is a retrospective cohort study of patients who had bimaxillary orthognathic surgery and achieved an acceptable esthetic outcome. Patients with no maxillary sagittal movement, obstructive sleep apnea, cleft or craniofacial diagnoses, or who were non-Caucasian were excluded. Treatment plans were developed using photographs, radiographs, and standard cephalometric measurements. The Andrews analysis, measuring the distance from the maxillary incisor to the goal anterior limit line, and standard measurements were applied to end-treatment records. The Andrews analysis was statistically compared with standard methods. There were 493 patients who had orthognathic surgery from 2007 through 2014, and 60 (62% women; mean age, 22.1 ± 6.8 yr) met the criteria for inclusion in this study. The mean Andrews distances were -4.8 ± 2.9 mm for women and -8.6 ± 4.6 mm for men preoperatively and -0.6 ± 2.1 mm for women and -1.9 ± 3.4 mm for men postoperatively. For women, the Andrews analysis was closer to the goal value (0 mm) postoperatively than any standard measurement (P < .001). For men, the linear distance from the A point to a vertical line tangent to the nasion from the McNamara analysis performed best (P < .001), followed by the Andrews analysis. The Andrews analysis correlated well with the final esthetic sagittal maxillary position in the present sample, particularly for women, and could be a useful tool for orthognathic surgical planning. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
2015-10-01
capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a
Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.
Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y
2015-11-01
Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Disseminating the unit of mass from multiple primary realisations
NASA Astrophysics Data System (ADS)
Nielsen, Lars
2016-12-01
When a new definition of the kilogram has been adopted in 2018 as expected, the unit of mass will be realised by the watt balance method, the x-ray crystal density method or perhaps other primary methods still to be developed. So far, the standard uncertainties associated with the available primary methods are at least one order of magnitude larger than the standard uncertainty associated with mass comparisons using mass comparators, so differences in primary realisations of the kilogram are easily detected, whereas many National Metrology Institutes would have to increase their calibration and measurement capabilities (CMCs) if they were traceable to a single primary realisation. This paper presents a scheme for obtaining traceability to multiple primary realisations of the kilogram using a small group of stainless steel 1 kg weights, which are allowed to change their masses over time in a way known to be realistic, and which are calibrated and stored in air. An analysis of the scheme shows that if the relative standard uncertainties of future primary realisations are equal to the relative standard uncertainties of the present methods used to measure the Planck constant, the unit of mass can be disseminated with a standard uncertainty less than 0.015 mg, which matches the smallest CMCs currently claimed for the calibration of 1 kg weights.
Integrating Security into the Curriculum
1998-12-01
predicate calculus, discrete math , and finite-state machine the- ory. In addition to applying standard mathematical foundations to constructing hardware and...models, specifi- cations, and the use of formal methods for verification and covert channel analysis. The means for analysis is based on discrete math , information
Exhaled human breath analysis has become a standard technique for assessing exposure to exogenous volatile organic compounds (VOCs) such as trihalomethanes from water chlorination; aromatics, hydrocarbons, and oxygenates from fuels usage; and various chlorinated solvents from i...
Prigge, R.; Micke, H.; Krüger, J.
1963-01-01
As part of a collaborative assay of the proposed Fifth International Standard for Gas-Gangrene Antitoxin (Perfringens), five ampoules of the proposed replacement material were assayed in the authors' laboratory against the then current Fourth International Standard. Both in vitro and in vivo methods were used. This paper presents the results and their statistical analysis. The two methods yielded different results which were not likely to have been due to chance, but exact statistical comparison is not possible. It is thought, however, that the differences may be due, at least in part, to differences in the relative proportions of zeta-antitoxin and alpha-antitoxin in the Fourth and Fifth International Standards and the consequent different reactions with the test toxin that was used for titration. PMID:14107746
A Standard Procedure for Conducting Cognitive Task Analysis.
ERIC Educational Resources Information Center
Redding, Richard E.
Traditional methods for task analysis have been largely based on the Instructional Systems Development (ISD) model, which is widely used throughout industry and the military. The first part of this document gives an overview of cognitive task analysis, which is conducted within the first phase of ISD. The following steps of cognitive task analysis…
ERLN Technical Support for Labs
The Environmental Response Laboratory Network provides policies and guidance on lab and data requirements, Standardized Analytical Methods, and technical support for water and radiological sampling and analysis
Farmer, William H.; Archfield, Stacey A.; Over, Thomas M.; Hay, Lauren E.; LaFontaine, Jacob H.; Kiang, Julie E.
2015-01-01
Effective and responsible management of water resources relies on a thorough understanding of the quantity and quality of available water. Streamgages cannot be installed at every location where streamflow information is needed. As part of its National Water Census, the U.S. Geological Survey is planning to provide streamflow predictions for ungaged locations. In order to predict streamflow at a useful spatial and temporal resolution throughout the Nation, efficient methods need to be selected. This report examines several methods used for streamflow prediction in ungaged basins to determine the best methods for regional and national implementation. A pilot area in the southeastern United States was selected to apply 19 different streamflow prediction methods and evaluate each method by a wide set of performance metrics. Through these comparisons, two methods emerged as the most generally accurate streamflow prediction methods: the nearest-neighbor implementations of nonlinear spatial interpolation using flow duration curves (NN-QPPQ) and standardizing logarithms of streamflow by monthly means and standard deviations (NN-SMS12L). It was nearly impossible to distinguish between these two methods in terms of performance. Furthermore, neither of these methods requires significantly more parameterization in order to be applied: NN-SMS12L requires 24 regional regressions—12 for monthly means and 12 for monthly standard deviations. NN-QPPQ, in the application described in this study, required 27 regressions of particular quantiles along the flow duration curve. Despite this finding, the results suggest that an optimal streamflow prediction method depends on the intended application. Some methods are stronger overall, while some methods may be better at predicting particular statistics. The methods of analysis presented here reflect a possible framework for continued analysis and comprehensive multiple comparisons of methods of prediction in ungaged basins (PUB). Additional metrics of comparison can easily be incorporated into this type of analysis. By considering such a multifaceted approach, the top-performing models can easily be identified and considered for further research. The top-performing models can then provide a basis for future applications and explorations by scientists, engineers, managers, and practitioners to suit their own needs.
Chen, Jian-Wu; Zhou, Chang-Fu; Lin, Zhi-Xiong
2015-09-15
Although age is thought to correlate with the prognosis of glioma patients, the most appropriate age-group classification standard to evaluate prognosis had not been fully studied. This study aimed to investigate the influence of age-group classification standards on the prognosis of patients with high-grade hemispheric glioma (HGG). This retrospective study of 125 HGG patients used three different classification standards of age-groups (≤ 50 and >50 years old, ≤ 60 and >60 years old, ≤ 45 and 45-65 and ≥ 65 years old) to evaluate the impact of age on prognosis. The primary end-point was overall survival (OS). The Kaplan-Meier method was applied for univariate analysis and Cox proportional hazards model for multivariate analysis. Univariate analysis showed a significant correlation between OS and all three classification standards of age-groups as well as between OS and pathological grade, gender, location of glioma, and regular chemotherapy and radiotherapy treatment. Multivariate analysis showed that the only independent predictors of OS were classification standard of age-groups ≤ 50 and > 50 years old, pathological grade and regular chemotherapy. In summary, the most appropriate classification standard of age-groups as an independent prognostic factor was ≤ 50 and > 50 years old. Pathological grade and chemotherapy were also independent predictors of OS in post-operative HGG patients. Copyright © 2015. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Wang, Tao; Zhou, Guoqing; Wang, Jianzhou; Zhou, Lei
2018-03-01
The artificial ground freezing method (AGF) is widely used in civil and mining engineering, and the thermal regime of frozen soil around the freezing pipe affects the safety of design and construction. The thermal parameters can be truly random due to heterogeneity of the soil properties, which lead to the randomness of thermal regime of frozen soil around the freezing pipe. The purpose of this paper is to study the one-dimensional (1D) random thermal regime problem on the basis of a stochastic analysis model and the Monte Carlo (MC) method. Considering the uncertain thermal parameters of frozen soil as random variables, stochastic processes and random fields, the corresponding stochastic thermal regime of frozen soil around a single freezing pipe are obtained and analyzed. Taking the variability of each stochastic parameter into account individually, the influences of each stochastic thermal parameter on stochastic thermal regime are investigated. The results show that the mean temperatures of frozen soil around the single freezing pipe with three analogy method are the same while the standard deviations are different. The distributions of standard deviation have a great difference at different radial coordinate location and the larger standard deviations are mainly at the phase change area. The computed data with random variable method and stochastic process method have a great difference from the measured data while the computed data with random field method well agree with the measured data. Each uncertain thermal parameter has a different effect on the standard deviation of frozen soil temperature around the single freezing pipe. These results can provide a theoretical basis for the design and construction of AGF.
Sinigalliano, Christopher D.; Ervin, Jared S.; Van De Werfhorst, Laurie C.; Badgley, Brian D.; Ballestée, Elisenda; Bartkowiaka, Jakob; Boehm, Alexandria B.; Byappanahalli, Muruleedhara N.; Goodwin, Kelly D.; Gourmelon, Michèle; Griffith, John; Holden, Patricia A.; Jay, Jenny; Layton, Blythe; Lee, Cheonghoon; Lee, Jiyoung; Meijer, Wim G.; Noble, Rachel; Raith, Meredith; Ryu, Hodon; Sadowsky, Michael J.; Schriewer, Alexander; Wang, Dan; Wanless, David; Whitman, Richard; Wuertz, Stefan; Santo Domingo, Jorge W.
2013-01-01
Here we report results from a multi-laboratory (n = 11) evaluation of four different PCR methods targeting the 16S rRNA gene of Catellicoccus marimammalium originally developed to detect gull fecal contamination in coastal environments. The methods included a conventional end-point PCR method, a SYBR® Green qPCR method, and two TaqMan® qPCR methods. Different techniques for data normalization and analysis were tested. Data analysis methods had a pronounced impact on assay sensitivity and specificity calculations. Across-laboratory standardization of metrics including the lower limit of quantification (LLOQ), target detected but not quantifiable (DNQ), and target not detected (ND) significantly improved results compared to results submitted by individual laboratories prior to definition standardization. The unit of measure used for data normalization also had a pronounced effect on measured assay performance. Data normalization to DNA mass improved quantitative method performance as compared to enterococcus normalization. The MST methods tested here were originally designed for gulls but were found in this study to also detect feces from other birds, particularly feces composited from pigeons. Sequencing efforts showed that some pigeon feces from California contained sequences similar to C. marimammalium found in gull feces. These data suggest that the prevalence, geographic scope, and ecology of C. marimammalium in host birds other than gulls require further investigation. This study represents an important first step in the multi-laboratory assessment of these methods and highlights the need to broaden and standardize additional evaluations, including environmentally relevant target concentrations in ambient waters from diverse geographic regions.
NASA Astrophysics Data System (ADS)
Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng
2017-12-01
A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.
[Analysis of phthalates in plastic food-packaging bags by thin layer chromatography].
Chen, Hui; Wang, Yuan; Zhu, Ruohua
2006-01-01
The method for simultaneous determination of four phthalates, namely dimethyl phthalate (DMP), diethyl phthalate (DEP), di-n-butyl phthalate (DBP) and di (2-ethylhexyl) phthalate (DEHP) in plastic food-packaging bags by thin layer chromatography (TLC) was developed. The plastic food-packaging bags were extracted with ethanol by ultrasonication, then the mixture was filtrated through membrane (0.45 microm). The mixture of ethyl acetate-anhydrous ether-isooctane (1 : 4 : 15, v/v) was used as developing agent on the TLC silica gel plate for development. The filtered liquid was spotted on the TLC plate dealt by acetone, and detected with scanning wavelength of 275 nm and reference wavelength of 340 nm. The qualitative analysis of the phthalates was performed using the R(f) values of the chromatogram. The quantitative analysis was performed with external standard method. Good linearities were obtained for DMP, DEP, DBP and DEHP. The detection limits were 2.1 ng for DMP, 2.4 ng for DEP, 3.4 ng for DBP and 4.0 ng for DEHP. The relative standard deviations (RSDs) of the four phthalates were 2.8% - 3.5%. The recoveries of the four phthalate standards in real sample were 78.58% - 111.04%. The method presented has the advantages of high precision, high sensitivity, small sample size, and simple pretreatment . The method was used to detect the four phthalates in the food-packaging bags. The contents in real samples were close to the results by gas chromatography.
Jitaru, Petru; Adams, Freddy C
2004-11-05
This paper reports the development of an analytical approach for speciation analysis of mercury at ultra-trace levels on the basis of solid-phase microextraction and multicapillary gas chromatography hyphenated to inductively coupled plasma-time-of-flight mass spectrometry. Headspace solid-phase microextraction with a carboxen/polydimethylsyloxane fiber is used for extraction/preconcentration of mercury species after derivatization with sodium tetraethylborate and subsequent volatilization. Isothermal separation of methylmercury (MeHg), inorganic mercury (Hg2+) and propylmercury (PrHg) used as internal standard is achieved within a chromatographic run below 45 s without the introduction of spectral skew. Method detection limits (3 x standard deviation criteria) calculated for 10 successive injections of the analytical reagent blank are 0.027 pg g(-1) (as metal) for MeHg and 0.27 pg g(-1) for Hg2+. The repeatability (R.S.D., %) is 3.3% for MeHg and 3.8% for Hg2+ for 10 successive injections of a standard mixture of 10pg. The method accuracy for MeHg and total mercury is validated through the analysis of marine and estuarine sediment reference materials. A comparison of the sediment data with those obtained by a purge-and-trap injection (PTI) method is also addressed. The analytical procedure is illustrated with some results for the ultra-trace level analysis of ice from Antarctica for which the accuracy is assessed by spike recovery experiments.
Chow, Clara K.; Corsi, Daniel J.; Lock, Karen; Madhavan, Manisha; Mackie, Pam; Li, Wei; Yi, Sun; Wang, Yang; Swaminathan, Sumathi; Lopez-Jaramillo, Patricio; Gomez-Arbelaez, Diego; Avezum, Álvaro; Lear, Scott A.; Dagenais, Gilles; Teo, Koon; McKee, Martin; Yusuf, Salim
2014-01-01
Background Previous research has shown that environments with features that encourage walking are associated with increased physical activity. Existing methods to assess the built environment using geographical information systems (GIS) data, direct audit or large surveys of the residents face constraints, such as data availability and comparability, when used to study communities in countries in diverse parts of the world. The aim of this study was to develop a method to evaluate features of the built environment of communities using a standard set of photos. In this report we describe the method of photo collection, photo analysis instrument development and inter-rater reliability of the instrument. Methods/Principal Findings A minimum of 5 photos were taken per community in 86 communities in 5 countries according to a standard set of instructions from a designated central point of each community by researchers at each site. A standard pro forma derived from reviewing existing instruments to assess the built environment was developed and used to score the characteristics of each community. Photo sets from each community were assessed independently by three observers in the central research office according to the pro forma and the inter-rater reliability was compared by intra-class correlation (ICC). Overall 87% (53 of 60) items had an ICC of ≥0.70, 7% (4 of 60) had an ICC between 0.60 and 0.70 and 5% (3 of 60) items had an ICC ≤0.50. Conclusions/Significance Analysis of photos using a standardized protocol as described in this study offers a means to obtain reliable and reproducible information on the built environment in communities in very diverse locations around the world. The collection of the photographic data required minimal training and the analysis demonstrated high reliability for the majority of items of interest. PMID:25369366
ERIC Educational Resources Information Center
Fish, Laurel J.; Halcoussis, Dennis; Phillips, G. Michael
2017-01-01
The Monte Carlo method and related multiple imputation methods are traditionally used in math, physics and science to estimate and analyze data and are now becoming standard tools in analyzing business and financial problems. However, few sources explain the application of the Monte Carlo method for individuals and business professionals who are…
A Preliminary Rubric Design to Evaluate Mixed Methods Research
ERIC Educational Resources Information Center
Burrows, Timothy J.
2013-01-01
With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…
Data-optimized source modeling with the Backwards Liouville Test–Kinetic method
Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.; ...
2017-09-14
In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less
Finite element techniques in computational time series analysis of turbulent flows
NASA Astrophysics Data System (ADS)
Horenko, I.
2009-04-01
In recent years there has been considerable increase of interest in the mathematical modeling and analysis of complex systems that undergo transitions between several phases or regimes. Such systems can be found, e.g., in weather forecast (transitions between weather conditions), climate research (ice and warm ages), computational drug design (conformational transitions) and in econometrics (e.g., transitions between different phases of the market). In all cases, the accumulation of sufficiently detailed time series has led to the formation of huge databases, containing enormous but still undiscovered treasures of information. However, the extraction of essential dynamics and identification of the phases is usually hindered by the multidimensional nature of the signal, i.e., the information is "hidden" in the time series. The standard filtering approaches (like f.~e. wavelets-based spectral methods) have in general unfeasible numerical complexity in high-dimensions, other standard methods (like f.~e. Kalman-filter, MVAR, ARCH/GARCH etc.) impose some strong assumptions about the type of the underlying dynamics. Approach based on optimization of the specially constructed regularized functional (describing the quality of data description in terms of the certain amount of specified models) will be introduced. Based on this approach, several new adaptive mathematical methods for simultaneous EOF/SSA-like data-based dimension reduction and identification of hidden phases in high-dimensional time series will be presented. The methods exploit the topological structure of the analysed data an do not impose severe assumptions on the underlying dynamics. Special emphasis will be done on the mathematical assumptions and numerical cost of the constructed methods. The application of the presented methods will be first demonstrated on a toy example and the results will be compared with the ones obtained by standard approaches. The importance of accounting for the mathematical assumptions used in the analysis will be pointed up in this example. Finally, applications to analysis of meteorological and climate data will be presented.
Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei
2015-01-01
A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Fancher, Chris M.; Han, Zhen; Levin, Igor; Page, Katharine; Reich, Brian J.; Smith, Ralph C.; Wilson, Alyson G.; Jones, Jacob L.
2016-01-01
A Bayesian inference method for refining crystallographic structures is presented. The distribution of model parameters is stochastically sampled using Markov chain Monte Carlo. Posterior probability distributions are constructed for all model parameters to properly quantify uncertainty by appropriately modeling the heteroskedasticity and correlation of the error structure. The proposed method is demonstrated by analyzing a National Institute of Standards and Technology silicon standard reference material. The results obtained by Bayesian inference are compared with those determined by Rietveld refinement. Posterior probability distributions of model parameters provide both estimates and uncertainties. The new method better estimates the true uncertainties in the model as compared to the Rietveld method. PMID:27550221
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.
In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less
The purpose of this SOP is to describe the collection, storage, and shipment of tap and drinking water samples for analysis by EPA method 524.2 (revision 4.0). This SOP provides a brief description of the sample containers, collection, preservation, storage, shipping, and custod...
2005-04-01
Bray-Curtis distance measure with an Unweighted Pair Group Method with Arithmetic Averages ( UPGMA ) linkage method to perform a cluster analysis of the...59 35 Comparison of reef condition indicators clustering by UPGMA analysis...Polyvinyl Chloride RBD Red-band Disease SACEX Supporting Arms Coordination Exercise SAV Submerged Aquatic Vegetation SD Standard Deviation UPGMA
The International Standard for Aureomycin
Humphrey, J. H.; Lightbown, J. W.; Mussett, M. V.; Perry, W. L. M.
1953-01-01
In 1950, the Department of Biological Standards, National Institute for Medical Research, London, was authorized by the WHO Expert Committee on Biological Standardization to proceed with the establishment of an International Standard for Aureomycin. A 100-g batch of aureomycin was obtained and was compared with the Standard Preparation of Aureomycin of the United States Food and Drug Administration (FDA) in a collaborative assay in which six laboratories in five countries participated. In all, 30 assays were carried out; 26 of these were done by biological methods, using Sarcina lutea, Bacillus pumilus, Staphylococcus aureus, or Bacillus cereus, and the remaining four by physicochemical methods. The results were subjected to standard methods of analysis, and the overall weighted mean potency (calculated from the biological assays only) was 1.0139, with limits of error of 99.5% to 100.5%. Since the International Standard is 1.39% more potent than the FDA Standard Preparation, it is probable that the latter contains a small amount of inert material; it is also possible that the International Standard itself is not 100% pure. For most practical purposes, however, both preparations may be regarded as substantially pure, and it is considered that to alter the present practice of quoting aureomycin dosage in metric units of weight would be inadvisable. Nevertheless, since the International Standard may not be a pure substance, a unit notation—for use where required in bioassays—is desirable, and the International Unit of Aureomycin has therefore been defined as the activity contained in one microgram of the International Standard. PMID:13141137
Sudell, Maria; Kolamunnage-Dona, Ruwanthi; Tudur-Smith, Catrin
2016-12-05
Joint models for longitudinal and time-to-event data are commonly used to simultaneously analyse correlated data in single study cases. Synthesis of evidence from multiple studies using meta-analysis is a natural next step but its feasibility depends heavily on the standard of reporting of joint models in the medical literature. During this review we aim to assess the current standard of reporting of joint models applied in the literature, and to determine whether current reporting standards would allow or hinder future aggregate data meta-analyses of model results. We undertook a literature review of non-methodological studies that involved joint modelling of longitudinal and time-to-event medical data. Study characteristics were extracted and an assessment of whether separate meta-analyses for longitudinal, time-to-event and association parameters were possible was made. The 65 studies identified used a wide range of joint modelling methods in a selection of software. Identified studies concerned a variety of disease areas. The majority of studies reported adequate information to conduct a meta-analysis (67.7% for longitudinal parameter aggregate data meta-analysis, 69.2% for time-to-event parameter aggregate data meta-analysis, 76.9% for association parameter aggregate data meta-analysis). In some cases model structure was difficult to ascertain from the published reports. Whilst extraction of sufficient information to permit meta-analyses was possible in a majority of cases, the standard of reporting of joint models should be maintained and improved. Recommendations for future practice include clear statement of model structure, of values of estimated parameters, of software used and of statistical methods applied.
Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes
2007-06-01
Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.
Guild, Georgia E.; Stangoulis, James C. R.
2016-01-01
Within the HarvestPlus program there are many collaborators currently using X-Ray Fluorescence (XRF) spectroscopy to measure Fe and Zn in their target crops. In India, five HarvestPlus wheat collaborators have laboratories that conduct this analysis and their throughput has increased significantly. The benefits of using XRF are its ease of use, minimal sample preparation and high throughput analysis. The lack of commercially available calibration standards has led to a need for alternative calibration arrangements for many of the instruments. Consequently, the majority of instruments have either been installed with an electronic transfer of an original grain calibration set developed by a preferred lab, or a locally supplied calibration. Unfortunately, neither of these methods has been entirely successful. The electronic transfer is unable to account for small variations between the instruments, whereas the use of a locally provided calibration set is heavily reliant on the accuracy of the reference analysis method, which is particularly difficult to achieve when analyzing low levels of micronutrient. Consequently, we have developed a calibration method that uses non-matrix matched glass disks. Here we present the validation of this method and show this calibration approach can improve the reproducibility and accuracy of whole grain wheat analysis on 5 different XRF instruments across the HarvestPlus breeding program. PMID:27375644
Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levnajić, Zoran; Department of Mechanical Engineering, University of California Santa Barbara, Santa Barbara, California 93106; Mezić, Igor
We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone,more » and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.« less
Jagust, William J.; Landau, Susan M.; Koeppe, Robert A.; Reiman, Eric M.; Chen, Kewei; Mathis, Chester A.; Price, Julie C.; Foster, Norman L.; Wang, Angela Y.
2015-01-01
INTRODUCTION This paper reviews the work done in the ADNI PET core over the past 5 years, largely concerning techniques, methods, and results related to amyloid imaging in ADNI. METHODS The PET Core has utilized [18F]florbetapir routinely on ADNI participants, with over 1600 scans available for download. Four different laboratories are involved in data analysis, and have examined factors such as longitudinal florbetapir analysis, use of FDG-PET in clinical trials, and relationships between different biomarkers and cognition. RESULTS Converging evidence from the PET Core has indicated that cross-sectional and longitudinal florbetapir analyses require different reference regions. Studies have also examined the relationship between florbetapir data obtained immediately after injection, which reflects perfusion, and FDG-PET results. Finally, standardization has included the translation of florbetapir PET data to a centiloid scale. CONCLUSION The PET Core has demonstrated a variety of methods for standardization of biomarkers such as florbetapir PET in a multicenter setting. PMID:26194311
Ergodic theory and visualization. II. Fourier mesochronic plots visualize (quasi)periodic sets.
Levnajić, Zoran; Mezić, Igor
2015-05-01
We present an application and analysis of a visualization method for measure-preserving dynamical systems introduced by I. Mezić and A. Banaszuk [Physica D 197, 101 (2004)], based on frequency analysis and Koopman operator theory. This extends our earlier work on visualization of ergodic partition [Z. Levnajić and I. Mezić, Chaos 20, 033114 (2010)]. Our method employs the concept of Fourier time average [I. Mezić and A. Banaszuk, Physica D 197, 101 (2004)], and is realized as a computational algorithms for visualization of periodic and quasi-periodic sets in the phase space. The complement of periodic phase space partition contains chaotic zone, and we show how to identify it. The range of method's applicability is illustrated using well-known Chirikov standard map, while its potential in illuminating higher-dimensional dynamics is presented by studying the Froeschlé map and the Extended Standard Map.
NASA Astrophysics Data System (ADS)
Lusianti, E.; Wibowo, R.; Hudiyono, S.
2018-01-01
Azelaic acid is one of the substances that has anti-acne and skin lightening effects which is often added to cosmetics. In the acne treatment, the azelaic acid is generally used with a concentration of 20% in cream formulation and 15% in gel. The use at concentrations below 10% is not recommended because it does not work effectively. While the use of above 10% is categorized as a medical treatment. In Indonesia, the Head of the National Agency of Drug and Food Control (BPOM) has issued Regulation No. 18 of 2015 on the Technical Requirements of Cosmetics Ingredients Annex V stating that the azelaic acid is banned in cosmetics. However, until this research began the BPOM has not had a valid method to identify it in cosmetics. Consequently, surveillance of such ingredient in products is hard to do. In this research, the fatty acid standard analysis method of AOAC International was modified and validated to be used in the laboratory. The method of analysis involves heating the cream preparations dissolved with methanol and then added BF3-methanol catalyst, followed by extraction and analysis using GCMS. The validation of method shows that the calibration curve is linear with correlative value of 0.9997. The method is fairly sensitive with 0.02% detection limit, and fairly precision with relative standard deviation (RSD) of between 0.626-0.961% and fairly accurate which the recovery percentage is 99.85% at range 98.27-100.72%. In sum the results demonstrate that the method can be used as a routine analysis method for laboratory testing.
STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT
The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...
[The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].
Pérol, M; Pérol, D
2004-02-01
Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.
A Generally Robust Approach for Testing Hypotheses and Setting Confidence Intervals for Effect Sizes
ERIC Educational Resources Information Center
Keselman, H. J.; Algina, James; Lix, Lisa M.; Wilcox, Rand R.; Deering, Kathleen N.
2008-01-01
Standard least squares analysis of variance methods suffer from poor power under arbitrarily small departures from normality and fail to control the probability of a Type I error when standard assumptions are violated. This article describes a framework for robust estimation and testing that uses trimmed means with an approximate degrees of…
Z-Score Demystified: A Critical Analysis of the Sri Lankan University Admission Policy
ERIC Educational Resources Information Center
Warnapala, Yajni; Silva, Karishma
2011-01-01
In the year 2001, the University Grants Commission of Sri Lanka successfully appealed to change the method of determining the cut-off scores for university admissions from raw scores to standardized z-scores. This standardization allegedly eliminated the discrepancy caused due to the assumption of equal difficulty levels across all subjects. This…
Jules Verne's "Around the World in Eighty Days": Helping Teach the National Geography Standards
ERIC Educational Resources Information Center
Donaldson, Daniel P.; Kuhlke, Olaf
2009-01-01
Consistent with developments in American education pedagogy, geography educators have made great strides exploring a wide range of high- and low-tech methods for teaching and learning geographic concepts. This article draws on a qualitative analysis of essays in which college students discuss tenets of the National Geography Standards in the…
Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing
2016-05-08
Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.
Sochor, Jiri; Ryvolova, Marketa; Krystofova, Olga; Salas, Petr; Hubalek, Jaromir; Adam, Vojtech; Trnkova, Libuse; Havel, Ladislav; Beklova, Miroslava; Zehnalek, Josef; Provaznik, Ivo; Kizek, Rene
2010-11-29
The aim of this study was to describe behaviour, kinetics, time courses and limitations of the six different fully automated spectrometric methods--DPPH, TEAC, FRAP, DMPD, Free Radicals and Blue CrO5. Absorption curves were measured and absorbance maxima were found. All methods were calibrated using the standard compounds Trolox® and/or gallic acid. Calibration curves were determined (relative standard deviation was within the range from 1.5 to 2.5%). The obtained characteristics were compared and discussed. Moreover, the data obtained were applied to optimize and to automate all mentioned protocols. Automatic analyzer allowed us to analyse simultaneously larger set of samples, to decrease the measurement time, to eliminate the errors and to provide data of higher quality in comparison to manual analysis. The total time of analysis for one sample was decreased to 10 min for all six methods. In contrary, the total time of manual spectrometric determination was approximately 120 min. The obtained data provided good correlations between studied methods (R=0.97-0.99).
Abdalla, Amir A; Smith, Robert E
2013-01-01
Mercury has been determined in Ayurvedic dietary supplements (Trifala, Trifala Guggulu, Turmeric, Mahasudarshan, Yograj, Shatawari, Hingwastika, Shatavari, and Shilajit) by inductively coupled plasma-mass spectrometry (ICP-MS) and direct mercury analysis using the Hydra-C direct mercury analyzer (Teledyne Leeman Labs Hudson, NH, USA). Similar results were obtained from the two methods, but the direct mercury analysis method was much faster and safer and required no microwave digestion (unlike ICP-MS). Levels of mercury ranged from 0.002 to 56 μ g/g in samples of dietary supplements. Standard reference materials Ephedra 3240 and tomato leaves that were from the National Institute of Standard and Technology (NIST) and dogfish liver (DOLT3) that was from the Canadian Research Council were analyzed using Hydra-C method. Average mercury recoveries were 102% (RSD% 0.0018), 100% (RSD% 0.0009), and 101% (RSD% 0.0729), respectively. Hydra-C method Limit Of Quantitation was 0.5 ng.
Abdalla, Amir A.; Smith, Robert E.
2013-01-01
Mercury has been determined in Ayurvedic dietary supplements (Trifala, Trifala Guggulu, Turmeric, Mahasudarshan, Yograj, Shatawari, Hingwastika, Shatavari, and Shilajit) by inductively coupled plasma-mass spectrometry (ICP-MS) and direct mercury analysis using the Hydra-C direct mercury analyzer (Teledyne Leeman Labs Hudson, NH, USA). Similar results were obtained from the two methods, but the direct mercury analysis method was much faster and safer and required no microwave digestion (unlike ICP-MS). Levels of mercury ranged from 0.002 to 56 μg/g in samples of dietary supplements. Standard reference materials Ephedra 3240 and tomato leaves that were from the National Institute of Standard and Technology (NIST) and dogfish liver (DOLT3) that was from the Canadian Research Council were analyzed using Hydra-C method. Average mercury recoveries were 102% (RSD% 0.0018), 100% (RSD% 0.0009), and 101% (RSD% 0.0729), respectively. Hydra-C method Limit Of Quantitation was 0.5 ng. PMID:23710181
McLain, B.J.
1993-01-01
Graphite furnace atomic absorption spectrophotometry is a sensitive, precise, and accurate method for the determination of chromium in natural water samples. The detection limit for this analytical method is 0.4 microg/L with a working linear limit of 25.0 microg/L. The precision at the detection limit ranges from 20 to 57 percent relative standard deviation (RSD) with an improvement to 4.6 percent RSD for concentrations more than 3 microg/L. Accuracy of this method was determined for a variety of reference standards that was representative of the analytical range. The results were within the established standard deviations. Samples were spiked with known concentrations of chromium with recoveries ranging from 84 to 122 percent. In addition, a comparison of data between graphite furnace atomic absorption spectrophotometry and direct-current plasma atomic emission spectrometry resulted in suitable agreement between the two methods, with an average deviation of +/- 2.0 microg/L throughout the analytical range.
Verification of spectrophotometric method for nitrate analysis in water samples
NASA Astrophysics Data System (ADS)
Kurniawati, Puji; Gusrianti, Reny; Dwisiwi, Bledug Bernanti; Purbaningtias, Tri Esti; Wiyantoko, Bayu
2017-12-01
The aim of this research was to verify the spectrophotometric method to analyze nitrate in water samples using APHA 2012 Section 4500 NO3-B method. The verification parameters used were: linearity, method detection limit, level of quantitation, level of linearity, accuracy and precision. Linearity was obtained by using 0 to 50 mg/L nitrate standard solution and the correlation coefficient of standard calibration linear regression equation was 0.9981. The method detection limit (MDL) was defined as 0,1294 mg/L and limit of quantitation (LOQ) was 0,4117 mg/L. The result of a level of linearity (LOL) was 50 mg/L and nitrate concentration 10 to 50 mg/L was linear with a level of confidence was 99%. The accuracy was determined through recovery value was 109.1907%. The precision value was observed using % relative standard deviation (%RSD) from repeatability and its result was 1.0886%. The tested performance criteria showed that the methodology was verified under the laboratory conditions.
NASA Astrophysics Data System (ADS)
Jin, Yang; Ciwei, Gao; Jing, Zhang; Min, Sun; Jie, Yu
2017-05-01
The selection and evaluation of priority domains in Global Energy Internet standard development will help to break through limits of national investment, thus priority will be given to standardizing technical areas with highest urgency and feasibility. Therefore, in this paper, the process of Delphi survey based on technology foresight is put forward, the evaluation index system of priority domains is established, and the index calculation method is determined. Afterwards, statistical method is used to evaluate the alternative domains. Finally the top four priority domains are determined as follows: Interconnected Network Planning and Simulation Analysis, Interconnected Network Safety Control and Protection, Intelligent Power Transmission and Transformation, and Internet of Things.
Automatic Error Analysis Using Intervals
ERIC Educational Resources Information Center
Rothwell, E. J.; Cloud, M. J.
2012-01-01
A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…
Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed
2016-05-15
Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures. Copyright © 2016 Elsevier B.V. All rights reserved.
Preiksaitis, J.; Tong, Y.; Pang, X.; Sun, Y.; Tang, L.; Cook, L.; Pounds, S.; Fryer, J.; Caliendo, A. M.
2015-01-01
Quantitative detection of cytomegalovirus (CMV) DNA has become a standard part of care for many groups of immunocompromised patients; recent development of the first WHO international standard for human CMV DNA has raised hopes of reducing interlaboratory variability of results. Commutability of reference material has been shown to be necessary if such material is to reduce variability among laboratories. Here we evaluated the commutability of the WHO standard using 10 different real-time quantitative CMV PCR assays run by eight different laboratories. Test panels, including aliquots of 50 patient samples (40 positive samples and 10 negative samples) and lyophilized CMV standard, were run, with each testing center using its own quantitative calibrators, reagents, and nucleic acid extraction methods. Commutability was assessed both on a pairwise basis and over the entire group of assays, using linear regression and correspondence analyses. Commutability of the WHO material differed among the tests that were evaluated, and these differences appeared to vary depending on the method of statistical analysis used and the cohort of assays included in the analysis. Depending on the methodology used, the WHO material showed poor or absent commutability with up to 50% of assays. Determination of commutability may require a multifaceted approach; the lack of commutability seen when using the WHO standard with several of the assays here suggests that further work is needed to bring us toward true consensus. PMID:26269622
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
Li, Fuqin; Shi, Lihong; Wang, Fei; Sun, Caiyuan; Kang, Di; Zhang, Yuping; Chen, Lingzhu; Hu, Deyu
2017-06-08
A QuEChERS-liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed for the determination of pyraclostrobin, thiophanate-methyl and its metabolite carbendazim in soil and citrus. The samples were extracted with methanol or acetonitrile, purified by primary secondary amine (PSA), then separated by LC, detected in multiple reaction monitoring (MRM) mass spectrometry mode via positive electrospray ionization. The analytes were quantified by matrix-matched standard solutions with external standard method. The limits of quantification (LOQs) of pyraclostrobin, thiophanate-methyl and carbendazim in different matrices were 5.8-7.0 μg/kg, 9.3-14.1 μg/kg and 2.1-2.6 μg/kg, respectively. For all the samples, the spiked recoveries ranged from 75.48% to 109.18%, and the relative standard deviations (RSDs) were 0.60%-5.11% ( n =5). The method is quick, easy, effective, sensitive and accurate. The matrix-matched calibration solutions can efficiently compensate matrix effects of the pyraclostrobin, thiophanate-methyl and carbendazim in LC-MS/MS analysis. The established method can be applied to the residue analysis of the real samples of soil, citrus peel, citrus pulp and citrus fruits.
Zhang, Yi-Bei; DA, Juan; Zhang, Jing-Xian; Li, Shang-Rong; Chen, Xin; Long, Hua-Li; Wang, Qiu-Rong; Cai, Lu-Ying; Yao, Shuai; Hou, Jin-Jun; Wu, Wan-Ying; Guo, De-An
2017-04-01
Aconiti Lateralis Radix Praeparata (Fuzi) is a commonly used traditional Chinese medicine in clinic for its potency in restoring yang and rescuing from collapse. Aconiti alkaloids, mainly including monoester-diterpenoidaconitines (MDAs) and diester-diterpenoidaconitines (DDAs), are considered to act as both bioactive and toxic constituents. In the present study, a feasible, economical, and accurate HPLC method for simultaneous determination of six alkaloid markers using the Single Standard for Determination of Multi-Components (SSDMC) method was developed and fully validated. Benzoylmesaconine was used as the unique reference standard. This method was proven as accurate (recovery varying between 97.5%-101.8%, RSD < 3%), precise (RSD 0.63%-2.05%), and linear (R > 0.999 9) over the concentration ranges, and subsequently applied to quantitative evaluation of 62 batches of samples, among which 45 batches were from good manufacturing practice (GMP) facilities and 17 batches from the drug market. The contents were then analyzed by principal component analysis (PCA) and homogeneity test. The present study provided valuable information for improving the quality standard of Aconiti Lateralis Radix Praeparata. The developed method also has the potential in analysis of other Aconitum species, such as Aconitum carmichaelii (prepared parent root) and Aconitum kusnezoffii (prepared root). Copyright © 2017 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.
Tweedell, Andrew J.; Haynes, Courtney A.
2017-01-01
The timing of muscle activity is a commonly applied analytic method to understand how the nervous system controls movement. This study systematically evaluates six classes of standard and statistical algorithms to determine muscle onset in both experimental surface electromyography (EMG) and simulated EMG with a known onset time. Eighteen participants had EMG collected from the biceps brachii and vastus lateralis while performing a biceps curl or knee extension, respectively. Three established methods and three statistical methods for EMG onset were evaluated. Linear envelope, Teager-Kaiser energy operator + linear envelope and sample entropy were the established methods evaluated while general time series mean/variance, sequential and batch processing of parametric and nonparametric tools, and Bayesian changepoint analysis were the statistical techniques used. Visual EMG onset (experimental data) and objective EMG onset (simulated data) were compared with algorithmic EMG onset via root mean square error and linear regression models for stepwise elimination of inferior algorithms. The top algorithms for both data types were analyzed for their mean agreement with the gold standard onset and evaluation of 95% confidence intervals. The top algorithms were all Bayesian changepoint analysis iterations where the parameter of the prior (p0) was zero. The best performing Bayesian algorithms were p0 = 0 and a posterior probability for onset determination at 60–90%. While existing algorithms performed reasonably, the Bayesian changepoint analysis methodology provides greater reliability and accuracy when determining the singular onset of EMG activity in a time series. Further research is needed to determine if this class of algorithms perform equally well when the time series has multiple bursts of muscle activity. PMID:28489897
Chen, Xinyuan; Dai, Jianrong
2018-05-01
Magnetic Resonance Imaging (MRI) simulation differs from diagnostic MRI in purpose, technical requirements, and implementation. We propose a semiautomatic method for image acceptance and commissioning for the scanner, the radiofrequency (RF) coils, and pulse sequences for an MRI simulator. The ACR MRI accreditation large phantom was used for image quality analysis with seven parameters. Standard ACR sequences with a split head coil were adopted to examine the scanner's basic performance. The performance of simulation RF coils were measured and compared using the standard sequence with different clinical diagnostic coils. We used simulation sequences with simulation coils to test the quality of image and advanced performance of the scanner. Codes and procedures were developed for semiautomatic image quality analysis. When using standard ACR sequences with a split head coil, image quality passed all ACR recommended criteria. The image intensity uniformity with a simulation RF coil decreased about 34% compared with the eight-channel diagnostic head coil, while the other six image quality parameters were acceptable. Those two image quality parameters could be improved to more than 85% by built-in intensity calibration methods. In the simulation sequences test, the contrast resolution was sensitive to the FOV and matrix settings. The geometric distortion of simulation sequences such as T1-weighted and T2-weighted images was well-controlled in the isocenter and 10 cm off-center within a range of ±1% (2 mm). We developed a semiautomatic image quality analysis method for quantitative evaluation of images and commissioning of an MRI simulator. The baseline performances of simulation RF coils and pulse sequences have been established for routine QA. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.