Sample records for reference method results

  1. Valid analytical performance specifications for combined analytical bias and imprecision for the use of common reference intervals.

    PubMed

    Hyltoft Petersen, Per; Lund, Flemming; Fraser, Callum G; Sandberg, Sverre; Sölétormos, György

    2018-01-01

    Background Many clinical decisions are based on comparison of patient results with reference intervals. Therefore, an estimation of the analytical performance specifications for the quality that would be required to allow sharing common reference intervals is needed. The International Federation of Clinical Chemistry (IFCC) recommended a minimum of 120 reference individuals to establish reference intervals. This number implies a certain level of quality, which could then be used for defining analytical performance specifications as the maximum combination of analytical bias and imprecision required for sharing common reference intervals, the aim of this investigation. Methods Two methods were investigated for defining the maximum combination of analytical bias and imprecision that would give the same quality of common reference intervals as the IFCC recommendation. Method 1 is based on a formula for the combination of analytical bias and imprecision and Method 2 is based on the Microsoft Excel formula NORMINV including the fractional probability of reference individuals outside each limit and the Gaussian variables of mean and standard deviation. The combinations of normalized bias and imprecision are illustrated for both methods. The formulae are identical for Gaussian and log-Gaussian distributions. Results Method 2 gives the correct results with a constant percentage of 4.4% for all combinations of bias and imprecision. Conclusion The Microsoft Excel formula NORMINV is useful for the estimation of analytical performance specifications for both Gaussian and log-Gaussian distributions of reference intervals.

  2. Fast lossless compression via cascading Bloom filters

    PubMed Central

    2014-01-01

    Background Data from large Next Generation Sequencing (NGS) experiments present challenges both in terms of costs associated with storage and in time required for file transfer. It is sometimes possible to store only a summary relevant to particular applications, but generally it is desirable to keep all information needed to revisit experimental results in the future. Thus, the need for efficient lossless compression methods for NGS reads arises. It has been shown that NGS-specific compression schemes can improve results over generic compression methods, such as the Lempel-Ziv algorithm, Burrows-Wheeler transform, or Arithmetic Coding. When a reference genome is available, effective compression can be achieved by first aligning the reads to the reference genome, and then encoding each read using the alignment position combined with the differences in the read relative to the reference. These reference-based methods have been shown to compress better than reference-free schemes, but the alignment step they require demands several hours of CPU time on a typical dataset, whereas reference-free methods can usually compress in minutes. Results We present a new approach that achieves highly efficient compression by using a reference genome, but completely circumvents the need for alignment, affording a great reduction in the time needed to compress. In contrast to reference-based methods that first align reads to the genome, we hash all reads into Bloom filters to encode, and decode by querying the same Bloom filters using read-length subsequences of the reference genome. Further compression is achieved by using a cascade of such filters. Conclusions Our method, called BARCODE, runs an order of magnitude faster than reference-based methods, while compressing an order of magnitude better than reference-free methods, over a broad range of sequencing coverage. In high coverage (50-100 fold), compared to the best tested compressors, BARCODE saves 80-90% of the running time while only increasing space slightly. PMID:25252952

  3. External quality assurance programs as a tool for verifying standardization of measurement procedures: Pilot collaboration in Europe.

    PubMed

    Perich, C; Ricós, C; Alvarez, V; Biosca, C; Boned, B; Cava, F; Doménech, M V; Fernández-Calle, P; Fernández-Fernández, P; García-Lario, J V; Minchinela, J; Simón, M; Jansen, R

    2014-05-15

    Current external quality assurance schemes have been classified into six categories, according to their ability to verify the degree of standardization of the participating measurement procedures. SKML (Netherlands) is a Category 1 EQA scheme (commutable EQA materials with values assigned by reference methods), whereas SEQC (Spain) is a Category 5 scheme (replicate analyses of non-commutable materials with no values assigned by reference methods). The results obtained by a group of Spanish laboratories participating in a pilot study organized by SKML are examined, with the aim of pointing out the improvements over our current scheme that a Category 1 program could provide. Imprecision and bias are calculated for each analyte and laboratory, and compared with quality specifications derived from biological variation. Of the 26 analytes studied, 9 had results comparable with those from reference methods, and 10 analytes did not have comparable results. The remaining 7 analytes measured did not have available reference method values, and in these cases, comparison with the peer group showed comparable results. The reasons for disagreement in the second group can be summarized as: use of non-standard methods (IFCC without exogenous pyridoxal phosphate for AST and ALT, Jaffé kinetic at low-normal creatinine concentrations and with eGFR); non-commutability of the reference material used to assign values to the routine calibrator (calcium, magnesium and sodium); use of reference materials without established commutability instead of reference methods for AST and GGT, and lack of a systematic effort by manufacturers to harmonize results. Results obtained in this work demonstrate the important role of external quality assurance programs using commutable materials with values assigned by reference methods to correctly monitor the standardization of laboratory tests with consequent minimization of risk to patients. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Different equation-of-motion coupled cluster methods with different reference functions: The formyl radical

    NASA Astrophysics Data System (ADS)

    Kuś, Tomasz; Bartlett, Rodney J.

    2008-09-01

    The doublet and quartet excited states of the formyl radical have been studied by the equation-of-motion (EOM) coupled cluster (CC) method. The Sz spin-conserving singles and doubles (EOM-EE-CCSD) and singles, doubles, and triples (EOM-EE-CCSDT) approaches, as well as the spin-flipped singles and doubles (EOM-SF-CCSD) method have been applied, subject to unrestricted Hartree-Fock (HF), restricted open-shell HF, and quasirestricted HF references. The structural parameters, vertical and adiabatic excitation energies, and harmonic vibrational frequencies have been calculated. The issue of the reference function choice for the spin-flipped (SF) method and its impact on the results has been discussed using the experimental data and theoretical results available. The results show that if the appropriate reference function is chosen so that target states differ from the reference by only single excitations, then EOM-EE-CCSD and EOM-SF-CCSD methods give a very good description of the excited states. For the states that have a non-negligible contribution of the doubly excited configurations one is able to use the SF method with such a reference function, that in most cases the performance of the EOM-SF-CCSD method is better than that of the EOM-EE-CCSD approach.

  5. Measurement of susceptibility artifacts with histogram-based reference value on magnetic resonance images according to standard ASTM F2119.

    PubMed

    Heinrich, Andreas; Teichgräber, Ulf K; Güttler, Felix V

    2015-12-01

    The standard ASTM F2119 describes a test method for measuring the size of a susceptibility artifact based on the example of a passive implant. A pixel in an image is considered to be a part of an image artifact if the intensity is changed by at least 30% in the presence of a test object, compared to a reference image in which the test object is absent (reference value). The aim of this paper is to simplify and accelerate the test method using a histogram-based reference value. Four test objects were scanned parallel and perpendicular to the main magnetic field, and the largest susceptibility artifacts were measured using two methods of reference value determination (reference image-based and histogram-based reference value). The results between both methods were compared using the Mann-Whitney U-test. The difference between both reference values was 42.35 ± 23.66. The difference of artifact size was 0.64 ± 0.69 mm. The artifact sizes of both methods did not show significant differences; the p-value of the Mann-Whitney U-test was between 0.710 and 0.521. A standard-conform method for a rapid, objective, and reproducible evaluation of susceptibility artifacts could be implemented. The result of the histogram-based method does not significantly differ from the ASTM-conform method.

  6. Reference interval computation: which method (not) to choose?

    PubMed

    Pavlov, Igor Y; Wilson, Andrew R; Delgado, Julio C

    2012-07-11

    When different methods are applied to reference interval (RI) calculation the results can sometimes be substantially different, especially for small reference groups. If there are no reliable RI data available, there is no way to confirm which method generates results closest to the true RI. We randomly drawn samples obtained from a public database for 33 markers. For each sample, RIs were calculated by bootstrapping, parametric, and Box-Cox transformed parametric methods. Results were compared to the values of the population RI. For approximately half of the 33 markers, results of all 3 methods were within 3% of the true reference value. For other markers, parametric results were either unavailable or deviated considerably from the true values. The transformed parametric method was more accurate than bootstrapping for sample size of 60, very close to bootstrapping for sample size 120, but in some cases unavailable. We recommend against using parametric calculations to determine RIs. The transformed parametric method utilizing Box-Cox transformation would be preferable way of RI calculation, if it satisfies normality test. If not, the bootstrapping is always available, and is almost as accurate and precise as the transformed parametric method. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Estimation of reference intervals from small samples: an example using canine plasma creatinine.

    PubMed

    Geffré, A; Braun, J P; Trumel, C; Concordet, D

    2009-12-01

    According to international recommendations, reference intervals should be determined from at least 120 reference individuals, which often are impossible to achieve in veterinary clinical pathology, especially for wild animals. When only a small number of reference subjects is available, the possible bias cannot be known and the normality of the distribution cannot be evaluated. A comparison of reference intervals estimated by different methods could be helpful. The purpose of this study was to compare reference limits determined from a large set of canine plasma creatinine reference values, and large subsets of this data, with estimates obtained from small samples selected randomly. Twenty sets each of 120 and 27 samples were randomly selected from a set of 1439 plasma creatinine results obtained from healthy dogs in another study. Reference intervals for the whole sample and for the large samples were determined by a nonparametric method. The estimated reference limits for the small samples were minimum and maximum, mean +/- 2 SD of native and Box-Cox-transformed values, 2.5th and 97.5th percentiles by a robust method on native and Box-Cox-transformed values, and estimates from diagrams of cumulative distribution functions. The whole sample had a heavily skewed distribution, which approached Gaussian after Box-Cox transformation. The reference limits estimated from small samples were highly variable. The closest estimates to the 1439-result reference interval for 27-result subsamples were obtained by both parametric and robust methods after Box-Cox transformation but were grossly erroneous in some cases. For small samples, it is recommended that all values be reported graphically in a dot plot or histogram and that estimates of the reference limits be compared using different methods.

  8. Selection of reference standard during method development using the analytical hierarchy process.

    PubMed

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. A comparative uncertainty study of the calibration of macrolide antibiotic reference standards using quantitative nuclear magnetic resonance and mass balance methods.

    PubMed

    Liu, Shu-Yu; Hu, Chang-Qin

    2007-10-17

    This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.

  10. [Selection of reference genes of Siraitia grosvenorii by real-time PCR].

    PubMed

    Tu, Dong-ping; Mo, Chang-ming; Ma, Xiao-jun; Zhao, Huan; Tang, Qi; Huang, Jie; Pan, Li-mei; Wei, Rong-chang

    2015-01-01

    Siraitia grosvenorii is a traditional Chinese medicine also as edible food. This study selected six candidate reference genes by real-time quantitative PCR, the expression stability of the candidate reference genes in the different samples was analyzed by using the software and methods of geNorm, NormFinder, BestKeeper, Delta CT method and RefFinder, reference genes for S. grosvenorii were selected for the first time. The results showed that 18SrRNA expressed most stable in all samples, was the best reference gene in the genetic analysis. The study has a guiding role for the analysis of gene expression using qRT-PCR methods, providing a suitable reference genes to ensure the results in the study on differential expressed gene in synthesis and biological pathways, also other genes of S. grosvenorii.

  11. Resampling methods in Microsoft Excel® for estimating reference intervals

    PubMed Central

    Theodorsson, Elvar

    2015-01-01

    Computer- intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles.
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples. PMID:26527366

  12. Resampling methods in Microsoft Excel® for estimating reference intervals.

    PubMed

    Theodorsson, Elvar

    2015-01-01

    Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. 
The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular.
 Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.

  13. Testing the causal theory of reference.

    PubMed

    Domaneschi, Filippo; Vignolo, Massimiliano; Di Paola, Simona

    2017-04-01

    Theories of reference are a crucial research topic in analytic philosophy. Since the publication of Kripke's Naming and Necessity, most philosophers have endorsed the causal/historical theory of reference. The goal of this paper is twofold: (i) to discuss a method for testing experimentally the causal theory of reference for proper names by investigating linguistic usage and (ii) to present the results from two experiments conducted with that method. Data collected in our experiments confirm the causal theory of reference for people proper names and for geographical proper names. A secondary but interesting result is that the semantic domain affects reference assignment: while with people proper names speakers tend to assign the semantic reference, with geographical proper names they are prompted to assign the speaker's reference. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A reference estimator based on composite sensor pattern noise for source device identification

    NASA Astrophysics Data System (ADS)

    Li, Ruizhe; Li, Chang-Tsun; Guan, Yu

    2014-02-01

    It has been proved that Sensor Pattern Noise (SPN) can serve as an imaging device fingerprint for source camera identification. Reference SPN estimation is a very important procedure within the framework of this application. Most previous works built reference SPN by averaging the SPNs extracted from 50 images of blue sky. However, this method can be problematic. Firstly, in practice we may face the problem of source camera identification in the absence of the imaging cameras and reference SPNs, which means only natural images with scene details are available for reference SPN estimation rather than blue sky images. It is challenging because the reference SPN can be severely contaminated by image content. Secondly, the number of available reference images sometimes is too few for existing methods to estimate a reliable reference SPN. In fact, existing methods lack consideration of the number of available reference images as they were designed for the datasets with abundant images to estimate the reference SPN. In order to deal with the aforementioned problem, in this work, a novel reference estimator is proposed. Experimental results show that our proposed method achieves better performance than the methods based on the averaged reference SPN, especially when few reference images used.

  15. [The water content reference material of water saturated octanol].

    PubMed

    Wang, Haifeng; Ma, Kang; Zhang, Wei; Li, Zhanyuan

    2011-03-01

    The national standards of biofuels specify the technique specification and analytical methods. A water content certified reference material based on the water saturated octanol was developed in order to satisfy the needs of the instrument calibration and the methods validation, assure the accuracy and consistency of results in water content measurements of biofuels. Three analytical methods based on different theories were employed to certify the water content of the reference material, including Karl Fischer coulometric titration, Karl Fischer volumetric titration and quantitative nuclear magnetic resonance. The consistency of coulometric and volumetric titration was achieved through the improvement of methods. The accuracy of the certified result was improved by the introduction of the new method of quantitative nuclear magnetic resonance. Finally, the certified value of reference material is 4.76% with an expanded uncertainty of 0.09%.

  16. Performance of three reflectance calibration methods for airborne hyperspectral spectrometer data.

    PubMed

    Miura, Tomoaki; Huete, Alfredo R

    2009-01-01

    In this study, the performances and accuracies of three methods for converting airborne hyperspectral spectrometer data to reflectance factors were characterized and compared. The "reflectance mode (RM)" method, which calibrates a spectrometer against a white reference panel prior to mounting on an aircraft, resulted in spectral reflectance retrievals that were biased and distorted. The magnitudes of these bias errors and distortions varied significantly, depending on time of day and length of the flight campaign. The "linear-interpolation (LI)" method, which converts airborne spectrometer data by taking a ratio of linearly-interpolated reference values from the preflight and post-flight reference panel readings, resulted in precise, but inaccurate reflectance retrievals. These reflectance spectra were not distorted, but were subject to bias errors of varying magnitudes dependent on the flight duration length. The "continuous panel (CP)" method uses a multi-band radiometer to obtain continuous measurements over a reference panel throughout the flight campaign, in order to adjust the magnitudes of the linear-interpolated reference values from the preflight and post-flight reference panel readings. Airborne hyperspectral reflectance retrievals obtained using this method were found to be the most accurate and reliable reflectance calibration method. The performances of the CP method in retrieving accurate reflectance factors were consistent throughout time of day and for various flight durations. Based on the dataset analyzed in this study, the uncertainty of the CP method has been estimated to be 0.0025 ± 0.0005 reflectance units for the wavelength regions not affected by atmospheric absorptions. The RM method can produce reasonable results only for a very short-term flight (e.g., < 15 minutes) conducted around a local solar noon. The flight duration should be kept shorter than 30 minutes for the LI method to produce results with reasonable accuracies. An important advantage of the CP method is that the method can be used for long-duration flight campaigns (e.g., 1-2 hours). Although this study focused on reflectance calibration of airborne spectrometer data, the methods evaluated in this study and the results obtained are directly applicable to ground spectrometer measurements.

  17. Intra prediction using face continuity in 360-degree video coding

    NASA Astrophysics Data System (ADS)

    Hanhart, Philippe; He, Yuwen; Ye, Yan

    2017-09-01

    This paper presents a new reference sample derivation method for intra prediction in 360-degree video coding. Unlike the conventional reference sample derivation method for 2D video coding, which uses the samples located directly above and on the left of the current block, the proposed method considers the spherical nature of 360-degree video when deriving reference samples located outside the current face to which the block belongs, and derives reference samples that are geometric neighbors on the sphere. The proposed reference sample derivation method was implemented in the Joint Exploration Model 3.0 (JEM-3.0) for the cubemap projection format. Simulation results for the all intra configuration show that, when compared with the conventional reference sample derivation method, the proposed method gives, on average, luma BD-rate reduction of 0.3% in terms of the weighted spherical PSNR (WS-PSNR) and spherical PSNR (SPSNR) metrics.

  18. Validation of a modification to Performance-Tested Method 070601: Reveal Listeria Test for detection of Listeria spp. in selected foods and selected environmental samples.

    PubMed

    Alles, Susan; Peng, Linda X; Mozola, Mark A

    2009-01-01

    A modification to Performance-Tested Method (PTM) 070601, Reveal Listeria Test (Reveal), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there was a statistically significant difference in performance between the Reveal and reference culture [U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA/BAM) or U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS)] methods for only a single food in one trial (pasteurized crab meat) at the 27 h enrichment time point, with more positive results obtained with the FDA/BAM reference method. No foods showed statistically significant differences in method performance at the 30 h time point. Independent laboratory testing of 3 foods again produced a statistically significant difference in results for crab meat at the 27 h time point; otherwise results of the Reveal and reference methods were statistically equivalent. Overall, considering both internal and independent laboratory trials, sensitivity of the Reveal method relative to the reference culture procedures in testing of foods was 85.9% at 27 h and 97.1% at 30 h. Results from 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the Reveal method was more productive than the reference USDA-FSIS culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the Reveal method at the 24 h time point. Overall, sensitivity of the Reveal method at 24 h relative to that of the USDA-FSIS method was 153%. The Reveal method exhibited extremely high specificity, with only a single false-positive result in all trials combined for overall specificity of 99.5%.

  19. Fast lossless compression via cascading Bloom filters.

    PubMed

    Rozov, Roye; Shamir, Ron; Halperin, Eran

    2014-01-01

    Data from large Next Generation Sequencing (NGS) experiments present challenges both in terms of costs associated with storage and in time required for file transfer. It is sometimes possible to store only a summary relevant to particular applications, but generally it is desirable to keep all information needed to revisit experimental results in the future. Thus, the need for efficient lossless compression methods for NGS reads arises. It has been shown that NGS-specific compression schemes can improve results over generic compression methods, such as the Lempel-Ziv algorithm, Burrows-Wheeler transform, or Arithmetic Coding. When a reference genome is available, effective compression can be achieved by first aligning the reads to the reference genome, and then encoding each read using the alignment position combined with the differences in the read relative to the reference. These reference-based methods have been shown to compress better than reference-free schemes, but the alignment step they require demands several hours of CPU time on a typical dataset, whereas reference-free methods can usually compress in minutes. We present a new approach that achieves highly efficient compression by using a reference genome, but completely circumvents the need for alignment, affording a great reduction in the time needed to compress. In contrast to reference-based methods that first align reads to the genome, we hash all reads into Bloom filters to encode, and decode by querying the same Bloom filters using read-length subsequences of the reference genome. Further compression is achieved by using a cascade of such filters. Our method, called BARCODE, runs an order of magnitude faster than reference-based methods, while compressing an order of magnitude better than reference-free methods, over a broad range of sequencing coverage. In high coverage (50-100 fold), compared to the best tested compressors, BARCODE saves 80-90% of the running time while only increasing space slightly.

  20. Thermal radiation transfer calculations in combustion fields using the SLW model coupled with a modified reference approach

    NASA Astrophysics Data System (ADS)

    Darbandi, Masoud; Abrar, Bagher

    2018-01-01

    The spectral-line weighted-sum-of-gray-gases (SLW) model is considered as a modern global model, which can be used in predicting the thermal radiation heat transfer within the combustion fields. The past SLW model users have mostly employed the reference approach to calculate the local values of gray gases' absorption coefficient. This classical reference approach assumes that the absorption spectra of gases at different thermodynamic conditions are scalable with the absorption spectrum of gas at a reference thermodynamic state in the domain. However, this assumption cannot be reasonable in combustion fields, where the gas temperature is very different from the reference temperature. Consequently, the results of SLW model incorporated with the classical reference approach, say the classical SLW method, are highly sensitive to the reference temperature magnitude in non-isothermal combustion fields. To lessen this sensitivity, the current work combines the SLW model with a modified reference approach, which is a particular one among the eight possible reference approach forms reported recently by Solovjov, et al. [DOI: 10.1016/j.jqsrt.2017.01.034, 2017]. The combination is called "modified SLW method". This work shows that the modified reference approach can provide more accurate total emissivity calculation than the classical reference approach if it is coupled with the SLW method. This would be particularly helpful for more accurate calculation of radiation transfer in highly non-isothermal combustion fields. To approve this, we use both the classical and modified SLW methods and calculate the radiation transfer in such fields. It is shown that the modified SLW method can almost eliminate the sensitivity of achieved results to the chosen reference temperature in treating highly non-isothermal combustion fields.

  1. Indirect methods for reference interval determination - review and recommendations.

    PubMed

    Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim

    2018-04-19

    Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.

  2. [Study on the experimental application of floating-reference method to noninvasive blood glucose sensing].

    PubMed

    Yu, Hui; Qi, Dan; Li, Heng-da; Xu, Ke-xin; Yuan, Wei-jie

    2012-03-01

    Weak signal, low instrument signal-to-noise ratio, continuous variation of human physiological environment and the interferences from other components in blood make it difficult to extract the blood glucose information from near infrared spectrum in noninvasive blood glucose measurement. The floating-reference method, which analyses the effect of glucose concentration variation on absorption coefficient and scattering coefficient, gets spectrum at the reference point and the measurement point where the light intensity variations from absorption and scattering are counteractive and biggest respectively. By using the spectrum from reference point as reference, floating-reference method can reduce the interferences from variation of physiological environment and experiment circumstance. In the present paper, the effectiveness of floating-reference method working on improving prediction precision and stability was assessed through application experiments. The comparison was made between models whose data were processed with and without floating-reference method. The results showed that the root mean square error of prediction (RMSEP) decreased by 34.7% maximally. The floating-reference method could reduce the influences of changes of samples' state, instrument noises and drift, and improve the models' prediction precision and stability effectively.

  3. Standardization in laboratory medicine: Adoption of common reference intervals to the Croatian population.

    PubMed

    Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea

    2016-03-26

    Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients' care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine.

  4. [Establishing biological reference intervals of alanine transaminase for clinical laboratory stored database].

    PubMed

    Guo, Wei; Song, Binbin; Shen, Junfei; Wu, Jiong; Zhang, Chunyan; Wang, Beili; Pan, Baishen

    2015-08-25

    To establish an indirect reference interval based on the test results of alanine aminotransferase stored in a laboratory information system. All alanine aminotransferase results were included for outpatients and physical examinations that were stored in the laboratory information system of Zhongshan Hospital during 2014. The original data were transformed using a Box-Cox transformation to obtain an approximate normal distribution. Outliers were identified and omitted using the Chauvenet and Tukey methods. The indirect reference intervals were obtained by simultaneously applying nonparametric and Hoffmann methods. The reference change value was selected to determine the statistical significance of the observed differences between the calculated and published reference intervals. The indirect reference intervals for alanine aminotransferase of all groups were 12 to 41 U/L (male, outpatient), 12 to 48 U/L (male, physical examination), 9 to 32 U/L (female, outpatient), and 8 to 35 U/L (female, physical examination), respectively. The absolute differences when compared with the direct results were all smaller than the reference change value of alanine aminotransferase. The Box-Cox transformation combined with the Hoffmann and Tukey methods is a simple and reliable technique that should be promoted and used by clinical laboratories.

  5. Standardization in laboratory medicine: Adoption of common reference intervals to the Croatian population

    PubMed Central

    Flegar-Meštrić, Zlata; Perkov, Sonja; Radeljak, Andrea

    2016-01-01

    Considering the fact that the results of laboratory tests provide useful information about the state of health of patients, determination of reference value is considered an intrinsic part in the development of laboratory medicine. There are still huge differences in the analytical methods used as well as in the associated reference intervals which could consequently significantly affect the proper assessment of patient health. In a constant effort to increase the quality of patients’ care, there are numerous international initiatives for standardization and/or harmonization of laboratory diagnostics in order to achieve maximum comparability of laboratory test results and improve patient safety. Through the standardization and harmonization processes of analytical methods the ability to create unique reference intervals is achieved. Such reference intervals could be applied globally in all laboratories using methods traceable to the same reference measuring system and analysing the biological samples from the populations with similar socio-demographic and ethnic characteristics. In this review we outlined the results of the harmonization processes in Croatia in the field of population based reference intervals for clinically relevant blood and serum constituents which are in accordance with ongoing activity for worldwide standardization and harmonization based on traceability in laboratory medicine. PMID:27019800

  6. Innovative application of the moisture analyzer for determination of dry mass content of processed cheese

    NASA Astrophysics Data System (ADS)

    Kowalska, Małgorzata; Janas, Sławomir; Woźniak, Magdalena

    2018-04-01

    The aim of this work was the presentation of an alternative method of determination of the total dry mass content in processed cheese. The authors claim that the presented method can be used in industry's quality control laboratories for routine testing and for quick in-process control. For the test purposes both reference method of determination of dry mass in processed cheese and moisture analyzer method were used. The tests were carried out for three different kinds of processed cheese. In accordance with the reference method, the sample was placed on a layer of silica sand and dried at the temperature of 102 °C for about 4 h. The moisture analyzer test required method validation, with regard to drying temperature range and mass of the analyzed sample. Optimum drying temperature of 110 °C was determined experimentally. For Hochland cream processed cheese sample, the total dry mass content, obtained using the reference method, was 38.92%, whereas using the moisture analyzer method, it was 38.74%. An average analysis time in case of the moisture analyzer method was 9 min. For the sample of processed cheese with tomatoes, the reference method result was 40.37%, and the alternative method result was 40.67%. For the sample of cream processed cheese with garlic the reference method gave value of 36.88%, and the alternative method, of 37.02%. An average time of those determinations was 16 min. Obtained results confirmed that use of moisture analyzer is effective. Compliant values of dry mass content were obtained for both of the used methods. According to the authors, the fact that the measurement took incomparably less time for moisture analyzer method, is a key criterion of in-process control and final quality control method selection.

  7. A self-reference PRF-shift MR thermometry method utilizing the phase gradient

    NASA Astrophysics Data System (ADS)

    Langley, Jason; Potter, William; Phipps, Corey; Huang, Feng; Zhao, Qun

    2011-12-01

    In magnetic resonance (MR) imaging, the most widely used and accurate method for measuring temperature is based on the shift in proton resonance frequency (PRF). However, inter-scan motion and bulk magnetic field shifts can lead to inaccurate temperature measurements in the PRF-shift MR thermometry method. The self-reference PRF-shift MR thermometry method was introduced to overcome such problems by deriving a reference image from the heated or treated image, and approximates the reference phase map with low-order polynomial functions. In this note, a new approach is presented to calculate the baseline phase map in self-reference PRF-shift MR thermometry. The proposed method utilizes the phase gradient to remove the phase unwrapping step inherent to other self-reference PRF-shift MR thermometry methods. The performance of the proposed method was evaluated using numerical simulations with temperature distributions following a two-dimensional Gaussian function as well as phantom and in vivo experimental data sets. The results from both the numerical simulations and experimental data show that the proposed method is a promising technique for measuring temperature.

  8. Reference Value Advisor: a new freeware set of macroinstructions to calculate reference intervals with Microsoft Excel.

    PubMed

    Geffré, Anne; Concordet, Didier; Braun, Jean-Pierre; Trumel, Catherine

    2011-03-01

    International recommendations for determination of reference intervals have been recently updated, especially for small reference sample groups, and use of the robust method and Box-Cox transformation is now recommended. Unfortunately, these methods are not included in most software programs used for data analysis by clinical laboratories. We have created a set of macroinstructions, named Reference Value Advisor, for use in Microsoft Excel to calculate reference limits applying different methods. For any series of data, Reference Value Advisor calculates reference limits (with 90% confidence intervals [CI]) using a nonparametric method when n≥40 and by parametric and robust methods from native and Box-Cox transformed values; tests normality of distributions using the Anderson-Darling test and outliers using Tukey and Dixon-Reed tests; displays the distribution of values in dot plots and histograms and constructs Q-Q plots for visual inspection of normality; and provides minimal guidelines in the form of comments based on international recommendations. The critical steps in determination of reference intervals are correct selection of as many reference individuals as possible and analysis of specimens in controlled preanalytical and analytical conditions. Computing tools cannot compensate for flaws in selection and size of the reference sample group and handling and analysis of samples. However, if those steps are performed properly, Reference Value Advisor, available as freeware at http://www.biostat.envt.fr/spip/spip.php?article63, permits rapid assessment and comparison of results calculated using different methods, including currently unavailable methods. This allows for selection of the most appropriate method, especially as the program provides the CI of limits. It should be useful in veterinary clinical pathology when only small reference sample groups are available. ©2011 American Society for Veterinary Clinical Pathology.

  9. Calibration of BCR-ABL1 mRNA quantification methods using genetic reference materials is a valid strategy to report results on the international scale.

    PubMed

    Mauté, Carole; Nibourel, Olivier; Réa, Delphine; Coiteux, Valérie; Grardel, Nathalie; Preudhomme, Claude; Cayuela, Jean-Michel

    2014-09-01

    Until recently, diagnostic laboratories that wanted to report on the international scale had limited options: they had to align their BCR-ABL1 quantification methods through a sample exchange with a reference laboratory to derive a conversion factor. However, commercial methods calibrated on the World Health Organization genetic reference panel are now available. We report results from a study designed to assess the comparability of the two alignment strategies. Sixty follow-up samples from chronic myeloid leukemia patients were included. Two commercial methods calibrated on the genetic reference panel were compared to two conversion factor methods routinely used at Saint-Louis Hospital, Paris, and at Lille University Hospital. Results were matched against concordance criteria (i.e., obtaining at least two of the three following landmarks: 50, 75 and 90% of the patient samples within a 2-fold, 3-fold and 5-fold range, respectively). Out of the 60 samples, more than 32 were available for comparison. Compared to the conversion factor method, the two commercial methods were within a 2-fold, 3-fold and 5-fold range for 53 and 59%, 89 and 88%, 100 and 97%, respectively of the samples analyzed at Saint-Louis. At Lille, results were 45 and 85%, 76 and 97%, 100 and 100%, respectively. Agreements between methods were observed in the four comparisons performed. Our data show that the two commercial methods selected are concordant with the conversion factor methods. This study brings the proof of principle that alignment on the international scale using the genetic reference panel is compatible with the patient sample exchange procedure. We believe that these results are particularly important for diagnostic laboratories wishing to adopt commercial methods. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. The Kjeldahl method as a primary reference procedure for total protein in certified reference materials used in clinical chemistry. II. Selection of direct Kjeldahl analysis and its preliminary performance parameters.

    PubMed

    Vinklárková, Bára; Chromý, Vratislav; Šprongl, Luděk; Bittová, Miroslava; Rikanová, Milena; Ohnútková, Ivana; Žaludová, Lenka

    2015-01-01

    To select a Kjeldahl procedure suitable for the determination of total protein in reference materials used in laboratory medicine, we reviewed in our previous article Kjeldahl methods adopted by clinical chemistry and found an indirect two-step analysis by total Kjeldahl nitrogen corrected for its nonprotein nitrogen and a direct analysis made on isolated protein precipitates. In this article, we compare both procedures on various reference materials. An indirect Kjeldahl method gave falsely lower results than a direct analysis. Preliminary performance parameters qualify the direct Kjeldahl analysis as a suitable primary reference procedure for the certification of total protein in reference laboratories.

  11. What is the best reference RNA? And other questions regarding the design and analysis of two-color microarray experiments.

    PubMed

    Kerr, Kathleen F; Serikawa, Kyle A; Wei, Caimiao; Peters, Mette A; Bumgarner, Roger E

    2007-01-01

    The reference design is a practical and popular choice for microarray studies using two-color platforms. In the reference design, the reference RNA uses half of all array resources, leading investigators to ask: What is the best reference RNA? We propose a novel method for evaluating reference RNAs and present the results of an experiment that was specially designed to evaluate three common choices of reference RNA. We found no compelling evidence in favor of any particular reference. In particular, a commercial reference showed no advantage in our data. Our experimental design also enabled a new way to test the effectiveness of pre-processing methods for two-color arrays. Our results favor using intensity normalization and foregoing background subtraction. Finally, we evaluate the sensitivity and specificity of data quality filters, and we propose a new filter that can be applied to any experimental design and does not rely on replicate hybridizations.

  12. Study of the viscosity of hydrocarbon mixtures under pressure and temperature: A critical model of the corresponding states to double reference in the modeling domain.

    NASA Astrophysics Data System (ADS)

    Ettahir, Aziz; Boned, Christian; Lagourette, Bernard; Kettani, Kamal; Amarrayi, Khaoula; Garoumi, Mohammed

    2017-10-01

    The studied predictive model of behavior viscosimetric is the model of K.A. Petersen [1]. The dominant idea of this method is to characterize the viscosity of a fluid from two models taken as a reference in passing through a reduced pressure. The method is corresponding state with two references. This study shows that this method is dependent on the choice of reference and for each of the possibilities of C10/C6H6 and C1/C10 references . The results were investigated for four different weight ratios. It shows that the introduction of an adjusted coefficient does not improve significantly compared to results without adjustment factor, which appears to be the best choice. Regarding the influence of the choice of references, generally the two couples appear suitable but we noted that the choice is not necessary. In the case of mixtures containing at least one aromatic, the results are correct, especially if one takes the ratio of adjustment and our ratio without adjustment compared to that of K. A. PETERSEN[1]. The experimental results of the viscosity exhibit a good agreement with the calculated values. We can predict that the relative improvement is the finding that the introduction of the second body of reference (C10) from the model states corresponding to a reference (C1) of the authors.

  13. PRELIMINARY RESULTS OF EPA'S PERFORMANCE EVALUATION OF FEDERAL REFERENCE METHODS AND FEDERAL EQUIVALENT METHODS FOR COARSE PARTICULATE MATTER

    EPA Science Inventory

    The main objective of this study is to evaluate the performance of sampling methods for potential use as a Federal Reference Method (FRM) capable of providing an estimate of coarse particle (PMc: particulate matter with an aerodynamic diameter between 2.5 µm and 10 µm) ...

  14. Estimating patient-specific and anatomically correct reference model for craniomaxillofacial deformity via sparse representation

    PubMed Central

    Wang, Li; Ren, Yi; Gao, Yaozong; Tang, Zhen; Chen, Ken-Chung; Li, Jianfu; Shen, Steve G. F.; Yan, Jin; Lee, Philip K. M.; Chow, Ben; Xia, James J.; Shen, Dinggang

    2015-01-01

    Purpose: A significant number of patients suffer from craniomaxillofacial (CMF) deformity and require CMF surgery in the United States. The success of CMF surgery depends on not only the surgical techniques but also an accurate surgical planning. However, surgical planning for CMF surgery is challenging due to the absence of a patient-specific reference model. Currently, the outcome of the surgery is often subjective and highly dependent on surgeon’s experience. In this paper, the authors present an automatic method to estimate an anatomically correct reference shape of jaws for orthognathic surgery, a common type of CMF surgery. Methods: To estimate a patient-specific jaw reference model, the authors use a data-driven method based on sparse shape composition. Given a dictionary of normal subjects, the authors first use the sparse representation to represent the midface of a patient by the midfaces of the normal subjects in the dictionary. Then, the derived sparse coefficients are used to reconstruct a patient-specific reference jaw shape. Results: The authors have validated the proposed method on both synthetic and real patient data. Experimental results show that the authors’ method can effectively reconstruct the normal shape of jaw for patients. Conclusions: The authors have presented a novel method to automatically estimate a patient-specific reference model for the patient suffering from CMF deformity. PMID:26429255

  15. Pathology Report for Intraperitoneal Sodium Dichromate Exposure in Rats

    DTIC Science & Technology

    2015-08-12

    1 2 References 1 3 Methods 1 4 Results 2 5 Discussion...Dr. Michael Madejczyk, USARMY MEDCOM USACEHR (US)). 2 References See Appendix A for a listing of references. 3 Methods Conical tubes containing...Nonproliferative Lesions of the Rat and Mouse Hepatobiliary System Toxicol Pathol. 38(7 Suppl): 5S -81S. Toxicological Study No. S.0035303-15, December

  16. Standardization of glycohemoglobin results and reference values in whole blood studied in 103 laboratories using 20 methods.

    PubMed

    Weykamp, C W; Penders, T J; Miedema, K; Muskiet, F A; van der Slik, W

    1995-01-01

    We investigated the effect of calibration with lyophilized calibrators on whole-blood glycohemoglobin (glyHb) results. One hundred three laboratories, using 20 different methods, determined glyHb in two lyophilized calibrators and two whole-blood samples. For whole-blood samples with low (5%) and high (9%) glyHb percentages, respectively, calibration decreased overall interlaboratory variation (CV) from 16% to 9% and from 11% to 6% and decreased intermethod variation from 14% to 6% and from 12% to 5%. Forty-seven laboratories, using 14 different methods, determined mean glyHb percentages in self-selected groups of 10 nondiabetic volunteers each. With calibration their overall mean (2SD) was 5.0% (0.5%), very close to the 5.0% (0.3%) derived from the reference method used in the Diabetes Control and Complications Trial. In both experiments the Abbott IMx and Vision showed deviating results. We conclude that, irrespective of the analytical method used, calibration enables standardization of glyHb results, reference values, and interpretation criteria.

  17. Calibration procedure of Hukseflux SR25 to Establish the Diffuse Reference for the Outdoor Broadband Radiometer Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Ibrahim M.; Andreas, Afshin M.

    2017-08-01

    Accurate pyranometer calibrations, traceable to internationally recognized standards, are critical for solar irradiance measurements. One calibration method is the component summation method, where the pyranometers are calibrated outdoors under clear sky conditions, and the reference global solar irradiance is calculated as the sum of two reference components, the diffuse horizontal and subtended beam solar irradiances. The beam component is measured with pyrheliometers traceable to the World Radiometric Reference, while there is no internationally recognized reference for the diffuse component. In the absence of such a reference, we present a method to consistently calibrate pyranometers for measuring the diffuse component. Themore » method is based on using a modified shade/unshade method and a pyranometer with less than 0.5 W/m2 thermal offset. The calibration result shows that the responsivity of Hukseflux SR25 pyranometer equals 10.98 uV/(W/m2) with +/-0.86 percent uncertainty.« less

  18. Determination of glycated hemoglobin in patients with advanced liver disease

    PubMed Central

    Lahousen, Theresa; Hegenbarth, Karin; Ille, Rottraut; Lipp, Rainer W.; Krause, Robert; Little, Randie R.; Schnedl, Wolfgang J.

    2004-01-01

    AIM: To evaluate the glycated hemoglobin (HbA 1c) determination methods and to determine fructosamine in patients with chronic hepatitis, compensated cirrhosis and in patients with chronic hepatitis treated with ribavirin. METHODS: HbA1c values were determined in 15 patients with compensated liver cirrhosis and in 20 patients with chronic hepatitis using the ion-exchange high performance liquid chromatography and the immunoassay methods. Fructosamine was determined using nitroblue tetrazolium. RESULTS: Forty percent of patients with liver cirrhosis had HbA1c results below the non-diabetic reference range by at least one HbA1c method, while fructosamine results were either within the reference range or elevated. Twenty percent of patients with chronic hepatitis (hepatic fibrosis) had HbA1c results below the non -diabetic reference range by at least one HbA1c method. In patients with chronic hepatitis treated with ribavirin, 50% of HbA1c results were below the non-diabetic reference using at least one of the HbA1c methods. CONCLUSION: Only evaluated in context with all liver function parameters as well as a red blood count including reticulocytes, HbA 1c results should be used in patients with advanced liver disease. HbA 1c and fructosamine measurements should be used with caution when evaluating long-term glucose control in patients with hepatic cirrhosis or in patients with chronic hepatitis and ribavirin treatment. PMID:15259084

  19. Cross-reference identification within a PDF document

    NASA Astrophysics Data System (ADS)

    Li, Sida; Gao, Liangcai; Tang, Zhi; Yu, Yinyan

    2015-01-01

    Cross-references, such like footnotes, endnotes, figure/table captions, references, are a common and useful type of page elements to further explain their corresponding entities in the target document. In this paper, we focus on cross-reference identification in a PDF document, and present a robust method as a case study of identifying footnotes and figure references. The proposed method first extracts footnotes and figure captions, and then matches them with their corresponding references within a document. A number of novel features within a PDF document, i.e., page layout, font information, lexical and linguistic features of cross-references, are utilized for the task. Clustering is adopted to handle the features that are stable in one document but varied in different kinds of documents so that the process of identification is adaptive with document types. In addition, this method leverages results from the matching process to provide feedback to the identification process and further improve the algorithm accuracy. The primary experiments in real document sets show that the proposed method is promising to identify cross-reference in a PDF document.

  20. Decentralized model reference adaptive control of large flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Fu-Ming; Fong, I-Kong; Lin, Yu-Hwan

    1988-01-01

    A decentralized model reference adaptive control (DMRAC) method is developed for large flexible structures (LFS). The development follows that of a centralized model reference adaptive control for LFS that have been shown to be feasible. The proposed method is illustrated using a simply supported beam with collocated actuators and sensors. Results show that the DMRAC can achieve either output regulation or output tracking with adequate convergence, provided the reference model inputs and their time derivatives are integrable, bounded, and approach zero as t approaches infinity.

  1. [The validation of the effect of correcting spectral background changes based on floating reference method by simulation].

    PubMed

    Wang, Zhu-lou; Zhang, Wan-jie; Li, Chen-xi; Chen, Wen-liang; Xu, Ke-xin

    2015-02-01

    There are some challenges in near-infrared non-invasive blood glucose measurement, such as the low signal to noise ratio of instrument, the unstable measurement conditions, the unpredictable and irregular changes of the measured object, and etc. Therefore, it is difficult to extract the information of blood glucose concentrations from the complicated signals accurately. Reference measurement method is usually considered to be used to eliminate the effect of background changes. But there is no reference substance which changes synchronously with the anylate. After many years of research, our research group has proposed the floating reference method, which is succeeded in eliminating the spectral effects induced by the instrument drifts and the measured object's background variations. But our studies indicate that the reference-point will changes following the changing of measurement location and wavelength. Therefore, the effects of floating reference method should be verified comprehensively. In this paper, keeping things simple, the Monte Carlo simulation employing Intralipid solution with the concentrations of 5% and 10% is performed to verify the effect of floating reference method used into eliminating the consequences of the light source drift. And the light source drift is introduced through varying the incident photon number. The effectiveness of the floating reference method with corresponding reference-points at different wavelengths in eliminating the variations of the light source drift is estimated. The comparison of the prediction abilities of the calibration models with and without using this method shows that the RMSEPs of the method are decreased by about 98.57% (5%Intralipid)and 99.36% (10% Intralipid)for different Intralipid. The results indicate that the floating reference method has obvious effect in eliminating the background changes.

  2. Collaborative derivation of reference intervals for major clinical laboratory tests in Japan.

    PubMed

    Ichihara, Kiyoshi; Yomamoto, Yoshikazu; Hotta, Taeko; Hosogaya, Shigemi; Miyachi, Hayato; Itoh, Yoshihisa; Ishibashi, Midori; Kang, Dongchon

    2016-05-01

    Three multicentre studies of reference intervals were conducted recently in Japan. The Committee on Common Reference Intervals of the Japan Society of Clinical Chemistry sought to establish common reference intervals for 40 laboratory tests which were measured in common in the three studies and regarded as well harmonized in Japan. The study protocols were comparable with recruitment mostly from hospital workers with body mass index ≤28 and no medications. Age and sex distributions were made equal to obtain a final data size of 6345 individuals. Between-subgroup differences were expressed as the SD ratio (between-subgroup SD divided by SD representing the reference interval). Between-study differences were all within acceptable levels, and thus the three datasets were merged. By adopting SD ratio ≥0.50 as a guide, sex-specific reference intervals were necessary for 12 assays. Age-specific reference intervals for females partitioned at age 45 were required for five analytes. The reference intervals derived by the parametric method resulted in appreciable narrowing of the ranges by applying the latent abnormal values exclusion method in 10 items which were closely associated with prevalent disorders among healthy individuals. Sex- and age-related profiles of reference values, derived from individuals with no abnormal results in major tests, showed peculiar patterns specific to each analyte. Common reference intervals for nationwide use were developed for 40 major tests, based on three multicentre studies by advanced statistical methods. Sex- and age-related profiles of reference values are of great relevance not only for interpreting test results, but for applying clinical decision limits specified in various clinical guidelines. © The Author(s) 2015.

  3. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.

    PubMed

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-03-01

    A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.

  4. Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China

    PubMed Central

    Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling

    2016-01-01

    Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390

  5. Passive field reflectance measurements

    NASA Astrophysics Data System (ADS)

    Weber, Christian; Schinca, Daniel C.; Tocho, Jorge O.; Videla, Fabian

    2008-10-01

    The results of reflectance measurements performed with a three-band passive radiometer with independent channels for solar irradiance reference are presented. Comparative operation between the traditional method that uses downward-looking field and reference white panel measurements and the new approach involving duplicated downward- and upward-looking spectral channels (each latter one with its own diffuser) is analyzed. The results indicate that the latter method performs in very good agreement with the standard method and is more suitable for passive sensors under rapidly changing atmospheric conditions (such as clouds, dust, mist, smog and other scatterers), since a more reliable synchronous recording of reference and incident light is achieved. Besides, having separate channels for the reference and the signal allows a better balancing of gains in the amplifiers for each spectral channel. We show the results obtained in the determination of the normalized difference vegetation index (NDVI) corresponding to the period 2004-2007 field experiments concerning weed detection in soybean stubbles and fertilizer level assessment in wheat. The method may be used to refine sensor-based nitrogen fertilizer rate recommendations and to determine suitable zones for herbicide applications.

  6. [Standard sample preparation method for quick determination of trace elements in plastic].

    PubMed

    Yao, Wen-Qing; Zong, Rui-Long; Zhu, Yong-Fa

    2011-08-01

    Reference sample was prepared by masterbatch method, containing heavy metals with known concentration of electronic information products (plastic), the repeatability and precision were determined, and reference sample preparation procedures were established. X-Ray fluorescence spectroscopy (XRF) analysis method was used to determine the repeatability and uncertainty in the analysis of the sample of heavy metals and bromine element. The working curve and the metrical methods for the reference sample were carried out. The results showed that the use of the method in the 200-2000 mg x kg(-1) concentration range for Hg, Pb, Cr and Br elements, and in the 20-200 mg x kg(-1) range for Cd elements, exhibited a very good linear relationship, and the repeatability of analysis methods for six times is good. In testing the circuit board ICB288G and ICB288 from the Mitsubishi Heavy Industry Company, results agreed with the recommended values.

  7. Comparison between the triglycerides standardization of routine methods used in Japan and the chromotropic acid reference measurement procedure used by the CDC Lipid Standardization Programme.

    PubMed

    Nakamura, Masakazu; Iso, Hiroyasu; Kitamura, Akihiko; Imano, Hironori; Noda, Hiroyuki; Kiyama, Masahiko; Sato, Shinichi; Yamagishi, Kazumasa; Nishimura, Kunihiro; Nakai, Michikazu; Vesper, Hubert W; Teramoto, Tamio; Miyamoto, Yoshihiro

    2016-11-01

    Background The US Centers for Disease Control and Prevention ensured adequate performance of the routine triglycerides methods used in Japan by a chromotropic acid reference measurement procedure used by the Centers for Disease Control and Prevention lipid standardization programme as a reference point. We examined standardized data to clarify the performance of routine triglycerides methods. Methods The two routine triglycerides methods were the fluorometric method of Kessler and Lederer and the enzymatic method. The methods were standardized using 495 Centers for Disease Control and Prevention reference pools with 98 different concentrations ranging between 0.37 and 5.15 mmol/L in 141 survey runs. The triglycerides criteria for laboratories which perform triglycerides analyses are used: accuracy, as bias ≤5% from the Centers for Disease Control and Prevention reference value and precision, as measured by CV, ≤5%. Results The correlation of the bias of both methods to the Centers for Disease Control and Prevention reference method was: y (%bias) = 0.516 × (Centers for Disease Control and Prevention reference value) -1.292 ( n = 495, R 2  = 0.018). Triglycerides bias at medical decision points of 1.13, 1.69 and 2.26 mmol/L was -0.71%, -0.42% and -0.13%, respectively. For the combined precision, the equation y (CV) = -0.398 × (triglycerides value) + 1.797 ( n = 495, R 2  = 0.081) was used. Precision was 1.35%, 1.12% and 0.90%, respectively. It was shown that triglycerides measurements at Osaka were stable for 36 years. Conclusions The epidemiologic laboratory in Japan met acceptable accuracy goals for 88.7% of all samples, and met acceptable precision goals for 97.8% of all samples measured through the Centers for Disease Control and Prevention lipid standardization programme and demonstrated stable results for an extended period of time.

  8. Comparison between the triglycerides standardization of routine methods used in Japan and the chromotropic acid reference measurement procedure used by the CDC Lipid Standardization Programme

    PubMed Central

    Nakamura, Masakazu; Iso, Hiroyasu; Kitamura, Akihiko; Imano, Hironori; Noda, Hiroyuki; Kiyama, Masahiko; Sato, Shinichi; Yamagishi, Kazumasa; Nishimura, Kunihiro; Nakai, Michikazu; Vesper, Hubert W; Teramoto, Tamio; Miyamoto, Yoshihiro

    2017-01-01

    Background The US Centers for Disease Control and Prevention ensured adequate performance of the routine triglycerides methods used in Japan by a chromotropic acid reference measurement procedure used by the Centers for Disease Control and Prevention lipid standardization programme as a reference point. We examined standardized data to clarify the performance of routine triglycerides methods. Methods The two routine triglycerides methods were the fluorometric method of Kessler and Lederer and the enzymatic method. The methods were standardized using 495 Centers for Disease Control and Prevention reference pools with 98 different concentrations ranging between 0.37 and 5.15 mmol/L in 141 survey runs. The triglycerides criteria for laboratories which perform triglycerides analyses are used: accuracy, as bias ≤5% from the Centers for Disease Control and Prevention reference value and precision, as measured by CV, ≤5%. Results The correlation of the bias of both methods to the Centers for Disease Control and Prevention reference method was: y (%bias) = 0.516 × (Centers for Disease Control and Prevention reference value) −1.292 (n = 495, R2 = 0.018). Triglycerides bias at medical decision points of 1.13, 1.69 and 2.26 mmol/L was −0.71%, −0.42% and −0.13%, respectively. For the combined precision, the equation y (CV) = −0.398 × (triglycerides value) + 1.797 (n = 495, R2 = 0.081) was used. Precision was 1.35%, 1.12% and 0.90%, respectively. It was shown that triglycerides measurements at Osaka were stable for 36 years. Conclusions The epidemiologic laboratory in Japan met acceptable accuracy goals for 88.7% of all samples, and met acceptable precision goals for 97.8% of all samples measured through the Centers for Disease Control and Prevention lipid standardization programme and demonstrated stable results for an extended period of time. PMID:26680645

  9. A global multicenter study on reference values: 1. Assessment of methods for derivation and comparison of reference intervals.

    PubMed

    Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Qiu, Ling; Erasmus, Rajiv; Borai, Anwar; Evgina, Svetlana; Ashavaid, Tester; Khan, Dilshad; Schreier, Laura; Rolle, Reynan; Shimizu, Yoshihisa; Kimura, Shogo; Kawano, Reo; Armbruster, David; Mori, Kazuo; Yadav, Binod K

    2017-04-01

    The IFCC Committee on Reference Intervals and Decision Limits coordinated a global multicenter study on reference values (RVs) to explore rational and harmonizable procedures for derivation of reference intervals (RIs) and investigate the feasibility of sharing RIs through evaluation of sources of variation of RVs on a global scale. For the common protocol, rather lenient criteria for reference individuals were adopted to facilitate harmonized recruitment with planned use of the latent abnormal values exclusion (LAVE) method. As of July 2015, 12 countries had completed their study with total recruitment of 13,386 healthy adults. 25 analytes were measured chemically and 25 immunologically. A serum panel with assigned values was measured by all laboratories. RIs were derived by parametric and nonparametric methods. The effect of LAVE methods is prominent in analytes which reflect nutritional status, inflammation and muscular exertion, indicating that inappropriate results are frequent in any country. The validity of the parametric method was confirmed by the presence of analyte-specific distribution patterns and successful Gaussian transformation using the modified Box-Cox formula in all countries. After successful alignment of RVs based on the panel test results, nearly half the analytes showed variable degrees of between-country differences. This finding, however, requires confirmation after adjusting for BMI and other sources of variation. The results are reported in the second part of this paper. The collaborative study enabled us to evaluate rational methods for deriving RIs and comparing the RVs based on real-world datasets obtained in a harmonized manner. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children

    PubMed Central

    Lee, Hye Ryun; Roh, Eun Youn; Chang, Ju Young

    2016-01-01

    Background Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. Methods A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. Results As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. Conclusions We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age. PMID:27374715

  11. An automated and objective method for age partitioning of reference intervals based on continuous centile curves.

    PubMed

    Yang, Qian; Lew, Hwee Yeong; Peh, Raymond Hock Huat; Metz, Michael Patrick; Loh, Tze Ping

    2016-10-01

    Reference intervals are the most commonly used decision support tool when interpreting quantitative laboratory results. They may require partitioning to better describe subpopulations that display significantly different reference values. Partitioning by age is particularly important for the paediatric population since there are marked physiological changes associated with growth and maturation. However, most partitioning methods are either technically complex or require prior knowledge of the underlying physiology/biological variation of the population. There is growing interest in the use of continuous centile curves, which provides seamless laboratory reference values as a child grows, as an alternative to rigidly described fixed reference intervals. However, the mathematical functions that describe these curves can be complex and may not be easily implemented in laboratory information systems. Hence, the use of fixed reference intervals is expected to continue for a foreseeable time. We developed a method that objectively proposes optimised age partitions and reference intervals for quantitative laboratory data (http://research.sph.nus.edu.sg/pp/ppResult.aspx), based on the sum of gradient that best describes the underlying distribution of the continuous centile curves. It is hoped that this method may improve the selection of age intervals for partitioning, which is receiving increasing attention in paediatric laboratory medicine. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  12. Analysis of street drugs in seized material without primary reference standards.

    PubMed

    Laks, Suvi; Pelander, Anna; Vuori, Erkki; Ali-Tolppa, Elisa; Sippola, Erkki; Ojanperä, Ilkka

    2004-12-15

    A novel approach was used to analyze street drugs in seized material without primary reference standards. Identification was performed by liquid chromatography/time-of-flight mass spectrometry (LC/TOFMS), essentially based on accurate mass determination using a target library of 735 exact monoisotopic masses. Quantification was carried out by liquid chromatography/chemiluminescence nitrogen detection (LC/CLND) with a single secondary standard (caffeine), utilizing the detector's equimolar response to nitrogen. Sample preparation comprised dilution, first with methanol and further with the LC mobile phase. Altogether 21 seized drug samples were analyzed blind by the present method, and results were compared to accredited reference methods utilizing identification by gas chromatography/mass spectrometry and quantification by gas chromatography or liquid chromatography. The 31 drug findings by LC/TOFMS comprised 19 different drugs-of-abuse, byproducts, and adulterants, including amphetamine and tryptamine designer drugs, with one unresolved pair of compounds having an identical mass. By the reference methods, 27 findings could be confirmed, and among the four unconfirmed findings, only 1 apparent false positive was found. In the quantitative analysis of 11 amphetamine, heroin, and cocaine findings, mean relative difference between the results of LC/CLND and the reference methods was 11% (range 4.2-21%), without any observable bias. Mean relative standard deviation for three parallel LC/CLND results was 6%. Results suggest that the present combination of LC/TOFMS and LC/CLND offers a simple solution for the analysis of scheduled and designer drugs in seized material, independent of the availability of primary reference standards.

  13. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    PubMed

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification of Mycobacterium tuberculosis would provide a clinically useful readout. The methods described in this study provide a means by which the technical performance of quantitative molecular methods can be evaluated independently of clinical variability to improve accuracy of measurement results. These will assist in ultimately increasing the likelihood that such approaches could be used to improve patient management of TB.

  14. [Study on ethnic medicine quantitative reference herb,Tibetan medicine fruits of Capsicum frutescens as a case].

    PubMed

    Zan, Ke; Cui, Gan; Guo, Li-Nong; Ma, Shuang-Cheng; Zheng, Jian

    2018-05-01

    High price and difficult to get of reference substance have become obstacles to HPLC assay of ethnic medicine. A new method based on quantitative reference herb (QRH) was proposed. Specific chromatograms in fruits of Capsicum frutescens were employed to determine peak positions, and HPLC quantitative reference herb was prepared from fruits of C. frutescens. The content of capsaicin and dihydrocapsaicin in the quantitative control herb was determined by HPLC. Eleven batches of fruits of C. frutescens were analyzed with quantitative reference herb and reference substance respectively. The results showed no difference. The present method is feasible for quality control of ethnic medicines and quantitative reference herb is suitable to replace reference substances in assay. Copyright© by the Chinese Pharmaceutical Association.

  15. Registration-based segmentation with articulated model from multipostural magnetic resonance images for hand bone motion animation.

    PubMed

    Chen, Hsin-Chen; Jou, I-Ming; Wang, Chien-Kuo; Su, Fong-Chin; Sun, Yung-Nien

    2010-06-01

    The quantitative measurements of hand bones, including volume, surface, orientation, and position are essential in investigating hand kinematics. Moreover, within the measurement stage, bone segmentation is the most important step due to its certain influences on measuring accuracy. Since hand bones are small and tubular in shape, magnetic resonance (MR) imaging is prone to artifacts such as nonuniform intensity and fuzzy boundaries. Thus, greater detail is required for improving segmentation accuracy. The authors then propose using a novel registration-based method on an articulated hand model to segment hand bones from multipostural MR images. The proposed method consists of the model construction and registration-based segmentation stages. Given a reference postural image, the first stage requires construction of a drivable reference model characterized by hand bone shapes, intensity patterns, and articulated joint mechanism. By applying the reference model to the second stage, the authors initially design a model-based registration pursuant to intensity distribution similarity, MR bone intensity properties, and constraints of model geometry to align the reference model to target bone regions of the given postural image. The authors then refine the resulting surface to improve the superimposition between the registered reference model and target bone boundaries. For each subject, given a reference postural image, the proposed method can automatically segment all hand bones from all other postural images. Compared to the ground truth from two experts, the resulting surface image had an average margin of error within 1 mm (mm) only. In addition, the proposed method showed good agreement on the overlap of bone segmentations by dice similarity coefficient and also demonstrated better segmentation results than conventional methods. The proposed registration-based segmentation method can successfully overcome drawbacks caused by inherent artifacts in MR images and obtain more accurate segmentation results automatically. Moreover, realistic hand motion animations can be generated based on the bone segmentation results. The proposed method is found helpful for understanding hand bone geometries in dynamic postures that can be used in simulating 3D hand motion through multipostural MR images.

  16. Comparison of analytical methods for the determination of histamine in reference canned fish samples

    NASA Astrophysics Data System (ADS)

    Jakšić, S.; Baloš, M. Ž.; Mihaljev, Ž.; Prodanov Radulović, J.; Nešić, K.

    2017-09-01

    Two screening methods for histamine in canned fish, an enzymatic test and a competitive direct enzyme-linked immunosorbent assay (CD-ELISA), were compared with the reversed-phase liquid chromatography (RP-HPLC) standard method. For enzymatic and CD-ELISA methods, determination was conducted according to producers’ manuals. For RP-HPLC, histamine was derivatized with dansyl-chloride, followed by RP-HPLC and diode array detection. Results of analysis of canned fish, supplied as reference samples for proficiency testing, showed good agreement when histamine was present at higher concentrations (above 100 mg kg-1). At a lower level (16.95 mg kg-1), the enzymatic test produced some higher results. Generally, analysis of four reference samples according to CD-ELISA and RP-HPLC showed good agreement for histamine determination (r=0.977 in concentration range 16.95-216 mg kg-1) The results show that the applied enzymatic test and CD-ELISA appeared to be suitable screening methods for the determination of histamine in canned fish.

  17. Effect of genotyped cows in the reference population on the genomic evaluation of Holstein cattle.

    PubMed

    Uemoto, Y; Osawa, T; Saburi, J

    2017-03-01

    This study evaluated the dependence of reliability and prediction bias on the prediction method, the contribution of including animals (bulls or cows), and the genetic relatedness, when including genotyped cows in the progeny-tested bull reference population. We performed genomic evaluation using a Japanese Holstein population, and assessed the accuracy of genomic enhanced breeding value (GEBV) for three production traits and 13 linear conformation traits. A total of 4564 animals for production traits and 4172 animals for conformation traits were genotyped using Illumina BovineSNP50 array. Single- and multi-step methods were compared for predicting GEBV in genotyped bull-only and genotyped bull-cow reference populations. No large differences in realized reliability and regression coefficient were found between the two reference populations; however, a slight difference was found between the two methods for production traits. The accuracy of GEBV determined by single-step method increased slightly when genotyped cows were included in the bull reference population, but decreased slightly by multi-step method. A validation study was used to evaluate the accuracy of GEBV when 800 additional genotyped bulls (POPbull) or cows (POPcow) were included in the base reference population composed of 2000 genotyped bulls. The realized reliabilities of POPbull were higher than those of POPcow for all traits. For the gain of realized reliability over the base reference population, the average ratios of POPbull gain to POPcow gain for production traits and conformation traits were 2.6 and 7.2, respectively, and the ratios depended on heritabilities of the traits. For regression coefficient, no large differences were found between the results for POPbull and POPcow. Another validation study was performed to investigate the effect of genetic relatedness between cows and bulls in the reference and test populations. The effect of genetic relationship among bulls in the reference population was also assessed. The results showed that it is important to account for relatedness among bulls in the reference population. Our studies indicate that the prediction method, the contribution ratio of including animals, and genetic relatedness could affect the prediction accuracy in genomic evaluation of Holstein cattle, when including genotyped cows in the reference population.

  18. Application of quantitative 1H NMR for the calibration of protoberberine alkaloid reference standards.

    PubMed

    Wu, Yan; He, Yi; He, Wenyi; Zhang, Yumei; Lu, Jing; Dai, Zhong; Ma, Shuangcheng; Lin, Ruichao

    2014-03-01

    Quantitative nuclear magnetic resonance spectroscopy (qNMR) has been developed into an important tool in the drug analysis, biomacromolecule detection, and metabolism study. Compared with mass balance method, qNMR method bears some advantages in the calibration of reference standard (RS): it determines the absolute amount of a sample; other chemical compound and its certified reference material (CRM) can be used as internal standard (IS) to obtain the purity of the sample. Protoberberine alkaloids have many biological activities and have been used as reference standards for the control of many herbal drugs. In present study, the qNMR methods were developed for the calibration of berberine hydrochloride, palmatine hydrochloride, tetrahydropalmatine, and phellodendrine hydrochloride with potassium hydrogen phthalate as IS. Method validation was carried out according to the guidelines for the method validation of Chinese Pharmacopoeia. The results of qNMR were compared with those of mass balance method and the differences between the results of two methods were acceptable based on the analysis of estimated measurement uncertainties. Therefore, qNMR is an effective and reliable analysis method for the calibration of RS and can be used as a good complementarity to the mass balance method. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Certification of biological candidates reference materials by neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Kabanov, Denis V.; Nesterova, Yulia V.; Merkulov, Viktor G.

    2018-03-01

    The paper gives the results of interlaboratory certification of new biological candidate reference materials by neutron activation analysis recommended by the Institute of Nuclear Chemistry and Technology (Warsaw, Poland). The correctness and accuracy of the applied method was statistically estimated for the determination of trace elements in candidate reference materials. The procedure of irradiation in the reactor thermal fuel assembly without formation of fast neutrons was carried out. It excluded formation of interfering isotopes leading to false results. The concentration of more than 20 elements (e.g., Ba, Br, Ca, Co, Ce, Cr, Cs, Eu, Fe, Hf, La, Lu, Rb, Sb, Sc, Ta, Th, Tb, Yb, U, Zn) in candidate references of tobacco leaves and bottom sediment compared to certified reference materials were determined. It was shown that the average error of the applied method did not exceed 10%.

  20. MOISTURE IN COTTON BY THE KARL FISCHER TITRATION REFERENCE METHOD

    USDA-ARS?s Scientific Manuscript database

    Moisture is a critical parameter that influences many aspects of cotton fiber from harvesting and ginning to various fiber properties. Because of their importance, reference moisture methods that are more accurate than the existing oven-drying techniques and relatively easy to generate results are ...

  1. A multicenter nationwide reference intervals study for common biochemical analytes in Turkey using Abbott analyzers.

    PubMed

    Ozarda, Yesim; Ichihara, Kiyoshi; Aslan, Diler; Aybek, Hulya; Ari, Zeki; Taneli, Fatma; Coker, Canan; Akan, Pinar; Sisman, Ali Riza; Bahceci, Onur; Sezgin, Nurzen; Demir, Meltem; Yucel, Gultekin; Akbas, Halide; Ozdem, Sebahat; Polat, Gurbuz; Erbagci, Ayse Binnur; Orkmez, Mustafa; Mete, Nuriye; Evliyaoglu, Osman; Kiyici, Aysel; Vatansev, Husamettin; Ozturk, Bahadir; Yucel, Dogan; Kayaalp, Damla; Dogan, Kubra; Pinar, Asli; Gurbilek, Mehmet; Cetinkaya, Cigdem Damla; Akin, Okhan; Serdar, Muhittin; Kurt, Ismail; Erdinc, Selda; Kadicesme, Ozgur; Ilhan, Necip; Atali, Dilek Sadak; Bakan, Ebubekir; Polat, Harun; Noyan, Tevfik; Can, Murat; Bedir, Abdulkerim; Okuyucu, Ali; Deger, Orhan; Agac, Suret; Ademoglu, Evin; Kaya, Ayşem; Nogay, Turkan; Eren, Nezaket; Dirican, Melahat; Tuncer, GulOzlem; Aykus, Mehmet; Gunes, Yeliz; Ozmen, Sevda Unalli; Kawano, Reo; Tezcan, Sehavet; Demirpence, Ozlem; Degirmen, Elif

    2014-12-01

    A nationwide multicenter study was organized to establish reference intervals (RIs) in the Turkish population for 25 commonly tested biochemical analytes and to explore sources of variation in reference values, including regionality. Blood samples were collected nationwide in 28 laboratories from the seven regions (≥400 samples/region, 3066 in all). The sera were collectively analyzed in Uludag University in Bursa using Abbott reagents and analyzer. Reference materials were used for standardization of test results. After secondary exclusion using the latent abnormal values exclusion method, RIs were derived by a parametric method employing the modified Box-Cox formula and compared with the RIs by the non-parametric method. Three-level nested ANOVA was used to evaluate variations among sexes, ages and regions. Associations between test results and age, body mass index (BMI) and region were determined by multiple regression analysis (MRA). By ANOVA, differences of reference values among seven regions were significant in none of the 25 analytes. Significant sex-related and age-related differences were observed for 10 and seven analytes, respectively. MRA revealed BMI-related changes in results for uric acid, glucose, triglycerides, high-density lipoprotein (HDL)-cholesterol, alanine aminotransferase, and γ-glutamyltransferase. Their RIs were thus derived by applying stricter criteria excluding individuals with BMI >28 kg/m2. Ranges of RIs by non-parametric method were wider than those by parametric method especially for those analytes affected by BMI. With the lack of regional differences and the well-standardized status of test results, the RIs derived from this nationwide study can be used for the entire Turkish population.

  2. Idiographic duo-trio tests using a constant-reference based on preference of each consumer: Sample presentation sequence in difference test can be customized for individual consumers to reduce error.

    PubMed

    Kim, Min-A; Sim, Hye-Min; Lee, Hye-Seong

    2016-11-01

    As reformulations and processing changes are increasingly needed in the food industry to produce healthier, more sustainable, and cost effective products while maintaining superior quality, reliable measurements of consumers' sensory perception and discrimination are becoming more critical. Consumer discrimination methods using a preferred-reference duo-trio test design have been shown to be effective in improving the discrimination performance by customizing sample presentation sequences. However, this design can add complexity to the discrimination task for some consumers, resulting in more errors in sensory discrimination. The objective of the present study was to investigate the effects of different types of test instructions using the preference-reference duo-trio test design where a paired-preference test is followed by 6 repeated preferred-reference duo-trio tests, in comparison to the analytical method using the balanced-reference duo-trio. Analyses of d' estimates (product-related measure) and probabilistic sensory discriminators in momentary numbers of subjects showing statistical significance (subject-related measure) revealed that only preferred-reference duo-trio test using affective reference-framing, either by providing no information about the reference or information on a previously preferred sample, improved the sensory discrimination more than the analytical method. No decrease in discrimination performance was observed with any type of instruction, confirming that consumers could handle the test methods. These results suggest that when repeated tests are feasible, using the affective discrimination method would be operationally more efficient as well as ecologically more reliable for measuring consumers' sensory discrimination ability. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Spectroscopic diagnostics for bacteria in biologic sample

    DOEpatents

    El-Sayed, Mostafa A.; El-Sayed, Ivan H.

    2002-01-01

    A method to analyze and diagnose specific bacteria in a biologic sample using spectroscopy is disclosed. The method includes obtaining the spectra of a biologic sample of a non-infected patient for use as a reference, subtracting the reference from the spectra of an infected sample, and comparing the fingerprint regions of the resulting differential spectrum with reference spectra of bacteria in saline. Using this diagnostic technique, specific bacteria can be identified sooner and without culturing, bacteria-specific antibiotics can be prescribed sooner, resulting in decreased likelihood of antibiotic resistance and an overall reduction of medical costs.

  4. The impact of change in albumin assay on reference intervals, prevalence of 'hypoalbuminaemia' and albumin prescriptions.

    PubMed

    Coley-Grant, Deon; Herbert, Mike; Cornes, Michael P; Barlow, Ian M; Ford, Clare; Gama, Rousseau

    2016-01-01

    We studied the impact on reference intervals, classification of patients with hypoalbuminaemia and albumin infusion prescriptions on changing from a bromocresol green (BCG) to a bromocresol purple (BCP) serum albumin assay. Passing-Bablok regression analysis and Bland-Altman plot were used to compare Abbott BCP and Roche BCG methods. Linear regression analysis was used to compare in-house and an external laboratory Abbott BCP serum albumin results. Reference intervals for Abbott BCP serum albumin were derived in two different laboratories using pathology data from adult patients in primary care. Prescriptions for 20% albumin infusions were compared one year before and one year after changing the albumin method. Abbott BCP assay had a negative bias of approximately 6 g/L compared with Roche BCG method.There was good agreement (y = 1.04 x - 1.03; R(2 )= 0.9933) between in-house and external laboratory Abbott BCP results. Reference intervals for the serum albumin Abbott BCP assay were 31-45 g/L, different to those recommended by Pathology Harmony and the manufacturers (35-50 g/L). Following the change in method there was a large increase in the number of patients classified as hypoalbuminaemic using Pathology Harmony references intervals (32%) but not when retrospectively compared to locally derived reference intervals (16%) compared with the previous year (12%). The method change was associated with a 44.6% increase in albumin prescriptions. This equated to an annual increase in expenditure of £35,234. We suggest that serum albumin reference intervals be method specific to prevent misclassification of albumin status in patients. Change in albumin methodology may have significant impact on hospital resources. © The Author(s) 2015.

  5. Determination of tungsten in geochemical reference material basalt Columbia River 2 by radiochemical neutron activation analysis and inductively coupled plasma mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrison, Samuel S.; Beck, Chelsie L.; Bowen, James M.

    Environmental tungsten (W) analyses are inhibited by a lack of reference materials and practical methods to remove isobaric and radiometric interferences. We present a method that evaluates the potential use of commercially available sediment, Basalt Columbia River-2 (BCR-2), as a reference material using neutron activation analysis (NAA) and mass spectrometry. Tungsten concentrations using both methods are in statistical agreement at the 95% confidence interval (92 ± 4 ng/g for NAA and 100 ±7 ng/g for mass spectrometry) with recoveries greater than 95%. These results indicate that BCR-2 may be suitable as a reference material for future studies.

  6. Rapid iterative reanalysis for automated design

    NASA Technical Reports Server (NTRS)

    Bhatia, K. G.

    1973-01-01

    A method for iterative reanalysis in automated structural design is presented for a finite-element analysis using the direct stiffness approach. A basic feature of the method is that the generalized stiffness and inertia matrices are expressed as functions of structural design parameters, and these generalized matrices are expanded in Taylor series about the initial design. Only the linear terms are retained in the expansions. The method is approximate because it uses static condensation, modal reduction, and the linear Taylor series expansions. The exact linear representation of the expansions of the generalized matrices is also described and a basis for the present method is established. Results of applications of the present method to the recalculation of the natural frequencies of two simple platelike structural models are presented and compared with results obtained by using a commonly applied analysis procedure used as a reference. In general, the results are in good agreement. A comparison of the computer times required for the use of the present method and the reference method indicated that the present method required substantially less time for reanalysis. Although the results presented are for relatively small-order problems, the present method will become more efficient relative to the reference method as the problem size increases. An extension of the present method to static reanalysis is described, ana a basis for unifying the static and dynamic reanalysis procedures is presented.

  7. Preliminary comparative assessment of PM10 hourly measurement results from new monitoring stations type using stochastic and exploratory methodology and models

    NASA Astrophysics Data System (ADS)

    Czechowski, Piotr Oskar; Owczarek, Tomasz; Badyda, Artur; Majewski, Grzegorz; Rogulski, Mariusz; Ogrodnik, Paweł

    2018-01-01

    The paper presents selected preliminary stage key issues proposed extended equivalence measurement results assessment for new portable devices - the comparability PM10 concentration results hourly series with reference station measurement results with statistical methods. In article presented new portable meters technical aspects. The emphasis was placed on the comparability the results using the stochastic and exploratory methods methodology concept. The concept is based on notice that results series simple comparability in the time domain is insufficient. The comparison of regularity should be done in three complementary fields of statistical modeling: time, frequency and space. The proposal is based on model's results of five annual series measurement results new mobile devices and WIOS (Provincial Environmental Protection Inspectorate) reference station located in Nowy Sacz city. The obtained results indicate both the comparison methodology completeness and the high correspondence obtained new measurements results devices with reference.

  8. Reference layer adaptive filtering (RLAF) for EEG artifact reduction in simultaneous EEG-fMRI

    NASA Astrophysics Data System (ADS)

    Steyrl, David; Krausz, Gunther; Koschutnig, Karl; Edlinger, Günter; Müller-Putz, Gernot R.

    2017-04-01

    Objective. Simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) combines advantages of both methods, namely high temporal resolution of EEG and high spatial resolution of fMRI. However, EEG quality is limited due to severe artifacts caused by fMRI scanners. Approach. To improve EEG data quality substantially, we introduce methods that use a reusable reference layer EEG cap prototype in combination with adaptive filtering. The first method, reference layer adaptive filtering (RLAF), uses adaptive filtering with reference layer artifact data to optimize artifact subtraction from EEG. In the second method, multi band reference layer adaptive filtering (MBRLAF), adaptive filtering is performed on bandwidth limited sub-bands of the EEG and the reference channels. Main results. The results suggests that RLAF outperforms the baseline method, average artifact subtraction, in all settings and also its direct predecessor, reference layer artifact subtraction (RLAS), in lower (<35 Hz) frequency ranges. MBRLAF is computationally more demanding than RLAF, but highly effective in all EEG frequency ranges. Effectivity is determined by visual inspection, as well as root-mean-square voltage reduction and power reduction of EEG provided that physiological EEG components such as occipital EEG alpha power and visual evoked potentials (VEP) are preserved. We demonstrate that both, RLAF and MBRLAF, improve VEP quality. For that, we calculate the mean-squared-distance of single trial VEP to the mean VEP and estimate single trial VEP classification accuracies. We found that the average mean-squared-distance is lowest and the average classification accuracy is highest after MBLAF. RLAF was second best. Significance. In conclusion, the results suggests that RLAF and MBRLAF are potentially very effective in improving EEG quality of simultaneous EEG-fMRI. Highlights We present a new and reusable reference layer cap prototype for simultaneous EEG-fMRI We introduce new algorithms for reducing EEG artifacts due to simultaneous fMRI The algorithms combine a reference layer and adaptive filtering Several evaluation criteria suggest superior effectivity in terms of artifact reduction We demonstrate that physiological EEG components are preserved

  9. Comparison of three commercially available fit-test methods.

    PubMed

    Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J

    2002-01-01

    American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.

  10. Reveal Listeria 2.0 test for detection of Listeria spp. in foods and environmental samples.

    PubMed

    Alles, Susan; Curry, Stephanie; Almy, David; Jagadeesan, Balamurugan; Rice, Jennifer; Mozola, Mark

    2012-01-01

    A Performance Tested Method validation study was conducted for a new lateral flow immunoassay (Reveal Listeria 2.0) for detection of Listeria spp. in foods and environmental samples. Results of inclusivity testing showed that the test detects all species of Listeria, with the exception of L. grayi. In exclusivity testing conducted under nonselective growth conditions, all non-listeriae tested produced negative Reveal assay results, except for three strains of Lactobacillus spp. However, these lactobacilli are inhibited by the selective Listeria Enrichment Single Step broth enrichment medium used with the Reveal method. Six foods were tested in parallel by the Reveal method and the U.S. Food and Drug Administration/Bacteriological Analytical Manual (FDA/BAM) reference culture procedure. Considering data from both internal and independent laboratory trials, overall sensitivity of the Reveal method relative to that of the FDA/BAM procedure was 101%. Four foods were tested in parallel by the Reveal method and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference culture procedure. Overall sensitivity of the Reveal method relative to that of the USDA-FSIS procedure was 98.2%. There were no statistically significant differences in the number of positives obtained by the Reveal and reference culture procedures in any food trials. In testing of swab or sponge samples from four types of environmental surfaces, sensitivity of Reveal relative to that of the USDA-FSIS reference culture procedure was 127%. For two surface types, differences in the number of positives obtained by the Reveal and reference methods were statistically significant, with more positives by the Reveal method in both cases. Specificity of the Reveal assay was 100%, as there were no unconfirmed positive results obtained in any phase of the testing. Results of ruggedness experiments showed that the Reveal assay is tolerant of modest deviations in test sample volume and device incubation time.

  11. Two new computational methods for universal DNA barcoding: a benchmark using barcode sequences of bacteria, archaea, animals, fungi, and land plants.

    PubMed

    Tanabe, Akifumi S; Toju, Hirokazu

    2013-01-01

    Taxonomic identification of biological specimens based on DNA sequence information (a.k.a. DNA barcoding) is becoming increasingly common in biodiversity science. Although several methods have been proposed, many of them are not universally applicable due to the need for prerequisite phylogenetic/machine-learning analyses, the need for huge computational resources, or the lack of a firm theoretical background. Here, we propose two new computational methods of DNA barcoding and show a benchmark for bacterial/archeal 16S, animal COX1, fungal internal transcribed spacer, and three plant chloroplast (rbcL, matK, and trnH-psbA) barcode loci that can be used to compare the performance of existing and new methods. The benchmark was performed under two alternative situations: query sequences were available in the corresponding reference sequence databases in one, but were not available in the other. In the former situation, the commonly used "1-nearest-neighbor" (1-NN) method, which assigns the taxonomic information of the most similar sequences in a reference database (i.e., BLAST-top-hit reference sequence) to a query, displays the highest rate and highest precision of successful taxonomic identification. However, in the latter situation, the 1-NN method produced extremely high rates of misidentification for all the barcode loci examined. In contrast, one of our new methods, the query-centric auto-k-nearest-neighbor (QCauto) method, consistently produced low rates of misidentification for all the loci examined in both situations. These results indicate that the 1-NN method is most suitable if the reference sequences of all potentially observable species are available in databases; otherwise, the QCauto method returns the most reliable identification results. The benchmark results also indicated that the taxon coverage of reference sequences is far from complete for genus or species level identification in all the barcode loci examined. Therefore, we need to accelerate the registration of reference barcode sequences to apply high-throughput DNA barcoding to genus or species level identification in biodiversity research.

  12. Two New Computational Methods for Universal DNA Barcoding: A Benchmark Using Barcode Sequences of Bacteria, Archaea, Animals, Fungi, and Land Plants

    PubMed Central

    Tanabe, Akifumi S.; Toju, Hirokazu

    2013-01-01

    Taxonomic identification of biological specimens based on DNA sequence information (a.k.a. DNA barcoding) is becoming increasingly common in biodiversity science. Although several methods have been proposed, many of them are not universally applicable due to the need for prerequisite phylogenetic/machine-learning analyses, the need for huge computational resources, or the lack of a firm theoretical background. Here, we propose two new computational methods of DNA barcoding and show a benchmark for bacterial/archeal 16S, animal COX1, fungal internal transcribed spacer, and three plant chloroplast (rbcL, matK, and trnH-psbA) barcode loci that can be used to compare the performance of existing and new methods. The benchmark was performed under two alternative situations: query sequences were available in the corresponding reference sequence databases in one, but were not available in the other. In the former situation, the commonly used “1-nearest-neighbor” (1-NN) method, which assigns the taxonomic information of the most similar sequences in a reference database (i.e., BLAST-top-hit reference sequence) to a query, displays the highest rate and highest precision of successful taxonomic identification. However, in the latter situation, the 1-NN method produced extremely high rates of misidentification for all the barcode loci examined. In contrast, one of our new methods, the query-centric auto-k-nearest-neighbor (QCauto) method, consistently produced low rates of misidentification for all the loci examined in both situations. These results indicate that the 1-NN method is most suitable if the reference sequences of all potentially observable species are available in databases; otherwise, the QCauto method returns the most reliable identification results. The benchmark results also indicated that the taxon coverage of reference sequences is far from complete for genus or species level identification in all the barcode loci examined. Therefore, we need to accelerate the registration of reference barcode sequences to apply high-throughput DNA barcoding to genus or species level identification in biodiversity research. PMID:24204702

  13. Electrophysiological Responses to Expectancy Violations in Semantic and Gambling Tasks: A Comparison of Different EEG Reference Approaches

    PubMed Central

    Li, Ya; Wang, Yongchun; Zhang, Baoqiang; Wang, Yonghui; Zhou, Xiaolin

    2018-01-01

    Dynamically evaluating the outcomes of our actions and thoughts is a fundamental cognitive ability. Given its excellent temporal resolution, the event-related potential (ERP) technology has been used to address this issue. The feedback-related negativity (FRN) component of ERPs has been studied intensively with the averaged linked mastoid reference method (LM). However, it is unknown whether FRN can be induced by an expectancy violation in an antonym relations context and whether LM is the most suitable reference approach. To address these issues, the current research directly compared the ERP components induced by expectancy violations in antonym expectation and gambling tasks with a within-subjects design and investigated the effect of the reference approach on the experimental effects. Specifically, we systematically compared the influence of the LM, reference electrode standardization technique (REST) and average reference (AVE) approaches on the amplitude, scalp distribution and magnitude of ERP effects as a function of expectancy violation type. The expectancy deviation in the antonym expectation task elicited an N400 effect that differed from the FRN effect induced in the gambling task; this difference was confirmed by all the three reference methods. Both the amplitudes of the ERP effects (N400 and FRN) and the magnitude as the expectancy violation increased were greater under the LM approach than those under the REST approach, followed by those under the AVE approach. Based on the statistical results, the electrode sites that showed the N400 and FRN effects critically depended on the reference method, and the results of the REST analysis were consistent with previous ERP studies. Combined with evidence from simulation studies, we suggest that REST is an optional reference method to be used in future ERP data analysis. PMID:29615858

  14. De novo assembly of highly polymorphic metagenomic data using in situ generated reference sequences and a novel BLAST-based assembly pipeline.

    PubMed

    Lin, You-Yu; Hsieh, Chia-Hung; Chen, Jiun-Hong; Lu, Xuemei; Kao, Jia-Horng; Chen, Pei-Jer; Chen, Ding-Shinn; Wang, Hurng-Yi

    2017-04-26

    The accuracy of metagenomic assembly is usually compromised by high levels of polymorphism due to divergent reads from the same genomic region recognized as different loci when sequenced and assembled together. A viral quasispecies is a group of abundant and diversified genetically related viruses found in a single carrier. Current mainstream assembly methods, such as Velvet and SOAPdenovo, were not originally intended for the assembly of such metagenomics data, and therefore demands for new methods to provide accurate and informative assembly results for metagenomic data. In this study, we present a hybrid method for assembling highly polymorphic data combining the partial de novo-reference assembly (PDR) strategy and the BLAST-based assembly pipeline (BBAP). The PDR strategy generates in situ reference sequences through de novo assembly of a randomly extracted partial data set which is subsequently used for the reference assembly for the full data set. BBAP employs a greedy algorithm to assemble polymorphic reads. We used 12 hepatitis B virus quasispecies NGS data sets from a previous study to assess and compare the performance of both PDR and BBAP. Analyses suggest the high polymorphism of a full metagenomic data set leads to fragmentized de novo assembly results, whereas the biased or limited representation of external reference sequences included fewer reads into the assembly with lower assembly accuracy and variation sensitivity. In comparison, the PDR generated in situ reference sequence incorporated more reads into the final PDR assembly of the full metagenomics data set along with greater accuracy and higher variation sensitivity. BBAP assembly results also suggest higher assembly efficiency and accuracy compared to other assembly methods. Additionally, BBAP assembly recovered HBV structural variants that were not observed amongst assembly results of other methods. Together, PDR/BBAP assembly results were significantly better than other compared methods. Both PDR and BBAP independently increased the assembly efficiency and accuracy of highly polymorphic data, and assembly performances were further improved when used together. BBAP also provides nucleotide frequency information. Together, PDR and BBAP provide powerful tools for metagenomic data studies.

  15. Torque ripple reduction of brushless DC motor based on adaptive input-output feedback linearization.

    PubMed

    Shirvani Boroujeni, M; Markadeh, G R Arab; Soltani, J

    2017-09-01

    Torque ripple reduction of Brushless DC Motors (BLDCs) is an interesting subject in variable speed AC drives. In this paper at first, a mathematical expression for torque ripple harmonics is obtained. Then for a non-ideal BLDC motor with known harmonic contents of back-EMF, calculation of desired reference current amplitudes, which are required to eliminate some selected harmonics of torque ripple, are reviewed. In order to inject the reference harmonic currents to the motor windings, an Adaptive Input-Output Feedback Linearization (AIOFBL) control is proposed, which generates the reference voltages for three phases voltage source inverter in stationary reference frame. Experimental results are presented to show the capability and validity of the proposed control method and are compared with the vector control in Multi-Reference Frame (MRF) and Pseudo-Vector Control (P-VC) method results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Contributions of satellite-determined gravity results in geodesy

    NASA Technical Reports Server (NTRS)

    Khan, M. A.

    1974-01-01

    Different forms of the theoretical gravity formula are summarized and methods of standardization of gravity anomalies obtained from satellite gravity and terrestrial gravity data are discussed in the context of three most commonly used reference figures, e.g., International Reference Ellipsoid, Reference Ellipsoid 1967, and Equilibrium Reference Ellipsoid. These methods are important in the comparison and combination of satellite gravity and gravimetric data as well as the integration of surface gravity data, collected with different objectives, in a single reference system. For ready reference, tables for such reductions are computed. Nature of the satellite gravity anomalies is examined to aid the geophysical and geodetic interpretation of these anomalies in terms of the tectonic features of the earth and the structure of the earth's crust and mantle. Computation of the Potsdam correction from satellite-determined geopotential is reviewed. The contribution of the satellite gravity results in decomposing the total observed gravity anomaly into components of geophysical interest is discussed. Recent work on the possible temporal variations in the geogravity field is briefly reviewed.

  17. Validation of a modification to Performance-Tested Method 010403: microwell DNA hybridization assay for detection of Listeria spp. in selected foods and selected environmental surfaces.

    PubMed

    Alles, Susan; Peng, Linda X; Mozola, Mark A

    2009-01-01

    A modification to Performance-Tested Method 010403, GeneQuence Listeria Test (DNAH method), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C, and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there were statistically significant differences in method performance between the DNAH method and reference culture procedures for only 2 foods (pasteurized crab meat and lettuce) at the 27 h enrichment time point and for only a single food (pasteurized crab meat) in one trial at the 30 h enrichment time point. Independent laboratory testing with 3 foods showed statistical equivalence between the methods for all foods, and results support the findings of the internal trials. Overall, considering both internal and independent laboratory trials, sensitivity of the DNAH method relative to the reference culture procedures was 90.5%. Results of testing 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the DNAH method was more productive than the reference U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the DNAH method at the 24 h time point. Overall, sensitivity of the DNAH method at 24 h relative to that of the USDA-FSIS method was 152%. The DNAH method exhibited extremely high specificity, with only 1% false-positive reactions overall.

  18. Reveal Salmonella 2.0 test for detection of Salmonella spp. in foods and environmental samples. Performance Tested Method 960801.

    PubMed

    Hoerner, Rebecca; Feldpausch, Jill; Gray, R Lucas; Curry, Stephanie; Islam, Zahidul; Goldy, Tim; Klein, Frank; Tadese, Theodros; Rice, Jennifer; Mozola, Mark

    2011-01-01

    Reveal Salmonella 2.0 is an improved version of the original Reveal Salmonella lateral flow immunoassay and is applicable to the detection of Salmonella enterica serogroups A-E in a variety of food and environmental samples. A Performance Tested Method validation study was conducted to compare performance of the Reveal 2.0 method with that of the U.S. Department of Agriculture-Food Safety and Inspection Service or U.S. Food and Drug Administration/Bacteriological Analytical Manual reference culture methods for detection of Salmonella spp. in chicken carcass rinse, raw ground turkey, raw ground beef, hot dogs, raw shrimp, a ready-to-eat meal product, dry pet food, ice cream, spinach, cantaloupe, peanut butter, stainless steel surface, and sprout irrigation water. In a total of 17 trials performed internally and four trials performed in an independent laboratory, there were no statistically significant differences in performance of the Reveal 2.0 and reference culture procedures as determined by Chi-square analysis, with the exception of one trial with stainless steel surface and one trial with sprout irrigation water where there were significantly more positive results by the Reveal 2.0 method. Considering all data generated in testing food samples using enrichment procedures specifically designed for the Reveal method, overall sensitivity of the Reveal method relative to the reference culture methods was 99%. In testing environmental samples, sensitivity of the Reveal method relative to the reference culture method was 164%. For select foods, use of the Reveal test in conjunction with reference method enrichment resulted in overall sensitivity of 92%. There were no unconfirmed positive results on uninoculated control samples in any trials for specificity of 100%. In inclusivity testing, 102 different Salmonella serovars belonging to serogroups A-E were tested and 99 were consistently positive in the Reveal test. In exclusivity testing of 33 strains of non-salmonellae representing 14 genera, 32 were negative when tested with Reveal following nonselective enrichment, and the remaining strain was found to be substantially inhibited by the enrichment media used with the Reveal method. Results of ruggedness testing showed that the Reveal test produces accurate results even with substantial deviation in sample volume or device development time.

  19. Automatic reference selection for quantitative EEG interpretation: identification of diffuse/localised activity and the active earlobe reference, iterative detection of the distribution of EEG rhythms.

    PubMed

    Wang, Bei; Wang, Xingyu; Ikeda, Akio; Nagamine, Takashi; Shibasaki, Hiroshi; Nakamura, Masatoshi

    2014-01-01

    EEG (Electroencephalograph) interpretation is important for the diagnosis of neurological disorders. The proper adjustment of the montage can highlight the EEG rhythm of interest and avoid false interpretation. The aim of this study was to develop an automatic reference selection method to identify a suitable reference. The results may contribute to the accurate inspection of the distribution of EEG rhythms for quantitative EEG interpretation. The method includes two pre-judgements and one iterative detection module. The diffuse case is initially identified by pre-judgement 1 when intermittent rhythmic waveforms occur over large areas along the scalp. The earlobe reference or averaged reference is adopted for the diffuse case due to the effect of the earlobe reference depending on pre-judgement 2. An iterative detection algorithm is developed for the localised case when the signal is distributed in a small area of the brain. The suitable averaged reference is finally determined based on the detected focal and distributed electrodes. The presented technique was applied to the pathological EEG recordings of nine patients. One example of the diffuse case is introduced by illustrating the results of the pre-judgements. The diffusely intermittent rhythmic slow wave is identified. The effect of active earlobe reference is analysed. Two examples of the localised case are presented, indicating the results of the iterative detection module. The focal and distributed electrodes are detected automatically during the repeating algorithm. The identification of diffuse and localised activity was satisfactory compared with the visual inspection. The EEG rhythm of interest can be highlighted using a suitable selected reference. The implementation of an automatic reference selection method is helpful to detect the distribution of an EEG rhythm, which can improve the accuracy of EEG interpretation during both visual inspection and automatic interpretation. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. X-ray Moiré deflectometry using synthetic reference images

    DOE PAGES

    Stutman, Dan; Valdivia, Maria Pia; Finkenthal, Michael

    2015-06-25

    Moiré fringe deflectometry with grating interferometers is a technique that enables refraction-based x-ray imaging using a single exposure of an object. To obtain the refraction image, the method requires a reference fringe pattern (without the object). Our study shows that, in order to avoid artifacts, the reference pattern must be exactly matched in phase with the object fringe pattern. In experiments, however, it is difficult to produce a perfectly matched reference pattern due to unavoidable interferometer drifts. We present a simple method to obtain matched reference patterns using a phase-scan procedure to generate synthetic Moiré images. As a result, themore » method will enable deflectometric diagnostics of transient phenomena such as laser-produced plasmas and could improve the sensitivity and accuracy of medical phase-contrast imaging.« less

  1. Comparison of Haemophilus parasuis reference strains and field isolates by using random amplified polymorphic DNA and protein profiles

    PubMed Central

    2012-01-01

    Background Haemophilus parasuis is the causative agent of Glässer’s disease and is a pathogen of swine in high-health status herds. Reports on serotyping of field strains from outbreaks describe that approximately 30% of them are nontypeable and therefore cannot be traced. Molecular typing methods have been used as alternatives to serotyping. This study was done to compare random amplified polymorphic DNA (RAPD) profiles and whole cell protein (WCP) lysate profiles as methods for distinguishing H. parasuis reference strains and field isolates. Results The DNA and WCP lysate profiles of 15 reference strains and 31 field isolates of H. parasuis were analyzed using the Dice and neighbor joining algorithms. The results revealed unique and reproducible DNA and protein profiles among the reference strains and field isolates studied. Simpson’s index of diversity showed significant discrimination between isolates when three 10mer primers were combined for the RAPD method and also when both the RAPD and WCP lysate typing methods were combined. Conclusions The RAPD profiles seen among the reference strains and field isolates did not appear to change over time which may reflect a lack of DNA mutations in the genes of the samples. The recent field isolates had different WCP lysate profiles than the reference strains, possibly because the number of passages of the type strains may affect their protein expression. PMID:22703293

  2. Determination of the purity of pharmaceutical reference materials by 1H NMR using the standardless PULCON methodology.

    PubMed

    Monakhova, Yulia B; Kohl-Himmelseher, Matthias; Kuballa, Thomas; Lachenmeier, Dirk W

    2014-11-01

    A fast and reliable nuclear magnetic resonance spectroscopic method for quantitative determination (qNMR) of targeted molecules in reference materials has been established using the ERETIC2 methodology (electronic reference to access in vivo concentrations) based on the PULCON principle (pulse length based concentration determination). The developed approach was validated for the analysis of pharmaceutical samples in the context of official medicines control, including ibandronic acid, amantadine, ambroxol and lercanidipine. The PULCON recoveries were above 94.3% and coefficients of variation (CVs) obtained by quantification of different targeted resonances ranged between 0.7% and 2.8%, demonstrating that the qNMR method is a precise tool for rapid quantification (approximately 15min) of reference materials and medicinal products. Generally, the values were within specification (certified values) provided by the manufactures. The results were in agreement with NMR quantification using an internal standard and validated reference HPLC analysis. The PULCON method was found to be a practical alternative with competitive precision and accuracy to the classical internal reference method and it proved to be applicable to different solvent conditions. The method can be recommended for routine use in medicines control laboratories, especially when the availability and costs of reference compounds are problematic. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Automatic identification of the reference system based on the fourth ventricular landmarks in T1-weighted MR images.

    PubMed

    Fu, Yili; Gao, Wenpeng; Chen, Xiaoguang; Zhu, Minwei; Shen, Weigao; Wang, Shuguo

    2010-01-01

    The reference system based on the fourth ventricular landmarks (including the fastigial point and ventricular floor plane) is used in medical image analysis of the brain stem. The objective of this study was to develop a rapid, robust, and accurate method for the automatic identification of this reference system on T1-weighted magnetic resonance images. The fully automated method developed in this study consisted of four stages: preprocessing of the data set, expectation-maximization algorithm-based extraction of the fourth ventricle in the region of interest, a coarse-to-fine strategy for identifying the fastigial point, and localization of the base point. The method was evaluated on 27 Brain Web data sets qualitatively and 18 Internet Brain Segmentation Repository data sets and 30 clinical scans quantitatively. The results of qualitative evaluation indicated that the method was robust to rotation, landmark variation, noise, and inhomogeneity. The results of quantitative evaluation indicated that the method was able to identify the reference system with an accuracy of 0.7 +/- 0.2 mm for the fastigial point and 1.1 +/- 0.3 mm for the base point. It took <6 seconds for the method to identify the related landmarks on a personal computer with an Intel Core 2 6300 processor and 2 GB of random-access memory. The proposed method for the automatic identification of the reference system based on the fourth ventricular landmarks was shown to be rapid, robust, and accurate. The method has potentially utility in image registration and computer-aided surgery.

  4. SU-G-BRB-14: Uncertainty of Radiochromic Film Based Relative Dose Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devic, S; Tomic, N; DeBlois, F

    2016-06-15

    Purpose: Due to inherently non-linear dose response, measurement of relative dose distribution with radiochromic film requires measurement of absolute dose using a calibration curve following previously established reference dosimetry protocol. On the other hand, a functional form that converts the inherently non-linear dose response curve of the radiochromic film dosimetry system into linear one has been proposed recently [Devic et al, Med. Phys. 39 4850–4857 (2012)]. However, there is a question what would be the uncertainty of such measured relative dose. Methods: If the relative dose distribution is determined going through the reference dosimetry system (conversion of the response bymore » using calibration curve into absolute dose) the total uncertainty of such determined relative dose will be calculated by summing in quadrature total uncertainties of doses measured at a given and at the reference point. On the other hand, if the relative dose is determined using linearization method, the new response variable is calculated as ζ=a(netOD)n/ln(netOD). In this case, the total uncertainty in relative dose will be calculated by summing in quadrature uncertainties for a new response function (σζ) for a given and the reference point. Results: Except at very low doses, where the measurement uncertainty dominates, the total relative dose uncertainty is less than 1% for the linear response method as compared to almost 2% uncertainty level for the reference dosimetry method. The result is not surprising having in mind that the total uncertainty of the reference dose method is dominated by the fitting uncertainty, which is mitigated in the case of linearization method. Conclusion: Linearization of the radiochromic film dose response provides a convenient and a more precise method for relative dose measurements as it does not require reference dosimetry and creation of calibration curve. However, the linearity of the newly introduced function must be verified. Dave Lewis is inventor and runs a consulting company for radiochromic films.« less

  5. Note: Improved calibration of atomic force microscope cantilevers using multiple reference cantilevers.

    PubMed

    Sader, John E; Friend, James R

    2015-05-01

    Overall precision of the simplified calibration method in J. E. Sader et al., Rev. Sci. Instrum. 83, 103705 (2012), Sec. III D, is dominated by the spring constant of the reference cantilever. The question arises: How does one take measurements from multiple reference cantilevers, and combine these results, to improve uncertainty of the reference cantilever's spring constant and hence the overall precision of the method? This question is addressed in this note. Its answer enables manufacturers to specify of a single set of data for the spring constant, resonant frequency, and quality factor, from measurements on multiple reference cantilevers. With this data set, users can trivially calibrate cantilevers of the same type.

  6. Methods for Scaling Icing Test Conditions

    NASA Technical Reports Server (NTRS)

    Anderson, David N.

    1995-01-01

    This report presents the results of tests at NASA Lewis to evaluate several methods to establish suitable alternative test conditions when the test facility limits the model size or operating conditions. The first method was proposed by Olsen. It can be applied when full-size models are tested and all the desired test conditions except liquid-water content can be obtained in the facility. The other two methods discussed are: a modification of the French scaling law and the AEDC scaling method. Icing tests were made with cylinders at both reference and scaled conditions representing mixed and glaze ice in the NASA Lewis Icing Research Tunnel. Reference and scale ice shapes were compared to evaluate each method. The Olsen method was tested with liquid-water content varying from 1.3 to .8 g/m(exp3). Over this range, ice shapes produced using the Olsen method were unchanged. The modified French and AEDC methods produced scaled ice shapes which approximated the reference shapes when model size was reduced to half the reference size for the glaze-ice cases tested.

  7. Reference-free ground truth metric for metal artifact evaluation in CT images.

    PubMed

    Kratz, Bärbel; Ens, Svitlana; Müller, Jan; Buzug, Thorsten M

    2011-07-01

    In computed tomography (CT), metal objects in the region of interest introduce data inconsistencies during acquisition. Reconstructing these data results in an image with star shaped artifacts induced by the metal inconsistencies. To enhance image quality, the influence of the metal objects can be reduced by different metal artifact reduction (MAR) strategies. For an adequate evaluation of new MAR approaches a ground truth reference data set is needed. In technical evaluations, where phantoms can be measured with and without metal inserts, ground truth data can easily be obtained by a second reference data acquisition. Obviously, this is not possible for clinical data. Here, an alternative evaluation method is presented without the need of an additionally acquired reference data set. The proposed metric is based on an inherent ground truth for metal artifacts as well as MAR methods comparison, where no reference information in terms of a second acquisition is needed. The method is based on the forward projection of a reconstructed image, which is compared to the actually measured projection data. The new evaluation technique is performed on phantom and on clinical CT data with and without MAR. The metric results are then compared with methods using a reference data set as well as an expert-based classification. It is shown that the new approach is an adequate quantification technique for artifact strength in reconstructed metal or MAR CT images. The presented method works solely on the original projection data itself, which yields some advantages compared to distance measures in image domain using two data sets. Beside this, no parameters have to be manually chosen. The new metric is a useful evaluation alternative when no reference data are available.

  8. A candidate reference method for serum potassium measurement by inductively coupled plasma mass spectrometry.

    PubMed

    Yan, Ying; Han, Bingqing; Zeng, Jie; Zhou, Weiyan; Zhang, Tianjiao; Zhang, Jiangtao; Chen, Wenxiang; Zhang, Chuanbao

    2017-08-28

    Potassium is an important serum ion that is frequently assayed in clinical laboratories. Quality assurance requires reference methods; thus, the establishment of a candidate reference method for serum potassium measurements is important. An inductively coupled plasma mass spectrometry (ICP-MS) method was developed. Serum samples were gravimetrically spiked with an aluminum internal standard, digested with 69% ultrapure nitric acid, and diluted to the required concentration. The 39K/27Al ratios were measured by ICP-MS in hydrogen mode. The method was calibrated using 5% nitric acid matrix calibrators, and the calibration function was established using the bracketing method. The correlation coefficients between the measured 39K/27Al ratios and the analyte concentration ratios were >0.9999. The coefficients of variation were 0.40%, 0.68%, and 0.22% for the three serum samples, and the analytical recovery was 99.8%. The accuracy of the measurement was also verified by measuring certified reference materials, SRM909b and SRM956b. Comparison with the ion selective electrode routine method and international inter-laboratory comparisons gave satisfied results. The new ICP-MS method is specific, precise, simple, and low-cost, and it may be used as a candidate reference method for standardizing serum potassium measurements.

  9. Fecal electrolyte testing for evaluation of unexplained diarrhea: Validation of body fluid test accuracy in the absence of a reference method.

    PubMed

    Voskoboev, Nikolay V; Cambern, Sarah J; Hanley, Matthew M; Giesen, Callen D; Schilling, Jason J; Jannetto, Paul J; Lieske, John C; Block, Darci R

    2015-11-01

    Validation of tests performed on body fluids other than blood or urine can be challenging due to the lack of a reference method to confirm accuracy. The aim of this study was to evaluate alternate assessments of accuracy that laboratories can rely on to validate body fluid tests in the absence of a reference method using the example of sodium (Na(+)), potassium (K(+)), and magnesium (Mg(2+)) testing in stool fluid. Validations of fecal Na(+), K(+), and Mg(2+) were performed on the Roche cobas 6000 c501 (Roche Diagnostics) using residual stool specimens submitted for clinical testing. Spiked recovery, mixing studies, and serial dilutions were performed and % recovery of each analyte was calculated to assess accuracy. Results were confirmed by comparison to a reference method (ICP-OES, PerkinElmer). Mean recoveries for fecal electrolytes were Na(+) upon spiking=92%, mixing=104%, and dilution=105%; K(+) upon spiking=94%, mixing=96%, and dilution=100%; and Mg(2+) upon spiking=93%, mixing=98%, and dilution=100%. When autoanalyzer results were compared to reference ICP-OES results, Na(+) had a slope=0.94, intercept=4.1, and R(2)=0.99; K(+) had a slope=0.99, intercept=0.7, and R(2)=0.99; and Mg(2+) had a slope=0.91, intercept=-4.6, and R(2)=0.91. Calculated osmotic gap using both methods were highly correlated with slope=0.95, intercept=4.5, and R(2)=0.97. Acid pretreatment increased magnesium recovery from a subset of clinical specimens. A combination of mixing, spiking, and dilution recovery experiments are an acceptable surrogate for assessing accuracy in body fluid validations in the absence of a reference method. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    PubMed Central

    Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112

  11. Circuit Riding: A Method for Providing Reference Services.

    ERIC Educational Resources Information Center

    Plunket, Linda; And Others

    1983-01-01

    Discussion of the design and implementation of the Circuit Rider Librarian Program, a shared services project for delivering reference services to eight hospitals in Maine, includes a cost analysis of services and description of user evaluation survey. Five references, composite results of the survey, and postgrant options proposal are appended.…

  12. The development of a revised version of multi-center molecular Ornstein-Zernike equation

    NASA Astrophysics Data System (ADS)

    Kido, Kentaro; Yokogawa, Daisuke; Sato, Hirofumi

    2012-04-01

    Ornstein-Zernike (OZ)-type theory is a powerful tool to obtain 3-dimensional solvent distribution around solute molecule. Recently, we proposed multi-center molecular OZ method, which is suitable for parallel computing of 3D solvation structure. The distribution function in this method consists of two components, namely reference and residue parts. Several types of the function were examined as the reference part to investigate the numerical robustness of the method. As the benchmark, the method is applied to water, benzene in aqueous solution and single-walled carbon nanotube in chloroform solution. The results indicate that fully-parallelization is achieved by utilizing the newly proposed reference functions.

  13. A localization algorithm of adaptively determining the ROI of the reference circle in image

    NASA Astrophysics Data System (ADS)

    Xu, Zeen; Zhang, Jun; Zhang, Daimeng; Liu, Xiaomao; Tian, Jinwen

    2018-03-01

    Aiming at solving the problem of accurately positioning the detection probes underwater, this paper proposed a method based on computer vision which can effectively solve this problem. The theory of this method is that: First, because the shape information of the heat tube is similar to a circle in the image, we can find a circle which physical location is well known in the image, we set this circle as the reference circle. Second, we calculate the pixel offset between the reference circle and the probes in the picture, and adjust the steering gear through the offset. As a result, we can accurately measure the physical distance between the probes and the under test heat tubes, then we can know the precise location of the probes underwater. However, how to choose reference circle in image is a difficult problem. In this paper, we propose an algorithm that can adaptively confirm the area of reference circle. In this area, there will be only one circle, and the circle is the reference circle. The test results show that the accuracy of the algorithm of extracting the reference circle in the whole picture without using ROI (region of interest) of the reference circle is only 58.76% and the proposed algorithm is 95.88%. The experimental results indicate that the proposed algorithm can effectively improve the efficiency of the tubes detection.

  14. A convenient method for X-ray analysis in TEM that measures mass thickness and composition

    NASA Astrophysics Data System (ADS)

    Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.

    2018-01-01

    We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.

  15. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method.

    PubMed

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-07-22

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account.

  16. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method

    PubMed Central

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-01-01

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account. PMID:27445105

  17. Statistical considerations for harmonization of the global multicenter study on reference values.

    PubMed

    Ichihara, Kiyoshi

    2014-05-15

    The global multicenter study on reference values coordinated by the Committee on Reference Intervals and Decision Limits (C-RIDL) of the IFCC was launched in December 2011, targeting 45 commonly tested analytes with the following objectives: 1) to derive reference intervals (RIs) country by country using a common protocol, and 2) to explore regionality/ethnicity of reference values by aligning test results among the countries. To achieve these objectives, it is crucial to harmonize 1) the protocol for recruitment and sampling, 2) statistical procedures for deriving the RI, and 3) test results through measurement of a panel of sera in common. For harmonized recruitment, very lenient inclusion/exclusion criteria were adopted in view of differences in interpretation of what constitutes healthiness by different cultures and investigators. This policy may require secondary exclusion of individuals according to the standard of each country at the time of deriving RIs. An iterative optimization procedure, called the latent abnormal values exclusion (LAVE) method, can be applied to automate the process of refining the choice of reference individuals. For global comparison of reference values, test results must be harmonized, based on the among-country, pair-wise linear relationships of test values for the panel. Traceability of reference values can be ensured based on values assigned indirectly to the panel through collaborative measurement of certified reference materials. The validity of the adopted strategies is discussed in this article, based on interim results obtained to date from five countries. Special considerations are made for dissociation of RIs by parametric and nonparametric methods and between-country difference in the effect of body mass index on reference values. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Development of a new ferulic acid certified reference material for use in clinical chemistry and pharmaceutical analysis.

    PubMed

    Yang, Dezhi; Wang, Fengfeng; Zhang, Li; Gong, Ningbo; Lv, Yang

    2015-05-01

    This study compares the results of three certified methods, namely differential scanning calorimetry (DSC), the mass balance (MB) method and coulometric titrimetry (CT), in the purity assessment of ferulic acid certified reference material (CRM). Purity and expanded uncertainty as determined by the three methods were respectively 99.81%, 0.16%; 99.79%, 0.16%; and 99.81%, 0.26% with, in all cases, a coverage factor (k) of 2 (P=95%). The purity results are consistent indicating that the combination of DSC, the MB method and CT provides a confident assessment of the purity of suitable CRMs like ferulic acid.

  19. Uptake of recommended common reference intervals for chemical pathology in Australia.

    PubMed

    Jones, Graham Rd; Koetsier, Sabrina

    2017-05-01

    Background Reference intervals are a vital part of reporting numerical pathology results. It is known, however, that variation in reference intervals between laboratories is common, even when analytical methods support common reference intervals. In response to this, in Australia, the Australasian Association of Clinical Biochemists together with the Royal College of Pathologists of Australasia published in 2014 a set of recommended common reference intervals for 11 common serum analytes (sodium, potassium, chloride, bicarbonate, creatinine male, creatinine female, calcium, calcium adjusted for albumin, phosphate, magnesium, lactate dehydrogenase, alkaline phosphatase and total protein). Methods Uptake of recommended common reference intervals in Australian laboratories was assessed using data from four annual cycles of the RCPAQAP reference intervals external quality assurance programme. Results Over three years, from 2013 to 2016, the use of the recommended upper and lower reference limits has increased from 40% to 83%. Nearly half of the intervals in use by enrolled laboratories in 2016 have been changed in this time period, indicating an active response to the guidelines. Conclusions These data support the activities of the Australasian Association of Clinical Biochemists and Royal College of Pathologists of Australasia in demonstrating a change in laboratory behaviour to reduce unnecessary variation in reference intervals and thus provide a consistent message to doctor and patients irrespective of the laboratory used.

  20. Absolute Radiometric Calibration of Narrow-Swath Imaging Sensors with Reference to Non-Coincident Wide-Swath Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Thome, Kurtis; Lockwood, Ronald

    2012-01-01

    An inter-calibration method is developed to provide absolute radiometric calibration of narrow-swath imaging sensors with reference to non-coincident wide-swath sensors. The method predicts at-sensor radiance using non-coincident imagery from the reference sensor and knowledge of spectral reflectance of the test site. The imagery of the reference sensor is restricted to acquisitions that provide similar view and solar illumination geometry to reduce uncertainties due to directional reflectance effects. Spectral reflectance of the test site is found with a simple iterative radiative transfer method using radiance values of a well-understood wide-swath sensor and spectral shape information based on historical ground-based measurements. At-sensor radiance is calculated for the narrow-swath sensor using this spectral reflectance and atmospheric parameters that are also based on historical in situ measurements. Results of the inter-calibration method show agreement on the 2 5 percent level in most spectral regions with the vicarious calibration technique relying on coincident ground-based measurements referred to as the reflectance-based approach. While the variability of the inter-calibration method based on non-coincident image pairs is significantly larger, results are consistent with techniques relying on in situ measurements. The method is also insensitive to spectral differences between the sensors by transferring to surface spectral reflectance prior to prediction of at-sensor radiance. The utility of this inter-calibration method is made clear by its flexibility to utilize image pairings with acquisition dates differing in excess of 30 days allowing frequent absolute calibration comparisons between wide- and narrow-swath sensors.

  1. Romer Labs RapidChek®Listeria monocytogenes Test System for the Detection of L. monocytogenes on Selected Foods and Environmental Surfaces.

    PubMed

    Juck, Gregory; Gonzalez, Verapaz; Allen, Ann-Christine Olsson; Sutzko, Meredith; Seward, Kody; Muldoon, Mark T

    2018-04-27

    The Romer Labs RapidChek ® Listeria monocytogenes test system (Performance Tested Method ℠ 011805) was validated against the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook (USDA-FSIS/MLG), U.S. Food and Drug Association Bacteriological Analytical Manual (FDA/BAM), and AOAC Official Methods of Analysis ℠ (AOAC/OMA) cultural reference methods for the detection of L. monocytogenes on selected foods including hot dogs, frozen cooked breaded chicken, frozen cooked shrimp, cured ham, and ice cream, and environmental surfaces including stainless steel and plastic in an unpaired study design. The RapidChek method uses a proprietary enrichment media system, a 44-48 h enrichment at 30 ± 1°C, and detects L. monocytogenes on an immunochromatographic lateral flow device within 10 min. Different L. monocytogenes strains were used to spike each of the matrixes. Samples were confirmed based on the reference method confirmations and an alternate confirmation method. A total of 140 low-level spiked samples were tested by the RapidChek method after enrichment for 44-48 h in parallel with the cultural reference method. There were 88 RapidChek presumptive positives. One of the presumptive positives was not confirmed culturally. Additionally, one of the culturally confirmed samples did not exhibit a presumptive positive. No difference between the alternate confirmation method and reference confirmation method was observed. The respective cultural reference methods (USDA-FSIS/MLG, FDA/BAM, and AOAC/OMA) produced a total of 63 confirmed positive results. Nonspiked samples from all foods were reported as negative for L. monocytogenes by all methods. Probability of detection analysis demonstrated no significant differences in the number of positive samples detected by the RapidChek method and the respective cultural reference method.

  2. Validation of miRNA genes suitable as reference genes in qPCR analyses of miRNA gene expression in Atlantic salmon (Salmo salar).

    PubMed

    Johansen, Ilona; Andreassen, Rune

    2014-12-23

    MicroRNAs (miRNAs) are an abundant class of endogenous small RNA molecules that downregulate gene expression at the post-transcriptional level. They play important roles by regulating genes that control multiple biological processes, and recent years there has been an increased interest in studying miRNA genes and miRNA gene expression. The most common method applied to study gene expression of single genes is quantitative PCR (qPCR). However, before expression of mature miRNAs can be studied robust qPCR methods (miRNA-qPCR) must be developed. This includes identification and validation of suitable reference genes. We are particularly interested in Atlantic salmon (Salmo salar). This is an economically important aquaculture species, but no reference genes dedicated for use in miRNA-qPCR methods has been validated for this species. Our aim was, therefore, to identify suitable reference genes for miRNA-qPCR methods in Salmo salar. We used a systematic approach where we utilized similar studies in other species, some biological criteria, results from deep sequencing of small RNAs and, finally, experimental validation of candidate reference genes by qPCR to identify the most suitable reference genes. Ssa-miR-25-3p was identified as most suitable single reference gene. The best combinations of two reference genes were ssa-miR-25-3p and ssa-miR-455-5p. These two genes were constitutively and stably expressed across many different tissues. Furthermore, infectious salmon anaemia did not seem to affect their expression levels. These genes were amplified with high specificity, good efficiency and the qPCR assays showed a good linearity when applying a simple cybergreen miRNA-PCR method using miRNA gene specific forward primers. We have identified suitable reference genes for miRNA-qPCR in Atlantic salmon. These results will greatly facilitate further studies on miRNA genes in this species. The reference genes identified are conserved genes that are identical in their mature sequence in many aquaculture species. Therefore, they may also be suitable as reference genes in other teleosts. Finally, the systematic approach used in our study successfully identified suitable reference genes, suggesting that this may be a useful strategy to apply in similar validation studies in other aquaculture species.

  3. Reference test methods for total water in lint cotton by Karl Fischer Titration and low temperature distillation

    USDA-ARS?s Scientific Manuscript database

    In a study of comparability of total water contents (%) of conditioned cottons by Karl Fischer Titration (KFT) and Low Temperature Distillation (LTD) reference methods, we demonstrated a match of averaged results based on a large number of replications and weighing the test specimens at the same tim...

  4. Note: Improved calibration of atomic force microscope cantilevers using multiple reference cantilevers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sader, John E., E-mail: jsader@unimelb.edu.au; Friend, James R.

    2015-05-15

    Overall precision of the simplified calibration method in J. E. Sader et al., Rev. Sci. Instrum. 83, 103705 (2012), Sec. III D, is dominated by the spring constant of the reference cantilever. The question arises: How does one take measurements from multiple reference cantilevers, and combine these results, to improve uncertainty of the reference cantilever’s spring constant and hence the overall precision of the method? This question is addressed in this note. Its answer enables manufacturers to specify of a single set of data for the spring constant, resonant frequency, and quality factor, from measurements on multiple reference cantilevers. Withmore » this data set, users can trivially calibrate cantilevers of the same type.« less

  5. Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses

    PubMed Central

    Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah

    2015-01-01

    Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481

  6. Certification of elements in and use of standard reference material 3280 multivitamin/multielement tablets.

    PubMed

    Turk, Gregory C; Sharpless, Katherine E; Cleveland, Danielle; Jongsma, Candice; Mackey, Elizabeth A; Marlow, Anthony F; Oflaz, Rabia; Paul, Rick L; Sieber, John R; Thompson, Robert Q; Wood, Laura J; Yu, Lee L; Zeisler, Rolf; Wise, Stephen A; Yen, James H; Christopher, Steven J; Day, Russell D; Long, Stephen E; Greene, Ella; Harnly, James; Ho, I-Pin; Betz, Joseph M

    2013-01-01

    Standard Reference Material 3280 Multivitamin/ Multielement Tablets was issued by the National Institute of Standards and Technology in 2009, and has certified and reference mass fraction values for 13 vitamins, 26 elements, and two carotenoids. Elements were measured using two or more analytical methods at NIST with additional data contributed by collaborating laboratories. This reference material is expected to serve a dual purpose: to provide quality assurance in support of a database of dietary supplement products and to provide a means for analysts, dietary supplement manufacturers, and researchers to assess the appropriateness and validity of their analytical methods and the accuracy of their results.

  7. Evaluation of reference gene suitability for quantitative expression analysis by quantitative polymerase chain reaction in the mandibular condyle of sheep.

    PubMed

    Jiang, Xin; Xue, Yang; Zhou, Hongzhi; Li, Shouhong; Zhang, Zongmin; Hou, Rui; Ding, Yuxiang; Hu, Kaijin

    2015-10-01

    Reference genes are commonly used as a reliable approach to normalize the results of quantitative polymerase chain reaction (qPCR), and to reduce errors in the relative quantification of gene expression. Suitable reference genes belonging to numerous functional classes have been identified for various types of species and tissue. However, little is currently known regarding the most suitable reference genes for bone, specifically for the sheep mandibular condyle. Sheep are important for the study of human bone diseases, particularly for temporomandibular diseases. The present study aimed to identify a set of reference genes suitable for the normalization of qPCR data from the mandibular condyle of sheep. A total of 12 reference genes belonging to various functional classes were selected, and the expression stability of the reference genes was determined in both the normal and fractured area of the sheep mandibular condyle. RefFinder, which integrates the following currently available computational algorithms: geNorm, NormFinder, BestKeeper, and the comparative ΔCt method, was used to compare and rank the candidate reference genes. The results obtained from the four methods demonstrated a similar trend: RPL19, ACTB, and PGK1 were the most stably expressed reference genes in the sheep mandibular condyle. As determined by RefFinder comprehensive analysis, the results of the present study suggested that RPL19 is the most suitable reference gene for studies associated with the sheep mandibular condyle. In addition, ACTB and PGK1 may be considered suitable alternatives.

  8. Semiautomatic registration of 3D transabdominal ultrasound images for patient repositioning during postprostatectomy radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Presles, Benoît, E-mail: benoit.presles@creatis.insa-lyon.fr; Rit, Simon; Sarrut, David

    2014-12-15

    Purpose: The aim of the present work is to propose and evaluate registration algorithms of three-dimensional (3D) transabdominal (TA) ultrasound (US) images to setup postprostatectomy patients during radiation therapy. Methods: Three registration methods have been developed and evaluated to register a reference 3D-TA-US image acquired during the planning CT session and a 3D-TA-US image acquired before each treatment session. The first method (method A) uses only gray value information, whereas the second one (method B) uses only gradient information. The third one (method C) combines both sets of information. All methods restrict the comparison to a region of interest computedmore » from the dilated reference positioning volume drawn on the reference image and use mutual information as a similarity measure. The considered geometric transformations are translations and have been optimized by using the adaptive stochastic gradient descent algorithm. Validation has been carried out using manual registration by three operators of the same set of image pairs as the algorithms. Sixty-two treatment US images of seven patients irradiated after a prostatectomy have been registered to their corresponding reference US image. The reference registration has been defined as the average of the manual registration values. Registration error has been calculated by subtracting the reference registration from the algorithm result. For each session, the method has been considered a failure if the registration error was above both the interoperator variability of the session and a global threshold of 3.0 mm. Results: All proposed registration algorithms have no systematic bias. Method B leads to the best results with mean errors of −0.6, 0.7, and −0.2 mm in left–right (LR), superior–inferior (SI), and anterior–posterior (AP) directions, respectively. With this method, the standard deviations of the mean error are of 1.7, 2.4, and 2.6 mm in LR, SI, and AP directions, respectively. The latter are inferior to the interoperator registration variabilities which are of 2.5, 2.5, and 3.5 mm in LR, SI, and AP directions, respectively. Failures occur in 5%, 18%, and 10% of cases in LR, SI, and AP directions, respectively. 69% of the sessions have no failure. Conclusions: Results of the best proposed registration algorithm of 3D-TA-US images for postprostatectomy treatment have no bias and are in the same variability range as manual registration. As the algorithm requires a short computation time, it could be used in clinical practice provided that a visual review is performed.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.

    Here, as part of an international intercomparison project, a set of single-column models (SCMs) and cloud-resolving models (CRMs) are run under the weak-temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistentmore » implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.« less

  10. Noise-free recovery of optodigital encrypted and multiplexed images.

    PubMed

    Henao, Rodrigo; Rueda, Edgar; Barrera, John F; Torroba, Roberto

    2010-02-01

    We present a method that allows storing multiple encrypted data using digital holography and a joint transform correlator architecture with a controllable angle reference wave. In this method, the information is multiplexed by using a key and a different reference wave angle for each object. In the recovering process, the use of different reference wave angles prevents noise produced by the nonrecovered objects from being superimposed on the recovered object; moreover, the position of the recovered object in the exit plane can be fully controlled. We present the theoretical analysis and the experimental results that show the potential and applicability of the method.

  11. Flip-avoiding interpolating surface registration for skull reconstruction.

    PubMed

    Xie, Shudong; Leow, Wee Kheng; Lee, Hanjing; Lim, Thiam Chye

    2018-03-30

    Skull reconstruction is an important and challenging task in craniofacial surgery planning, forensic investigation and anthropological studies. Existing methods typically reconstruct approximating surfaces that regard corresponding points on the target skull as soft constraints, thus incurring non-zero error even for non-defective parts and high overall reconstruction error. This paper proposes a novel geometric reconstruction method that non-rigidly registers an interpolating reference surface that regards corresponding target points as hard constraints, thus achieving low reconstruction error. To overcome the shortcoming of interpolating a surface, a flip-avoiding method is used to detect and exclude conflicting hard constraints that would otherwise cause surface patches to flip and self-intersect. Comprehensive test results show that our method is more accurate and robust than existing skull reconstruction methods. By incorporating symmetry constraints, it can produce more symmetric and normal results than other methods in reconstructing defective skulls with a large number of defects. It is robust against severe outliers such as radiation artifacts in computed tomography due to dental implants. In addition, test results also show that our method outperforms thin-plate spline for model resampling, which enables the active shape model to yield more accurate reconstruction results. As the reconstruction accuracy of defective parts varies with the use of different reference models, we also study the implication of reference model selection for skull reconstruction. Copyright © 2018 John Wiley & Sons, Ltd.

  12. Reproducibility of polycarbonate reference material in toxicity evaluation

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Huttlinger, P. A.

    1981-01-01

    A specific lot of bisphenol A polycarbonate has been used for almost four years as the reference material for the NASA-USF-PSC toxicity screening test method. The reproducibility of the test results over this period of time indicate that certain plastics may be more suitable reference materials than the more traditional cellulosic materials.

  13. Incorporating geographical factors with artificial neural networks to predict reference values of erythrocyte sedimentation rate

    PubMed Central

    2013-01-01

    Background The measurement of the Erythrocyte Sedimentation Rate (ESR) value is a standard procedure performed during a typical blood test. In order to formulate a unified standard of establishing reference ESR values, this paper presents a novel prediction model in which local normal ESR values and corresponding geographical factors are used to predict reference ESR values using multi-layer feed-forward artificial neural networks (ANN). Methods and findings Local normal ESR values were obtained from hospital data, while geographical factors that include altitude, sunshine hours, relative humidity, temperature and precipitation were obtained from the National Geographical Data Information Centre in China. The results show that predicted values are statistically in agreement with measured values. Model results exhibit significant agreement between training data and test data. Consequently, the model is used to predict the unseen local reference ESR values. Conclusions Reference ESR values can be established with geographical factors by using artificial intelligence techniques. ANN is an effective method for simulating and predicting reference ESR values because of its ability to model nonlinear and complex relationships. PMID:23497145

  14. Diffuse reflectance spectrophotometry with visible light: comparison of four different methods in a tissue phantom

    NASA Astrophysics Data System (ADS)

    Gade, John; Palmqvist, Dorte; Plomgård, Peter; Greisen, Gorm

    2006-01-01

    The purpose of the study was to compare algorithms of four methods (plus two modifications) for spectrophotometric haemoglobin saturation measurements. Comparison was made in tissue phantoms basically consisting of a phosphate buffer, Intralipid and blood, allowing samples to be taken for reference measurements. Three experimental series were made. In experiment A (eight phantoms) we used the Knoefel method and measured specific extinction coefficients with a reflection spectrophotometer. In experiment B (six phantoms) the fully oxygenated phantoms were gradually deoxygenated with baker's yeast, and simultaneous measurements were made with our spectrophotometer and with a reference oxymeter (ABL-605) in 3 min intervals. For each spectrophotometric measurement haemoglobin saturation was calculated with all algorithms and modifications, and compared with reference. In experiment C (11 phantoms) we evaluated the ability of a modification of the Knoefel method to measure haemoglobin concentration in absolute quantities using extinction coefficients from experiment A. Results. Experiment A: with the Knoefel method extinction coefficients (±SD) for oxyhaemoglobin at 553.04 and 573.75 nm were 1.117 (±0.0396) ODmM-1 and 1.680 (± 0.0815) ODmM-1, respectively, and for deoxyhaemoglobin 1.205 (± 0.0514) ODmM-1 and 0.953 (±0.0487) ODmM-1, respectively. Experiment B: high correlation with the reference was found in all methods (r = 0.94-0.97). However, agreement varied from evidently wrong in method 3 and the original method 4 (e.g. saturation above 160%) to high agreement in method 2 as well as the modifications of methods 1 and 4, where oxygen dissociation curves were close to the reference method. Experiment C: with the modified Knoefel method the mean haemoglobin concentration difference from reference was 8.3% and the correlation was high (r = 0.91). We conclude that method 2 and the modifications of 1 and 4 were superior to the others, but depended on known values in the same or similar phantoms. The original method 1 was independent of results from the tissue phantoms, but agreement was slightly poorer. Method 3 and the original method 4 could not be recommended. The ability of the modified method 1 to measure haemoglobin concentration is promising, but needs further development.

  15. How accurately can the peak skin dose in fluoroscopy be determined using indirect dose metrics?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Ensor, Joe E.; Pasciak, Alexander S.

    Purpose: Skin dosimetry is important for fluoroscopically-guided interventions, as peak skin doses (PSD) that result in skin reactions can be reached during these procedures. There is no consensus as to whether or not indirect skin dosimetry is sufficiently accurate for fluoroscopically-guided interventions. However, measuring PSD with film is difficult and the decision to do so must be madea priori. The purpose of this study was to assess the accuracy of different types of indirect dose estimates and to determine if PSD can be calculated within ±50% using indirect dose metrics for embolization procedures. Methods: PSD were measured directly using radiochromicmore » film for 41 consecutive embolization procedures at two sites. Indirect dose metrics from the procedures were collected, including reference air kerma. Four different estimates of PSD were calculated from the indirect dose metrics and compared along with reference air kerma to the measured PSD for each case. The four indirect estimates included a standard calculation method, the use of detailed information from the radiation dose structured report, and two simplified calculation methods based on the standard method. Indirect dosimetry results were compared with direct measurements, including an analysis of uncertainty associated with film dosimetry. Factors affecting the accuracy of the different indirect estimates were examined. Results: When using the standard calculation method, calculated PSD were within ±35% for all 41 procedures studied. Calculated PSD were within ±50% for a simplified method using a single source-to-patient distance for all calculations. Reference air kerma was within ±50% for all but one procedure. Cases for which reference air kerma or calculated PSD exhibited large (±35%) differences from the measured PSD were analyzed, and two main causative factors were identified: unusually small or large source-to-patient distances and large contributions to reference air kerma from cone beam computed tomography or acquisition runs acquired at large primary gantry angles. When calculated uncertainty limits [−12.8%, 10%] were applied to directly measured PSD, most indirect PSD estimates remained within ±50% of the measured PSD. Conclusions: Using indirect dose metrics, PSD can be determined within ±35% for embolization procedures. Reference air kerma can be used without modification to set notification limits and substantial radiation dose levels, provided the displayed reference air kerma is accurate. These results can reasonably be extended to similar procedures, including vascular and interventional oncology. Considering these results, film dosimetry is likely an unnecessary effort for these types of procedures when indirect dose metrics are available.« less

  16. Development of a candidate reference measurement procedure for the analysis of cortisol in human serum samples by isotope dilution-gas chromatography-mass spectrometry.

    PubMed

    Kawaguchi, Migaku; Takatsu, Akiko

    2009-08-01

    A candidate reference measurement procedure involving isotope dilution coupled with gas chromatography-mass spectrometry (GC-MS) has been developed and critically evaluated. An isotopically labeled internal standard, cortisol-d(2), was added to a serum sample. After equilibration, solid-phase extractions (SPE) for sample preparation and derivatization with heptafluorobutyric anhydride (HFBA) were performed for GC-MS analysis. The limit of detection (LOD) and the limit of quantification (LOQ) were 5 and 20 ng g(-1), respectively. The recovery of the added cortisol ranged from 99.8 to 101.0%. Excellent precision was obtained with a within-day variation (RSD) of 0.7% for GC-MS analysis. The accuracy of the measurement was evaluated by comparing of results of this reference measurement procedure on lyophilized human serum reference materials for cortisol (European Reference Materials (ERM)-DA 192) as Certified Reference Materials (CRMs). The results of this method for total cortisol agreed with the certified values within some uncertainty. This method, which demonstrates simply, easy, good accuracy, high precision, and is free from interferences from structural analogues, qualifies as a reference measurement procedure.

  17. A New Dual-purpose Quality Control Dosimetry Protocol for Diagnostic Reference-level Determination in Computed Tomography.

    PubMed

    Sohrabi, Mehdi; Parsi, Masoumeh; Sina, Sedigheh

    2018-05-17

    A diagnostic reference level is an advisory dose level set by a regulatory authority in a country as an efficient criterion for protection of patients from unwanted medical exposure. In computed tomography, the direct dose measurement and data collection methods are commonly applied for determination of diagnostic reference levels. Recently, a new quality-control-based dose survey method was proposed by the authors to simplify the diagnostic reference-level determination using a retrospective quality control database usually available at a regulatory authority in a country. In line with such a development, a prospective dual-purpose quality control dosimetry protocol is proposed for determination of diagnostic reference levels in a country, which can be simply applied by quality control service providers. This new proposed method was applied to five computed tomography scanners in Shiraz, Iran, and diagnostic reference levels for head, abdomen/pelvis, sinus, chest, and lumbar spine examinations were determined. The results were compared to those obtained by the data collection and quality-control-based dose survey methods, carried out in parallel in this study, and were found to agree well within approximately 6%. This is highly acceptable for quality-control-based methods according to International Atomic Energy Agency tolerance levels (±20%).

  18. An Optimal Control Modification to Model-Reference Adaptive Control for Fast Adaptation

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Krishnakumar, Kalmanje; Boskovic, Jovan

    2008-01-01

    This paper presents a method that can achieve fast adaptation for a class of model-reference adaptive control. It is well-known that standard model-reference adaptive control exhibits high-gain control behaviors when a large adaptive gain is used to achieve fast adaptation in order to reduce tracking error rapidly. High gain control creates high-frequency oscillations that can excite unmodeled dynamics and can lead to instability. The fast adaptation approach is based on the minimization of the squares of the tracking error, which is formulated as an optimal control problem. The necessary condition of optimality is used to derive an adaptive law using the gradient method. This adaptive law is shown to result in uniform boundedness of the tracking error by means of the Lyapunov s direct method. Furthermore, this adaptive law allows a large adaptive gain to be used without causing undesired high-gain control effects. The method is shown to be more robust than standard model-reference adaptive control. Simulations demonstrate the effectiveness of the proposed method.

  19. POTENTIAL RADIOACTIVE POLLUTANTS RESULTING FROM EXPANDED ENERGY PROGRAMS

    EPA Science Inventory

    An effective environmental monitoring program must have a quality assurance component to assure the production of valid data. Quality assurance has many components: calibration standards, standard reference materials, standard reference methods, interlaboratory comparison studies...

  20. Gene expression studies of reference genes for quantitative real-time PCR: an overview in insects.

    PubMed

    Shakeel, Muhammad; Rodriguez, Alicia; Tahir, Urfa Bin; Jin, Fengliang

    2018-02-01

    Whenever gene expression is being examined, it is essential that a normalization process is carried out to eliminate non-biological variations. The use of reference genes, such as glyceraldehyde-3-phosphate dehydrogenase, actin, and ribosomal protein genes, is the usual method of choice for normalizing gene expression. Although reference genes are used to normalize target gene expression, a major problem is that the stability of these genes differs among tissues, developmental stages, species, and responses to abiotic factors. Therefore, the use and validation of multiple reference genes are required. This review discusses the reasons that why RT-qPCR has become the preferred method for validating results of gene expression profiles, the use of specific and non-specific dyes and the importance of use of primers and probes for qPCR as well as to discuss several statistical algorithms developed to help the validation of potential reference genes. The conflicts arising in the use of classical reference genes in gene normalization and their replacement with novel references are also discussed by citing the high stability and low stability of classical and novel reference genes under various biotic and abiotic experimental conditions by employing various methods applied for the reference genes amplification.

  1. Automated color classification of urine dipstick image in urine examination

    NASA Astrophysics Data System (ADS)

    Rahmat, R. F.; Royananda; Muchtar, M. A.; Taqiuddin, R.; Adnan, S.; Anugrahwaty, R.; Budiarto, R.

    2018-03-01

    Urine examination using urine dipstick has long been used to determine the health status of a person. The economical and convenient use of urine dipstick is one of the reasons urine dipstick is still used to check people health status. The real-life implementation of urine dipstick is done manually, in general, that is by comparing it with the reference color visually. This resulted perception differences in the color reading of the examination results. In this research, authors used a scanner to obtain the urine dipstick color image. The use of scanner can be one of the solutions in reading the result of urine dipstick because the light produced is consistent. A method is required to overcome the problems of urine dipstick color matching and the test reference color that have been conducted manually. The method proposed by authors is Euclidean Distance, Otsu along with RGB color feature extraction method to match the colors on the urine dipstick with the standard reference color of urine examination. The result shows that the proposed approach was able to classify the colors on a urine dipstick with an accuracy of 95.45%. The accuracy of color classification on urine dipstick against the standard reference color is influenced by the level of scanner resolution used, the higher the scanner resolution level, the higher the accuracy.

  2. A method for determining the conversion efficiency of multiple-cell photovoltaic devices

    NASA Astrophysics Data System (ADS)

    Glatfelter, Troy; Burdick, Joseph

    A method for accurately determining the conversion efficiency of any multiple-cell photovoltaic device under any arbitrary reference spectrum is presented. This method makes it possible to obtain not only the short-circuit current, but also the fill factor, the open-circuit voltage, and hence the conversion efficiency of a multiple-cell device under any reference spectrum. Results are presented which allow a comparison of the I-V parameters of two-terminal, two- and three-cell tandem devices measured under a multiple-source simulator with the same parameters measured under different reference spectra. It is determined that the uncertainty in the conversion efficiency of a multiple-cell photovoltaic device obtained with this method is less than +/-3 percent.

  3. Application of solid/liquid extraction for the gravimetric determination of lipids in royal jelly.

    PubMed

    Antinelli, Jean-François; Davico, Renée; Rognone, Catherine; Faucon, Jean-Paul; Lizzani-Cuvelier, Louisette

    2002-04-10

    Gravimetric lipid determination is a major parameter for the characterization and the authentication of royal jelly quality. A solid/liquid extraction was compared to the reference method, which is based on liquid/liquid extraction. The amount of royal jelly and the time of the extraction were optimized in comparison to the reference method. Boiling/rinsing ratio and spread of royal jelly onto the extraction thimble were identified as critical parameters, resulting in good accuracy and precision for the alternative method. Comparison of reproducibility and repeatability of both methods associated with gas chromatographic analysis of the composition of the extracted lipids showed no differences between the two methods. As the intra-laboratory validation tests were comparable to the reference method, while offering rapidity and a decrease in amount of solvent used, it was concluded that the proposed method should be used with no modification of quality criteria and norms established for royal jelly characterization.

  4. Example-Based Image Colorization Using Locality Consistent Sparse Representation.

    PubMed

    Bo Li; Fuchen Zhao; Zhuo Su; Xiangguo Liang; Yu-Kun Lai; Rosin, Paul L

    2017-11-01

    Image colorization aims to produce a natural looking color image from a given gray-scale image, which remains a challenging problem. In this paper, we propose a novel example-based image colorization method exploiting a new locality consistent sparse representation. Given a single reference color image, our method automatically colorizes the target gray-scale image by sparse pursuit. For efficiency and robustness, our method operates at the superpixel level. We extract low-level intensity features, mid-level texture features, and high-level semantic features for each superpixel, which are then concatenated to form its descriptor. The collection of feature vectors for all the superpixels from the reference image composes the dictionary. We formulate colorization of target superpixels as a dictionary-based sparse reconstruction problem. Inspired by the observation that superpixels with similar spatial location and/or feature representation are likely to match spatially close regions from the reference image, we further introduce a locality promoting regularization term into the energy formulation, which substantially improves the matching consistency and subsequent colorization results. Target superpixels are colorized based on the chrominance information from the dominant reference superpixels. Finally, to further improve coherence while preserving sharpness, we develop a new edge-preserving filter for chrominance channels with the guidance from the target gray-scale image. To the best of our knowledge, this is the first work on sparse pursuit image colorization from single reference images. Experimental results demonstrate that our colorization method outperforms the state-of-the-art methods, both visually and quantitatively using a user study.

  5. Rapid identification of oral Actinomyces species cultivated from subgingival biofilm by MALDI-TOF-MS

    PubMed Central

    Stingu, Catalina S.; Borgmann, Toralf; Rodloff, Arne C.; Vielkind, Paul; Jentsch, Holger; Schellenberger, Wolfgang; Eschrich, Klaus

    2015-01-01

    Background Actinomyces are a common part of the residential flora of the human intestinal tract, genitourinary system and skin. Isolation and identification of Actinomyces by conventional methods is often difficult and time consuming. In recent years, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) has become a rapid and simple method to identify bacteria. Objective The present study evaluated a new in-house algorithm using MALDI-TOF-MS for rapid identification of different species of oral Actinomyces cultivated from subgingival biofilm. Design Eleven reference strains and 674 clinical strains were used in this study. All the strains were preliminarily identified using biochemical methods and then subjected to MALDI-TOF-MS analysis using both similarity-based analysis and classification methods (support vector machine [SVM]). The genotype of the reference strains and of 232 clinical strains was identified by sequence analysis of the 16S ribosomal RNA (rRNA). Results The sequence analysis of the 16S rRNA gene of all references strains confirmed their previous identification. The MALDI-TOF-MS spectra obtained from the reference strains and the other clinical strains undoubtedly identified as Actinomyces by 16S rRNA sequencing were used to create the mass spectra reference database. Already a visual inspection of the mass spectra of different species reveals both similarities and differences. However, the differences between them are not large enough to allow a reliable differentiation by similarity analysis. Therefore, classification methods were applied as an alternative approach for differentiation and identification of Actinomyces at the species level. A cross-validation of the reference database representing 14 Actinomyces species yielded correct results for all species which were represented by more than two strains in the database. Conclusions Our results suggest that a combination of MALDI-TOF-MS with powerful classification algorithms, such as SVMs, provide a useful tool for the differentiation and identification of oral Actinomyces. PMID:25597306

  6. Validation of no-reference image quality index for the assessment of digital mammographic images

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Barufaldi, Bruno; Borges, Lucas R.; Gabarda, Salvador; Bakic, Predrag R.; Maidment, Andrew D. A.; Schiabel, Homero; Vieira, Marcelo A. C.

    2016-03-01

    To ensure optimal clinical performance of digital mammography, it is necessary to obtain images with high spatial resolution and low noise, keeping radiation exposure as low as possible. These requirements directly affect the interpretation of radiologists. The quality of a digital image should be assessed using objective measurements. In general, these methods measure the similarity between a degraded image and an ideal image without degradation (ground-truth), used as a reference. These methods are called Full-Reference Image Quality Assessment (FR-IQA). However, for digital mammography, an image without degradation is not available in clinical practice; thus, an objective method to assess the quality of mammograms must be performed without reference. The purpose of this study is to present a Normalized Anisotropic Quality Index (NAQI), based on the Rényi entropy in the pseudo-Wigner domain, to assess mammography images in terms of spatial resolution and noise without any reference. The method was validated using synthetic images acquired through an anthropomorphic breast software phantom, and the clinical exposures on anthropomorphic breast physical phantoms and patient's mammograms. The results reported by this noreference index follow the same behavior as other well-established full-reference metrics, e.g., the peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM). Reductions of 50% on the radiation dose in phantom images were translated as a decrease of 4dB on the PSNR, 25% on the SSIM and 33% on the NAQI, evidencing that the proposed metric is sensitive to the noise resulted from dose reduction. The clinical results showed that images reduced to 53% and 30% of the standard radiation dose reported reductions of 15% and 25% on the NAQI, respectively. Thus, this index may be used in clinical practice as an image quality indicator to improve the quality assurance programs in mammography; hence, the proposed method reduces the subjectivity inter-observers in the reporting of image quality assessment.

  7. Theoretical studies of floating-reference method for NIR blood glucose sensing

    NASA Astrophysics Data System (ADS)

    Shi, Zhenzhi; Yang, Yue; Zhao, Huijuan; Chen, Wenliang; Liu, Rong; Xu, Kexin

    2011-03-01

    Non-invasive blood glucose monitoring using NIR light has been suffered from the variety of optical background that is mainly caused by the change of human body, such as the change of temperature, water concentration, and so on. In order to eliminate these internal influence and external interference a so called floating-reference method has been proposed to provide an internal reference. From the analysis of the diffuse reflectance spectrum, a position has been found where diffuse reflection of light is not sensitive to the glucose concentrations. Our previous work has proved the existence of reference position using diffusion equation. However, since glucose monitoring generally use the NIR light in region of 1000-2000nm, diffusion equation is not valid because of the high absorption coefficient and small source-detector separations. In this paper, steady-state high-order approximate model is used to further investigate the existence of the floating reference position in semi-infinite medium. Based on the analysis of different optical parameters on the impact of spatially resolved reflectance of light, we find that the existence of the floating-reference position is the result of the interaction of optical parameters. Comparing to the results of Monte Carlo simulation, the applicable region of diffusion approximation and higher-order approximation for the calculation of floating-reference position is discussed at the wavelength of 1000nm-1800nm, using the intralipid solution of different concentrations. The results indicate that when the reduced albedo is greater than 0.93, diffusion approximation results are more close to simulation results, otherwise the high order approximation is more applicable.

  8. Validation of the concentration profiles obtained from the near infrared/multivariate curve resolution monitoring of reactions of epoxy resins using high performance liquid chromatography as a reference method.

    PubMed

    Garrido, M; Larrechi, M S; Rius, F X

    2007-03-07

    This paper reports the validation of the results obtained by combining near infrared spectroscopy and multivariate curve resolution-alternating least squares (MCR-ALS) and using high performance liquid chromatography as a reference method, for the model reaction of phenylglycidylether (PGE) and aniline. The results are obtained as concentration profiles over the reaction time. The trueness of the proposed method has been evaluated in terms of lack of bias. The joint test for the intercept and the slope showed that there were no significant differences between the profiles calculated spectroscopically and the ones obtained experimentally by means of the chromatographic reference method at an overall level of confidence of 5%. The uncertainty of the results was estimated by using information derived from the process of assessment of trueness. Such operational aspects as the cost and availability of instrumentation and the length and cost of the analysis were evaluated. The method proposed is a good way of monitoring the reactions of epoxy resins, and it adequately shows how the species concentration varies over time.

  9. Methodological evaluation and comparison of five urinary albumin measurements.

    PubMed

    Liu, Rui; Li, Gang; Cui, Xiao-Fan; Zhang, Dong-Ling; Yang, Qing-Hong; Mu, Xiao-Yan; Pan, Wen-Jie

    2011-01-01

    Microalbuminuria is an indicator of kidney damage and a risk factor for the progression kidney disease, cardiovascular disease, and so on. Therefore, accurate and precise measurement of urinary albumin is critical. However, there are no reference measurement procedures and reference materials for urinary albumin. Nephelometry, turbidimetry, colloidal gold method, radioimmunoassay, and chemiluminescence immunoassay were performed for methodological evaluation, based on imprecision test, recovery rate, linearity, haemoglobin interference rate, and verified reference interval. Then we tested 40 urine samples from diabetic patients by each method, and compared the result between assays. The results indicate that nephelometry is the method with best analytical performance among the five methods, with an average intraassay coefficient of variation (CV) of 2.6%, an average interassay CV of 1.7%, a mean recovery of 99.6%, a linearity of R=1.00 from 2 to 250 mg/l, and an interference rate of <10% at haemoglobin concentrations of <1.82 g/l. The correlation (r) between assays was from 0.701 to 0.982, and the Bland-Altman plots indicated each assay provided significantly different results from each other. Nephelometry is the clinical urinary albumin method with best analytical performance in our study. © 2011 Wiley-Liss, Inc.

  10. Reference layer adaptive filtering (RLAF) for EEG artifact reduction in simultaneous EEG-fMRI.

    PubMed

    Steyrl, David; Krausz, Gunther; Koschutnig, Karl; Edlinger, Günter; Müller-Putz, Gernot R

    2017-04-01

    Simultaneous electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) combines advantages of both methods, namely high temporal resolution of EEG and high spatial resolution of fMRI. However, EEG quality is limited due to severe artifacts caused by fMRI scanners. To improve EEG data quality substantially, we introduce methods that use a reusable reference layer EEG cap prototype in combination with adaptive filtering. The first method, reference layer adaptive filtering (RLAF), uses adaptive filtering with reference layer artifact data to optimize artifact subtraction from EEG. In the second method, multi band reference layer adaptive filtering (MBRLAF), adaptive filtering is performed on bandwidth limited sub-bands of the EEG and the reference channels. The results suggests that RLAF outperforms the baseline method, average artifact subtraction, in all settings and also its direct predecessor, reference layer artifact subtraction (RLAS), in lower (<35 Hz) frequency ranges. MBRLAF is computationally more demanding than RLAF, but highly effective in all EEG frequency ranges. Effectivity is determined by visual inspection, as well as root-mean-square voltage reduction and power reduction of EEG provided that physiological EEG components such as occipital EEG alpha power and visual evoked potentials (VEP) are preserved. We demonstrate that both, RLAF and MBRLAF, improve VEP quality. For that, we calculate the mean-squared-distance of single trial VEP to the mean VEP and estimate single trial VEP classification accuracies. We found that the average mean-squared-distance is lowest and the average classification accuracy is highest after MBLAF. RLAF was second best. In conclusion, the results suggests that RLAF and MBRLAF are potentially very effective in improving EEG quality of simultaneous EEG-fMRI. Highlights We present a new and reusable reference layer cap prototype for simultaneous EEG-fMRI We introduce new algorithms for reducing EEG artifacts due to simultaneous fMRI The algorithms combine a reference layer and adaptive filtering Several evaluation criteria suggest superior effectivity in terms of artifact reduction We demonstrate that physiological EEG components are preserved.

  11. Method modification of the Legipid® Legionella fast detection test kit.

    PubMed

    Albalat, Guillermo Rodríguez; Broch, Begoña Bedrina; Bono, Marisa Jiménez

    2014-01-01

    Legipid(®) Legionella Fast Detection is a test based on combined magnetic immunocapture and enzyme-immunoassay (CEIA) for the detection of Legionella in water. The test is based on the use of anti-Legionella antibodies immobilized on magnetic microspheres. Target microorganism is preconcentrated by filtration. Immunomagnetic analysis is applied on these preconcentrated water samples in a final test portion of 9 mL. The test kit was certified by the AOAC Research Institute as Performance Tested Method(SM) (PTM) No. 111101 in a PTM validation which certifies the performance claims of the test method in comparison to the ISO reference method 11731-1998 and the revision 11731-2004 "Water Quality: Detection and Enumeration of Legionella pneumophila" in potable water, industrial water, and waste water. The modification of this test kit has been approved. The modification includes increasing the target analyte from L. pneumophila to Legionella species and adding an optical reader to the test method. In this study, 71 strains of Legionella spp. other than L. pneumophila were tested to determine its reactivity with the kit based on CEIA. All the strains of Legionella spp. tested by the CEIA test were confirmed positive by reference standard method ISO 11731. This test (PTM 111101) has been modified to include a final optical reading. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Two water matrixes were analyzed. Results show no statistically detectable difference between the test method and the reference culture method for the enumeration of Legionella spp. The relative level of detection was 93 CFU/volume examined (LOD50). For optical reading, the LOD was 40 CFU/volume examined and the LOQ was 60 CFU/volume examined. Results showed that the test Legipid Legionella Fast Detection is equivalent to the reference culture method for the enumeration of Legionella spp.

  12. Standardization of gamma-glutamyltransferase assays by intermethod calibration. Effect on determining common reference limits.

    PubMed

    Steinmetz, Josiane; Schiele, Françoise; Gueguen, René; Férard, Georges; Henny, Joseph

    2007-01-01

    The improvement of the consistency of gamma-glutamyltransferase (GGT) activity results among different assays after calibration with a common material was estimated. We evaluated if this harmonization could lead to reference limits common to different routine methods. Seven laboratories measured GGT activity using their own routine analytical system both according to the manufacturer's recommendation and after calibration with a multi-enzyme calibrator [value assigned by the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) reference procedure]. All samples were re-measured using the IFCC reference procedure. Two groups of subjects were selected in each laboratory: a group of healthy men aged 18-25 years without long-term medication and with alcohol consumption less than 44 g/day and a group of subjects with elevated GGT activity. The day-to-day coefficients of variation were less than 2.9% in each laboratory. The means obtained in the group of healthy subjects without common calibration (range of the means 16-23 U/L) were significantly different from those obtained by the IFCC procedure in five laboratories. After calibration, the means remained significantly different from the IFCC procedure results in only one laboratory. For three calibrated methods, the slope values of linear regression vs. the IFCC procedure were not different from the value 1. The results obtained with these three methods for healthy subjects (n=117) were gathered and reference limits were calculated. These were 11-49 U/L (2.5th-97.5th percentiles). The calibration also improved the consistency of elevated results when compared to the IFCC procedure. The common calibration improved the level of consistency between different routine methods. It permitted to define common reference limits which are quite similar to those proposed by the IFCC. This approach should lead to a real benefit in terms of prevention, screening, diagnosis, therapeutic monitoring and for epidemiological studies.

  13. An evaluation of the distribution of sexual references among "Top 8" MySpace friends.

    PubMed

    Moreno, Megan A; Brockman, Libby; Rogers, Cara B; Christakis, Dimitri A

    2010-10-01

    To evaluate whether online friends of adolescents who display sexual references on a social networking site also display references. The method used was content analysis. The result of this study was that adolescents who displayed explicit sexual references were more likely to have online friends who displayed references. Thus, social networking sites present new opportunities to investigate adolescent sexual behavior. Copyright © 2010 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  14. Tables for Supersonic Flow Around Right Circular Cones at Small Angle of Attack

    NASA Technical Reports Server (NTRS)

    Sims, Joseph L.

    1964-01-01

    The solution of supersonic flow fields by the method of characteristics requires that starting conditions be known. Ferri, in reference 1, developed a method-of-characteristics solution for axially symmetric bodies of revolution at small angles of attack. With computing machinery that is now available, this has become a feasible method for computing the aerodynamic characteristics of bodies near zero angle of attack. For sharp-nosed bodies of revolution, the required starting line may be obtained by computing the flow field about a cone at a small angle of attack. This calculation is readily performed using Stone's theory in reference 2. Some solutions of this theory are available in reference 3. However, the manner in which these results are presented, namely in a wind-fixed coordinate system, makes their use somewhat cumbersome. Additionally, as pointed out in reference 4, the flow component perpendicular to the meridian planes was computed incorrectly. The results contained herein have been computed in the same basic manner as those of reference 3 with the correct velocity normal to the meridian planes. Also, all results have been transferred into the body-fixed coordinate system. Therefore, the values tabulated herein may be used, in conjunction with the respective zero-angle-of-attack results of reference 5, as starting conditions for the method-of-characteristics solution of the flow field about axially symmetric bodies of revolution at small angles of attack. As in the zero-angle-of-attack case (ref. 5) the present results have been computed using the ideal gas value of 1.4 for the ratio of the specific heats of air. Solutions are given for cone angles from 2.5 deg to 30 deg in increments of 2.5 deg. For each cone angle, results were computed for a constant series of free-stream Mach numbers from 1.5 to 20. In addition, a solution was computed which yielded the minimum free-stream Mach number for a completely supersonic conical flow field. For cone angles of 27.5 deg and 30 deg, this minimum free-stream Mach number was above 1.5. Consequently, solutions at this Mach number were not computed for these two cone angles.

  15. Reference Levels for Patient Radiation Doses in Interventional Radiology: Proposed Initial Values for U.S. Practice1

    PubMed Central

    Miller, Donald L.; Kwon, Deukwoo; Bonavia, Grant H.

    2009-01-01

    Purpose: To propose initial values for patient reference levels for fluoroscopically guided procedures in the United States. Materials and Methods: This secondary analysis of data from the Radiation Doses in Interventional Radiology Procedures (RAD-IR) study was conducted under a protocol approved by the institutional review board and was HIPAA compliant. Dose distributions (percentiles) were calculated for each type of procedure in the RAD-IR study where there were data from at least 30 cases. Confidence intervals for the dose distributions were determined by using bootstrap resampling. Weight banding and size correction methods for normalizing dose to patient body habitus were tested. Results: The different methods for normalizing patient radiation dose according to patient weight gave results that were not significantly different (P > .05). The 75th percentile patient radiation doses normalized with weight banding were not significantly different from those that were uncorrected for body habitus. Proposed initial reference levels for various interventional procedures are provided for reference air kerma, kerma-area product, fluoroscopy time, and number of images. Conclusion: Sufficient data exist to permit an initial proposal of values for reference levels for interventional radiologic procedures in the United States. For ease of use, reference levels without correction for body habitus are recommended. A national registry of radiation-dose data for interventional radiologic procedures is a necessary next step to refine these reference levels. © RSNA, 2009 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.2533090354/-/DC1 PMID:19789226

  16. Development of a candidate reference material for adventitious virus detection in vaccine and biologicals manufacturing by deep sequencing

    PubMed Central

    Mee, Edward T.; Preston, Mark D.; Minor, Philip D.; Schepelmann, Silke; Huang, Xuening; Nguyen, Jenny; Wall, David; Hargrove, Stacey; Fu, Thomas; Xu, George; Li, Li; Cote, Colette; Delwart, Eric; Li, Linlin; Hewlett, Indira; Simonyan, Vahan; Ragupathy, Viswanath; Alin, Voskanian-Kordi; Mermod, Nicolas; Hill, Christiane; Ottenwälder, Birgit; Richter, Daniel C.; Tehrani, Arman; Jacqueline, Weber-Lehmann; Cassart, Jean-Pol; Letellier, Carine; Vandeputte, Olivier; Ruelle, Jean-Louis; Deyati, Avisek; La Neve, Fabio; Modena, Chiara; Mee, Edward; Schepelmann, Silke; Preston, Mark; Minor, Philip; Eloit, Marc; Muth, Erika; Lamamy, Arnaud; Jagorel, Florence; Cheval, Justine; Anscombe, Catherine; Misra, Raju; Wooldridge, David; Gharbia, Saheer; Rose, Graham; Ng, Siemon H.S.; Charlebois, Robert L.; Gisonni-Lex, Lucy; Mallet, Laurent; Dorange, Fabien; Chiu, Charles; Naccache, Samia; Kellam, Paul; van der Hoek, Lia; Cotten, Matt; Mitchell, Christine; Baier, Brian S.; Sun, Wenping; Malicki, Heather D.

    2016-01-01

    Background Unbiased deep sequencing offers the potential for improved adventitious virus screening in vaccines and biotherapeutics. Successful implementation of such assays will require appropriate control materials to confirm assay performance and sensitivity. Methods A common reference material containing 25 target viruses was produced and 16 laboratories were invited to process it using their preferred adventitious virus detection assay. Results Fifteen laboratories returned results, obtained using a wide range of wet-lab and informatics methods. Six of 25 target viruses were detected by all laboratories, with the remaining viruses detected by 4–14 laboratories. Six non-target viruses were detected by three or more laboratories. Conclusion The study demonstrated that a wide range of methods are currently used for adventitious virus detection screening in biological products by deep sequencing and that they can yield significantly different results. This underscores the need for common reference materials to ensure satisfactory assay performance and enable comparisons between laboratories. PMID:26709640

  17. IDLN-MSP: Idiolocal normalization of real-time methylation-specific PCR for genetic imbalanced DNA specimens.

    PubMed

    Santourlidis, Simeon; Ghanjati, Foued; Beermann, Agnes; Hermanns, Thomas; Poyet, Cédric

    2016-02-01

    Sensitive, accurate, and reliable measurements of tumor cell-specific DNA methylation changes are of fundamental importance in cancer diagnosis, prognosis, and monitoring. Real-time methylation-specific PCR (MSP) using intercalating dyes is an established method of choice for this purpose. Here we present a simple but crucial adaptation of this widely applied method that overcomes a major obstacle: genetic abnormalities in the DNA samples, such as aneuploidy or copy number variations, that could result in inaccurate results due to improper normalization if the copy numbers of the target and reference sequences are not the same. In our idiolocal normalization (IDLN) method, the locus for the normalizing, methylation-independent reference amplification is chosen close to the locus of the methylation-dependent target amplification. This ensures that the copy numbers of both the target and reference sequences will be identical in most cases if they are close enough to each other, resulting in accurate normalization and reliable comparative measurements of DNA methylation in clinical samples when using real-time MSP.

  18. A series of strategies for solving the shortage of reference standards for multi-components determination of traditional Chinese medicine, Mahoniae Caulis as a case.

    PubMed

    Wang, Wenguang; Ma, Xiaoli; Guo, Xiaoyu; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong

    2015-09-18

    In order to solve the bottleneck of reference standards shortage for comprehensive quality control of traditional Chinese medicines (TCMs), a series of strategies, including one single reference standard to determine multi-compounds (SSDMC), quantitative analysis by standardized reference extract (QASRE), and quantitative nuclear magnetic resonance spectroscopy (qNMR) were proposed, and Mahoniae Caulis was selected as an example to develop and validate these methods for simultaneous determination of four alkaloids, columbamine, jatrorrhizine, palmatine, and berberine. Comprehensive comparisons among these methods and with the conventional external standard method (ESM) were carried out. The relative expanded uncertainty of measurement was firstly used to compare their credibility. The results showed that all these three new developed methods can accurately accomplish the quantification by using only one purified reference standard, but each of them has its own advantages and disadvantages as well as the specific application scope, which were also discussed in detail in this paper. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Evaluation of Methods to Select Scale Velocities in Icing Scaling Tests

    NASA Technical Reports Server (NTRS)

    Anderson, David N.; Ruff, Gary A.; Bond, Thomas H. (Technical Monitor)

    2003-01-01

    A series of tests were made in the NASA Glenn Icing Research Tunnel to determine how icing scaling results were affected by the choice of scale velocity. Reference tests were performed with a 53.3-cm-chord NACA 0012 airfoil model, while scale tests used a 27.7-cm-chord 0012 model. Tests were made with rime, mixed, and glaze ice. Reference test conditions included airspeeds of 67 and 89 m/s, an MVD of 40 microns, and LWCs of 0.5 and 0.6 g/cu m. Scale test conditions were established by the modified Ruff (AEDC) scaling method with the scale velocity determined in five ways. The resulting scale velocities ranged from 85 to 220 percent of the reference velocity. This paper presents the ice shapes that resulted from those scale tests and compares them to the reference shapes. It was concluded that for freezing fractions greater than 0.8 as well as for a freezing fraction of 0.3, the value of the scale velocity had no effect on how well the scale ice shape simulated the reference shape. For freezing fractions of 0.5 and 0.7, the simulation of the reference shape appeared to improve as the scale velocity increased.

  20. Model Reduction via Principe Component Analysis and Markov Chain Monte Carlo (MCMC) Methods

    NASA Astrophysics Data System (ADS)

    Gong, R.; Chen, J.; Hoversten, M. G.; Luo, J.

    2011-12-01

    Geophysical and hydrogeological inverse problems often include a large number of unknown parameters, ranging from hundreds to millions, depending on parameterization and problems undertaking. This makes inverse estimation and uncertainty quantification very challenging, especially for those problems in two- or three-dimensional spatial domains. Model reduction technique has the potential of mitigating the curse of dimensionality by reducing total numbers of unknowns while describing the complex subsurface systems adequately. In this study, we explore the use of principal component analysis (PCA) and Markov chain Monte Carlo (MCMC) sampling methods for model reduction through the use of synthetic datasets. We compare the performances of three different but closely related model reduction approaches: (1) PCA methods with geometric sampling (referred to as 'Method 1'), (2) PCA methods with MCMC sampling (referred to as 'Method 2'), and (3) PCA methods with MCMC sampling and inclusion of random effects (referred to as 'Method 3'). We consider a simple convolution model with five unknown parameters as our goal is to understand and visualize the advantages and disadvantages of each method by comparing their inversion results with the corresponding analytical solutions. We generated synthetic data with noise added and invert them under two different situations: (1) the noised data and the covariance matrix for PCA analysis are consistent (referred to as the unbiased case), and (2) the noise data and the covariance matrix are inconsistent (referred to as biased case). In the unbiased case, comparison between the analytical solutions and the inversion results show that all three methods provide good estimates of the true values and Method 1 is computationally more efficient. In terms of uncertainty quantification, Method 1 performs poorly because of relatively small number of samples obtained, Method 2 performs best, and Method 3 overestimates uncertainty due to inclusion of random effects. However, in the biased case, only Method 3 correctly estimates all the unknown parameters, and both Methods 1 and 2 provide wrong values for the biased parameters. The synthetic case study demonstrates that if the covariance matrix for PCA analysis is inconsistent with true models, the PCA methods with geometric or MCMC sampling will provide incorrect estimates.

  1. Solving multi-objective optimization problems in conservation with the reference point method

    PubMed Central

    Dujardin, Yann; Chadès, Iadine

    2018-01-01

    Managing the biodiversity extinction crisis requires wise decision-making processes able to account for the limited resources available. In most decision problems in conservation biology, several conflicting objectives have to be taken into account. Most methods used in conservation either provide suboptimal solutions or use strong assumptions about the decision-maker’s preferences. Our paper reviews some of the existing approaches to solve multi-objective decision problems and presents new multi-objective linear programming formulations of two multi-objective optimization problems in conservation, allowing the use of a reference point approach. Reference point approaches solve multi-objective optimization problems by interactively representing the preferences of the decision-maker with a point in the criteria (objectives) space, called the reference point. We modelled and solved the following two problems in conservation: a dynamic multi-species management problem under uncertainty and a spatial allocation resource management problem. Results show that the reference point method outperforms classic methods while illustrating the use of an interactive methodology for solving combinatorial problems with multiple objectives. The method is general and can be adapted to a wide range of ecological combinatorial problems. PMID:29293650

  2. Development of a mushroom powder Certified Reference Material for calcium, arsenic, cadmium and lead measurements.

    PubMed

    Chew, Gina; Sim, Lay Peng; Ng, Sin Yee; Ding, Yi; Shin, Richard Y C; Lee, Tong Kooi

    2016-01-01

    Isotope dilution mass spectrometry and standard addition techniques were developed for the analysis of four elements (Ca, As, Cd and Pb) in a mushroom powder material. Results from the validated methods were compared to those of other national metrology institutes in the CCQM-K89 intercomparisons and the results were in excellent agreement with the reference values. The same methods were then used for the assignment of reference values to a mushroom powder Certified Reference Material (CRM). The certified values obtained for Ca, As, Cd and Pb were 1.444 ± 0.099 mg/g, 5.61 ± 0.59 mg/kg, 1.191 ± 0.079 mg/kg and 5.23 ± 0.94 mg/kg, respectively. The expanded measurement uncertainties were obtained by combining the uncertainty contributions from characterization (uchar) and between-bottle homogeneity (ubb). Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Comparison of analytical and predictive methods for water, protein, fat, sugar, and gross energy in marine mammal milk.

    PubMed

    Oftedal, O T; Eisert, R; Barrell, G K

    2014-01-01

    Mammalian milks may differ greatly in composition from cow milk, and these differences may affect the performance of analytical methods. High-fat, high-protein milks with a preponderance of oligosaccharides, such as those produced by many marine mammals, present a particular challenge. We compared the performance of several methods against reference procedures using Weddell seal (Leptonychotes weddellii) milk of highly varied composition (by reference methods: 27-63% water, 24-62% fat, 8-12% crude protein, 0.5-1.8% sugar). A microdrying step preparatory to carbon-hydrogen-nitrogen (CHN) gas analysis slightly underestimated water content and had a higher repeatability relative standard deviation (RSDr) than did reference oven drying at 100°C. Compared with a reference macro-Kjeldahl protein procedure, the CHN (or Dumas) combustion method had a somewhat higher RSDr (1.56 vs. 0.60%) but correlation between methods was high (0.992), means were not different (CHN: 17.2±0.46% dry matter basis; Kjeldahl 17.3±0.49% dry matter basis), there were no significant proportional or constant errors, and predictive performance was high. A carbon stoichiometric procedure based on CHN analysis failed to adequately predict fat (reference: Röse-Gottlieb method) or total sugar (reference: phenol-sulfuric acid method). Gross energy content, calculated from energetic factors and results from reference methods for fat, protein, and total sugar, accurately predicted gross energy as measured by bomb calorimetry. We conclude that the CHN (Dumas) combustion method and calculation of gross energy are acceptable analytical approaches for marine mammal milk, but fat and sugar require separate analysis by appropriate analytic methods and cannot be adequately estimated by carbon stoichiometry. Some other alternative methods-low-temperature drying for water determination; Bradford, Lowry, and biuret methods for protein; the Folch and the Bligh and Dyer methods for fat; and enzymatic and reducing sugar methods for total sugar-appear likely to produce substantial error in marine mammal milks. It is important that alternative analytical methods be properly validated against a reference method before being used, especially for mammalian milks that differ greatly from cow milk in analyte characteristics and concentrations. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  4. Reference Materials in LIS Instruction: A Delphi Study

    ERIC Educational Resources Information Center

    Rabina, Debbie

    2013-01-01

    This paper presents the results of a Delphi study conducted over a two-month period in 2011. The purpose of the study was to identify reference sources that should be covered in basic reference courses taught in LIS programs in the United States. The Delphi method was selected for its appropriateness in soliciting expert opinions and assessing the…

  5. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  6. Evaluation of home allergen sampling devices.

    PubMed

    Sercombe, J K; Liu-Brennan, D; Garcia, M L; Tovey, E R

    2005-04-01

    Simple, inexpensive methods of sampling from allergen reservoirs are necessary for large-scale studies or low-cost householder-operated allergen measurement. We tested two commercial devices: the Indoor Biotechnologies Mitest Dust Collector and the Drager Bio-Check Allergen Control; two devices of our own design: the Electrostatic Cloth Sampler (ECS) and the Press Tape Sampler (PTS); and a Vacuum Sampler as used in many allergen studies (our Reference Method). Devices were used to collect dust mite allergen samples from 16 domestic carpets. Results were examined for correlations between the sampling methods. With mite allergen concentration expressed as microg/g, the Mitest, the ECS and the PTS correlated with the Reference Method but not with each other. When mite allergen concentration was expressed as microg/m2 the Mitest and the ECS correlated with the Reference Method but the PTS did not. In the high allergen conditions of this study, the Drager Bio-Check did not relate to any methods. The Mitest Dust Collector, the ECS and the PTS show performance consistent with the Reference Method. Many techniques can be used to collect dust mite allergen samples. More investigation is needed to prove any method as superior for estimating allergen exposure.

  7. Image quality evaluation of full reference algorithm

    NASA Astrophysics Data System (ADS)

    He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan

    2018-03-01

    Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.

  8. Methodological Issues in Antifungal Susceptibility Testing of Malassezia pachydermatis

    PubMed Central

    Peano, Andrea; Pasquetti, Mario; Tizzani, Paolo; Chiavassa, Elisa; Guillot, Jacques; Johnson, Elizabeth

    2017-01-01

    Reference methods for antifungal susceptibility testing of yeasts have been developed by the Clinical and Laboratory Standards Institute (CLSI) and the European Committee on Antibiotic Susceptibility Testing (EUCAST). These methods are intended to test the main pathogenic yeasts that cause invasive infections, namely Candida spp. and Cryptococcus neoformans, while testing other yeast species introduces several additional problems in standardization not addressed by these reference procedures. As a consequence, a number of procedures have been employed in the literature to test the antifungal susceptibility of Malassezia pachydermatis. This has resulted in conflicting results. The aim of the present study is to review the procedures and the technical parameters (growth media, inoculum preparation, temperature and length of incubation, method of reading) employed for susceptibility testing of M. pachydermatis, and when possible, to propose recommendations for or against their use. Such information may be useful for the future development of a reference assay. PMID:29371554

  9. Assessing noninferiority in a three-arm trial using the Bayesian approach.

    PubMed

    Ghosh, Pulak; Nathoo, Farouk; Gönen, Mithat; Tiwari, Ram C

    2011-07-10

    Non-inferiority trials, which aim to demonstrate that a test product is not worse than a competitor by more than a pre-specified small amount, are of great importance to the pharmaceutical community. As a result, methodology for designing and analyzing such trials is required, and developing new methods for such analysis is an important area of statistical research. The three-arm trial consists of a placebo, a reference and an experimental treatment, and simultaneously tests the superiority of the reference over the placebo along with comparing this reference to an experimental treatment. In this paper, we consider the analysis of non-inferiority trials using Bayesian methods which incorporate both parametric as well as semi-parametric models. The resulting testing approach is both flexible and robust. The benefit of the proposed Bayesian methods is assessed via simulation, based on a study examining home-based blood pressure interventions. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Comparison of methods for the prediction of human clearance from hepatocyte intrinsic clearance for a set of reference compounds and an external evaluation set.

    PubMed

    Yamagata, Tetsuo; Zanelli, Ugo; Gallemann, Dieter; Perrin, Dominique; Dolgos, Hugues; Petersson, Carl

    2017-09-01

    1. We compared direct scaling, regression model equation and the so-called "Poulin et al." methods to scale clearance (CL) from in vitro intrinsic clearance (CL int ) measured in human hepatocytes using two sets of compounds. One reference set comprised of 20 compounds with known elimination pathways and one external evaluation set based on 17 compounds development in Merck (MS). 2. A 90% prospective confidence interval was calculated using the reference set. This interval was found relevant for the regression equation method. The three outliers identified were justified on the basis of their elimination mechanism. 3. The direct scaling method showed a systematic underestimation of clearance in both the reference and evaluation sets. The "Poulin et al." and the regression equation methods showed no obvious bias in either the reference or evaluation sets. 4. The regression model equation was slightly superior to the "Poulin et al." method in the reference set and showed a better absolute average fold error (AAFE) of value 1.3 compared to 1.6. A larger difference was observed in the evaluation set were the regression method and "Poulin et al." resulted in an AAFE of 1.7 and 2.6, respectively (removing the three compounds with known issues mentioned above). A similar pattern was observed for the correlation coefficient. Based on these data we suggest the regression equation method combined with a prospective confidence interval as the first choice for the extrapolation of human in vivo hepatic metabolic clearance from in vitro systems.

  11. Traceable calibration of photovoltaic reference cells using natural sunlight

    NASA Astrophysics Data System (ADS)

    Müllejans, H.; Zaaiman, W.; Pavanello, D.; Dunlop, E. D.

    2018-02-01

    At the European Solar Test Installation (ESTI) photovoltaic (PV) reference cells are calibrated traceably to SI units via the World Radiometric Reference (WRR) using natural sunlight. The Direct Sunlight Method (DSM) is described in detail and the latest measurement results and an updated uncertainty budget are reported. These PV reference cells then provide a practical means for measuring the irradiance of natural or simulated sunlight during the calibration of other PV devices.

  12. Automated Spatial Brain Normalization and Hindbrain White Matter Reference Tissue Give Improved [(18)F]-Florbetaben PET Quantitation in Alzheimer's Model Mice.

    PubMed

    Overhoff, Felix; Brendel, Matthias; Jaworska, Anna; Korzhova, Viktoria; Delker, Andreas; Probst, Federico; Focke, Carola; Gildehaus, Franz-Josef; Carlsen, Janette; Baumann, Karlheinz; Haass, Christian; Bartenstein, Peter; Herms, Jochen; Rominger, Axel

    2016-01-01

    Preclinical PET studies of β-amyloid (Aβ) accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore, we aimed to evaluate systematically the (1) impact of an automated algorithm for spatial brain normalization, and (2) intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a 6 week longitudinal setting (N = 37) and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40) were investigated by [(18)F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ). For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX∕REF), relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen's d), and in reference to gold standard histopathological quantitation (Pearson's R). Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ≥0.99) between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX∕REF. All SUVRCTX∕REF methods performed better than SUVCTX both with regard to longitudinal stability (d≥1.21 vs. d = 0.23) and histological gold standard agreement (R≥0.66 vs. R≥0.31). Voxel-wise analysis suggested a physiologically implausible longitudinal decrease by global mean scaling. The hindbrain white matter reference (R mean = 0.75) was slightly superior to the brainstem (R mean = 0.74) and the cerebellum (R mean = 0.73). Automated brain normalization with reference region templates presents an excellent method to avoid the inter-reader variability in preclinical Aβ-PET scans. Intracerebral reference regions lacking Aβ pathology serve for precise longitudinal in vivo quantification of [(18)F]-florbetaben PET. Hindbrain white matter reference performed best when considering the composite of quality criteria.

  13. Precise Relative Earthquake Magnitudes from Cross Correlation

    DOE PAGES

    Cleveland, K. Michael; Ammon, Charles J.

    2015-04-21

    We present a method to estimate precise relative magnitudes using cross correlation of seismic waveforms. Our method incorporates the intercorrelation of all events in a group of earthquakes, as opposed to individual event pairings relative to a reference event. This method works well when a reliable reference event does not exist. We illustrate the method using vertical strike-slip earthquakes located in the northeast Pacific and Panama fracture zone regions. Our results are generally consistent with the Global Centroid Moment Tensor catalog, which we use to establish a baseline for the relative event sizes.

  14. Roka Listeria detection method using transcription mediated amplification to detect Listeria species in select foods and surfaces. Performance Tested Method(SM) 011201.

    PubMed

    Hua, Yang; Kaplan, Shannon; Reshatoff, Michael; Hu, Ernie; Zukowski, Alexis; Schweis, Franz; Gin, Cristal; Maroni, Brett; Becker, Michael; Wisniewski, Michele

    2012-01-01

    The Roka Listeria Detection Assay was compared to the reference culture methods for nine select foods and three select surfaces. The Roka method used Half-Fraser Broth for enrichment at 35 +/- 2 degrees C for 24-28 h. Comparison of Roka's method to reference methods requires an unpaired approach. Each method had a total of 545 samples inoculated with a Listeria strain. Each food and surface was inoculated with a different strain of Listeria at two different levels per method. For the dairy products (Brie cheese, whole milk, and ice cream), our method was compared to AOAC Official Method(SM) 993.12. For the ready-to-eat meats (deli chicken, cured ham, chicken salad, and hot dogs) and environmental surfaces (sealed concrete, stainless steel, and plastic), these samples were compared to the U.S. Department of Agriculture/Food Safety and Inspection Service-Microbiology Laboratory Guidebook (USDA/FSIS-MLG) method MLG 8.07. Cold-smoked salmon and romaine lettuce were compared to the U.S. Food and Drug Administration/Bacteriological Analytical Manual, Chapter 10 (FDA/BAM) method. Roka's method had 358 positives out of 545 total inoculated samples compared to 332 positive for the reference methods. Overall the probability of detection analysis of the results showed better or equivalent performance compared to the reference methods.

  15. Plasma creatinine in dogs: intra- and inter-laboratory variation in 10 European veterinary laboratories

    PubMed Central

    2011-01-01

    Background There is substantial variation in reported reference intervals for canine plasma creatinine among veterinary laboratories, thereby influencing the clinical assessment of analytical results. The aims of the study was to determine the inter- and intra-laboratory variation in plasma creatinine among 10 veterinary laboratories, and to compare results from each laboratory with the upper limit of its reference interval. Methods Samples were collected from 10 healthy dogs, 10 dogs with expected intermediate plasma creatinine concentrations, and 10 dogs with azotemia. Overlap was observed for the first two groups. The 30 samples were divided into 3 batches and shipped in random order by postal delivery for plasma creatinine determination. Statistical testing was performed in accordance with ISO standard methodology. Results Inter- and intra-laboratory variation was clinically acceptable as plasma creatinine values for most samples were usually of the same magnitude. A few extreme outliers caused three laboratories to fail statistical testing for consistency. Laboratory sample means above or below the overall sample mean, did not unequivocally reflect high or low reference intervals in that laboratory. Conclusions In spite of close analytical results, further standardization among laboratories is warranted. The discrepant reference intervals seem to largely reflect different populations used in establishing the reference intervals, rather than analytical variation due to different laboratory methods. PMID:21477356

  16. Critical Evaluation of Kinetic Method Measurements: Possible Origins of Nonlinear Effects

    NASA Astrophysics Data System (ADS)

    Bourgoin-Voillard, Sandrine; Afonso, Carlos; Lesage, Denis; Zins, Emilie-Laure; Tabet, Jean-Claude; Armentrout, P. B.

    2013-03-01

    The kinetic method is a widely used approach for the determination of thermochemical data such as proton affinities (PA) and gas-phase acidities ( ΔH° acid ). These data are easily obtained from decompositions of noncovalent heterodimers if care is taken in the choice of the method, references used, and experimental conditions. Previously, several papers have focused on theoretical considerations concerning the nature of the references. Few investigations have been devoted to conditions required to validate the quality of the experimental results. In the present work, we are interested in rationalizing the origin of nonlinear effects that can be obtained with the kinetic method. It is shown that such deviations result from intrinsic properties of the systems investigated but can also be enhanced by artifacts resulting from experimental issues. Overall, it is shown that orthogonal distance regression (ODR) analysis of kinetic method data provides the optimum way of acquiring accurate thermodynamic information.

  17. Comparison of ambulatory blood pressure reference standards in children evaluated for hypertension

    PubMed Central

    Jones, Deborah P.; Richey, Phyllis A.; Alpert, Bruce S.

    2009-01-01

    Objective The purpose of this study was to systematically compare methods for standardization of blood pressure levels obtained by ambulatory blood pressure monitoring (ABPM) in a group of 111 children studied at our institution. Methods Blood pressure indices, blood pressure loads and standard deviation scores were calculated using he original ABPM and the modified reference standards. Bland—Altman plots and kappa statistics for the level of agreement were generated. Results Overall, the agreement between the two methods was excellent; however, approximately 5% of children were classified differently by one as compared with the other method. Conclusion Depending on which version of the German Working Group’s reference standards is used for interpretation of ABPM data, the classification of the individual as having hypertension or normal blood pressure may vary. PMID:19433980

  18. Which method should be the reference method to evaluate the severity of rheumatic mitral stenosis? Gorlin's method versus 3D-echo.

    PubMed

    Pérez de Isla, Leopoldo; Casanova, Carlos; Almería, Carlos; Rodrigo, José Luis; Cordeiro, Pedro; Mataix, Luis; Aubele, Ada Lia; Lang, Roberto; Zamorano, José Luis

    2007-12-01

    Several studies have shown a wide variability among different methods to determine the valve area in patients with rheumatic mitral stenosis. Our aim was to evaluate if 3D-echo planimetry is more accurate than the Gorlin method to measure the valve area. Twenty-six patients with mitral stenosis underwent 2D and 3D-echo echocardiographic examinations and catheterization. Valve area was estimated by different methods. A median value of the mitral valve area, obtained from the measurements of three classical non-invasive methods (2D planimetry, pressure half-time and PISA method), was used as the reference method and it was compared with 3D-echo planimetry and Gorlin's method. Our results showed that the accuracy of 3D-echo planimetry is superior to the accuracy of the Gorlin method for the assessment of mitral valve area. We should keep in mind the fact that 3D-echo planimetry may be a better reference method than the Gorlin method to assess the severity of rheumatic mitral stenosis.

  19. Establishment of the Ph. Eur. erythropoietin chemical reference substance batch 1.

    PubMed

    Burns, C; Bristow, A F; Buchheit, K H; Daas, A; Wierer, M; Costanzo, A

    2015-01-01

    The Erythropoietin (EPO) European Pharmacopoeia (Ph. Eur.) Biological Reference Preparation (BRP) batch 3 was calibrated in 2006 by in vivo bioassay and was used as a reference preparation for these assays as well as for the physicochemical methods in the Ph. Eur. monograph Erythropoietin concentrated solution (1316). In order to avoid the frequent replacement of this standard and thus reduce the use of animals, a new EPO Chemical Reference Substance (CRS) was established to be used solely for the physicochemical methods. Here we report the outcome of a collaborative study aimed at demonstrating the suitability of the candidate CRS (cCRS) as a reference for the physicochemical methods in the Ph. Eur. monograph. Results from the study demonstrated that for the physicochemical methods currently required in the monograph (capillary zone electrophoresis (CZE), polyacrylamide gel electrophoresis (PAGE)/immunoblotting and peptide mapping), the cCRS is essentially identical to the existing BRP. However, data also indicated that, for the physicochemical methods under consideration for inclusion in a revised monograph (test for oxidised forms and glycan mapping), the suitability of the cCRS as a reference needs to be confirmed with additional work. Further to completion of the study, the Ph. Eur. Commission adopted the cCRS as "Erythropoietin for physicochemical tests CRS batch 1" to be used for CZE, PAGE/immunoblotting and peptide mapping.

  20. A method of camera calibration in the measurement process with reference mark for approaching observation space target

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Zeng, Luan

    2017-11-01

    Binocular stereoscopic vision can be used for space-based space targets near observation. In order to solve the problem that the traditional binocular vision system cannot work normally after interference, an online calibration method of binocular stereo measuring camera with self-reference is proposed. The method uses an auxiliary optical imaging device to insert the image of the standard reference object into the edge of the main optical path and image with the target on the same focal plane, which is equivalent to a standard reference in the binocular imaging optical system; When the position of the system and the imaging device parameters are disturbed, the image of the standard reference will change accordingly in the imaging plane, and the position of the standard reference object does not change. The camera's external parameters can be re-calibrated by the visual relationship of the standard reference object. The experimental results show that the maximum mean square error of the same object can be reduced from the original 72.88mm to 1.65mm when the right camera is deflected by 0.4 degrees and the left camera is high and low with 0.2° rotation. This method can realize the online calibration of binocular stereoscopic vision measurement system, which can effectively improve the anti - jamming ability of the system.

  1. In-house validation study of the DuPont Qualicon BAX system Q7 instrument with the BAX system PCR Assay for Salmonella (modification of AOAC Official Method 2003.09 and AOAC Research Institute Performance-Tested Method 100201).

    PubMed

    Tice, George; Andaloro, Bridget; White, H Kirk; Bolton, Lance; Wang, Siqun; Davis, Eugene; Wallace, Morgan

    2009-01-01

    In 2006, DuPont Qualicon introduced the BAX system Q7 instrument for use with its assays. To demonstrate the equivalence of the new and old instruments, a validation study was conducted using the BAX system PCR Assay for Salmonella, AOAC Official Method 2003.09, on three food types. The foods were simultaneously analyzed with the BAX system Q7 instrument and either the U.S. Food and Drug Administration Bacteriological Analytical Manual or the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook reference method for detecting Salmonella. Comparable performance between the BAX system and the reference methods was observed. Of the 75 paired samples analyzed, 39 samples were positive by both the BAX system and reference methods, and 36 samples were negative by both the BAX system and reference methods, demonstrating 100% correlation. Inclusivity and exclusivity for the BAX system Q7 instrument were also established by testing 50 Salmonella strains and 20 non-Salmonella isolates. All Salmonella strains returned positive results, and all non-Salmonella isolates returned a negative response.

  2. Intercomparison of methods of coupling between convection and large-scale circulation. 1. Comparison over uniform surface conditions

    DOE PAGES

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.; ...

    2015-10-24

    Here, as part of an international intercomparison project, a set of single-column models (SCMs) and cloud-resolving models (CRMs) are run under the weak-temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistentmore » implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.« less

  3. The importance of reference materials in doping-control analysis.

    PubMed

    Mackay, Lindsey G; Kazlauskas, Rymantas

    2011-08-01

    Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.

  4. Toward Worldwide Hepcidin Assay Harmonization: Identification of a Commutable Secondary Reference Material.

    PubMed

    van der Vorm, Lisa N; Hendriks, Jan C M; Laarakkers, Coby M; Klaver, Siem; Armitage, Andrew E; Bamberg, Alison; Geurts-Moespot, Anneke J; Girelli, Domenico; Herkert, Matthias; Itkonen, Outi; Konrad, Robert J; Tomosugi, Naohisa; Westerman, Mark; Bansal, Sukhvinder S; Campostrini, Natascia; Drakesmith, Hal; Fillet, Marianne; Olbina, Gordana; Pasricha, Sant-Rayn; Pitts, Kelly R; Sloan, John H; Tagliaro, Franco; Weykamp, Cas W; Swinkels, Dorine W

    2016-07-01

    Absolute plasma hepcidin concentrations measured by various procedures differ substantially, complicating interpretation of results and rendering reference intervals method dependent. We investigated the degree of equivalence achievable by harmonization and the identification of a commutable secondary reference material to accomplish this goal. We applied technical procedures to achieve harmonization developed by the Consortium for Harmonization of Clinical Laboratory Results. Eleven plasma hepcidin measurement procedures (5 mass spectrometry based and 6 immunochemical based) quantified native individual plasma samples (n = 32) and native plasma pools (n = 8) to assess analytical performance and current and achievable equivalence. In addition, 8 types of candidate reference materials (3 concentrations each, n = 24) were assessed for their suitability, most notably in terms of commutability, to serve as secondary reference material. Absolute hepcidin values and reproducibility (intrameasurement procedure CVs 2.9%-8.7%) differed substantially between measurement procedures, but all were linear and correlated well. The current equivalence (intermeasurement procedure CV 28.6%) between the methods was mainly attributable to differences in calibration and could thus be improved by harmonization with a common calibrator. Linear regression analysis and standardized residuals showed that a candidate reference material consisting of native lyophilized plasma with cryolyoprotectant was commutable for all measurement procedures. Mathematically simulated harmonization with this calibrator resulted in a maximum achievable equivalence of 7.7%. The secondary reference material identified in this study has the potential to substantially improve equivalence between hepcidin measurement procedures and contributes to the establishment of a traceability chain that will ultimately allow standardization of hepcidin measurement results. © 2016 American Association for Clinical Chemistry.

  5. Reconstruction method for fringe projection profilometry based on light beams.

    PubMed

    Li, Xuexing; Zhang, Zhijiang; Yang, Chen

    2016-12-01

    A novel reconstruction method for fringe projection profilometry, based on light beams, is proposed and verified by experiments. Commonly used calibration techniques require the parameters of projector calibration or the reference planes placed in many known positions. Obviously, introducing the projector calibration can reduce the accuracy of the reconstruction result, and setting the reference planes to many known positions is a time-consuming process. Therefore, in this paper, a reconstruction method without projector's parameters is proposed and only two reference planes are introduced. A series of light beams determined by the subpixel point-to-point map on the two reference planes combined with their reflected light beams determined by the camera model are used to calculate the 3D coordinates of reconstruction points. Furthermore, the bundle adjustment strategy and the complementary gray-code phase-shifting method are utilized to ensure the accuracy and stability. Qualitative and quantitative comparisons as well as experimental tests demonstrate the performance of our proposed approach, and the measurement accuracy can reach about 0.0454 mm.

  6. Food adulteration analysis without laboratory prepared or determined reference food adulterant values.

    PubMed

    Kalivas, John H; Georgiou, Constantinos A; Moira, Marianna; Tsafaras, Ilias; Petrakis, Eleftherios A; Mousdis, George A

    2014-04-01

    Quantitative analysis of food adulterants is an important health and economic issue that needs to be fast and simple. Spectroscopy has significantly reduced analysis time. However, still needed are preparations of analyte calibration samples matrix matched to prediction samples which can be laborious and costly. Reported in this paper is the application of a newly developed pure component Tikhonov regularization (PCTR) process that does not require laboratory prepared or reference analysis methods, and hence, is a greener calibration method. The PCTR method requires an analyte pure component spectrum and non-analyte spectra. As a food analysis example, synchronous fluorescence spectra of extra virgin olive oil samples adulterated with sunflower oil is used. Results are shown to be better than those obtained using ridge regression with reference calibration samples. The flexibility of PCTR allows including reference samples and is generic for use with other instrumental methods and food products. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    PubMed

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  8. Technical Note: Modification of the standard gain correction algorithm to compensate for the number of used reference flat frames in detector performance studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konstantinidis, Anastasios C.; Olivo, Alessandro; Speller, Robert D.

    2011-12-15

    Purpose: The x-ray performance evaluation of digital x-ray detectors is based on the calculation of the modulation transfer function (MTF), the noise power spectrum (NPS), and the resultant detective quantum efficiency (DQE). The flat images used for the extraction of the NPS should not contain any fixed pattern noise (FPN) to avoid contamination from nonstochastic processes. The ''gold standard'' method used for the reduction of the FPN (i.e., the different gain between pixels) in linear x-ray detectors is based on normalization with an average reference flat-field. However, the noise in the corrected image depends on the number of flat framesmore » used for the average flat image. The aim of this study is to modify the standard gain correction algorithm to make it independent on the used reference flat frames. Methods: Many publications suggest the use of 10-16 reference flat frames, while other studies use higher numbers (e.g., 48 frames) to reduce the propagated noise from the average flat image. This study quantifies experimentally the effect of the number of used reference flat frames on the NPS and DQE values and appropriately modifies the gain correction algorithm to compensate for this effect. Results: It is shown that using the suggested gain correction algorithm a minimum number of reference flat frames (i.e., down to one frame) can be used to eliminate the FPN from the raw flat image. This saves computer memory and time during the x-ray performance evaluation. Conclusions: The authors show that the method presented in the study (a) leads to the maximum DQE value that one would have by using the conventional method and very large number of frames and (b) has been compared to an independent gain correction method based on the subtraction of flat-field images, leading to identical DQE values. They believe this provides robust validation of the proposed method.« less

  9. Using Vision Metrology System for Quality Control in Automotive Industries

    NASA Astrophysics Data System (ADS)

    Mostofi, N.; Samadzadegan, F.; Roohy, Sh.; Nozari, M.

    2012-07-01

    The need of more accurate measurements in different stages of industrial applications, such as designing, producing, installation, and etc., is the main reason of encouraging the industry deputy in using of industrial Photogrammetry (Vision Metrology System). With respect to the main advantages of Photogrammetric methods, such as greater economy, high level of automation, capability of noncontact measurement, more flexibility and high accuracy, a good competition occurred between this method and other industrial traditional methods. With respect to the industries that make objects using a main reference model without having any mathematical model of it, main problem of producers is the evaluation of the production line. This problem will be so complicated when both reference and product object just as a physical object is available and comparison of them will be possible with direct measurement. In such case, producers make fixtures fitting reference with limited accuracy. In practical reports sometimes available precision is not better than millimetres. We used a non-metric high resolution digital camera for this investigation and the case study that studied in this paper is a chassis of automobile. In this research, a stable photogrammetric network designed for measuring the industrial object (Both Reference and Product) and then by using the Bundle Adjustment and Self-Calibration methods, differences between the Reference and Product object achieved. These differences will be useful for the producer to improve the production work flow and bringing more accurate products. Results of this research, demonstrate the high potential of proposed method in industrial fields. Presented results prove high efficiency and reliability of this method using RMSE criteria. Achieved RMSE for this case study is smaller than 200 microns that shows the fact of high capability of implemented approach.

  10. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  11. Intercomparison of Lab-Based Soil Water Extraction Methods for Stable Water Isotope Analysis

    NASA Astrophysics Data System (ADS)

    Pratt, D.; Orlowski, N.; McDonnell, J.

    2016-12-01

    The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of five common lab-based soil water extraction techniques: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and cryogenic extraction. We applied five extraction methods to two physicochemically different standard soil types (silty sand and clayey loam) that were oven-dried and rewetted with water of known isotopic composition at three different gravimetric water contents (8, 20, and 30%). We tested the null hypothisis that all extraction techniques would provide the same isotopic result independent from soil type and water content. Our results showed that the extraction technique had a significant effect on the soil water isotopic composition. Each method exhibited deviations from spiked reference water, with soil type and water content showing a secondary effect. Cryogenic extraction showed the largest deviations from the reference water, whereas mechanical squeezing and centrifugation provided the closest match to the reference water for both soil types. We also compared results for each extraction technique that produced liquid water on both an OA-ICOS and IRMS; differences between them were negligible.

  12. Reference method for detection of Pgp mediated multidrug resistance in human hematological malignancies: a method validated by the laboratories of the French Drug Resistance Network.

    PubMed

    Huet, S; Marie, J P; Gualde, N; Robert, J

    1998-12-15

    Multidrug resistance (MDR) associated with overexpression of the MDR1 gene and of its product, P-glycoprotein (Pgp), plays an important role in limiting cancer treatment efficacy. Many studies have investigated Pgp expression in clinical samples of hematological malignancies but failed to give definitive conclusion on its usefulness. One convenient method for fluorescent detection of Pgp in malignant cells is flow cytometry which however gives variable results from a laboratory to another one, partly due to the lack of a reference method rigorously tested. The purpose of this technical note is to describe each step of a reference flow cytometric method. The guidelines for sample handling, staining and analysis have been established both for Pgp detection with monoclonal antibodies directed against extracellular epitopes (MRK16, UIC2 and 4E3), and for Pgp functional activity measurement with Rhodamine 123 as a fluorescent probe. Both methods have been validated on cultured cell lines and clinical samples by 12 laboratories of the French Drug Resistance Network. This cross-validated multicentric study points out crucial steps for the accuracy and reproducibility of the results, like cell viability, data analysis and expression.

  13. Nonparametric spirometry reference values for Hispanic Americans.

    PubMed

    Glenn, Nancy L; Brown, Vanessa M

    2011-02-01

    Recent literature sites ethnic origin as a major factor in developing pulmonary function reference values. Extensive studies established reference values for European and African Americans, but not for Hispanic Americans. The Third National Health and Nutrition Examination Survey defines Hispanic as individuals of Spanish speaking cultures. While no group was excluded from the target population, sample size requirements only allowed inclusion of individuals who identified themselves as Mexican Americans. This research constructs nonparametric reference value confidence intervals for Hispanic American pulmonary function. The method is applicable to all ethnicities. We use empirical likelihood confidence intervals to establish normal ranges for reference values. Its major advantage: it is model free, but shares asymptotic properties of model based methods. Statistical comparisons indicate that empirical likelihood interval lengths are comparable to normal theory intervals. Power and efficiency studies agree with previously published theoretical results.

  14. Analytical Bias Exceeding Desirable Quality Goal in 4 out of 5 Common Immunoassays: Results of a Native Single Serum Sample External Quality Assessment Program for Cobalamin, Folate, Ferritin, Thyroid-Stimulating Hormone, and Free T4 Analyses.

    PubMed

    Kristensen, Gunn B B; Rustad, Pål; Berg, Jens P; Aakre, Kristin M

    2016-09-01

    We undertook this study to evaluate method differences for 5 components analyzed by immunoassays, to explore whether the use of method-dependent reference intervals may compensate for method differences, and to investigate commutability of external quality assessment (EQA) materials. Twenty fresh native single serum samples, a fresh native serum pool, Nordic Federation of Clinical Chemistry Reference Serum X (serum X) (serum pool), and 2 EQA materials were sent to 38 laboratories for measurement of cobalamin, folate, ferritin, free T4, and thyroid-stimulating hormone (TSH) by 5 different measurement procedures [Roche Cobas (n = 15), Roche Modular (n = 4), Abbott Architect (n = 8), Beckman Coulter Unicel (n = 2), and Siemens ADVIA Centaur (n = 9)]. The target value for each component was calculated based on the mean of method means or measured by a reference measurement procedure (free T4). Quality specifications were based on biological variation. Local reference intervals were reported from all laboratories. Method differences that exceeded acceptable bias were found for all components except folate. Free T4 differences from the uncommonly used reference measurement procedure were large. Reference intervals differed between measurement procedures but also within 1 measurement procedure. The serum X material was commutable for all components and measurement procedures, whereas the EQA materials were noncommutable in 13 of 50 occasions (5 components, 5 methods, 2 EQA materials). The bias between the measurement procedures was unacceptably large in 4/5 tested components. Traceability to reference materials as claimed by the manufacturers did not lead to acceptable harmonization. Adjustment of reference intervals in accordance with method differences and use of commutable EQA samples are not implemented commonly. © 2016 American Association for Clinical Chemistry.

  15. A Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) Quantitative Analysis Method Based on the Auto-Selection of an Internal Reference Line and Optimized Estimation of Plasma Temperature.

    PubMed

    Yang, Jianhong; Li, Xiaomeng; Xu, Jinwu; Ma, Xianghong

    2018-01-01

    The quantitative analysis accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is severely affected by the self-absorption effect and estimation of plasma temperature. Herein, a CF-LIBS quantitative analysis method based on the auto-selection of internal reference line and the optimized estimation of plasma temperature is proposed. The internal reference line of each species is automatically selected from analytical lines by a programmable procedure through easily accessible parameters. Furthermore, the self-absorption effect of the internal reference line is considered during the correction procedure. To improve the analysis accuracy of CF-LIBS, the particle swarm optimization (PSO) algorithm is introduced to estimate the plasma temperature based on the calculation results from the Boltzmann plot. Thereafter, the species concentrations of a sample can be calculated according to the classical CF-LIBS method. A total of 15 certified alloy steel standard samples of known compositions and elemental weight percentages were used in the experiment. Using the proposed method, the average relative errors of Cr, Ni, and Fe calculated concentrations were 4.40%, 6.81%, and 2.29%, respectively. The quantitative results demonstrated an improvement compared with the classical CF-LIBS method and the promising potential of in situ and real-time application.

  16. Reference gene selection for quantitative gene expression studies during biological invasions: A test on multiple genes and tissues in a model ascidian Ciona savignyi.

    PubMed

    Huang, Xuena; Gao, Yangchun; Jiang, Bei; Zhou, Zunchun; Zhan, Aibin

    2016-01-15

    As invasive species have successfully colonized a wide range of dramatically different local environments, they offer a good opportunity to study interactions between species and rapidly changing environments. Gene expression represents one of the primary and crucial mechanisms for rapid adaptation to local environments. Here, we aim to select reference genes for quantitative gene expression analysis based on quantitative Real-Time PCR (qRT-PCR) for a model invasive ascidian, Ciona savignyi. We analyzed the stability of ten candidate reference genes in three tissues (siphon, pharynx and intestine) under two key environmental stresses (temperature and salinity) in the marine realm based on three programs (geNorm, NormFinder and delta Ct method). Our results demonstrated only minor difference for stability rankings among the three methods. The use of different single reference gene might influence the data interpretation, while multiple reference genes could minimize possible errors. Therefore, reference gene combinations were recommended for different tissues - the optimal reference gene combination for siphon was RPS15 and RPL17 under temperature stress, and RPL17, UBQ and TubA under salinity treatment; for pharynx, TubB, TubA and RPL17 were the most stable genes under temperature stress, while TubB, TubA and UBQ were the best under salinity stress; for intestine, UBQ, RPS15 and RPL17 were the most reliable reference genes under both treatments. Our results suggest that the necessity of selection and test of reference genes for different tissues under varying environmental stresses. The results obtained here are expected to reveal mechanisms of gene expression-mediated invasion success using C. savignyi as a model species. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Recording multiple spatially-heterodyned direct to digital holograms in one digital image

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN

    2008-03-25

    Systems and methods are described for recording multiple spatially-heterodyned direct to digital holograms in one digital image. A method includes digitally recording, at a first reference beam-object beam angle, a first spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded first spatially-heterodyned hologram by shifting a first original origin of the recorded first spatially-heterodyned hologram to sit on top of a first spatial-heterodyne carrier frequency defined by the first reference beam-object beam angle; digitally recording, at a second reference beam-object beam angle, a second spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis; Fourier analyzing the recorded second spatially-heterodyned hologram by shifting a second original origin of the recorded second spatially-heterodyned hologram to sit on top of a second spatial-heterodyne carrier frequency defined by the second reference beam-object beam angle; applying a first digital filter to cut off signals around the first original origin and define a first result; performing a first inverse Fourier transform on the first result; applying a second digital filter to cut off signals around the second original origin and define a second result; and performing a second inverse Fourier transform on the second result, wherein the first reference beam-object beam angle is not equal to the second reference beam-object beam angle and a single digital image includes both the first spatially-heterodyned hologram and the second spatially-heterodyned hologram.

  18. A computerized procedure for teaching the relationship between graphic symbols and their referents.

    PubMed

    Isaacson, Mick; Lloyd, Lyle L

    2013-01-01

    Many individuals with little or no functional speech communicate through graphic symbols. Communication is enhanced when the relationship between symbols and their referents are learned to such a degree that retrieval is effortless, resulting in fluent communication. Developing fluency is a time consuming endeavor for special educators and speech-language pathologists (SLPs). It would be beneficial for these professionals to have an automated procedure based on the most efficacious method for teaching the relationship between symbols and referent. Hence, this study investigated whether a procedure based on the generation effect would promote learning the association between symbols and their referents. Results show that referent generation produces the best long-term retention of this relationship. These findings provide evidence that software based on referent generation would provide special educators and SLPs with an efficacious automated procedure, requiring minimal direct supervision, to facilitate symbol/referent learning and the development of communicative fluency.

  19. Bounded Linear Stability Analysis - A Time Delay Margin Estimation Approach for Adaptive Control

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Ishihara, Abraham K.; Krishnakumar, Kalmanje Srinlvas; Bakhtiari-Nejad, Maryam

    2009-01-01

    This paper presents a method for estimating time delay margin for model-reference adaptive control of systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent the conventional model-reference adaptive law by a locally bounded linear approximation within a small time window using the comparison lemma. The locally bounded linear approximation of the combined adaptive system is cast in a form of an input-time-delay differential equation over a small time window. The time delay margin of this system represents a local stability measure and is computed analytically by a matrix measure method, which provides a simple analytical technique for estimating an upper bound of time delay margin. Based on simulation results for a scalar model-reference adaptive control system, both the bounded linear stability method and the matrix measure method are seen to provide a reasonably accurate and yet not too conservative time delay margin estimation.

  20. Fundamental Importance of Reference Glucose Analyzer Accuracy for Evaluating the Performance of Blood Glucose Monitoring Systems (BGMSs)

    PubMed Central

    Bailey, Timothy S.; Klaff, Leslie J.; Wallace, Jane F.; Greene, Carmine; Pardo, Scott; Harrison, Bern; Simmons, David A.

    2016-01-01

    Background: As blood glucose monitoring system (BGMS) accuracy is based on comparison of BGMS and laboratory reference glucose analyzer results, reference instrument accuracy is important to discriminate small differences between BGMS and reference glucose analyzer results. Here, we demonstrate the important role of reference glucose analyzer accuracy in BGMS accuracy evaluations. Methods: Two clinical studies assessed the performance of a new BGMS, using different reference instrument procedures. BGMS and YSI analyzer results were compared for fingertip blood that was obtained by untrained subjects’ self-testing and study staff testing, respectively. YSI analyzer accuracy was monitored using traceable serum controls. Results: In study 1 (N = 136), 94.1% of BGMS results were within International Organization for Standardization (ISO) 15197:2013 accuracy criteria; YSI analyzer serum control results showed a negative bias (−0.64% to −2.48%) at the first site and a positive bias (3.36% to 6.91%) at the other site. In study 2 (N = 329), 97.8% of BGMS results were within accuracy criteria; serum controls showed minimal bias (<0.92%) at both sites. Conclusions: These findings suggest that the ability to demonstrate that a BGMS meets accuracy guidelines is influenced by reference instrument accuracy. PMID:26902794

  1. Assessment of data processing to improve reliability of microarray experiments using genomic DNA reference.

    PubMed

    Yang, Yunfeng; Zhu, Mengxia; Wu, Liyou; Zhou, Jizhong

    2008-09-16

    Using genomic DNA as common reference in microarray experiments has recently been tested by different laboratories. Conflicting results have been reported with regard to the reliability of microarray results using this method. To explain it, we hypothesize that data processing is a critical element that impacts the data quality. Microarray experiments were performed in a gamma-proteobacterium Shewanella oneidensis. Pair-wise comparison of three experimental conditions was obtained either with two labeled cDNA samples co-hybridized to the same array, or by employing Shewanella genomic DNA as a standard reference. Various data processing techniques were exploited to reduce the amount of inconsistency between both methods and the results were assessed. We discovered that data quality was significantly improved by imposing the constraint of minimal number of replicates, logarithmic transformation and random error analyses. These findings demonstrate that data processing significantly influences data quality, which provides an explanation for the conflicting evaluation in the literature. This work could serve as a guideline for microarray data analysis using genomic DNA as a standard reference.

  2. EC comparison on the determination of 226Ra, 228Ra, 234U and 238U in water among European monitoring laboratories.

    PubMed

    Wätjen, U; Benedik, L; Spasova, Y; Vasile, M; Altzitzoglou, T; Beyermann, M

    2010-01-01

    In anticipation of new European requirements for monitoring radioactivity concentration in drinking water, IRMM organized an interlaboratory comparison on the determination of low levels of activity concentrations (about 10-100 mBq L(-1)) of the naturally occurring radionuclides (226)Ra, (228)Ra, (234)U and (238)U in three commercially available mineral waters. Using two or three different methods with traceability to the International System of Reference (SIR), the reference values of the water samples were determined prior to the proficiency test within combined standard uncertainties of the order of 3%-10%. An overview of radiochemical separation and measurement methods used by the 45 participating laboratories are given. The results of the participants are evaluated versus the reference values. Several of the participants' results deviate by more than a factor of two from the reference values, in particular for the radium isotopes. Such erroneous analysis results may lead to a crucial omission of remedial actions on drinking water supplies or to economic loss by an unjustified action. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. Metabarcoding of marine nematodes – evaluation of reference datasets used in tree-based taxonomy assignment approach

    PubMed Central

    2016-01-01

    Abstract Background Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. New information In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand. Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset. Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach. PMID:27932919

  4. Metabarcoding of marine nematodes - evaluation of reference datasets used in tree-based taxonomy assignment approach.

    PubMed

    Holovachov, Oleksandr

    2016-01-01

    Metabarcoding is becoming a common tool used to assess and compare diversity of organisms in environmental samples. Identification of OTUs is one of the critical steps in the process and several taxonomy assignment methods were proposed to accomplish this task. This publication evaluates the quality of reference datasets, alongside with several alignment and phylogeny inference methods used in one of the taxonomy assignment methods, called tree-based approach. This approach assigns anonymous OTUs to taxonomic categories based on relative placements of OTUs and reference sequences on the cladogram and support that these placements receive. In tree-based taxonomy assignment approach, reliable identification of anonymous OTUs is based on their placement in monophyletic and highly supported clades together with identified reference taxa. Therefore, it requires high quality reference dataset to be used. Resolution of phylogenetic trees is strongly affected by the presence of erroneous sequences as well as alignment and phylogeny inference methods used in the process. Two preparation steps are essential for the successful application of tree-based taxonomy assignment approach. Curated collections of genetic information do include erroneous sequences. These sequences have detrimental effect on the resolution of cladograms used in tree-based approach. They must be identified and excluded from the reference dataset beforehand.Various combinations of multiple sequence alignment and phylogeny inference methods provide cladograms with different topology and bootstrap support. These combinations of methods need to be tested in order to determine the one that gives highest resolution for the particular reference dataset.Completing the above mentioned preparation steps is expected to decrease the number of unassigned OTUs and thus improve the results of the tree-based taxonomy assignment approach.

  5. Validating fatty acid intake as estimated by an FFQ: how does the 24 h recall perform as reference method compared with the duplicate portion?

    PubMed

    Trijsburg, Laura; de Vries, Jeanne Hm; Hollman, Peter Ch; Hulshof, Paul Jm; van 't Veer, Pieter; Boshuizen, Hendriek C; Geelen, Anouk

    2018-05-08

    To compare the performance of the commonly used 24 h recall (24hR) with the more distinct duplicate portion (DP) as reference method for validation of fatty acid intake estimated with an FFQ. Intakes of SFA, MUFA, n-3 fatty acids and linoleic acid (LA) were estimated by chemical analysis of two DP and by on average five 24hR and two FFQ. Plasma n-3 fatty acids and LA were used to objectively compare ranking of individuals based on DP and 24hR. Multivariate measurement error models were used to estimate validity coefficients and attenuation factors for the FFQ with the DP and 24hR as reference methods. Wageningen, the Netherlands. Ninety-two men and 106 women (aged 20-70 years). Validity coefficients for the fatty acid estimates by the FFQ tended to be lower when using the DP as reference method compared with the 24hR. Attenuation factors for the FFQ tended to be slightly higher based on the DP than those based on the 24hR as reference method. Furthermore, when using plasma fatty acids as reference, the DP showed comparable to slightly better ranking of participants according to their intake of n-3 fatty acids (0·33) and n-3:LA (0·34) than the 24hR (0·22 and 0·24, respectively). The 24hR gives only slightly different results compared with the distinctive but less feasible DP, therefore use of the 24hR seems appropriate as the reference method for FFQ validation of fatty acid intake.

  6. Fast first arrival picking algorithm for noisy microseismic data

    NASA Astrophysics Data System (ADS)

    Kim, Dowan; Byun, Joongmoo; Lee, Minho; Choi, Jihoon; Kim, Myungsun

    2017-01-01

    Most microseismic events occur during hydraulic fracturing. Thus microseismic monitoring, by recording seismic waves from microseismic events, is one of the best methods for locating the positions of hydraulic fractures. However, since microseismic events have very low energy, the data often have a low signal-to-noise ratio (S/N ratio) and it is not easy to pick the first arrival time. In this study, we suggest a new fast picking method optimised for noisy data using cross-correlation and stacking. In this method, a reference trace is selected and the time differences between the first arrivals of the reference trace and those of the other traces are computed by cross-correlation. Then, all traces are aligned with the reference trace by time shifting, and the aligned traces are summed together to produce a stacked reference trace that has a considerably improved S/N ratio. After the first arrival time of the stacked reference trace is picked, the first arrival time of each trace is calculated automatically using the time differences obtained in the cross-correlation process. In experiments with noisy synthetic data and field data, this method produces more reliable results than the traditional method, which picks the first arrival time of each noisy trace separately. In addition, the computation time is dramatically reduced.

  7. Adaptation of the Sensititre broth microdilution technique to antimicrobial susceptibility testing of Mycoplasma hyopneumoniae.

    PubMed

    Tanner, A C; Erickson, B Z; Ross, R F

    1993-09-01

    A broth microdilution technique is described for determining the antimicrobial susceptibility of Mycoplasma hyopneumoniae, using commercially prepared Sensititre plates. Twenty-five field isolates and two reference strains (J & 232), were tested against seven antimicrobials. Field isolates were tested in duplicate and reference strains, four times to estimate reproducibility. Ninety-seven percent of the duplicate MIC results for the field isolates were in agreement, or within one log2 dilution. Similar results were obtained with the reference strains. The isolates were susceptible to lincomycin-spectinomycin, tylosin and oxytetracycline or resistant to amoxycillin, apramycin and erythromycin. Susceptibility to furaltadone varied. This method retains the accuracy and reproducibility of broth MIC determinations, while avoiding the lengthy preparation of antimicrobial dilutions normally associated with more traditional methods.

  8. International Standards and Reference Materials for Quantitative Molecular Infectious Disease Testing

    PubMed Central

    Madej, Roberta M.; Davis, Jack; Holden, Marcia J.; Kwang, Stan; Labourier, Emmanuel; Schneider, George J.

    2010-01-01

    The utility of quantitative molecular diagnostics for patient management depends on the ability to relate patient results to prior results or to absolute values in clinical practice guidelines. To do this, those results need to be comparable across time and methods, either by producing the same value across methods and test versions or by using reliable and stable conversions. Universally available standards and reference materials specific to quantitative molecular technologies are critical to this process but are few in number. This review describes recent history in the establishment of international standards for nucleic acid test development, organizations involved in current efforts, and future issues and initiatives. PMID:20075208

  9. An analysis of methods for gravity determination and their utilization for the calculation of geopotential numbers in the Slovak national levelling network

    NASA Astrophysics Data System (ADS)

    Majkráková, Miroslava; Papčo, Juraj; Zahorec, Pavol; Droščák, Branislav; Mikuška, Ján; Marušiak, Ivan

    2016-09-01

    The vertical reference system in the Slovak Republic is realized by the National Levelling Network (NLN). The normal heights according to Molodensky have been introduced as reference heights in the NLN in 1957. Since then, the gravity correction, which is necessary to determine the reference heights in the NLN, has been obtained by an interpolation either from the simple or complete Bouguer anomalies. We refer to this method as the "original". Currently, the method based on geopotential numbers is the preferred way to unify the European levelling networks. The core of this article is an analysis of different ways to the gravity determination and their application for the calculation of geopotential numbers at the points of the NLN. The first method is based on the calculation of gravity at levelling points from the interpolated values of the complete Bouguer anomaly using the CBA2G_SK software. The second method is based on the global geopotential model EGM2008 improved by the Residual Terrain Model (RTM) approach. The calculated gravity is used to determine the normal heights according to Molodensky along parts of the levelling lines around the EVRF2007 datum point EH-V. Pitelová (UELN-1905325) and the levelling line of the 2nd order NLN to Kráľova hoľa Mountain (the highest point measured by levelling). The results from our analysis illustrate that the method based on the interpolated value of gravity is a better method for gravity determination when we do not know the measured gravity. It was shown that this method is suitable for the determination of geopotential numbers and reference heights in the Slovak national levelling network at the points in which the gravity is not observed directly. We also demonstrated the necessity of using the precise RTM for the refinement of the results derived solely from the EGM2008.

  10. Experimental study on cross-sensitivity of temperature and vibration of embedded fiber Bragg grating sensors

    NASA Astrophysics Data System (ADS)

    Chen, Tao; Ye, Meng-li; Liu, Shu-liang; Deng, Yan

    2018-03-01

    In view of the principle for occurrence of cross-sensitivity, a series of calibration experiments are carried out to solve the cross-sensitivity problem of embedded fiber Bragg gratings (FBGs) using the reference grating method. Moreover, an ultrasonic-vibration-assisted grinding (UVAG) model is established, and finite element analysis (FEA) is carried out under the monitoring environment of embedded temperature measurement system. In addition, the related temperature acquisition tests are set in accordance with requirements of the reference grating method. Finally, comparative analyses of the simulation and experimental results are performed, and it may be concluded that the reference grating method may be utilized to effectively solve the cross-sensitivity of embedded FBGs.

  11. A novel method for the activity measurement of large-area beta reference sources.

    PubMed

    Stanga, D; De Felice, P; Keightley, J; Capogni, M; Ioan, M R

    2016-03-01

    A novel method has been developed for the activity measurement of large-area beta reference sources. It makes use of two emission rate measurements and is based on the weak dependence between the source activity and the activity distribution for a given value of transmission coefficient. The method was checked experimentally by measuring the activity of two ((60)Co and (137)Cs) large-area reference sources constructed from anodized aluminum foils. Measurement results were compared with the activity values measured by gamma spectrometry. For each source, they agree within one standard uncertainty and also agree within the same limits with the certified values of the source activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Accurate determination of reference materials and natural isolates by means of quantitative (1)h NMR spectroscopy.

    PubMed

    Frank, Oliver; Kreissl, Johanna Karoline; Daschner, Andreas; Hofmann, Thomas

    2014-03-26

    A fast and precise proton nuclear magnetic resonance (qHNMR) method for the quantitative determination of low molecular weight target molecules in reference materials and natural isolates has been validated using ERETIC 2 (Electronic REference To access In vivo Concentrations) based on the PULCON (PULse length based CONcentration determination) methodology and compared to the gravimetric results. Using an Avance III NMR spectrometer (400 MHz) equipped with a broad band observe (BBO) probe, the qHNMR method was validated by determining its linearity, range, precision, and accuracy as well as robustness and limit of quantitation. The linearity of the method was assessed by measuring samples of l-tyrosine, caffeine, or benzoic acid in a concentration range between 0.3 and 16.5 mmol/L (r(2) ≥ 0.99), whereas the interday and intraday precisions were found to be ≤2%. The recovery of a range of reference compounds was ≥98.5%, thus demonstrating the qHNMR method as a precise tool for the rapid quantitation (~15 min) of food-related target compounds in reference materials and natural isolates such as nucleotides, polyphenols, or cyclic peptides.

  13. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies

    PubMed Central

    Satzke, Catherine; Dunne, Eileen M.; Porter, Barbara D.; Klugman, Keith P.; Mulholland, E. Kim

    2015-01-01

    Background The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Methods and Findings Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Conclusions Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high. PMID:26575033

  14. Paleodemographic age-at-death distributions of two Mexican skeletal collections: a comparison of transition analysis and traditional aging methods.

    PubMed

    Bullock, Meggan; Márquez, Lourdes; Hernández, Patricia; Ruíz, Fernando

    2013-09-01

    Traditional methods of aging adult skeletons suffer from the problem of age mimicry of the reference collection, as described by Bocquet-Appel and Masset (1982). Transition analysis (Boldsen et al., 2002) is a method of aging adult skeletons that addresses the problem of age mimicry of the reference collection by allowing users to select an appropriate prior probability. In order to evaluate whether transition analysis results in significantly different age estimates for adults, the method was applied to skeletal collections from Postclassic Cholula and Contact-Period Xochimilco. The resulting age-at-death distributions were then compared with age-at-death distributions for the two populations constructed using traditional aging methods. Although the traditional aging methods result in age-at-death distributions with high young adult mortality and few individuals living past the age of 50, the age-at-death distributions constructed using transition analysis indicate that most individuals who lived into adulthood lived past the age of 50. Copyright © 2013 Wiley Periodicals, Inc.

  15. Quantitative analysis of Si1-xGex alloy films by SIMS and XPS depth profiling using a reference material

    NASA Astrophysics Data System (ADS)

    Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong

    2018-02-01

    Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.

  16. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Britton, L.J.; Greeson, P.E.

    1988-01-01

    Chapter A4, methods for collection and analyses of aquatic biological and microbiological samples, contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity and bioassay. Each method is summarized, and the applications, interferences, apparatus, reagents, analyses, calculations, reporting of results, precisions, and references are given. Part 2 consists of a glossary. Part 3 is a list of taxonomic references. (USGS)

  17. The PneuCarriage Project: A Multi-Centre Comparative Study to Identify the Best Serotyping Methods for Examining Pneumococcal Carriage in Vaccine Evaluation Studies.

    PubMed

    Satzke, Catherine; Dunne, Eileen M; Porter, Barbara D; Klugman, Keith P; Mulholland, E Kim

    2015-11-01

    The pneumococcus is a diverse pathogen whose primary niche is the nasopharynx. Over 90 different serotypes exist, and nasopharyngeal carriage of multiple serotypes is common. Understanding pneumococcal carriage is essential for evaluating the impact of pneumococcal vaccines. Traditional serotyping methods are cumbersome and insufficient for detecting multiple serotype carriage, and there are few data comparing the new methods that have been developed over the past decade. We established the PneuCarriage project, a large, international multi-centre study dedicated to the identification of the best pneumococcal serotyping methods for carriage studies. Reference sample sets were distributed to 15 research groups for blinded testing. Twenty pneumococcal serotyping methods were used to test 81 laboratory-prepared (spiked) samples. The five top-performing methods were used to test 260 nasopharyngeal (field) samples collected from children in six high-burden countries. Sensitivity and positive predictive value (PPV) were determined for the test methods and the reference method (traditional serotyping of >100 colonies from each sample). For the alternate serotyping methods, the overall sensitivity ranged from 1% to 99% (reference method 98%), and PPV from 8% to 100% (reference method 100%), when testing the spiked samples. Fifteen methods had ≥70% sensitivity to detect the dominant (major) serotype, whilst only eight methods had ≥70% sensitivity to detect minor serotypes. For the field samples, the overall sensitivity ranged from 74.2% to 95.8% (reference method 93.8%), and PPV from 82.2% to 96.4% (reference method 99.6%). The microarray had the highest sensitivity (95.8%) and high PPV (93.7%). The major limitation of this study is that not all of the available alternative serotyping methods were included. Most methods were able to detect the dominant serotype in a sample, but many performed poorly in detecting the minor serotype populations. Microarray with a culture amplification step was the top-performing method. Results from this comprehensive evaluation will inform future vaccine evaluation and impact studies, particularly in low-income settings, where pneumococcal disease burden remains high.

  18. 'Aussie normals': an a priori study to develop clinical chemistry reference intervals in a healthy Australian population.

    PubMed

    Koerbin, G; Cavanaugh, J A; Potter, J M; Abhayaratna, W P; West, N P; Glasgow, N; Hawkins, C; Armbruster, D; Oakman, C; Hickman, P E

    2015-02-01

    Development of reference intervals is difficult, time consuming, expensive and beyond the scope of most laboratories. The Aussie Normals study is a direct a priori study to determine reference intervals in healthy Australian adults. All volunteers completed a health and lifestyle questionnaire and exclusion was based on conditions such as pregnancy, diabetes, renal or cardiovascular disease. Up to 91 biochemical analyses were undertaken on a variety of analytical platforms using serum samples collected from 1856 volunteers. We report on our findings for 40 of these analytes and two calculated parameters performed on the Abbott ARCHITECTci8200/ci16200 analysers. Not all samples were analysed for all assays due to volume requirements or assay/instrument availability. Results with elevated interference indices and those deemed unsuitable after clinical evaluation were removed from the database. Reference intervals were partitioned based on the method of Harris and Boyd into three scenarios, combined gender, males and females and age and gender. We have performed a detailed reference interval study on a healthy Australian population considering the effects of sex, age and body mass. These reference intervals may be adapted to other manufacturer's analytical methods using method transference.

  19. Comparison of results of fluconazole disk diffusion testing for Candida species with results from a central reference laboratory in the ARTEMIS global antifungal surveillance program.

    PubMed

    Pfaller, M A; Hazen, K C; Messer, S A; Boyken, L; Tendolkar, S; Hollis, R J; Diekema, D J

    2004-08-01

    The accuracy of antifungal susceptibility tests is important for accurate resistance surveillance and for the clinical management of patients with serious infections. Our main objective was to compare the results of fluconazole disk diffusion testing of Candida spp. performed by ARTEMIS participating centers with disk diffusion and MIC results obtained by the central reference laboratory. A total of 2,949 isolates of Candida spp. were tested by NCCLS disk diffusion and reference broth microdilution methods in the central reference laboratory. These results were compared to the results of disk diffusion testing performed in the 54 participating centers. All tests were performed and interpreted following NCCLS recommendations. Overall categorical agreement between participant disk diffusion test results and reference laboratory MIC results was 87.4%, with 0.2% very major errors (VME) and 3.3% major errors (ME). The categorical agreement between the disk diffusion test results obtained in the reference laboratory with the MIC test results was similar: 92.8%. Likewise, good agreement was observed between participant disk diffusion test results and reference laboratory disk diffusion test results: 90.4%, 0.4% VME, and 3.4% ME. The disk diffusion test was especially reliable in detecting those isolates of Candida spp. that were characterized as resistant by reference MIC testing. External quality assurance data obtained by surveillance programs such as the ARTEMIS Global Antifungal Surveillance Program ensure the generation of useful surveillance data and result in the continued improvement of antifungal susceptibility testing practices.

  20. A Comparison of the Kernel Equating Method with Traditional Equating Methods Using SAT[R] Data

    ERIC Educational Resources Information Center

    Liu, Jinghua; Low, Albert C.

    2008-01-01

    This study applied kernel equating (KE) in two scenarios: equating to a very similar population and equating to a very different population, referred to as a distant population, using SAT[R] data. The KE results were compared to the results obtained from analogous traditional equating methods in both scenarios. The results indicate that KE results…

  1. National geodetic satellite program, part 2

    NASA Technical Reports Server (NTRS)

    Schmid, H.

    1977-01-01

    Satellite geodesy and the creation of worldwide geodetic reference systems is discussed. The geometric description of the surface and the analytical description of the gravity field of the earth by means of worldwide reference systems, with the aid of satellite geodesy, are presented. A triangulation method based on photogrammetric principles is described in detail. Results are derived in the form of three dimensional models. These mathematical models represent the frame of reference into which one can fit the existing geodetic results from the various local datums, as well as future measurements.

  2. Reference Values for Spirometry Derived Using Lambda, Mu, Sigma (LMS) Method in Korean Adults: in Comparison with Previous References.

    PubMed

    Jo, Bum Seak; Myong, Jun Pyo; Rhee, Chin Kook; Yoon, Hyoung Kyu; Koo, Jung Wan; Kim, Hyoung Ryoul

    2018-01-15

    The present study aimed to update the prediction equations for spirometry and their lower limits of normal (LLN) by using the lambda, mu, sigma (LMS) method and to compare the outcomes with the values of previous spirometric reference equations. Spirometric data of 10,249 healthy non-smokers (8,776 females) were extracted from the fourth and fifth versions of the Korea National Health and Nutrition Examination Survey (KNHANES IV, 2007-2009; V, 2010-2012). Reference equations were derived using the LMS method which allows modeling skewness (lambda [L]), mean (mu [M]), and coefficient of variation (sigma [S]). The outcome equations were compared with previous reference values. Prediction equations were presented in the following form: predicted value = e{a + b × ln(height) + c × ln(age) + M - spline}. The new predicted values for spirometry and their LLN derived using the LMS method were shown to more accurately reflect transitions in pulmonary function in young adults than previous prediction equations derived using conventional regression analysis in 2013. There were partial discrepancies between the new reference values and the reference values from the Global Lung Function Initiative in 2012. The results should be interpreted with caution for young adults and elderly males, particularly in terms of the LLN for forced expiratory volume in one second/forced vital capacity in elderly males. Serial spirometry follow-up, together with correlations with other clinical findings, should be emphasized in evaluating the pulmonary function of individuals. Future studies are needed to improve the accuracy of reference data and to develop continuous reference values for spirometry across all ages. © 2018 The Korean Academy of Medical Sciences.

  3. VIDAS Listeria species Xpress (LSX).

    PubMed

    Johnson, Ronald; Mills, John

    2013-01-01

    The AOAC GovVal study compared the VIDAS Listeria species Xpress (LSX) to the Health Products and Food Branch MFHPB-30 reference method for detection of Listeria on stainless steel. The LSX method utilizes a novel and proprietary enrichment media, Listeria Xpress broth, enabling detection of Listeria species in environmental samples with the automated VIDAS in a minimum of 26 h. The LSX method also includes the use of the chromogenic media, chromID Ottaviani Agosti Agar (OAA) and chromID Lmono for confirmation of LSX presumptive results. In previous AOAC validation studies comparing VIDAS LSX to the U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA-BAM) and the U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) reference methods, the LSX method was approved as AOAC Official Method 2010.02 for the detection of Listeria species in dairy products, vegetables, seafood, raw meats and poultry, and processed meats and poultry, and as AOAC Performance Tested Method 100501 in a variety of foods and on environmental surfaces. The GovVal comparative study included 20 replicate test portions each at two contamination levels for stainless steel where fractionally positive results (5-15 positive results/20 replicate portions tested) were obtained by at least one method at one level. Five uncontaminated controls were included. In the stainless steel artificially contaminated surface study, there were 25 confirmed positives by the VIDAS LSX assay and 22 confirmed positives by the standard culture methods. Chi-square analysis indicated no statistical differences between the VIDAS LSX method and the MFHPB-30 standard methods at the 5% level of significance. Confirmation of presumptive LSX results with the chromogenic OAA and Lmono media was shown to be equivalent to the appropriate reference method agars. The data in this study demonstrate that the VIDAS LSX method is an acceptable alternative method to the MFHPB-30 standard culture method for the detection of Listeria species on stainless steel.

  4. A mapping closure for turbulent scalar mixing using a time-evolving reference field

    NASA Technical Reports Server (NTRS)

    Girimaji, Sharath S.

    1992-01-01

    A general mapping-closure approach for modeling scalar mixing in homogeneous turbulence is developed. This approach is different from the previous methods in that the reference field also evolves according to the same equations as the physical scalar field. The use of a time-evolving Gaussian reference field results in a model that is similar to the mapping closure model of Pope (1991), which is based on the methodology of Chen et al. (1989). Both models yield identical relationships between the scalar variance and higher-order moments, which are in good agreement with heat conduction simulation data and can be consistent with any type of epsilon(phi) evolution. The present methodology can be extended to any reference field whose behavior is known. The possibility of a beta-pdf reference field is explored. The shortcomings of the mapping closure methods are discussed, and the limit at which the mapping becomes invalid is identified.

  5. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  6. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    PubMed

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  7. Ballistocardiogram Artifact Removal with a Reference Layer and Standard EEG Cap

    PubMed Central

    Luo, Qingfei; Huang, Xiaoshan; Glover, Gary H.

    2014-01-01

    Background In simultaneous EEG-fMRI, the EEG recordings are severely contaminated by ballistocardiogram (BCG) artifacts, which are caused by cardiac pulsations. To reconstruct and remove the BCG artifacts, one promising method is to measure the artifacts in the absence of EEG signal by placing a group of electrodes (BCG electrodes) on a conductive layer (reference layer) insulated from the scalp. However, current BCG reference layer (BRL) methods either use a customized EEG cap composed of electrode pairs, or need to construct the custom reference layer through additional model-building experiments for each EEG-fMRI experiment. These requirements have limited the versatility and efficiency of BRL. The aim of this study is to propose a more practical and efficient BRL method and compare its performance with the most popular BCG removal method, the optimal basis sets (OBS) algorithm. New Method By designing the reference layer as a permanent and reusable cap, the new BRL method is able to be used with a standard EEG cap, and no extra experiments and preparations are needed to use the BRL in an EEG-fMRI experiment. Results The BRL method effectively removed the BCG artifacts from both oscillatory and evoked potential scalp recordings and recovered the EEG signal. Comparison with Existing Method Compared to the OBS, this new BRL method improved the contrast-to-noise ratios of the alpha-wave, visual, and auditory evoked potential signals by 101%, 76%, and 75% respectively, employing 160 BCG electrodes. Using only 20 BCG electrodes, the BRL improved the EEG signal by 74%/26%/41% respectively. Conclusion The proposed method can substantially improve the EEG signal quality compared with traditional methods. PMID:24960423

  8. Evaluation of Alternative Altitude Scaling Methods for Thermal Ice Protection System in NASA Icing Research Tunnel

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Addy, Harold E. Jr.; Broeren, Andy P.; Orchard, David M.

    2017-01-01

    A test was conducted at NASA Icing Research Tunnel to evaluate altitude scaling methods for thermal ice protection system. Two new scaling methods based on Weber number were compared against a method based on Reynolds number. The results generally agreed with the previous set of tests conducted in NRCC Altitude Icing Wind Tunnel where the three methods of scaling were also tested and compared along with reference (altitude) icing conditions. In those tests, the Weber number-based scaling methods yielded results much closer to those observed at the reference icing conditions than the Reynolds number-based icing conditions. The test in the NASA IRT used a much larger, asymmetric airfoil with an ice protection system that more closely resembled designs used in commercial aircraft. Following the trends observed during the AIWT tests, the Weber number based scaling methods resulted in smaller runback ice than the Reynolds number based scaling, and the ice formed farther upstream. The results show that the new Weber number based scaling methods, particularly the Weber number with water loading scaling, continue to show promise for ice protection system development and evaluation in atmospheric icing tunnels.

  9. Modifications to the NIST reference measurement procedure (RMP) for the determination of serum glucose by isotope dilution gas chromatography/mass spectrometry.

    PubMed

    Prendergast, Jocelyn L; Sniegoski, Lorna T; Welch, Michael J; Phinney, Karen W

    2010-07-01

    The definitive method (DM), now known as the reference measurement procedure (RMP), for the analysis of glucose in serum was originally published in 1982 by the National Institute of Standards and Technology (NIST). Over the years the method has been subject to a number of modifications to adapt to newer technologies and simplify sample preparation. We discuss here an adaptation of the method associated with serum glucose measurements using a modified isotope dilution gas chromatography/mass spectrometry (ID-GC/MS) method. NIST has used this modified method to certify the concentrations of glucose in SRM 965b, Glucose in Frozen Human Serum, and SRM 1950, Metabolites in Human Plasma. Comparison of results from the revised method with certified values for existing Standard Reference Materials (SRMs) demonstrated that these modifications have not affected the quality of the measurements, giving both good precision and accuracy, while reducing the sample preparation time by a day and a half.

  10. Evaluation of Terrestrial Laser Scanner Accuracy in the Control of Hydrotechnical Structures

    NASA Astrophysics Data System (ADS)

    Muszyński, Zbigniew; Rybak, Jarosław

    2017-12-01

    In many cases of monitoring or load testing of hydrotechnical structures, the measurement results obtained from dial gauges may be affected by random or systematic errors resulting from the instability of the reference beam. For example, the measurement of wall displacement or pile settlement may be increased (or decreased) by displacements of the reference beam due to ground movement. The application of surveying methods such as high-precision levelling, motorized tacheometry or even terrestrial laser scanning makes it possible to provide an independent reference measurement free from systematic errors. It is very important in the case of walls and piles embedded in the rivers, where the construction of reference structure is even more difficult than usually. Construction of an independent reference system is also complicated when horizontal testing of sheet piles or diaphragm walls are considered. In this case, any underestimation of the horizontal displacement of an anchored or strutted construction leads to an understated value of the strut's load. These measurements are even more important during modernization works and repairs of the hydrotechnical structures. The purpose of this paper is to discuss the possibilities of using modern measurement methods for monitoring of horizontal displacements of an excavation wall. The methods under scrutiny (motorized tacheometry and terrestrial laser scanning) have been compared to classical techniques and described in the context of their practical use on the example hydrotechnical structure. This structure was a temporary cofferdam made from sheet pile wall. The research continuously conducted at Wroclaw University of Science and Technology made it possible to collect and summarize measurement results and practical experience. This paper identifies advantages and disadvantages of both analysed methods and presents a comparison of obtained measurement results of horizontal displacements. In conclusion, some recommendations have been formulated, which are relevant from the point of view of engineering practice.

  11. Applications of asymptotic confidence intervals with continuity corrections for asymmetric comparisons in noninferiority trials.

    PubMed

    Soulakova, Julia N; Bright, Brianna C

    2013-01-01

    A large-sample problem of illustrating noninferiority of an experimental treatment over a referent treatment for binary outcomes is considered. The methods of illustrating noninferiority involve constructing the lower two-sided confidence bound for the difference between binomial proportions corresponding to the experimental and referent treatments and comparing it with the negative value of the noninferiority margin. The three considered methods, Anbar, Falk-Koch, and Reduced Falk-Koch, handle the comparison in an asymmetric way, that is, only the referent proportion out of the two, experimental and referent, is directly involved in the expression for the variance of the difference between two sample proportions. Five continuity corrections (including zero) are considered with respect to each approach. The key properties of the corresponding methods are evaluated via simulations. First, the uncorrected two-sided confidence intervals can, potentially, have smaller coverage probability than the nominal level even for moderately large sample sizes, for example, 150 per group. Next, the 15 testing methods are discussed in terms of their Type I error rate and power. In the settings with a relatively small referent proportion (about 0.4 or smaller), the Anbar approach with Yates' continuity correction is recommended for balanced designs and the Falk-Koch method with Yates' correction is recommended for unbalanced designs. For relatively moderate (about 0.6) and large (about 0.8 or greater) referent proportion, the uncorrected Reduced Falk-Koch method is recommended, although in this case, all methods tend to be over-conservative. These results are expected to be used in the design stage of a noninferiority study when asymmetric comparisons are envisioned. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Modification of the BAX Salmonella test kit to include a hot start functionality (modification of AOAC Official Method 2003.09).

    PubMed

    Wallace, F Morgan; DiCosimo, Deana; Farnum, Andrew; Tice, George; Andaloro, Bridget; Davis, Eugene; Burns, Frank R

    2011-01-01

    In 2010, the BAX System PCR assay for Salmonella was modified to include a hot start functionality designed to keep the reaction enzyme inactive until PCR begins. To validate the assay's Official Methods of Analysis status to include this procedure modification, an evaluation was conducted on four food types that were simultaneously analyzed with the BAX System and either the U.S. Food and Drug Administration's Bacteriological Analytical Manual or the U.S. Department of Agriculture-Food Safety and Inspection Service Microbiology Laboratory Guidebook reference method for detecting Salmonella. Identical performance between the BAX System method and the reference methods was observed. Additionally, lysates were analyzed using both the BAX System Classic and BAX System Q7 instruments with identical results using both platforms for all samples tested. Of the 100 samples analyzed, 34 samples were positive for both the BAX System and reference methods, and 66 samples were negative by both the BAX System and reference methods, demonstrating 100% correlation. No instrument platform variation was observed. Additional inclusivity and exclusivity testing using the modified test kit demonstrated the test kit to be 100% accurate in evaluation of test panels of 352 Salmonella strains and 46 non-Salmonella strains.

  13. An aggregate method to calibrate the reference point of cumulative prospect theory-based route choice model for urban transit network

    NASA Astrophysics Data System (ADS)

    Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia

    2015-12-01

    Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.

  14. Commutability of food microbiology proficiency testing samples.

    PubMed

    Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J

    2014-03-01

    Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013 The Society for Applied Microbiology.

  15. Evaluation of stability and validation of reference genes for RT-qPCR expression studies in rice plants under water deficit.

    PubMed

    Auler, Priscila Ariane; Benitez, Letícia Carvalho; do Amaral, Marcelo Nogueira; Vighi, Isabel Lopes; Dos Santos Rodrigues, Gabriela; da Maia, Luciano Carlos; Braga, Eugenia Jacira Bolacel

    2017-05-01

    Many studies use strategies that allow for the identification of a large number of genes expressed in response to different stress conditions to which the plant is subjected throughout its cycle. In order to obtain accurate and reliable results in gene expression studies, it is necessary to use reference genes, which must have uniform expression in the majority of cells in the organism studied. RNA isolation of leaves and expression analysis in real-time quantitative polymerase chain reaction (RT-qPCR) were carried out. In this study, nine candidate reference genes were tested, actin 11 (ACT11), ubiquitin conjugated to E2 enzyme (UBC-E2), glyceraldehyde-3-phosphate dehydrogenase (GAPDH), beta tubulin (β-tubulin), eukaryotic initiation factor 4α (eIF-4α), ubiquitin 10 (UBQ10), ubiquitin 5 (UBQ5), aquaporin TIP41 (TIP41-Like) and cyclophilin, in two genotypes of rice, AN Cambará and BRS Querência, with different levels of soil moisture (20%, 10% and recovery) in the vegetative (V5) and reproductive stages (period preceding flowering). Currently, there are different softwares that perform stability analyses and define the most suitable reference genes for a particular study. In this study, we used five different methods: geNorm, BestKeeper, ΔCt method, NormFinder and RefFinder. The results indicate that UBC-E2 and UBQ5 can be used as reference genes in all samples and softwares evaluated. The genes β-tubulin and eIF-4α, traditionally used as reference genes, along with GAPDH, presented lower stability values. The gene expression of basic leucine zipper (bZIP23 and bZIP72) was used to validate the selected reference genes, demonstrating that the use of an inappropriate reference can induce erroneous results.

  16. Unsupervised change detection in a particular vegetation land cover type using spectral angle mapper

    NASA Astrophysics Data System (ADS)

    Renza, Diego; Martinez, Estibaliz; Molina, Iñigo; Ballesteros L., Dora M.

    2017-04-01

    This paper presents a new unsupervised change detection methodology for multispectral images applied to specific land covers. The proposed method involves comparing each image against a reference spectrum, where the reference spectrum is obtained from the spectral signature of the type of coverage you want to detect. In this case the method has been tested using multispectral images (SPOT5) of the community of Madrid (Spain), and multispectral images (Quickbird) of an area over Indonesia that was impacted by the December 26, 2004 tsunami; here, the tests have focused on the detection of changes in vegetation. The image comparison is obtained by applying Spectral Angle Mapper between the reference spectrum and each multitemporal image. Then, a threshold to produce a single image of change is applied, which corresponds to the vegetation zones. The results for each multitemporal image are combined through an exclusive or (XOR) operation that selects vegetation zones that have changed over time. Finally, the derived results were compared against a supervised method based on classification with the Support Vector Machine. Furthermore, the NDVI-differencing and the Spectral Angle Mapper techniques were selected as unsupervised methods for comparison purposes. The main novelty of the method consists in the detection of changes in a specific land cover type (vegetation), therefore, for comparison purposes, the best scenario is to compare it with methods that aim to detect changes in a specific land cover type (vegetation). This is the main reason to select NDVI-based method and the post-classification method (SVM implemented in a standard software tool). To evaluate the improvements using a reference spectrum vector, the results are compared with the basic-SAM method. In SPOT5 image, the overall accuracy was 99.36% and the κ index was 90.11%; in Quickbird image, the overall accuracy was 97.5% and the κ index was 82.16%. Finally, the precision results of the method are comparable to those of a supervised method, supported by low detection of false positives and false negatives, along with a high overall accuracy and a high kappa index. On the other hand, the execution times were comparable to those of unsupervised methods of low computational load.

  17. Preparation and value assignment of standard reference material 968e fat-soluble vitamins, carotenoids, and cholesterol in human serum.

    PubMed

    Thomas, Jeanice B; Duewer, David L; Mugenya, Isaac O; Phinney, Karen W; Sander, Lane C; Sharpless, Katherine E; Sniegoski, Lorna T; Tai, Susan S; Welch, Michael J; Yen, James H

    2012-01-01

    Standard Reference Material 968e Fat-Soluble Vitamins, Carotenoids, and Cholesterol in Human Serum provides certified values for total retinol, γ- and α-tocopherol, total lutein, total zeaxanthin, total β-cryptoxanthin, total β-carotene, 25-hydroxyvitamin D(3), and cholesterol. Reference and information values are also reported for nine additional compounds including total α-cryptoxanthin, trans- and total lycopene, total α-carotene, trans-β-carotene, and coenzyme Q(10). The certified values for the fat-soluble vitamins and carotenoids in SRM 968e were based on the agreement of results from the means of two liquid chromatographic methods used at the National Institute of Standards and Technology (NIST) and from the median of results of an interlaboratory comparison exercise among institutions that participate in the NIST Micronutrients Measurement Quality Assurance Program. The assigned values for cholesterol and 25-hydroxyvitamin D(3) in the SRM are the means of results obtained using the NIST reference method based upon gas chromatography-isotope dilution mass spectrometry and liquid chromatography-isotope dilution tandem mass spectrometry, respectively. SRM 968e is currently one of two available health-related NIST reference materials with concentration values assigned for selected fat-soluble vitamins, carotenoids, and cholesterol in human serum matrix. This SRM is used extensively by laboratories worldwide primarily to validate methods for determining these analytes in human serum and plasma and for assigning values to in-house control materials. The value assignment of the analytes in this SRM will help support measurement accuracy and traceability for laboratories performing health-related measurements in the clinical and nutritional communities.

  18. Development of a reference material for routine performance monitoring of methods measuring polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and dioxin-like polychlorinated biphenyls.

    PubMed

    Selliah, S S; Cussion, S; MacPherson, K A; Reiner, E J; Toner, D

    2001-06-01

    Matrix-matched environmental certified reference materials (CRMs) are one of the most useful tools to validate analytical methods, assess analytical laboratory performance and to assist in the resolution of data conflicts between laboratories. This paper describes the development of a lake sediment as a CRM for polychorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (DLPCBs). The presence of DLPCBs in the environment is of increased concern and analytical methods are being developed internationally for monitoring DLPCBs in the environment. This paper also reports the results of an international interlaboratory study involving thirty-five laboratories from seventeen countries, conducted to characterize and validate levels of a sediment reference material for PCDDs, PCDFs and DLPCBs.

  19. Necklace: combining reference and assembled transcriptomes for more comprehensive RNA-Seq analysis.

    PubMed

    Davidson, Nadia M; Oshlack, Alicia

    2018-05-01

    RNA sequencing (RNA-seq) analyses can benefit from performing a genome-guided and de novo assembly, in particular for species where the reference genome or the annotation is incomplete. However, tools for integrating an assembled transcriptome with reference annotation are lacking. Necklace is a software pipeline that runs genome-guided and de novo assembly and combines the resulting transcriptomes with reference genome annotations. Necklace constructs a compact but comprehensive superTranscriptome out of the assembled and reference data. Reads are subsequently aligned and counted in preparation for differential expression testing. Necklace allows a comprehensive transcriptome to be built from a combination of assembled and annotated transcripts, which results in a more comprehensive transcriptome for the majority of organisms. In addition RNA-seq data are mapped back to this newly created superTranscript reference to enable differential expression testing with standard methods.

  20. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa.

    PubMed

    De Baetselier, Irith; Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. We report on an easy and cost-saving method to verify CRRs. Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory.

  1. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  2. Evaluation of Method-Specific Extraction Variability for the Measurement of Fatty Acids in a Candidate Infant/Adult Nutritional Formula Reference Material.

    PubMed

    Place, Benjamin J

    2017-05-01

    To address community needs, the National Institute of Standards and Technology has developed a candidate Standard Reference Material (SRM) for infant/adult nutritional formula based on milk and whey protein concentrates with isolated soy protein called SRM 1869 Infant/Adult Nutritional Formula. One major component of this candidate SRM is the fatty acid content. In this study, multiple extraction techniques were evaluated to quantify the fatty acids in this new material. Extraction methods that were based on lipid extraction followed by transesterification resulted in lower mass fraction values for all fatty acids than the values measured by methods utilizing in situ transesterification followed by fatty acid methyl ester extraction (ISTE). An ISTE method, based on the identified optimal parameters, was used to determine the fatty acid content of the new infant/adult nutritional formula reference material.

  3. Comparison of a New Cobinamide-Based Method to a Standard Laboratory Method for Measuring Cyanide in Human Blood

    PubMed Central

    Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.

    2013-01-01

    Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045

  4. Calibration test of the temperature and strain sensitivity coefficient in regional reference grating method

    NASA Astrophysics Data System (ADS)

    Wu, Jing; Huang, Junbing; Wu, Hanping; Gu, Hongcan; Tang, Bo

    2014-12-01

    In order to verify the validity of the regional reference grating method in solve the strain/temperature cross sensitive problem in the actual ship structural health monitoring system, and to meet the requirements of engineering, for the sensitivity coefficients of regional reference grating method, national standard measurement equipment is used to calibrate the temperature sensitivity coefficient of selected FBG temperature sensor and strain sensitivity coefficient of FBG strain sensor in this modal. And the thermal expansion sensitivity coefficient of the steel for ships is calibrated with water bath method. The calibration results show that the temperature sensitivity coefficient of FBG temperature sensor is 28.16pm/°C within -10~30°C, and its linearity is greater than 0.999, the strain sensitivity coefficient of FBG strain sensor is 1.32pm/μɛ within -2900~2900μɛ whose linearity is almost to 1, the thermal expansion sensitivity coefficient of the steel for ships is 23.438pm/°C within 30~90°C, and its linearity is greater than 0.998. Finally, the calibration parameters are used in the actual ship structure health monitoring system for temperature compensation. The results show that the effect of temperature compensation is good, and the calibration parameters meet the engineering requirements, which provide an important reference for fiber Bragg grating sensor is widely used in engineering.

  5. Numerical approach to reference identification of Staphylococcus, Stomatococcus, and Micrococcus spp.

    PubMed

    Rhoden, D L; Hancock, G A; Miller, J M

    1993-03-01

    A numerical-code system for the reference identification of Staphylococcus species, Stomatococcus mucilaginosus, and Micrococcus species was established by using a selected panel of conventional biochemicals. Results from 824 cultures (289 eye isolate cultures, 147 reference strains, and 388 known control strains) were used to generate a list of 354 identification code numbers. Each six-digit code number was based on results from 18 conventional biochemical reactions. Seven milliliters of purple agar base with 1% sterile carbohydrate solution added was poured into 60-mm-diameter agar plates. All biochemical tests were inoculated with 1 drop of a heavy broth suspension, incubated at 35 degrees C, and read daily for 3 days. All reactions were read and interpreted by the method of Kloos et al. (G. A. Hebert, C. G. Crowder, G. A. Hancock, W. R. Jarvis, and C. Thornsberry, J. Clin. Microbiol. 26:1939-1949, 1988; W. E. Kloos and D. W. Lambe, Jr., P. 222-237, in A. Balows, W. J. Hansler, Jr., K. L. Herrmann, H. D. Isenberg, and H. J. Shadomy, ed., Manual of Clinical Microbiology, 5th ed., 1991). This modified reference identification method was 96 to 98% accurate and could have value in reference and public health laboratory settings.

  6. Feature weighting using particle swarm optimization for learning vector quantization classifier

    NASA Astrophysics Data System (ADS)

    Dongoran, A.; Rahmadani, S.; Zarlis, M.; Zakarias

    2018-03-01

    This paper discusses and proposes a method of feature weighting in classification assignments on competitive learning artificial neural network LVQ. The weighting feature method is the search for the weight of an attribute using the PSO so as to give effect to the resulting output. This method is then applied to the LVQ-Classifier and tested on the 3 datasets obtained from the UCI Machine Learning repository. Then an accuracy analysis will be generated by two approaches. The first approach using LVQ1, referred to as LVQ-Classifier and the second approach referred to as PSOFW-LVQ, is a proposed model. The result shows that the PSO algorithm is capable of finding attribute weights that increase LVQ-classifier accuracy.

  7. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.

    1977-01-01

    Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.

  8. Effect of EEG Referencing Methods on Auditory Mismatch Negativity

    PubMed Central

    Mahajan, Yatin; Peter, Varghese; Sharma, Mridula

    2017-01-01

    Auditory event-related potentials (ERPs) have consistently been used in the investigation of auditory and cognitive processing in the research and clinical laboratories. There is currently no consensus on the choice of appropriate reference for auditory ERPs. The most commonly used references in auditory ERP research are the mathematically linked-mastoids (LM) and average referencing (AVG). Since LM and AVG referencing procedures do not solve the issue of electrically-neutral reference, Reference Electrode Standardization Technique (REST) was developed to create a neutral reference for EEG recordings. The aim of the current research is to compare the influence of the reference on amplitude and latency of auditory mismatch negativity (MMN) as a function of magnitude of frequency deviance across three commonly used electrode montages (16, 32, and 64-channel) using REST, LM, and AVG reference procedures. The current study was designed to determine if the three reference methods capture the variation in amplitude and latency of MMN with the deviance magnitude. We recorded MMN from 12 normal hearing young adults in an auditory oddball paradigm with 1,000 Hz pure tone as standard and 1,030, 1,100, and 1,200 Hz as small, medium and large frequency deviants, respectively. The EEG data recorded to these sounds was re-referenced using REST, LM, and AVG methods across 16-, 32-, and 64-channel EEG electrode montages. Results revealed that while the latency of MMN decreased with increment in frequency of deviant sounds, no effect of frequency deviance was present for amplitude of MMN. There was no effect of referencing procedure on the experimental effect tested. The amplitude of MMN was largest when the ERP was computed using LM referencing and the REST referencing produced the largest amplitude of MMN for 64-channel montage. There was no effect of electrode-montage on AVG referencing induced ERPs. Contrary to our predictions, the results suggest that the auditory MMN elicited as a function of increments in frequency deviance does not depend on the choice of referencing procedure. The results also suggest that auditory ERPs generated using REST referencing is contingent on the electrode arrays more than the AVG referencing. PMID:29066945

  9. Single-trial event-related potential extraction through one-unit ICA-with-reference

    NASA Astrophysics Data System (ADS)

    Lih Lee, Wee; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong

    2016-12-01

    Objective. In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. Approach. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Main results. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. Significance. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.

  10. Design of two-dimensional zero reference codes with cross-entropy method.

    PubMed

    Chen, Jung-Chieh; Wen, Chao-Kai

    2010-06-20

    We present a cross-entropy (CE)-based method for the design of optimum two-dimensional (2D) zero reference codes (ZRCs) in order to generate a zero reference signal for a grating measurement system and achieve absolute position, a coordinate origin, or a machine home position. In the absence of diffraction effects, the 2D ZRC design problem is known as the autocorrelation approximation. Based on the properties of the autocorrelation function, the design of the 2D ZRC is first formulated as a particular combination optimization problem. The CE method is then applied to search for an optimal 2D ZRC and thus obtain the desirable zero reference signal. Computer simulation results indicate that there are 15.38% and 14.29% reductions in the second maxima value for the 16x16 grating system with n(1)=64 and the 100x100 grating system with n(1)=300, respectively, where n(1) is the number of transparent pixels, compared with those of the conventional genetic algorithm.

  11. 40 CFR 53.16 - Supersession of reference methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Supersession of reference methods. 53... (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions § 53.16 Supersession of reference methods. (a) This section prescribes procedures and criteria applicable to requests that...

  12. Harmonized Reference Ranges for Circulating Testosterone Levels in Men of Four Cohort Studies in the United States and Europe

    PubMed Central

    Travison, Thomas G.; Vesper, Hubert W.; Orwoll, Eric; Wu, Frederick; Kaufman, Jean Marc; Wang, Ying; Lapauw, Bruno; Fiers, Tom; Matsumoto, Alvin M.

    2017-01-01

    Background: Reference ranges for testosterone are essential for making a diagnosis of hypogonadism in men. Objective: To establish harmonized reference ranges for total testosterone in men that can be applied across laboratories by cross-calibrating assays to a reference method and standard. Population: The 9054 community-dwelling men in cohort studies in the United States and Europe: Framingham Heart Study; European Male Aging Study; Osteoporotic Fractures in Men Study; and Male Sibling Study of Osteoporosis. Methods: Testosterone concentrations in 100 participants in each of the four cohorts were measured using a reference method at Centers for Disease Control and Prevention (CDC). Generalized additive models and Bland-Altman analyses supported the use of normalizing equations for transformation between cohort-specific and CDC values. Normalizing equations, generated using Passing-Bablok regression, were used to generate harmonized values, which were used to derive standardized, age-specific reference ranges. Results: Harmonization procedure reduced intercohort variation between testosterone measurements in men of similar ages. In healthy nonobese men, 19 to 39 years, harmonized 2.5th, 5th, 50th, 95th, and 97.5th percentile values were 264, 303, 531, 852, and 916 ng/dL, respectively. Age-specific harmonized testosterone concentrations in nonobese men were similar across cohorts and greater than in all men. Conclusion: Harmonized normal range in a healthy nonobese population of European and American men, 19 to 39 years, is 264 to 916 ng/dL. A substantial proportion of intercohort variation in testosterone levels is due to assay differences. These data demonstrate the feasibility of generating harmonized reference ranges for testosterone that can be applied to assays, which have been calibrated to a reference method and calibrator. PMID:28324103

  13. Identification and validation of suitable reference genes for RT-qPCR analysis in mouse testis development.

    PubMed

    Gong, Zu-Kang; Wang, Shuang-Jie; Huang, Yong-Qi; Zhao, Rui-Qiang; Zhu, Qi-Fang; Lin, Wen-Zhen

    2014-12-01

    RT-qPCR is a commonly used method for evaluating gene expression; however, its accuracy and reliability are dependent upon the choice of appropriate reference gene(s), and there is limited information available on suitable reference gene(s) that can be used in mouse testis at different stages. In this study, using the RT-qPCR method, we investigated the expression variations of six reference genes representing different functional classes (Actb, Gapdh, Ppia, Tbp, Rps29, Hprt1) in mice testis during embryonic and postnatal development. The expression stabilities of putative reference genes were evaluated using five algorithms: geNorm, NormFinder, Bestkeeper, the comparative delta C(t) method and integrated tool RefFinder. Analysis of the results showed that Ppia, Gapdh and Actb were identified as the most stable genes and the geometric mean of Ppia, Gapdh and Actb constitutes an appropriate normalization factor for gene expression studies. The mRNA expression of AT1 as a test gene of interest varied depending upon which of the reference gene(s) was used as an internal control(s). This study suggested that Ppia, Gapdh and Actb are suitable reference genes among the six genes used for RT-qPCR normalization and provide crucial information for transcriptional analyses in future studies of gene expression in the developing mouse testis.

  14. Noise Estimation and Quality Assessment of Gaussian Noise Corrupted Images

    NASA Astrophysics Data System (ADS)

    Kamble, V. M.; Bhurchandi, K.

    2018-03-01

    Evaluating the exact quantity of noise present in an image and quality of an image in the absence of reference image is a challenging task. We propose a near perfect noise estimation method and a no reference image quality assessment method for images corrupted by Gaussian noise. The proposed methods obtain initial estimate of noise standard deviation present in an image using the median of wavelet transform coefficients and then obtains a near to exact estimate using curve fitting. The proposed noise estimation method provides the estimate of noise within average error of +/-4%. For quality assessment, this noise estimate is mapped to fit the Differential Mean Opinion Score (DMOS) using a nonlinear function. The proposed methods require minimum training and yields the noise estimate and image quality score. Images from Laboratory for image and Video Processing (LIVE) database and Computational Perception and Image Quality (CSIQ) database are used for validation of the proposed quality assessment method. Experimental results show that the performance of proposed quality assessment method is at par with the existing no reference image quality assessment metric for Gaussian noise corrupted images.

  15. Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children.

    PubMed

    Lee, Hye Ryun; Shin, Sue; Yoon, Jong Hyun; Roh, Eun Youn; Chang, Ju Young

    2016-09-01

    Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age.

  16. Crop Field Reflectance Measurements

    NASA Astrophysics Data System (ADS)

    Weber, Christian; Schinca, Daniel C.; Tocho, Jorge O.; Videla, Fabian

    2008-04-01

    We present in this paper the results of reflectance measurements performed with a three-band passive radiometer with independent channels for solar irradiance reference. The comparative operation of the traditional method that alternatively uses measurements of the field (downward-looking) and a white panel for reference (downard-looking) and the new approach that involves duplicated spectral channels, each one with its own difuser that point upwards to the zenith direction (upward-looking) is analyzed. The results indicated that the latter method is more suitable for use with passive sensors under rapid changing atmospheric conditions (such as clouds, dust, mist, smog and other scatterers), since a more reliable synchronic record of reference and incident light is achieved. Besides, having separate channels for the reference and the signal allow a better balancing of gains in the amplifiers for each spectral channel. We show the results obtained in the determination of the Normalized Difference Vegetation Index (NDVI) corresponding to 2006 and 2007 field experiments concerning weeds detection and fertilizer levels assessing in wheat, to refine sensor-based fertilizer nitrogen rate recommendations. It is also shown the variation of the radiometric normalization measurements taken at noon (nadir solar position) for the whole culture cycle corresponding to two seasons (winter and spring).

  17. Certification of caffeine reference material purity by ultraviolet/visible spectrophotometry and high-performance liquid chromatography with diode-array detection as two independent analytical methods.

    PubMed

    Shehata, A B; Rizk, M S; Rend, E A

    2016-10-01

    Caffeine reference material certified for purity is produced worldwide, but no research work on the details of the certification process has been published in the literature. In this paper, we report the scientific details of the preparation and certification of pure caffeine reference materials. Caffeine was prepared by extraction from roasted and ground coffee by dichloromethane after heating in deionized water mixed with magnesium oxide. The extract was purified, dried, and bottled in dark glass vials. Stratified random selection was applied to select a number of vials for homogeneity and stability studies, which revealed that the prepared reference material is homogeneous and sufficiently stable. Quantification of caffeine purity % was carried out using a calibrated UV/visible spectrophotometer and a calibrated high-performance liquid chromatography with diode-array detection method. The results obtained from both methods were combined to drive the certified value and its associated uncertainty. The certified value of the reference material purity was found to be 99.86% and its associated uncertainty was ±0.65%, which makes the candidate reference material a very useful calibrant in food and drug chemical analysis. Copyright © 2016. Published by Elsevier B.V.

  18. Paradigm Diagnostics Salmonella Indicator Broth (PDX-SIB) for detection of Salmonella on selected environmental surfaces.

    PubMed

    Olstein, Alan; Griffith, Leena; Feirtag, Joellen; Pearson, Nicole

    2013-01-01

    The Paradigm Diagnostics Salmonella Indicator Broth (PDX-SIB) is intended as a single-step selective enrichment indicator broth to be used as a simple screening test for the presence of Salmonella spp. in environmental samples. This method permits the end user to avoid multistep sample processing to identify presumptively positive samples, as exemplified by standard U.S. reference methods. PDX-SIB permits the outgrowth of Salmonella while inhibiting the growth of competitive Gram-negative and -positive microflora. Growth of Salmonella-positive cultures results in a visual color change of the medium from purple to yellow when the sample is grown at 37 +/- 1 degree C. Performance of PDX-SIB has been evaluated in five different categories: inclusivity-exclusivity, methods comparison, ruggedness, lot-to-lot variability, and shelf stability. The inclusivity panel included 100 different Salmonella serovars, 98 of which were SIB-positive during the 30 to 48 h incubation period. The exclusivity panel included 33 different non-Salmonella microorganisms, 31 of which were SIB-negative during the incubation period. Methods comparison studies included four different surfaces: S. Newport on plastic, S. Anatum on sealed concrete, S. Abaetetuba on ceramic tile, and S. Typhimurium in the presence of 1 log excess of Citrobacter freundii. Results of the methods comparison studies demonstrated no statistical difference between the SIB method and the U.S. Food and Drug Administration-Bacteriological Analytical Manual reference method, as measured by the Mantel-Haenszel Chi-square test. Ruggedness studies demonstrated little variation in test results when SIB incubation temperatures were varied over a 34-40 degrees C range. Lot-to-lot consistency results suggest no detectable differences in manufactured goods using two reference Salmonella serovars and one non-Salmonella microorganism.

  19. Development of traceable measurement of the diffuse optical properties of solid reference standards for biomedical optics at National Institute of Standards and Technology.

    PubMed

    Lemaillet, Paul; Bouchard, Jean-Pierre; Allen, David W

    2015-07-01

    The development of a national reference instrument dedicated to the measurement of the scattering and absorption properties of solid tissue-mimicking phantoms used as reference standards is presented. The optical properties of the phantoms are measured with a double-integrating sphere setup in the steady-state domain, coupled with an inversion routine of the adding-doubling procedure that allows for the computation of the uncertainty budget for the measurements. The results are compared to the phantom manufacturer's values obtained by a time-resolved approach. The results suggest that the agreement between these two independent methods is within the estimated uncertainties. This new reference instrument will provide optical biomedical research laboratories with reference values for absolute diffuse optical properties of phantom materials.

  20. A novel and eco-friendly analytical method for phosphorus and sulfur determination in animal feed.

    PubMed

    Novo, Diogo L R; Pereira, Rodrigo M; Costa, Vanize C; Hartwig, Carla A; Mesko, Marcia F

    2018-04-25

    An eco-friendly method for indirect determining phosphorus and sulfur in animal feed by ion chromatography was proposed. Using this method, it was possible to digest 500 mg of animal feed in a microwave system under oxygen pressure (20 bar) using only a diluted acid solution (2 mol L -1 HNO 3 ). The accuracy of the proposed method was evaluated by recovery tests, by analysis of reference material (RM) and by comparison of the results with those obtained using conventional microwave-assisted digestion. Moreover, P results were compared with those obtained from the method recommended by AOAC International for animal feed (Method nr. 965.17) and no significant differences were found between the results. Recoveries for P and S were between 94 and 97%, and agreements with the reference values of RM were better than 94%. Phosphorus and S concentrations in animal feeds ranged from 10,026 to 28,357 mg kg -1 and 2259 to 4601 mg kg -1 , respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Evaluation of the validity of a rapid method for measuring high and low haemoglobin levels in whole blood donors.

    PubMed

    Shahshahani, Hayedeh J; Meraat, Nahid; Mansouri, Fatemeh

    2013-07-01

    Haemoglobin screening methods need to be highly sensitive to detect both low and high haemoglobin levels and avoid unnecessary rejection of potential blood donors. The aim of this study was to evaluate the accuracy of measurements by HemoCue in blood donors. Three hundred and fourteen randomly selected, prospective blood donors were studied. Single fingerstick blood samples were obtained to determine the donors' haemoglobin levels by HemoCue, while venous blood samples were drawn for measurement of the haemoglobin level by both HemoCue and an automated haematology analyser as the reference method. The sensitivity, specificity, predictive values and correlation between the reference method and HemoCue were assessed. Cases with a haemoglobin concentration in the range of 12.5-17.9 g/dL were accepted for blood donation. Analysis of paired results showed that haemoglobin levels measured by HemoCue were higher than those measured by the reference method. There was a significant correlation between the reference method and HemoCue for haemoglobin levels less than 12.5 g/dL. The correlation was less strong for increasing haemoglobin levels. Linear correlation was poor for haemoglobin levels over 18 g/dL. Thirteen percent of donors, who had haemoglobin levels close to the upper limit, were unnecessarily rejected. HemoCue is suitable for screening for anaemia in blood donors. Most donors at Yazd are males and a significant percentage of them have haemoglobin values close to the upper limit for acceptance as a blood donor; since these subjects could be unnecessarily rejected on the basis of HemoCue results and testing with this method is expensive, it is recommended that qualitative methods are used for primary screening and accurate quantitative methods used in clinically suspicious cases or when qualitative methods fail.

  2. USDA FSIS, FDA BAM, and ISO culture methods BD BBL CHROMagar O157 media.

    PubMed

    Ritter, Vicki; Kircher, Susan; Dick, Nancy

    2009-01-01

    BBL CHROMagar O157 media (CO) was evaluated for detection of Escherichia coli O157:H7 in raw ground beef and unpasteurized apple cider. The recovery of E. coli O157:H7 on CO was compared to the U.S. Food and Drug Administration (FDA) Bacteriological Analytical Manual (BAM), U.S. Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS), and International Organization for Standardization (ISO) reference-plated media using the recommended enrichment broths. Of the 180 food samples tested, 45 were tested using BAM, 45 using the USDA method, and 90 using the ISO method. CO produced comparable results with the reference methods on all matrixes with a sensitivity of 100% and a specificity of 100%. No false negatives were found in testing the food matrixes. There was no statistical difference in recovery based on Chi-square analysis. Method agreement for raw ground beef was 85% for the USDAFSIS method and 95% for the ISO method. Method agreement for unpasteurized apple cider was 100% for the ISO and FDA BAM methods. In all cases where method agreement was <100%, CO detected more positives than the reference method media. Evaluation of known isolates on CO in inclusivity and exclusivity testing had a sensitivity and specificity of 100%. The results of this study demonstrate that CO is an effective medium for the recovery and detection of E. coli O157:H7 in raw ground beef and unpasteurized apple cider using FDA BAM, USDA FSIS, and ISO methods.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal

    In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.

  4. A method for evaluating the relation between sound source segregation and masking

    PubMed Central

    Lutfi, Robert A.; Liu, Ching-Ju

    2011-01-01

    Sound source segregation refers to the ability to hear as separate entities two or more sound sources comprising a mixture. Masking refers to the ability of one sound to make another sound difficult to hear. Often in studies, masking is assumed to result from a failure of segregation, but this assumption may not always be correct. Here a method is offered to identify the relation between masking and sound source segregation in studies and an example is given of its application. PMID:21302979

  5. Characterisation of a reference site for quantifying uncertainties related to soil sampling.

    PubMed

    Barbizzi, Sabrina; de Zorzi, Paolo; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter

    2004-01-01

    The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the "fit-for-purpose" method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated.

  6. Discrimination of human and nonhuman blood using Raman spectroscopy with self-reference algorithm

    NASA Astrophysics Data System (ADS)

    Bian, Haiyi; Wang, Peng; Wang, Jun; Yin, Huancai; Tian, Yubing; Bai, Pengli; Wu, Xiaodong; Wang, Ning; Tang, Yuguo; Gao, Jing

    2017-09-01

    We report a self-reference algorithm to discriminate human and nonhuman blood by calculating the ratios of identification Raman peaks to reference Raman peaks and choosing appropriate threshold values. The influence of using different reference peaks and identification peaks was analyzed in detail. The Raman peak at 1003 cm-1 was proved to be a stable reference peak to avoid the influencing factors, such as the incident laser intensity and the amount of sample. The Raman peak at 1341 cm-1 was found to be an efficient identification peak, which indicates that the difference between human and nonhuman blood results from the C-H bend in tryptophan. The comparison between self-reference algorithm and partial least square method was made. It was found that the self-reference algorithm not only obtained the discrimination results with the same accuracy, but also provided information on the difference of chemical composition. In addition, the performance of self-reference algorithm whose true positive rate is 100% is significant for customs inspection to avoid genetic disclosure and forensic science.

  7. A fast and automatic mosaic method for high-resolution satellite images

    NASA Astrophysics Data System (ADS)

    Chen, Hongshun; He, Hui; Xiao, Hongyu; Huang, Jing

    2015-12-01

    We proposed a fast and fully automatic mosaic method for high-resolution satellite images. First, the overlapped rectangle is computed according to geographical locations of the reference and mosaic images and feature points on both the reference and mosaic images are extracted by a scale-invariant feature transform (SIFT) algorithm only from the overlapped region. Then, the RANSAC method is used to match feature points of both images. Finally, the two images are fused into a seamlessly panoramic image by the simple linear weighted fusion method or other method. The proposed method is implemented in C++ language based on OpenCV and GDAL, and tested by Worldview-2 multispectral images with a spatial resolution of 2 meters. Results show that the proposed method can detect feature points efficiently and mosaic images automatically.

  8. Solution of steady and unsteady transonic-vortex flows using Euler and full-potential equations

    NASA Technical Reports Server (NTRS)

    Kandil, Osama A.; Chuang, Andrew H.; Hu, Hong

    1989-01-01

    Two methods are presented for inviscid transonic flows: unsteady Euler equations in a rotating frame of reference for transonic-vortex flows and integral solution of full-potential equation with and without embedded Euler domains for transonic airfoil flows. The computational results covered: steady and unsteady conical vortex flows; 3-D steady transonic vortex flow; and transonic airfoil flows. The results are in good agreement with other computational results and experimental data. The rotating frame of reference solution is potentially efficient as compared with the space fixed reference formulation with dynamic gridding. The integral equation solution with embedded Euler domain is computationally efficient and as accurate as the Euler equations.

  9. Extraction of the gate capacitance coupling coefficient in floating gate non-volatile memories: Statistical study of the effect of mismatching between floating gate memory and reference transistor in dummy cell extraction methods

    NASA Astrophysics Data System (ADS)

    Rafhay, Quentin; Beug, M. Florian; Duane, Russell

    2007-04-01

    This paper presents an experimental comparison of dummy cell extraction methods of the gate capacitance coupling coefficient for floating gate non-volatile memory structures from different geometries and technologies. These results show the significant influence of mismatching floating gate devices and reference transistors on the extraction of the gate capacitance coupling coefficient. In addition, it demonstrates the accuracy of the new bulk bias dummy cell extraction method and the importance of the β function, introduced recently in [Duane R, Beug F, Mathewson A. Novel capacitance coupling coefficient measurement methodology for floating gate non-volatile memory devices. IEEE Electr Dev Lett 2005;26(7):507-9], to determine matching pairs of floating gate memory and reference transistor.

  10. Block correlated second order perturbation theory with a generalized valence bond reference function.

    PubMed

    Xu, Enhua; Li, Shuhua

    2013-11-07

    The block correlated second-order perturbation theory with a generalized valence bond (GVB) reference (GVB-BCPT2) is proposed. In this approach, each geminal in the GVB reference is considered as a "multi-orbital" block (a subset of spin orbitals), and each occupied or virtual spin orbital is also taken as a single block. The zeroth-order Hamiltonian is set to be the summation of the individual Hamiltonians of all blocks (with explicit two-electron operators within each geminal) so that the GVB reference function and all excited configuration functions are its eigenfunctions. The GVB-BCPT2 energy can be directly obtained without iteration, just like the second order Mo̸ller-Plesset perturbation method (MP2), both of which are size consistent. We have applied this GVB-BCPT2 method to investigate the equilibrium distances and spectroscopic constants of 7 diatomic molecules, conformational energy differences of 8 small molecules, and bond-breaking potential energy profiles in 3 systems. GVB-BCPT2 is demonstrated to have noticeably better performance than MP2 for systems with significant multi-reference character, and provide reasonably accurate results for some systems with large active spaces, which are beyond the capability of all CASSCF-based methods.

  11. Visualization of the IMIA Yearbook of Medical Informatics Publications over the Last 25 Years

    PubMed Central

    Tam-Tham, H.; Minty, E. P.

    2016-01-01

    Summary Background The last 25 years have been a period of innovation in the area of medical informatics. The International Medical Informatics Association (IMIA) has published, every year for the last quarter century, the Yearbook of Medical Informatics, collating selected papers from various journals in an attempt to provide a summary of the academic medical informatics literature. The objective of this paper is to visualize the evolution of the medical informatics field over the last 25 years according to the frequency of word occurrences in the papers published in the IMIA Yearbook of Medical Informatics. Methods A literature review was conducted examining the IMIA Yearbook of Medical Informatics between 1992 and 2015. These references were collated into a reference manager application to examine the literature using keyword searches, word clouds, and topic clustering. The data was considered in its entirety, as well as segregated into 3 time periods to examine the evolution of main trends over time. Several methods were used, including word clouds, cluster maps, and custom developed web-based information dashboards. Results The literature search resulted in a total of 1210 references published in the Yearbook, of which 213 references were excluded, resulting in 997 references for visualization. Overall, we found that publications were more technical and methods-oriented between 1992 and 1999; more clinically and patient-oriented between 2000 and 2009; and noted the emergence of “big data”, decision support, and global health in the past decade between 2010 and 2015. Dashboards were additionally created to show individual reference data, as well as, aggregated information. Conclusion Medical informatics is a vast and expanding area with new methods and technologies being researched, implemented, and evaluated. Determining visualization approaches that enhance our understanding of literature is an active area of research, and like medical informatics, is constantly evolving as new software and algorithms are developed. This paper examined several approaches for visualizing the medical informatics literature to show historical trends, associations, and aggregated summarized information to illustrate the state and changes in the IMIA Yearbook publications over the last quarter century. PMID:27362591

  12. Homogeneity study of a corn flour laboratory reference material candidate for inorganic analysis.

    PubMed

    Dos Santos, Ana Maria Pinto; Dos Santos, Liz Oliveira; Brandao, Geovani Cardoso; Leao, Danilo Junqueira; Bernedo, Alfredo Victor Bellido; Lopes, Ricardo Tadeu; Lemos, Valfredo Azevedo

    2015-07-01

    In this work, a homogeneity study of a corn flour reference material candidate for inorganic analysis is presented. Seven kilograms of corn flour were used to prepare the material, which was distributed among 100 bottles. The elements Ca, K, Mg, P, Zn, Cu, Fe, Mn and Mo were quantified by inductively coupled plasma optical emission spectrometry (ICP OES) after acid digestion procedure. The method accuracy was confirmed by analyzing the rice flour certified reference material, NIST 1568a. All results were evaluated by analysis of variance (ANOVA) and principal component analysis (PCA). In the study, a sample mass of 400mg was established as the minimum mass required for analysis, according to the PCA. The between-bottle test was performed by analyzing 9 bottles of the material. Subsamples of a single bottle were analyzed for the within-bottle test. No significant differences were observed for the results obtained through the application of both statistical methods. This fact demonstrates that the material is homogeneous for use as a laboratory reference material. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Edge Triggered Apparatus and Method for Measuring Strain in Bragg Gratings

    NASA Technical Reports Server (NTRS)

    Froggatt, Mark E. (Inventor)

    2003-01-01

    An apparatus and method for measuring strain of gratings written into an optical fiber. Optical radiation is transmitted over one or more contiguous predetermined wavelength ranges into a reference optical fiber network and an optical fiber network under test to produce a plurality of reference interference fringes and measurement interference fringes, respectively. The reference and measurement fringes are detected, and the reference fringes trigger the sampling of the measurement fringes. This results in the measurement fringes being sampled at 2(pi) increments of the reference fringes. Each sampled measurement fringe of each wavelength sweep is transformed into a spatial domain waveform. The spatial domain waveforms are summed to form a summation spatial domain waveform that is used to determine location of each grating with respect to a reference reflector. A portion of each spatial domain waveform that corresponds to a particular grating is determined and transformed into a corresponding frequency spectrum representation. The strain on the grating at each wavelength of optical radiation is determined by determining the difference between the current wavelength and an earlier, zero-strain wavelength measurement.

  14. WHO Melting-Point Reference Substances

    PubMed Central

    Bervenmark, H.; Diding, N. Å.; Öhrner, B.

    1963-01-01

    Batches of 13 highly purified chemicals, intended for use as reference substances in the calibration of apparatus for melting-point determinations, have been subjected to a collaborative assay by 15 laboratories in 13 countries. All the laboratories performed melting-point determinations by the capillary methods described in the proposed text for the second edition of the Pharmacopoea Internationalis and some, in addition, carried out determinations by the microscope hot stage (Kofler) method, using both the “going-through” and the “equilibrium” technique. Statistical analysis of the data obtained by the capillary method showed that the within-laboratory variation was small and that the between-laboratory variation, though constituting the greatest part of the whole variance, was not such as to warrant the exclusion of any laboratory from the evaluation of the results. The average values of the melting-points obtained by the laboratories can therefore be used as constants for the substances in question, which have accordingly been established as WHO Melting-Point Reference Substances and included in the WHO collection of authentic chemical substances. As to the microscope hot stage method, analysis of the results indicated that the values obtained by the “going-through” technique did not differ significantly from those obtained by the capillary method, but the values obtained by the “equilibrium” technique were mostly significantly lower. PMID:20604137

  15. Deviation of landmarks in accordance with methods of establishing reference planes in three-dimensional facial CT evaluation.

    PubMed

    Yoon, Kaeng Won; Yoon, Suk-Ja; Kang, Byung-Cheol; Kim, Young-Hee; Kook, Min Suk; Lee, Jae-Seo; Palomo, Juan Martin

    2014-09-01

    This study aimed to investigate the deviation of landmarks from horizontal or midsagittal reference planes according to the methods of establishing reference planes. Computed tomography (CT) scans of 18 patients who received orthodontic and orthognathic surgical treatment were reviewed. Each CT scan was reconstructed by three methods for establishing three orthogonal reference planes (namely, the horizontal, midsagittal, and coronal reference planes). The horizontal (bilateral porions and bilateral orbitales) and midsagittal (crista galli, nasion, prechiasmatic point, opisthion, and anterior nasal spine) landmarks were identified on each CT scan. Vertical deviation of the horizontal landmarks and horizontal deviation of the midsagittal landmarks were measured. The porion and orbitale, which were not involved in establishing the horizontal reference plane, were found to deviate vertically from the horizontal reference plane in the three methods. The midsagittal landmarks, which were not used for the midsagittal reference plane, deviated horizontally from the midsagittal reference plane in the three methods. In a three-dimensional facial analysis, the vertical and horizontal deviations of the landmarks from the horizontal and midsagittal reference planes could vary depending on the methods of establishing reference planes.

  16. SEMI-VOLATILE SECONDARY AEROSOLS IN URBAN ATMOSPHERES: MEETING A MEASURED CHALLENGE

    EPA Science Inventory

    This presentation compares the results from various particle measurement methods as they relate to semi-volatile secondary aerosols in urban atmospheres. The methods include the PM2.5 Federal Reference Method; Particle Concentrator - BYU Organic Sampling System (PC-BOSS); the Re...

  17. Novel lipoprotein density profiling in healthy dogs of various breeds, healthy miniature schnauzers, and miniature schnauzers with hyperlipidemia

    PubMed Central

    2013-01-01

    Background Despite the importance of abnormalities in lipoprotein metabolism in clinical canine medicine, the fact that most previously used methods for lipoprotein profiling are rather laborious and time-consuming has been a major obstacle to the wide clinical application and use of lipoprotein profiling in this species. The aim of the present study was to assess the feasibility of a continuous lipoprotein density profile (CLPDP) generated within a bismuth sodium ethylenediaminetetraacetic acid (NaBiEDTA) density gradient to characterize and compare the lipoprotein profiles of healthy dogs of various breeds, healthy Miniature Schnauzers, and Miniature Schnauzers with primary hypertriacylglycerolemia. A total of 35 healthy dogs of various breeds with serum triacylglycerol (TAG) and cholesterol concentrations within their respective reference intervals were selected for use as a reference population. Thirty-one Miniature Schnauzers with serum TAG and cholesterol concentrations within their respective reference intervals and 31 Miniature Schnauzers with hypertriacylglyceridemia were also included in the study. Results The results suggest that CLPDP using NaBiEDTA provides unique diagnostic information in addition to measurements of serum TAG and cholesterol concentrations and that it is a useful screening method for dogs with suspected lipoprotein metabolism disorders. Using the detailed and continuous density distribution information provided by the CLPDP, important differences in lipoprotein profiles can be detected even among dogs that have serum TAG and cholesterol concentrations within the reference interval. Miniature Schnauzers with serum TAG and cholesterol concentrations within the reference interval had significantly different lipoprotein profiles than dogs of various other breeds. In addition, it was further established that specific lipoprotein fractions are associated with hypertriacylglyceridemia in Miniature Schnauzers. Conclusions The results of the present study suggest that density gradient ultracentrifugation using NaBiEDTA is a useful screening method for the study of lipoprotein profiles in dogs. Therefore, this method could potentially be used for diagnostic purposes for the separation of dogs suspected of having lipoprotein abnormalities from healthy dogs. PMID:23497598

  18. Temporal upscaling of instantaneous evapotranspiration on clear-sky days using the constant reference evaporative fraction method with fixed or variable surface resistances at two cropland sites

    NASA Astrophysics Data System (ADS)

    Tang, Ronglin; Li, Zhao-Liang; Sun, Xiaomin; Bi, Yuyun

    2017-01-01

    Surface evapotranspiration (ET) is an important component of water and energy in land and atmospheric systems. This paper investigated whether using variable surface resistances in the reference ET estimates from the full-form Penman-Monteith (PM) equation could improve the upscaled daily ET estimates in the constant reference evaporative fraction (EFr, the ratio of actual to reference grass/alfalfa ET) method on clear-sky days using ground-based measurements. Half-hourly near-surface meteorological variables and eddy covariance (EC) system-measured latent heat flux data on clear-sky days were collected at two sites with different climatic conditions, namely, the subhumid Yucheng station in northern China and the arid Yingke site in northwestern China and were used as the model input and ground-truth, respectively. The results showed that using the Food and Agriculture Organization (FAO)-PM equation, the American Society of Civil Engineers-PM equation, and the full-form PM equation to estimate the reference ET in the constant EFr method produced progressively smaller upscaled daily ET at a given time from midmorning to midafternoon. Using all three PM equations produced the best results at noon at both sites regardless of whether the energy imbalance of the EC measurements was closed. When the EC measurements were not corrected for energy imbalance, using variable surface resistance in the full-form PM equation could improve the ET upscaling in the midafternoon, but worse results may occur in the midmorning to noon. Site-to-site and time-to-time variations were found in the performances of a given PM equation (with fixed or variable surface resistances) before and after the energy imbalance was closed.

  19. Evaluation of the methods for enumerating coliform bacteria from water samples using precise reference standards.

    PubMed

    Wohlsen, T; Bates, J; Vesey, G; Robinson, W A; Katouli, M

    2006-04-01

    To use BioBall cultures as a precise reference standard to evaluate methods for enumeration of Escherichia coli and other coliform bacteria in water samples. Eight methods were evaluated including membrane filtration, standard plate count (pour and spread plate methods), defined substrate technology methods (Colilert and Colisure), the most probable number method and the Petrifilm disposable plate method. Escherichia coli and Enterobacter aerogenes BioBall cultures containing 30 organisms each were used. All tests were performed using 10 replicates. The mean recovery of both bacteria varied with the different methods employed. The best and most consistent results were obtained with Petrifilm and the pour plate method. Other methods either yielded a low recovery or showed significantly high variability between replicates. The BioBall is a very suitable quality control tool for evaluating the efficiency of methods for bacterial enumeration in water samples.

  20. Assessing the accuracy of TDR-based water leak detection system

    NASA Astrophysics Data System (ADS)

    Fatemi Aghda, S. M.; GanjaliPour, K.; Nabiollahi, K.

    2018-03-01

    The use of TDR system to detect leakage locations in underground pipes has been developed in recent years. In this system, a bi-wire is installed in parallel with the underground pipes and is considered as a TDR sensor. This approach greatly covers the limitations arisen with using the traditional method of acoustic leak positioning. TDR based leak detection method is relatively accurate when the TDR sensor is in contact with water in just one point. Researchers have been working to improve the accuracy of this method in recent years. In this study, the ability of TDR method was evaluated in terms of the appearance of multi leakage points simultaneously. For this purpose, several laboratory tests were conducted. In these tests in order to simulate leakage points, the TDR sensor was put in contact with water at some points, then the number and the dimension of the simulated leakage points were gradually increased. The results showed that with the increase in the number and dimension of the leakage points, the error rate of the TDR-based water leak detection system increases. The authors tried, according to the results obtained from the laboratory tests, to develop a method to improve the accuracy of the TDR-based leak detection systems. To do that, they defined a few reference points on the TDR sensor. These points were created via increasing the distance between two conductors of TDR sensor and were easily identifiable in the TDR waveform. The tests were repeated again using the TDR sensor having reference points. In order to calculate the exact distance of the leakage point, the authors developed an equation in accordance to the reference points. A comparison between the results obtained from both tests (with and without reference points) showed that using the method and equation developed by the authors can significantly improve the accuracy of positioning the leakage points.

  1. In-vitro evaluation of the accuracy of conventional and digital methods of obtaining full-arch dental impressions.

    PubMed

    Ender, Andreas; Mehl, Albert

    2015-01-01

    To investigate the accuracy of conventional and digital impression methods used to obtain full-arch impressions by using an in-vitro reference model. Eight different conventional (polyether, POE; vinylsiloxanether, VSE; direct scannable vinylsiloxanether, VSES; and irreversible hydrocolloid, ALG) and digital (CEREC Bluecam, CER; CEREC Omnicam, OC; Cadent iTero, ITE; and Lava COS, LAV) full-arch impressions were obtained from a reference model with a known morphology, using a highly accurate reference scanner. The impressions obtained were then compared with the original geometry of the reference model and within each test group. A point-to-point measurement of the surface of the model using the signed nearest neighbour method resulted in a mean (10%-90%)/2 percentile value for the difference between the impression and original model (trueness) as well as the difference between impressions within a test group (precision). Trueness values ranged from 11.5 μm (VSE) to 60.2 μm (POE), and precision ranged from 12.3 μm (VSE) to 66.7 μm (POE). Among the test groups, VSE, VSES, and CER showed the highest trueness and precision. The deviation pattern varied with the impression method. Conventional impressions showed high accuracy across the full dental arch in all groups, except POE and ALG. Conventional and digital impression methods show differences regarding full-arch accuracy. Digital impression systems reveal higher local deviations of the full-arch model. Digital intraoral impression systems do not show superior accuracy compared to highly accurate conventional impression techniques. However, they provide excellent clinical results within their indications applying the correct scanning technique.

  2. Edge Detection Method Based on Neural Networks for COMS MI Images

    NASA Astrophysics Data System (ADS)

    Lee, Jin-Ho; Park, Eun-Bin; Woo, Sun-Hee

    2016-12-01

    Communication, Ocean And Meteorological Satellite (COMS) Meteorological Imager (MI) images are processed for radiometric and geometric correction from raw image data. When intermediate image data are matched and compared with reference landmark images in the geometrical correction process, various techniques for edge detection can be applied. It is essential to have a precise and correct edged image in this process, since its matching with the reference is directly related to the accuracy of the ground station output images. An edge detection method based on neural networks is applied for the ground processing of MI images for obtaining sharp edges in the correct positions. The simulation results are analyzed and characterized by comparing them with the results of conventional methods, such as Sobel and Canny filters.

  3. Validation of a T1 and T2* leakage correction method based on multi-echo DSC-MRI using MION as a reference standard

    PubMed Central

    Stokes, Ashley M.; Semmineh, Natenael; Quarles, C. Chad

    2015-01-01

    Purpose A combined biophysical- and pharmacokinetic-based method is proposed to separate, quantify, and correct for both T1 and T2* leakage effects using dual-echo DSC acquisitions to provide more accurate hemodynamic measures, as validated by a reference intravascular contrast agent (CA). Methods Dual-echo DSC-MRI data were acquired in two rodent glioma models. The T1 leakage effects were removed and also quantified in order to subsequently correct for the remaining T2* leakage effects. Pharmacokinetic, biophysical, and combined biophysical and pharmacokinetic models were used to obtain corrected cerebral blood volume (CBV) and cerebral blood flow (CBF), and these were compared with CBV and CBF from an intravascular CA. Results T1-corrected CBV was significantly overestimated compared to MION CBV, while T1+T2*-correction yielded CBV values closer to the reference values. The pharmacokinetic and simplified biophysical methods showed similar results and underestimated CBV in tumors exhibiting strong T2* leakage effects. The combined method was effective for correcting T1 and T2* leakage effects across tumor types. Conclusions Correcting for both T1 and T2* leakage effects yielded more accurate measures of CBV. The combined correction method yields more reliable CBV measures than either correction method alone, but for certain brain tumor types (e.g., gliomas) the simplified biophysical method may provide a robust and computationally efficient alternative. PMID:26362714

  4. Method for predicting dry mechanical properties from wet wood and standing trees

    DOEpatents

    Meglen, Robert R.; Kelley, Stephen S.

    2003-08-12

    A method for determining the dry mechanical strength for a green wood comprising: illuminating a surface of the wood to be determined with light between 350-2,500 nm, the wood having a green moisture content; analyzing the surface using a spectrometric method, the method generating a first spectral data, and using a multivariate analysis to predict the dry mechanical strength of green wood when dry by comparing the first spectral data with a calibration model, the calibration model comprising a second spectrometric method of spectral data obtained from a reference wood having a green moisture content, the second spectral data correlated with a known mechanical strength analytical result obtained from a reference wood when dried and having a dry moisture content.

  5. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    PubMed

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. What Is the Reference? An Examination of Alternatives to the Reference Sources Used in IES TM-30-15

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Royer, Michael P.

    A study was undertaken to document the role of the reference illuminant in the IES TM-30-15 method for evaluating color rendition. TM-30-15 relies on a relative reference scheme; that is, the reference illuminant and test source always have the same correlated color temperature (CCT). The reference illuminant is a Planckian radiator, model of daylight, or combination of those two, depending on the exact CCT of the test source. Three alternative reference schemes were considered: 1) either using all Planckian radiators or all daylight models; 2) using only one of ten possible illuminants (Planckian, daylight, or equal energy), regardless of themore » CCT of the test source; 3) using an off-Planckian reference illuminant (i.e., a source with a negative Duv). No reference scheme is inherently superior to another, with differences in metric values largely a result of small differences in gamut shape of the reference alternatives. While using any of the alternative schemes is more reasonable in the TM-30-15 evaluation framework than it was with the CIE CRI framework, the differences still ultimately manifest only as changes in interpretation of the results. References are employed in color rendering measures to provide a familiar point of comparison, not to establish an ideal source.« less

  7. Sports Training Support Method by Self-Coaching with Humanoid Robot

    NASA Astrophysics Data System (ADS)

    Toyama, S.; Ikeda, F.; Yasaka, T.

    2016-09-01

    This paper proposes a new training support method called self-coaching with humanoid robots. In the proposed method, two small size inexpensive humanoid robots are used because of their availability. One robot called target robot reproduces motion of a target player and another robot called reference robot reproduces motion of an expert player. The target player can recognize a target technique from the reference robot and his/her inadequate skill from the target robot. Modifying the motion of the target robot as self-coaching, the target player could get advanced cognition. Some experimental results show some possibility as the new training method and some issues of the self-coaching interface program as a future work.

  8. A comparison theorem for the SOR iterative method

    NASA Astrophysics Data System (ADS)

    Sun, Li-Ying

    2005-09-01

    In 1997, Kohno et al. have reported numerically that the improving modified Gauss-Seidel method, which was referred to as the IMGS method, is superior to the SOR iterative method. In this paper, we prove that the spectral radius of the IMGS method is smaller than that of the SOR method and Gauss-Seidel method, if the relaxation parameter [omega][set membership, variant](0,1]. As a result, we prove theoretically that this method is succeeded in improving the convergence of some classical iterative methods. Some recent results are improved.

  9. Preparation of canine C-reactive protein serum reference material: A feasibility study.

    PubMed

    Canalias, Francesca; Piñeiro, Matilde; Pato, Raquel; Peña, Raquel; Bosch, Lluís; Soler, Lourdes; García, Natalia; Lampreave, Fermín; Saco, Yolanda; Bassols, Anna

    2018-03-01

    The availability of a species-specific reference material is essential for the harmonization of results obtained in different laboratories by different methods. We describe the preparation of a canine C-reactive protein (cCRP) serum reference material containing purified cCRP stabilized in a serum matrix. The material can be used by manufacturers to assign values to their calibrator and control materials. The serum matrix was obtained using blood collected from healthy dogs, stabilized and submitted for a delipidation process. The reference material was prepared by diluting purified cCRP in the serum matrix containing 1.0 mol/L HEPES buffer, 3.0 mmol/L calcium chloride, 80,000 kUI/L aprotinin, and 1.0 mmol/L benzamidine hydrochloride monohydrate at a pH of 7.2, and dispensing (0.5 mL) the matrix into vials that were then frozen. The pilot batch of 200 vials was shown to be homogeneous and stable after a stability study at various temperatures and over a total time of 110 days. The prepared material was submitted to an assignment value study. Eight laboratories from different European countries participated by using the same reagents for an immunoturbidimetric method adapted for different analyzers. The obtained cCRP concentration in the reference material was 78.5 mg/L with an expanded uncertainty (k = 2) of 4.2 mg/L. Canine C-reactive protein serum reference material has been produced that allows harmonization of results obtained by different methods and different laboratories, thus reducing the possibility of errors and misunderstandings. © 2018 American Society for Veterinary Clinical Pathology.

  10. Quantification of drugs in plasma without primary reference standards by liquid chromatography-chemiluminescence nitrogen detection: application to tramadol metabolite ratios.

    PubMed

    Ojanperä, Suvi; Rasanen, Ilpo; Sistonen, Johanna; Pelander, Anna; Vuori, Erkki; Ojanperä, Ilkka

    2007-08-01

    Lack of availability of reference standards for drug metabolites, newly released drugs, and illicit drugs hinders the analysis of these substances in biologic samples. To counter this problem, an approach is presented here for quantitative drug analysis in plasma without primary reference standards by liquid chromatography-chemiluminescence nitrogen detection (LC-CLND). To demonstrate the feasibility of the method, metabolic ratios of the opioid drug tramadol were determined in the setting of a pharmacogenetic study. Four volunteers were given a single 100-mg oral dose of tramadol, and a blood sample was collected from each subject 1 hour later. Tramadol, O-desmethyltramadol, and nortramadol were determined in plasma by LC-CLND without reference standards and by a gas chromatography-mass spectrometry reference method. In contrast to previous CLND studies lacking an extraction step, a liquid-liquid extraction system was created for 5-mL plasma samples using n-butyl chloride-isopropyl alcohol (98 + 2) at pH 10. Extraction recovery estimation was based on model compounds chosen according to their similar physicochemical characteristics (retention time, pKa, logD). Instrument calibration was performed with a single secondary standard (caffeine) using the equimolar response of the detector to nitrogen. The mean differences between the results of the LC-CLND and gas chromatography-mass spectrometry methods for tramadol, O-desmethyltramadol, and nortramadol were 8%, 32%, and 19%, respectively. The sensitivity of LC-CLND was sufficient for therapeutic concentrations of tramadol and metabolites. A good correlation was obtained between genotype, expressed by the number of functional genes, and the plasma metabolite ratios. This experiment suggests that a recovery-corrected LC-CLND analysis produces sufficiently accurate results to be useful in a clinical context, particularly in instances in which reference standards are not readily accessible.

  11. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  12. Towards absolute quantification of allergenic proteins in food--lysozyme in wine as a model system for metrologically traceable mass spectrometric methods and certified reference materials.

    PubMed

    Cryar, Adam; Pritchard, Caroline; Burkitt, William; Walker, Michael; O'Connor, Gavin; Burns, Duncan Thorburn; Quaglia, Milena

    2013-01-01

    Current routine food allergen quantification methods, which are based on immunochemistry, offer high sensitivity but can suffer from issues of specificity and significant variability of results. MS approaches have been developed, but currently lack metrological traceability. A feasibility study on the application of metrologically traceable MS-based reference procedures was undertaken. A proof of concept involving proteolytic digestion and isotope dilution MS for quantification of protein allergens in a food matrix was undertaken using lysozyme in wine as a model system. A concentration of lysozyme in wine of 0.95 +/- 0.03 microg/g was calculated based on the concentrations of two peptides, confirming that this type of analysis is viable at allergenically meaningful concentrations. The challenges associated with this promising method were explored; these included peptide stability, chemical modification, enzymatic digestion, and sample cleanup. The method is suitable for the production of allergen in food certified reference materials, which together with the achieved understanding of the effects of sample preparation and of the matrix on the final results, will assist in addressing the bias of the techniques routinely used and improve measurement confidence. Confirmation of the feasibility of MS methods for absolute quantification of an allergenic protein in a food matrix with results traceable to the International System of Units is a step towards meaningful comparison of results for allergen proteins among laboratories. This approach will also underpin risk assessment and risk management of allergens in the food industry, and regulatory compliance of the use of thresholds or action levels when adopted.

  13. Guidelines for the detection of Trichinella larvae at the slaughterhouse in a quality assurance system.

    PubMed

    Rossi, Patrizia; Pozio, Edoardo

    2008-01-01

    The European Community Regulation (EC) No. 2075/2005 lays down specific rules on official controls for the detection of Trichinella in fresh meat for human consumption, recommending the pooled-sample digestion method as the reference method. The aim of this document is to provide specific guidance to implement an appropriate Trichinella digestion method by a laboratory accredited according to the ISO/IEC 17025:2005 international standard, and performing microbiological testing following the EA-04/10:2002 international guideline. Technical requirements for the correct implementation of the method, such as the personnel competence, specific equipments and reagents, validation of the method, reference materials, sampling, quality assurance of results and quality control of performance are provided, pointing out the critical control points for the correct implementation of the digestion method.

  14. Diagnostic Performance of a Molecular Test versus Clinician Assessment of Vaginitis

    PubMed Central

    Gaydos, Charlotte A.; Nyirjesy, Paul; Paradis, Sonia; Kodsi, Salma; Cooper, Charles K.

    2018-01-01

    ABSTRACT Vaginitis is a common complaint, diagnosed either empirically or using Amsel's criteria and wet mount microscopy. This study sought to determine characteristics of an investigational test (a molecular test for vaginitis), compared to reference, for detection of bacterial vaginosis, Candida spp., and Trichomonas vaginalis. Vaginal specimens from a cross-sectional study were obtained from 1,740 women (≥18 years old), with vaginitis symptoms, during routine clinic visits (across 10 sites in the United States). Specimens were analyzed using a commercial PCR/fluorogenic probe-based investigational test that detects bacterial vaginosis, Candida spp., and Trichomonas vaginalis. Clinician diagnosis and in-clinic testing (Amsel's test, potassium hydroxide preparation, and wet mount) were also employed to detect the three vaginitis causes. All testing methods were compared to the respective reference methods (Nugent Gram stain for bacterial vaginosis, detection of the Candida gene its2, and Trichomonas vaginalis culture). The investigational test, clinician diagnosis, and in-clinic testing were compared to reference methods for bacterial vaginosis, Candida spp., and Trichomonas vaginalis. The investigational test resulted in significantly higher sensitivity and negative predictive value than clinician diagnosis or in-clinic testing. In addition, the investigational test showed a statistically higher overall percent agreement with each of the three reference methods than did clinician diagnosis or in-clinic testing. The investigational test showed significantly higher sensitivity for detecting vaginitis, involving more than one cause, than did clinician diagnosis. Taken together, these results suggest that a molecular investigational test can facilitate accurate detection of vaginitis. PMID:29643195

  15. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of dissolved arsenic, boron, lithium, selenium, strontium, thallium, and vanadium using inductively coupled plasma-mass spectrometry

    USGS Publications Warehouse

    Garbarino, John R.

    1999-01-01

    The inductively coupled plasma?mass spectrometric (ICP?MS) methods have been expanded to include the determination of dissolved arsenic, boron, lithium, selenium, strontium, thallium, and vanadium in filtered, acidified natural water. Method detection limits for these elements are now 10 to 200 times lower than by former U.S. Geological Survey (USGS) methods, thus providing lower variability at ambient concentrations. The bias and variability of the method was determined by using results from spike recoveries, standard reference materials, and validation samples. Spike recoveries at 5 to 10 times the method detection limit and 75 micrograms per liter in reagent-water, surface-water, and groundwater matrices averaged 93 percent for seven replicates, although selected elemental recoveries in a ground-water matrix with an extremely high iron sulfate concentration were negatively biased by 30 percent. Results for standard reference materials were within 1 standard deviation of the most probable value. Statistical analysis of the results from about 60 filtered, acidified natural-water samples indicated that there was no significant difference between ICP?MS and former USGS official methods of analysis.

  16. Optimisation and validation of a rapid and efficient microemulsion liquid chromatographic (MELC) method for the determination of paracetamol (acetaminophen) content in a suppository formulation.

    PubMed

    McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin

    2007-05-09

    A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.

  17. Normative Databases for Imaging Instrumentation

    PubMed Central

    Realini, Tony; Zangwill, Linda; Flanagan, John; Garway-Heath, David; Patella, Vincent Michael; Johnson, Chris; Artes, Paul; Ben Gaddie, I.; Fingeret, Murray

    2015-01-01

    Purpose To describe the process by which imaging devices undergo reference database development and regulatory clearance. The limitations and potential improvements of reference (normative) data sets for ophthalmic imaging devices will be discussed. Methods A symposium was held in July 2013 in which a series of speakers discussed issues related to the development of reference databases for imaging devices. Results Automated imaging has become widely accepted and used in glaucoma management. The ability of such instruments to discriminate healthy from glaucomatous optic nerves, and to detect glaucomatous progression over time is limited by the quality of reference databases associated with the available commercial devices. In the absence of standardized rules governing the development of reference databases, each manufacturer’s database differs in size, eligibility criteria, and ethnic make-up, among other key features. Conclusions The process for development of imaging reference databases may be improved by standardizing eligibility requirements and data collection protocols. Such standardization may also improve the degree to which results may be compared between commercial instruments. PMID:25265003

  18. Accuracy of Referring Provider and Endoscopist Impressions of Colonoscopy Indication.

    PubMed

    Naveed, Mariam; Clary, Meredith; Ahn, Chul; Kubiliun, Nisa; Agrawal, Deepak; Cryer, Byron; Murphy, Caitlin; Singal, Amit G

    2017-07-01

    Background: Referring provider and endoscopist impressions of colonoscopy indication are used for clinical care, reimbursement, and quality reporting decisions; however, the accuracy of these impressions is unknown. This study assessed the sensitivity, specificity, positive and negative predictive value, and overall accuracy of methods to classify colonoscopy indication, including referring provider impression, endoscopist impression, and administrative algorithm compared with gold standard chart review. Methods: We randomly sampled 400 patients undergoing a colonoscopy at a Veterans Affairs health system between January 2010 and December 2010. Referring provider and endoscopist impressions of colonoscopy indication were compared with gold-standard chart review. Indications were classified into 4 mutually exclusive categories: diagnostic, surveillance, high-risk screening, or average-risk screening. Results: Of 400 colonoscopies, 26% were performed for average-risk screening, 7% for high-risk screening, 26% for surveillance, and 41% for diagnostic indications. Accuracy of referring provider and endoscopist impressions of colonoscopy indication were 87% and 84%, respectively, which were significantly higher than that of the administrative algorithm (45%; P <.001 for both). There was substantial agreement between endoscopist and referring provider impressions (κ=0.76). All 3 methods showed high sensitivity (>90%) for determining screening (vs nonscreening) indication, but specificity of the administrative algorithm was lower (40.3%) compared with referring provider (93.7%) and endoscopist (84.0%) impressions. Accuracy of endoscopist, but not referring provider, impression was lower in patients with a family history of colon cancer than in those without (65% vs 84%; P =.001). Conclusions: Referring provider and endoscopist impressions of colonoscopy indication are both accurate and may be useful data to incorporate into algorithms classifying colonoscopy indication. Copyright © 2017 by the National Comprehensive Cancer Network.

  19. An improved artifact removal in exposure fusion with local linear constraints

    NASA Astrophysics Data System (ADS)

    Zhang, Hai; Yu, Mali

    2018-04-01

    In exposure fusion, it is challenging to remove artifacts because of camera motion and moving objects in the scene. An improved artifact removal method is proposed in this paper, which performs local linear adjustment in artifact removal progress. After determining a reference image, we first perform high-dynamic-range (HDR) deghosting to generate an intermediate image stack from the input image stack. Then, a linear Intensity Mapping Function (IMF) in each window is extracted based on the intensities of intermediate image and reference image, the intensity mean and variance of reference image. Finally, with the extracted local linear constraints, we reconstruct a target image stack, which can be directly used for fusing a single HDR-like image. Some experiments have been implemented and experimental results demonstrate that the proposed method is robust and effective in removing artifacts especially in the saturated regions of the reference image.

  20. Linking in situ LAI and fine resolution remote sensing data to map reference LAI over cropland and grassland using geostatistical regression method

    NASA Astrophysics Data System (ADS)

    He, Yaqian; Bo, Yanchen; Chai, Leilei; Liu, Xiaolong; Li, Aihua

    2016-08-01

    Leaf Area Index (LAI) is an important parameter of vegetation structure. A number of moderate resolution LAI products have been produced in urgent need of large scale vegetation monitoring. High resolution LAI reference maps are necessary to validate these LAI products. This study used a geostatistical regression (GR) method to estimate LAI reference maps by linking in situ LAI and Landsat TM/ETM+ and SPOT-HRV data over two cropland and two grassland sites. To explore the discrepancies of employing different vegetation indices (VIs) on estimating LAI reference maps, this study established the GR models for different VIs, including difference vegetation index (DVI), normalized difference vegetation index (NDVI), and ratio vegetation index (RVI). To further assess the performance of the GR model, the results from the GR and Reduced Major Axis (RMA) models were compared. The results show that the performance of the GR model varies between the cropland and grassland sites. At the cropland sites, the GR model based on DVI provides the best estimation, while at the grassland sites, the GR model based on DVI performs poorly. Compared to the RMA model, the GR model improves the accuracy of reference LAI maps in terms of root mean square errors (RMSE) and bias.

  1. Using bibliometrics to demonstrate the value of library journal collections

    PubMed Central

    Belter, Christopher W; Kaske, Neal K

    2016-01-01

    Although cited reference studies are common in the library and information science literature, they are rarely performed in non-academic institutions or in the atmospheric and oceanic sciences. In this paper, we analyze over 400,000 cited references made by authors affiliated with the National Oceanic and Atmospheric Administration between 2009 and 2013. Our results suggest that these methods can be applied to research libraries in a variety of institutions, that the results of analyses performed at one institution may not be applicable to other institutions, and that cited reference analyses should be periodically updated to reflect changes in authors’ referencing behavior. PMID:27453584

  2. Effect of defuzzification method of fuzzy modeling

    NASA Astrophysics Data System (ADS)

    Lapohos, Tibor; Buchal, Ralph O.

    1994-10-01

    Imprecision can arise in fuzzy relational modeling as a result of fuzzification, inference and defuzzification. These three sources of imprecision are difficult to separate. We have determined through numerical studies that an important source of imprecision is the defuzzification stage. This imprecision adversely affects the quality of the model output. The most widely used defuzzification algorithm is known by the name of `center of area' (COA) or `center of gravity' (COG). In this paper, we show that this algorithm not only maps the near limit values of the variables improperly but also introduces errors for middle domain values of the same variables. Furthermore, the behavior of this algorithm is a function of the shape of the reference sets. We compare the COA method to the weighted average of cluster centers (WACC) procedure in which the transformation is carried out based on the values of the cluster centers belonging to each of the reference membership functions instead of using the functions themselves. We show that this procedure is more effective and computationally much faster than the COA. The method is tested for a family of reference sets satisfying certain constraints, that is, for any support value the sum of reference membership function values equals one and the peak values of the two marginal membership functions project to the boundaries of the universe of discourse. For all the member sets of this family of reference sets the defuzzification errors do not get bigger as the linguistic variables tend to their extreme values. In addition, the more reference sets that are defined for a certain linguistic variable, the less the average defuzzification error becomes. In case of triangle shaped reference sets there is no defuzzification error at all. Finally, an alternative solution is provided that improves the performance of the COA method.

  3. Determination of serum calcium levels by 42Ca isotope dilution inductively coupled plasma mass spectrometry.

    PubMed

    Han, Bingqing; Ge, Menglei; Zhao, Haijian; Yan, Ying; Zeng, Jie; Zhang, Tianjiao; Zhou, Weiyan; Zhang, Jiangtao; Wang, Jing; Zhang, Chuanbao

    2017-11-27

    Serum calcium level is an important clinical index that reflects pathophysiological states. However, detection accuracy in laboratory tests is not ideal; as such, a high accuracy method is needed. We developed a reference method for measuring serum calcium levels by isotope dilution inductively coupled plasma mass spectrometry (ID ICP-MS), using 42Ca as the enriched isotope. Serum was digested with 69% ultrapure nitric acid and diluted to a suitable concentration. The 44Ca/42Ca ratio was detected in H2 mode; spike concentration was calibrated by reverse IDMS using standard reference material (SRM) 3109a, and sample concentration was measured by a bracketing procedure. We compared the performance of ID ICP-MS with those of three other reference methods in China using the same serum and aqueous samples. The relative expanded uncertainty of the sample concentration was 0.414% (k=2). The range of repeatability (within-run imprecision), intermediate imprecision (between-run imprecision), and intra-laboratory imprecision were 0.12%-0.19%, 0.07%-0.09%, and 0.16%-0.17%, respectively, for two of the serum samples. SRM909bI, SRM909bII, SRM909c, and GBW09152 were found to be within the certified value interval, with mean relative bias values of 0.29%, -0.02%, 0.10%, and -0.19%, respectively. The range of recovery was 99.87%-100.37%. Results obtained by ID ICP-MS showed a better accuracy than and were highly correlated with those of other reference methods. ID ICP-MS is a simple and accurate candidate reference method for serum calcium measurement and can be used to establish and improve serum calcium reference system in China.

  4. Defining the Reference Condition for Wadeable Streams in the Sand Hills Subdivision of the Southeastern Plains Ecoregion, USA

    NASA Astrophysics Data System (ADS)

    Kosnicki, Ely; Sefick, Stephen A.; Paller, Michael H.; Jarrell, Miller S.; Prusha, Blair A.; Sterrett, Sean C.; Tuberville, Tracey D.; Feminella, Jack W.

    2014-09-01

    The Sand Hills subdivision of the Southeastern Plains ecoregion has been impacted by historical land uses over the past two centuries and, with the additive effects of contemporary land use, determining reference condition for streams in this region is a challenge. We identified reference condition based on the combined use of 3 independent selection methods. Method 1 involved use of a multivariate disturbance gradient derived from several stressors, method 2 was based on variation in channel morphology, and method 3 was based on passing 6 of 7 environmental criteria. Sites selected as reference from all 3 methods were considered primary reference, whereas those selected by 2 or 1 methods were considered secondary or tertiary reference, respectively. Sites not selected by any of the methods were considered non-reference. In addition, best professional judgment (BPJ) was used to exclude some sites from any reference class, and comparisons were made to examine the utility of BPJ. Non-metric multidimensional scaling indicated that use of BPJ may help designate non-reference sites when unidentified stressors are present. The macroinvertebrate community measures Ephemeroptera, Plecoptera, Trichoptera richness and North Carolina Biotic Index showed no differences between primary and secondary reference sites when BPJ was ignored. However, there was no significant difference among primary, secondary, and tertiary reference sites when BPJ was used. We underscore the importance of classifying reference conditions, especially in regions that have endured significant anthropogenic activity. We suggest that the use of secondary reference sites may enable construction of models that target a broader set of management interests.

  5. Are we using the appropriate reference samples to develop juvenile age estimation methods based on bone size? An exploration of growth differences between average children and those who become victims of homicide.

    PubMed

    Spake, Laure; Cardoso, Hugo F V

    2018-01-01

    The population on which forensic juvenile skeletal age estimation methods are applied has not been critically considered. Previous research suggests that child victims of homicide tend to be from socioeconomically disadvantaged contexts, and that these contexts impair linear growth. This study investigates whether juvenile skeletal remains examined by forensic anthropologists are short for age compared to their normal healthy peers. Cadaver lengths were obtained from records of autopsies of 1256 individuals, aged birth to eighteen years at death, conducted between 2000 and 2015 in Australia, New Zealand, and the U.S. Growth status of the forensic population, represented by homicide victims, and general population, represented by accident victims, were compared using height for age Z-scores and independent sample t-tests. Cadaver lengths of the accident victims were compared to growth references using one sample t-tests to evaluate whether accident victims reflect the general population. Homicide victims are shorter for age than accident victims in samples from the U.S., but not in Australia and New Zealand. Accident victims are more representative of the general population in Australia and New Zealand. Different results in Australia and New Zealand as opposed to the U.S. may be linked to socioeconomic inequality. These results suggest that physical anthropologists should critically select reference samples when devising forensic juvenile skeletal age estimation methods. Children examined in forensic investigations may be short for age, and thus methods developed on normal healthy children may yield inaccurate results. A healthy reference population may not necessarily constitute an appropriate growth comparison for the forensic anthropology population. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Shortwave Radiometer Calibration Methods Comparison and Resulting Solar Irradiance Measurement Differences: A User Perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron; Sengupta, Manajit; Andreas, Afshin

    Banks financing solar energy projects require assurance that these systems will produce the energy predicted. Furthermore, utility planners and grid system operators need to understand the impact of the variable solar resource on solar energy conversion system performance. Accurate solar radiation data sets reduce the expense associated with mitigating performance risk and assist in understanding the impacts of solar resource variability. The accuracy of solar radiation measured by radiometers depends on the instrument performance specification, installation method, calibration procedure, measurement conditions, maintenance practices, location, and environmental conditions. This study addresses the effect of different calibration methods provided by radiometric calibrationmore » service providers, such as NREL and manufacturers of radiometers, on the resulting calibration responsivity. Some of these radiometers are calibrated indoors and some outdoors. To establish or understand the differences in calibration methodology, we processed and analyzed field-measured data from these radiometers. This study investigates calibration responsivities provided by NREL's broadband outdoor radiometer calibration (BORCAL) and a few prominent manufacturers. The BORCAL method provides the outdoor calibration responsivity of pyranometers and pyrheliometers at 45 degree solar zenith angle, and as a function of solar zenith angle determined by clear-sky comparisons with reference irradiance. The BORCAL method also employs a thermal offset correction to the calibration responsivity of single-black thermopile detectors used in pyranometers. Indoor calibrations of radiometers by their manufacturers are performed using a stable artificial light source in a side-by-side comparison between the test radiometer under calibration and a reference radiometer of the same type. In both methods, the reference radiometer calibrations are traceable to the World Radiometric Reference (WRR). These different methods of calibration demonstrated +1% to +2% differences in solar irradiance measurement. Analyzing these differences will ultimately help determine the uncertainty of the field radiometer data and guide the development of a consensus standard for calibration. Further advancing procedures for precisely calibrating radiometers to world reference standards that reduce measurement uncertainty will allow more accurate prediction of solar output and improve the bankability of solar projects.« less

  7. Selection of reference genes for miRNA qRT-PCR under abiotic stress in grapevine.

    PubMed

    Luo, Meng; Gao, Zhen; Li, Hui; Li, Qin; Zhang, Caixi; Xu, Wenping; Song, Shiren; Ma, Chao; Wang, Shiping

    2018-03-13

    Grapevine is among the fruit crops with high economic value, and because of the economic losses caused by abiotic stresses, the stress resistance of Vitis vinifera has become an increasingly important research area. Among the mechanisms responding to environmental stresses, the role of miRNA has received much attention recently. qRT-PCR is a powerful method for miRNA quantitation, but the accuracy of the method strongly depends on the appropriate reference genes. To determine the most suitable reference genes for grapevine miRNA qRT-PCR, 15 genes were chosen as candidate reference genes. After eliminating 6 candidate reference genes with unsatisfactory amplification efficiency, the expression stability of the remaining candidate reference genes under salinity, cold and drought was analysed using four algorithms, geNorm, NormFinder, deltaCt and Bestkeeper. The results indicated that U6 snRNA was the most suitable reference gene under salinity and cold stresses; whereas miR168 was the best for drought stress. The best reference gene sets for salinity, cold and drought stresses were miR160e + miR164a, miR160e + miR168 and ACT + UBQ + GAPDH, respectively. The selected reference genes or gene sets were verified using miR319 or miR408 as the target gene.

  8. Optimization of GPS water vapor tomography technique with radiosonde and COSMIC historical data

    NASA Astrophysics Data System (ADS)

    Ye, Shirong; Xia, Pengfei; Cai, Changsheng

    2016-09-01

    The near-real-time high spatial resolution of atmospheric water vapor distribution is vital in numerical weather prediction. GPS tomography technique has been proved effectively for three-dimensional water vapor reconstruction. In this study, the tomography processing is optimized in a few aspects by the aid of radiosonde and COSMIC historical data. Firstly, regional tropospheric zenith hydrostatic delay (ZHD) models are improved and thus the zenith wet delay (ZWD) can be obtained at a higher accuracy. Secondly, the regional conversion factor of converting the ZWD to the precipitable water vapor (PWV) is refined. Next, we develop a new method for dividing the tomography grid with an uneven voxel height and a varied water vapor layer top. Finally, we propose a Gaussian exponential vertical interpolation method which can better reflect the vertical variation characteristic of water vapor. GPS datasets collected in Hong Kong in February 2014 are employed to evaluate the optimized tomographic method by contrast with the conventional method. The radiosonde-derived and COSMIC-derived water vapor densities are utilized as references to evaluate the tomographic results. Using radiosonde products as references, the test results obtained from our optimized method indicate that the water vapor density accuracy is improved by 15 and 12 % compared to those derived from the conventional method below the height of 3.75 km and above the height of 3.75 km, respectively. Using the COSMIC products as references, the results indicate that the water vapor density accuracy is improved by 15 and 19 % below 3.75 km and above 3.75 km, respectively.

  9. Effect of endogenous reference genes on digital PCR assessment of genetically engineered canola events.

    PubMed

    Demeke, Tigst; Eng, Monika

    2018-05-01

    Droplet digital PCR (ddPCR) has been used for absolute quantification of genetically engineered (GE) events. Absolute quantification of GE events by duplex ddPCR requires the use of appropriate primers and probes for target and reference gene sequences in order to accurately determine the amount of GE materials. Single copy reference genes are generally preferred for absolute quantification of GE events by ddPCR. Study has not been conducted on a comparison of reference genes for absolute quantification of GE canola events by ddPCR. The suitability of four endogenous reference sequences ( HMG-I/Y , FatA(A), CruA and Ccf) for absolute quantification of GE canola events by ddPCR was investigated. The effect of DNA extraction methods and DNA quality on the assessment of reference gene copy numbers was also investigated. ddPCR results were affected by the use of single vs. two copy reference genes. The single copy, FatA(A), reference gene was found to be stable and suitable for absolute quantification of GE canola events by ddPCR. For the copy numbers measured, the HMG-I/Y reference gene was less consistent than FatA(A) reference gene. The expected ddPCR values were underestimated when CruA and Ccf (two copy endogenous Cruciferin sequences) were used because of high number of copies. It is important to make an adjustment if two copy reference genes are used for ddPCR in order to obtain accurate results. On the other hand, real-time quantitative PCR results were not affected by the use of single vs. two copy reference genes.

  10. Liquid chromatography with absorbance detection and with isotope-dilution mass spectrometry for determination of isoflavones in soy standard reference materials.

    PubMed

    Phillips, Melissa M; Bedner, Mary; Reitz, Manuela; Burdette, Carolyn Q; Nelson, Michael A; Yen, James H; Sander, Lane C; Rimmer, Catherine A

    2017-02-01

    Two independent analytical approaches, based on liquid chromatography with absorbance detection and liquid chromatography with mass spectrometric detection, have been developed for determination of isoflavones in soy materials. These two methods yield comparable results for a variety of soy-based foods and dietary supplements. Four Standard Reference Materials (SRMs) have been produced by the National Institute of Standards and Technology to assist the food and dietary supplement community in method validation and have been assigned values for isoflavone content using both methods. These SRMs include SRM 3234 Soy Flour, SRM 3236 Soy Protein Isolate, SRM 3237 Soy Protein Concentrate, and SRM 3238 Soy-Containing Solid Oral Dosage Form. A fifth material, SRM 3235 Soy Milk, was evaluated using the methods and found to be inhomogeneous for isoflavones and unsuitable for value assignment. Graphical Abstract Separation of six isoflavone aglycones and glycosides found in Standard Reference Material (SRM) 3236 Soy Protein Isolate.

  11. Alternative Methods for Estimating Plane Parameters Based on a Point Cloud

    NASA Astrophysics Data System (ADS)

    Stryczek, Roman

    2017-12-01

    Non-contact measurement techniques carried out using triangulation optical sensors are increasingly popular in measurements with the use of industrial robots directly on production lines. The result of such measurements is often a cloud of measurement points that is characterized by considerable measuring noise, presence of a number of points that differ from the reference model, and excessive errors that must be eliminated from the analysis. To obtain vector information points contained in the cloud that describe reference models, the data obtained during a measurement should be subjected to appropriate processing operations. The present paperwork presents an analysis of suitability of methods known as RANdom Sample Consensus (RANSAC), Monte Carlo Method (MCM), and Particle Swarm Optimization (PSO) for the extraction of the reference model. The effectiveness of the tested methods is illustrated by examples of measurement of the height of an object and the angle of a plane, which were made on the basis of experiments carried out at workshop conditions.

  12. 46 CFR 160.176-4 - Incorporation by reference.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... and Elongation, Breaking of Woven Cloth; Grab Method, incorporation by reference approved for § 160.176-13. (ii) Method 5132, Strength of Cloth, Tearing; Falling-Pendulum Method, incorporation by reference approved for § 160.176-13. (iii) Method 5134, Strength of Cloth, Tearing; Tongue Method...

  13. 46 CFR 160.176-4 - Incorporation by reference.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... and Elongation, Breaking of Woven Cloth; Grab Method, incorporation by reference approved for § 160.176-13. (ii) Method 5132, Strength of Cloth, Tearing; Falling-Pendulum Method, incorporation by reference approved for § 160.176-13. (iii) Method 5134, Strength of Cloth, Tearing; Tongue Method...

  14. Analysis of standard reference materials by absolute INAA

    NASA Astrophysics Data System (ADS)

    Heft, R. E.; Koszykowski, R. F.

    1981-07-01

    Three standard reference materials: flyash, soil, and ASI 4340 steel, are analyzed by a method of absolute instrumental neutron activation analysis. Two different light water pool-type reactors were used to produce equivalent analytical results even though the epithermal to thermal flux ratio in one reactor was higher than that in the other by a factor of two.

  15. Reference Intervals of Alpha-Fetoprotein and Carcinoembryonic Antigen in the Apparently Healthy Population.

    PubMed

    Zhang, Gao-Ming; Guo, Xu-Xiao; Ma, Xiao-Bo; Zhang, Guo-Ming

    2016-12-12

    BACKGROUND The aim of this study was to calculate 95% reference intervals and double-sided limits of serum alpha-fetoprotein (AFP) and carcinoembryonic antigen (CEA) according to the CLSI EP28-A3 guideline. MATERIAL AND METHODS Serum AFP and CEA values were measured in samples from 26 000 healthy subjects in the Shuyang area receiving general health checkups. The 95% reference intervals and upper limits were calculated by using MedCalc. RESULTS We provided continuous reference intervals from 20 years old to 90 years old for AFP and CEA. The reference intervals were: AFP, 1.31-7.89 ng/ml (males) and 1.01-7.10 ng/ml (females); CEA, 0.51-4.86 ng/ml (males) and 0.35-3.45ng/ml (females). AFP and CEA were significantly positively correlated with age in both males (r=0.196 and r=0.198) and females (r=0.121 and r=0.197). CONCLUSIONS Different races or populations and different detection systems may result in different reference intervals for AFP and CEA. Continuous reference intervals of age changes are more accurate than age groups.

  16. Determination of the carbon, hydrogen and nitrogen contents of alanine and their uncertainties using the certified reference material L-alanine (NMIJ CRM 6011-a).

    PubMed

    Itoh, Nobuyasu; Sato, Ayako; Yamazaki, Taichi; Numata, Masahiko; Takatsu, Akiko

    2013-01-01

    The carbon, hydrogen, and nitrogen (CHN) contents of alanine and their uncertainties were estimated using a CHN analyzer and the certified reference material (CRM) L-alanine. The CHN contents and their uncertainties, as measured using the single-point calibration method, were 40.36 ± 0.20% for C, 7.86 ± 0.13% for H, and 15.66 ± 0.09% for N; the results obtained using the bracket calibration method were also comparable. The method described in this study is reasonable, convenient, and meets the general requirement of having uncertainties ≤ 0.4%.

  17. Measurement of optical to electrical and electrical to optical delays with ps-level uncertainty.

    PubMed

    Peek, H Z; Pinkert, T J; Jansweijer, P P M; Koelemeij, J C J

    2018-05-28

    We present a new measurement principle to determine the absolute time delay of a waveform from an optical reference plane to an electrical reference plane and vice versa. We demonstrate a method based on this principle with 2 ps uncertainty. This method can be used to perform accurate time delay determinations of optical transceivers used in fiber-optic time-dissemination equipment. As a result the time scales in optical and electrical domain can be related to each other with the same uncertainty. We expect this method will be a new breakthrough in high-accuracy time transfer and absolute calibration of time-transfer equipment.

  18. Eliminating traditional reference services in an academic health sciences library: a case study

    PubMed Central

    Schulte, Stephanie J

    2011-01-01

    Question: How were traditional librarian reference desk services successfully eliminated at one health sciences library? Setting: The analysis was done at an academic health sciences library at a major research university. Method: A gap analysis was performed, evaluating changes in the first eleven months through analysis of reference transaction and instructional session data. Main Results: Substantial increases were seen in the overall number of specialized reference transactions and those conducted by librarians lasting more than thirty minutes. The number of reference transactions overall increased after implementing the new model. Several new small-scale instructional initiatives began, though perhaps not directly related to the new model. Conclusion: Traditional reference desk services were eliminated at one academic health sciences library without negative impact on reference and instructional statistics. Eliminating ties to the confines of the physical library due to staffing reference desk hours removed one significant barrier to a more proactive liaison program. PMID:22022221

  19. Devices for the Production of Reference Gas Mixtures.

    PubMed

    Fijało, Cyprian; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2016-09-02

    For many years there has been growing demand for gaseous reference materials, which is connected with development in many fields of science and technology. As a result, new methodological and instrumental solutions appear that can be used for this purpose. Appropriate quality assurance/quality control (QA/QC) must be used to make sure that measurement data are a reliable source of information. Reference materials are a significant element of such systems. In the case of gas samples, such materials are generally called reference gas mixtures. This article presents the application and classification of reference gas mixtures, which are a specific type of reference materials, and the methods for obtaining them are described. Construction solutions of devices for the production of reference gas mixtures are detailed, and a description of a prototype device for dynamic production of reference gas mixtures containing aroma compounds is presented.

  20. Acquisition and replay systems for direct-to-digital holography and holovision

    DOEpatents

    Thomas, Clarence E.; Hanson, Gregory R.

    2003-02-25

    Improvements to the acquisition and replay systems for direct-to-digital holography and holovision are described. A method of recording an off-axis hologram includes: splitting a laser beam into an object beam and a reference beam; reflecting the reference beam from a reference beam mirror; reflecting the object beam from an illumination beamsplitter; passing the object beam through an objective lens; reflecting the object beam from an object; focusing the reference beam and the object beam at a focal plane of a digital recorder to form an off-axis hologram; digitally recording the off-axis hologram; and transforming the off-axis hologram in accordance with a Fourier transform to obtain a set of results. A method of writing an off-axis hologram includes: passing a laser beam through a spatial light modulator; and focusing the laser beam at a focal plane of a photorefractive crystal to impose a holographic diffraction grating pattern on the photorefractive crystal. A method of replaying an off-axis hologram includes: illuminating a photorefractive crystal having a holographic diffraction grating with a replay beam.

  1. Reference voltage calculation method based on zero-sequence component optimisation for a regional compensation DVR

    NASA Astrophysics Data System (ADS)

    Jian, Le; Cao, Wang; Jintao, Yang; Yinge, Wang

    2018-04-01

    This paper describes the design of a dynamic voltage restorer (DVR) that can simultaneously protect several sensitive loads from voltage sags in a region of an MV distribution network. A novel reference voltage calculation method based on zero-sequence voltage optimisation is proposed for this DVR to optimise cost-effectiveness in compensation of voltage sags with different characteristics in an ungrounded neutral system. Based on a detailed analysis of the characteristics of voltage sags caused by different types of faults and the effect of the wiring mode of the transformer on these characteristics, the optimisation target of the reference voltage calculation is presented with several constraints. The reference voltages under all types of voltage sags are calculated by optimising the zero-sequence component, which can reduce the degree of swell in the phase-to-ground voltage after compensation to the maximum extent and can improve the symmetry degree of the output voltages of the DVR, thereby effectively increasing the compensation ability. The validity and effectiveness of the proposed method are verified by simulation and experimental results.

  2. Simple equation for calculation of plasma clearance for evaluation of renal function without urine collection in rats.

    PubMed

    Liu, Xiang; Peng, Dejun; Tian, Hao; Lu, Chengyu

    2017-01-01

    To develop an equation for the evaluation of renal function in rats using three dilutions of plasma samples and to validate this method by comparison with a reference method. The investigation was conducted in Sprague-Dawley (SD) rats after delivery of three doses of iohexol, with blood samples collected before and after dosage using a quantitative blood collection method. Plasma iohexol concentrations were detected by high performance liquid chromatography (HPLC). The extraction recovery of iohexol from plasma was >97.30% and the calibration curve was linear (r 2  = 0.9997) over iohexol concentrations ranging from 10 to 1000 µg/mL. The method had an RE of <9.310 and intra- and inter-day RSD of <5.137% and <3.693%, respectively. The plasma clearance values obtained from the equation correlated closely (r = 0.763) with those obtained using the reference method. The relatively correlation in the results obtained using the method under investigation and the reference method indicate that this new equation can be used for preliminary assessment of renal function in rats. © 2016 Asian Pacific Society of Nephrology.

  3. US Fish and Wildlife Service biomonitoring operations manual, Appendices A--K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianotto, D.F.; Rope, R.C.; Mondecar, M.

    1993-04-01

    Volume 2 contains Appendices and Summary Sheets for the following areas: A-Legislative Background and Key to Relevant Legislation, B- Biomonitoring Operations Workbook, C-Air Monitoring, D-Introduction to the Flora and Fauna for Biomonitoring, E-Decontamination Guidance Reference Field Methods, F-Documentation Guidance, Sample Handling, and Quality Assurance/Quality Control Standard Operating Procedures, G-Field Instrument Measurements Reference Field Methods, H-Ground Water Sampling Reference Field Methods, I-Sediment Sampling Reference Field Methods, J-Soil Sampling Reference Field Methods, K-Surface Water Reference Field Methods. Appendix B explains how to set up strategy to enter information on the ``disk workbook``. Appendix B is enhanced by DE97006389, an on-line workbook formore » users to be able to make revisions to their own biomonitoring data.« less

  4. Optimization of Composite Structures with Curved Fiber Trajectories

    NASA Astrophysics Data System (ADS)

    Lemaire, Etienne; Zein, Samih; Bruyneel, Michael

    2014-06-01

    This paper studies the problem of optimizing composites shells manufactured using Automated Tape Layup (ATL) or Automated Fiber Placement (AFP) processes. The optimization procedure relies on a new approach to generate equidistant fiber trajectories based on Fast Marching Method. Starting with a (possibly curved) reference fiber direction defined on a (possibly curved) meshed surface, the new method allows determining fibers orientation resulting from a uniform thickness layup. The design variables are the parameters defining the position and the shape of the reference curve which results in very few design variables. Thanks to this efficient parameterization, maximum stiffness optimization numerical applications are proposed. The shape of the design space is discussed, regarding local and global optimal solutions.

  5. Certification of a reference material for determination of total cyanide in soil to support implementation of the International Standard ISO 11262:2011.

    PubMed

    Scharf, Holger; Bremser, Wolfram

    2015-04-01

    Cyanides are among the most important inorganic pollutants to be tested and monitored in environmental compartments. They can be distinguished and determined as free cyanide, weak acid dissociable cyanide or as total cyanide. However, in any case obtained, measurement results are operationally defined referring to the applied analytical method. In 2011, the International Standard ISO 11262 has been published which specifies a normative analytical method for the determination of total cyanide in soil. The objective of the project described in this paper was to provide the first soil reference material (CRM) certified for its mass fraction of total cyanide on the basis of this standard. A total of 12 German laboratories with proven experience in the determination of cyanides in environmental samples participated in the certification study. Measurement results were evaluated in full compliance with the requirements of ISO Guide 35. Taking into account the results of the inter-laboratory comparison as well as the outcome of the homogeneity and stability studies, a certified mass fraction of total cyanide of 21.1 mg/kg and an expanded uncertainty (k = 2) of 1.3 mg/kg were assigned to the material. The reference material has been issued as CRM BAM-U114.

  6. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  7. Spatial Standard Observer

    NASA Technical Reports Server (NTRS)

    Watson, Andrw B. (Inventor)

    2010-01-01

    The present invention relates to devices and methods for the measurement and/or for the specification of the perceptual intensity of a visual image. or the perceptual distance between a pair of images. Grayscale test and reference images are processed to produce test and reference luminance images. A luminance filter function is convolved with the reference luminance image to produce a local mean luminance reference image . Test and reference contrast images are produced from the local mean luminance reference image and the test and reference luminance images respectively, followed by application of a contrast sensitivity filter. The resulting images are combined according to mathematical prescriptions to produce a Just Noticeable Difference, JND value, indicative of a Spatial Standard Observer. SSO. Some embodiments include masking functions. window functions. special treatment for images lying on or near border and pre-processing of test images.

  8. Spatial Standard Observer

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B. (Inventor)

    2012-01-01

    The present invention relates to devices and methods for the measurement and/or for the specification of the perceptual intensity of a visual image, or the perceptual distance between a pair of images. Grayscale test and reference images are processed to produce test and reference luminance images. A luminance filter function is convolved with the reference luminance image to produce a local mean luminance reference image. Test and reference contrast images are produced from the local mean luminance reference image and the test and reference luminance images respectively, followed by application of a contrast sensitivity filter. The resulting images are combined according to mathematical prescriptions to produce a Just Noticeable Difference, JND value, indicative of a Spatial Standard Observer, SSO. Some embodiments include masking functions, window functions, special treatment for images lying on or near borders and pre-processing of test images.

  9. A preliminary verification of the floating reference measurement method for non-invasive blood glucose sensing

    NASA Astrophysics Data System (ADS)

    Min, Xiaolin; Liu, Rong; Fu, Bo; Xu, Kexin

    2017-06-01

    In the non-invasive sensing of blood glucose by near-infrared diffuse reflectance spectroscopy, the spectrum is highly susceptible to the unstable and complicated background variations from the human body and the environment. In in vitro analyses, background variations are usually corrected by the spectrum of a standard reference sample that has similar optical properties to the analyte of interest. However, it is hard to find a standard sample for the in vivo measurement. Therefore, the floating reference measurement method is proposed to enable relative measurements in vivo, where the spectra under some special source-detector distance, defined as the floating reference position, are insensitive to the changes in glucose concentration due to the absorption effect and scattering effect. Because the diffuse reflectance signals at the floating reference positions only reflect the information on background variations during the measurement, they can be used as the internal reference. In this paper, the theoretical basis of the floating reference positions in a semi-infinite turbid medium was discussed based on the steady-state diffusion equation and its analytical solutions in a semi-infinite turbid medium (under the extrapolated boundary conditions). Then, Monte-Carlo (MC) simulations and in vitro experiments based on a custom-built continuous-moving spatially resolving double-fiber NIR measurement system, configured with two types of light source, a super luminescent diode (SLD) and a super-continuum laser, were carried out to verify the existence of the floating reference position in 5%, 10% and 20% Intralipid solutions. The results showed that the simulation values of the floating reference positions are close to the theoretical results, with a maximum deviation of approximately 0.3 mm in 1100-1320 nm. Great differences can be observed in 1340-1400 nm because the optical properties of Intralipid in this region don not satisfy the conditions of the steady-state diffusion equation. For the in vitro experiments, floating reference positions exist in 1220 nm and 1320 nm under two types of light source, and the results are quite close. However, the reference positions obtained from experiments are further from the light source compared with those obtained in the MC simulation. For the turbid media and the wavelengths investigated, the difference is up to 1 mm. This study is important for the design of optical fibers to be applied in the floating reference measurement.

  10. Reference values of thirty-one frequently used laboratory markers for 75-year-old males and females

    PubMed Central

    Ryden, Ingvar; Lind, Lars

    2012-01-01

    Background We have previously reported reference values for common clinical chemistry tests in healthy 70-year-old males and females. We have now repeated this study 5 years later to establish reference values also at the age of 75. It is important to have adequate reference values for elderly patients as biological markers may change over time, and adequate reference values are essential for correct clinical decisions. Methods We have investigated 31 frequently used laboratory markers in 75-year-old males (n = 354) and females (n = 373) without diabetes. The 2.5 and 97.5 percentiles for these markers were calculated according to the recommendations of the International Federation of Clinical Chemistry. Results Reference values are reported for 75-year-old males and females for 31 frequently used laboratory markers. Conclusion There were minor differences between reference intervals calculated with and without individuals with cardiovascular diseases. Several of the reference intervals differed from Scandinavian reference intervals based on younger individuals (Nordic Reference Interval Project). PMID:22300333

  11. The application of artificial neural networks and support vector regression for simultaneous spectrophotometric determination of commercial eye drop contents

    NASA Astrophysics Data System (ADS)

    Valizadeh, Maryam; Sohrabi, Mahmoud Reza

    2018-03-01

    In the present study, artificial neural networks (ANNs) and support vector regression (SVR) as intelligent methods coupled with UV spectroscopy for simultaneous quantitative determination of Dorzolamide (DOR) and Timolol (TIM) in eye drop. Several synthetic mixtures were analyzed for validating the proposed methods. At first, neural network time series, which one type of network from the artificial neural network was employed and its efficiency was evaluated. Afterwards, the radial basis network was applied as another neural network. Results showed that the performance of this method is suitable for predicting. Finally, support vector regression was proposed to construct the Zilomole prediction model. Also, root mean square error (RMSE) and mean recovery (%) were calculated for SVR method. Moreover, the proposed methods were compared to the high-performance liquid chromatography (HPLC) as a reference method. One way analysis of variance (ANOVA) test at the 95% confidence level applied to the comparison results of suggested and reference methods that there were no significant differences between them. Also, the effect of interferences was investigated in spike solutions.

  12. A statistical assessment of differences and equivalences between genetically modified and reference plant varieties

    PubMed Central

    2011-01-01

    Background Safety assessment of genetically modified organisms is currently often performed by comparative evaluation. However, natural variation of plant characteristics between commercial varieties is usually not considered explicitly in the statistical computations underlying the assessment. Results Statistical methods are described for the assessment of the difference between a genetically modified (GM) plant variety and a conventional non-GM counterpart, and for the assessment of the equivalence between the GM variety and a group of reference plant varieties which have a history of safe use. It is proposed to present the results of both difference and equivalence testing for all relevant plant characteristics simultaneously in one or a few graphs, as an aid for further interpretation in safety assessment. A procedure is suggested to derive equivalence limits from the observed results for the reference plant varieties using a specific implementation of the linear mixed model. Three different equivalence tests are defined to classify any result in one of four equivalence classes. The performance of the proposed methods is investigated by a simulation study, and the methods are illustrated on compositional data from a field study on maize grain. Conclusions A clear distinction of practical relevance is shown between difference and equivalence testing. The proposed tests are shown to have appropriate performance characteristics by simulation, and the proposed simultaneous graphical representation of results was found to be helpful for the interpretation of results from a practical field trial data set. PMID:21324199

  13. Validation of the ANSR(®) Listeria monocytogenes Method for Detection of Listeria monocytogenes in Selected Food and Environmental Samples.

    PubMed

    Caballero, Oscar; Alles, Susan; Le, Quynh-Nhi; Gray, R Lucas; Hosking, Edan; Pinkava, Lisa; Norton, Paul; Tolan, Jerry; Mozola, Mark; Rice, Jennifer; Chen, Yi; Ryser, Elliot; Odumeru, Joseph

    2016-01-01

    Work was conducted to validate performance of the ANSR(®) for Listeria monocytogenes method in selected food and environmental matrixes. This DNA-based assay involves amplification of nucleic acid via an isothermal reaction based on nicking enzyme amplification technology. Following single-step sample enrichment for 16-24 h for most matrixes, the assay is completed in 40 min using only simple instrumentation. When 50 distinct strains of L. monocytogenes were tested for inclusivity, 48 produced positive results, the exceptions being two strains confirmed by PCR to lack the assay target gene. Forty-seven nontarget strains (30 species), including multiple non-monocytogenes Listeria species as well as non-Listeria, Gram-positive bacteria, were tested, and all generated negative ANSR assay results. Performance of the ANSR method was compared with that of the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook reference culture procedure for detection of L. monocytogenes in hot dogs, pasteurized liquid egg, and sponge samples taken from an inoculated stainless steel surface. In addition, ANSR performance was measured against the U.S. Food and Drug Administration Bacteriological Analytical Manual reference method for detection of L. monocytogenes in Mexican-style cheese, cantaloupe, sprout irrigation water, and guacamole. With the single exception of pasteurized liquid egg at 16 h, ANSR method performance as quantified by the number of positives obtained was not statistically different from that of the reference methods. Robustness trials demonstrated that deliberate introduction of small deviations to the normal assay parameters did not affect ANSR method performance. Results of accelerated stability testing conducted using two manufactured lots of reagents predicts stability at the specified storage temperature of 4°C of more than 1 year.

  14. The impacts of speed cameras on road accidents: an application of propensity score matching methods.

    PubMed

    Li, Haojie; Graham, Daniel J; Majumdar, Arnab

    2013-11-01

    This paper aims to evaluate the impacts of speed limit enforcement cameras on reducing road accidents in the UK by accounting for both confounding factors and the selection of proper reference groups. The propensity score matching (PSM) method is employed to do this. A naïve before and after approach and the empirical Bayes (EB) method are compared with the PSM method. A total of 771 sites and 4787 sites for the treatment and the potential reference groups respectively are observed for a period of 9 years in England. Both the PSM and the EB methods show similar results that there are significant reductions in the number of accidents of all severities at speed camera sites. It is suggested that the propensity score can be used as the criteria for selecting the reference group in before-after control studies. Speed cameras were found to be most effective in reducing accidents up to 200 meters from camera sites and no evidence of accident migration was found. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. MaCH-Admix: Genotype Imputation for Admixed Populations

    PubMed Central

    Liu, Eric Yi; Li, Mingyao; Wang, Wei; Li, Yun

    2012-01-01

    Imputation in admixed populations is an important problem but challenging due to the complex linkage disequilibrium (LD) pattern. The emergence of large reference panels such as that from the 1,000 Genomes Project enables more accurate imputation in general, and in particular for admixed populations and for uncommon variants. To efficiently benefit from these large reference panels, one key issue to consider in modern genotype imputation framework is the selection of effective reference panels. In this work, we consider a number of methods for effective reference panel construction inside a hidden Markov model and specific to each target individual. These methods fall into two categories: identity-by-state (IBS) based and ancestry-weighted approach. We evaluated the performance on individuals from recently admixed populations. Our target samples include 8,421 African Americans and 3,587 Hispanic Americans from the Women’s Health Initiative, which allow assessment of imputation quality for uncommon variants. Our experiments include both large and small reference panels; large, medium, and small target samples; and in genome regions of varying levels of LD. We also include BEAGLE and IMPUTE2 for comparison. Experiment results with large reference panel suggest that our novel piecewise IBS method yields consistently higher imputation quality than other methods/software. The advantage is particularly noteworthy among uncommon variants where we observe up to 5.1% information gain with the difference being highly significant (Wilcoxon signed rank test P-value < 0.0001). Our work is the first that considers various sensible approaches for imputation in admixed populations and presents a comprehensive comparison. PMID:23074066

  16. Forensic investigation of plutonium metal: a case study of CRM 126

    DOE PAGES

    Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal; ...

    2016-11-01

    In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.

  17. Paediatric Cochlear Implantation in Patients with Waardenburg Syndrome

    PubMed Central

    van Nierop, Josephine W.I.; Snabel, Rebecca R.; Langereis, Margreet; Pennings, Ronald J.E.; Admiraal, Ronald J.C.; Mylanus, Emmanuel A.M.; Kunst, Henricus P.M.

    2016-01-01

    Objective To analyse the benefit of cochlear implantation in young deaf children with Waardenburg syndrome (WS) compared to a reference group of young deaf children without additional disabilities. Method A retrospective study was conducted on children with WS who underwent cochlear implantation at the age of 2 years or younger. The post-operative results for speech perception (phonetically balanced standard Dutch consonant-vocal-consonant word lists) and language comprehension (the Reynell Developmental Language Scales, RDLS), expressed as a language quotient (LQ), were compared between the WS group and the reference group by using multiple linear regression analysis. Results A total of 14 children were diagnosed with WS, and 6 of them had additional disabilities. The WS children were implanted at a mean age of 1.6 years and the 48 children of the reference group at a mean age of 1.3 years. The WS children had a mean phoneme score of 80% and a mean LQ of 0.74 at 3 years post-implantation, and these results were comparable to those of the reference group. Only the factor additional disabilities had a significant negative influence on auditory perception and language comprehension. Conclusions Children with WS performed similarly to the reference group in the present study, and these outcomes are in line with the previous literature. Although good counselling about additional disabilities concomitant to the syndrome is relevant, cochlear implantation is a good rehabilitation method for children with WS. PMID:27245679

  18. Quantitative estimation of α-PVP metabolites in urine by GC-APCI-QTOFMS with nitrogen chemiluminescence detection based on parent drug calibration.

    PubMed

    Mesihää, Samuel; Rasanen, Ilpo; Ojanperä, Ilkka

    2018-05-01

    Gas chromatography (GC) hyphenated with nitrogen chemiluminescence detection (NCD) and quadrupole time-of-flight mass spectrometry (QTOFMS) was applied for the first time to the quantitative analysis of new psychoactive substances (NPS) in urine, based on the N-equimolar response of NCD. A method was developed and validated to estimate the concentrations of three metabolites of the common stimulant NPS α-pyrrolidinovalerophenone (α-PVP) in spiked urine samples, simulating an analysis having no authentic reference standards for the metabolites and using the parent drug instead for quantitative calibration. The metabolites studied were OH-α-PVP (M1), 2″-oxo-α-PVP (M3), and N,N-bis-dealkyl-PVP (2-amino-1-phenylpentan-1-one; M5). Sample preparation involved liquid-liquid extraction with a mixture of ethyl acetate and butyl chloride at a basic pH and subsequent silylation of the sec-hydroxyl and prim-amino groups of M1 and M5, respectively. Simultaneous compound identification was based on the accurate masses of the protonated molecules for each compound by QTOFMS following atmospheric pressure chemical ionization. The accuracy of quantification of the parent-calibrated NCD method was compared with that of the corresponding parent-calibrated QTOFMS method, as well as with a reference QTOFMS method calibrated with the authentic reference standards. The NCD method produced an equally good accuracy to the reference method for α-PVP, M3 and M5, while a higher negative bias (25%) was obtained for M1, best explainable by recovery and stability issues. The performance of the parent-calibrated QTOFMS method was inferior to the reference method with an especially high negative bias (60%) for M1. The NCD method enabled better quantitative precision than the QTOFMS methods To evaluate the novel approach in casework, twenty post- mortem urine samples previously found positive for α-PVP were analyzed by the parent calibrated NCD method and the reference QTOFMS method. The highest difference in the quantitative results between the two methods was only 33%, and the NCD method's precision as the coefficient of variation was better than 13%. The limit of quantification for the NCD method was approximately 0.25μg/mL in urine, which generally allowed the analysis of α-PVP and the main metabolite M1. However, the sensitivity was not sufficient for the low concentrations of M3 and M5. Consequently, while having potential for instant analysis of NPS and metabolites in moderate concentrations without reference standards, the NCD method should be further developed for improved sensitivity to be more generally applicable. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Ballistocardiogram as Proximal Timing Reference for Pulse Transit Time Measurement: Potential for Cuffless Blood Pressure Monitoring

    PubMed Central

    Kim, Chang-Sei; Carek, Andrew M.; Mukkamala, Ramakrishna; Inan, Omer T.; Hahn, Jin-Oh

    2015-01-01

    Goal We tested the hypothesis that the ballistocardiogram (BCG) waveform could yield a viable proximal timing reference for measuring pulse transit time (PTT). Methods From fifteen healthy volunteers, we measured PTT as the time interval between BCG and a non-invasively measured finger blood pressure (BP) waveform. To evaluate the efficacy of the BCG-based PTT in estimating BP, we likewise measured pulse arrival time (PAT) using the electrocardiogram (ECG) as proximal timing reference and compared their correlations to BP. Results BCG-based PTT was correlated with BP reasonably well: the mean correlation coefficient (r) was 0.62 for diastolic (DP), 0.65 for mean (MP) and 0.66 for systolic (SP) pressures when the intersecting tangent method was used as distal timing reference. Comparing four distal timing references (intersecting tangent, maximum second derivative, diastolic minimum and systolic maximum), PTT exhibited the best correlation with BP when the systolic maximum method was used (mean r value was 0.66 for DP, 0.67 for MP and 0.70 for SP). PTT was more strongly correlated with DP than PAT regardless of the distal timing reference: mean r value was 0.62 versus 0.51 (p=0.07) for intersecting tangent, 0.54 versus 0.49 (p=0.17) for maximum second derivative, 0.58 versus 0.52 (p=0.37) for diastolic minimum, and 0.66 versus 0.60 (p=0.10) for systolic maximum methods. The difference between PTT and PAT in estimating DP was significant (p=0.01) when the r values associated with all the distal timing references were compared altogether. However, PAT appeared to outperform PTT in estimating SP (p=0.31 when the r values associated with all the distal timing references were compared altogether). Conclusion We conclude that BCG is an adequate proximal timing reference in deriving PTT, and that BCG-based PTT may be superior to ECG-based PAT in estimating DP. Significance PTT with BCG as proximal timing reference has potential to enable convenient and ubiquitous cuffless BP monitoring. PMID:26054058

  20. Quantitative determination of ambroxol in tablets by derivative UV spectrophotometric method and HPLC.

    PubMed

    Dinçer, Zafer; Basan, Hasan; Göger, Nilgün Günden

    2003-04-01

    A derivative UV spectrophotometric method for the determination of ambroxol in tablets was developed. Determination of ambroxol in tablets was conducted by using first-order derivative UV spectrophotometric method at 255 nm (n = 5). Standards for the calibration graph ranging from 5.0 to 35.0 microg/ml were prepared from stock solution. The proposed method was accurate with 98.6+/-0.4% recovery value and precise with coefficient of variation (CV) of 1.22. These results were compared with those obtained by reference methods, zero-order UV spectrophotometric method and reversed-phase high-performance liquid chromatography (HPLC) method. A reversed-phase C(18) column with aqueous phosphate (0.01 M)-acetonitrile-glacial acetic acid (59:40:1, v/v/v) (pH 3.12) mobile phase was used and UV detector was set to 252 nm. Calibration solutions used in HPLC were ranging from 5.0 to 20.0 microg/ml. Results obtained by derivative UV spectrophotometric method was comparable to those obtained by reference methods, zero-order UV spectrophotometric method and HPLC, as far as ANOVA test, F(calculated) = 0.762 and F(theoretical) = 3.89, was concerned. Copyright 2003 Elsevier Science B.V.

  1. Correction Approach for Delta Function Convolution Model Fitting of Fluorescence Decay Data in the Case of a Monoexponential Reference Fluorophore.

    PubMed

    Talbot, Clifford B; Lagarto, João; Warren, Sean; Neil, Mark A A; French, Paul M W; Dunsby, Chris

    2015-09-01

    A correction is proposed to the Delta function convolution method (DFCM) for fitting a multiexponential decay model to time-resolved fluorescence decay data using a monoexponential reference fluorophore. A theoretical analysis of the discretised DFCM multiexponential decay function shows the presence an extra exponential decay term with the same lifetime as the reference fluorophore that we denote as the residual reference component. This extra decay component arises as a result of the discretised convolution of one of the two terms in the modified model function required by the DFCM. The effect of the residual reference component becomes more pronounced when the fluorescence lifetime of the reference is longer than all of the individual components of the specimen under inspection and when the temporal sampling interval is not negligible compared to the quantity (τR (-1) - τ(-1))(-1), where τR and τ are the fluorescence lifetimes of the reference and the specimen respectively. It is shown that the unwanted residual reference component results in systematic errors when fitting simulated data and that these errors are not present when the proposed correction is applied. The correction is also verified using real data obtained from experiment.

  2. Patient dose measurement in common medical X-ray examinations and propose the first local dose reference levels to diagnostic radiology in Iran

    NASA Astrophysics Data System (ADS)

    Rasuli, Behrouz; Tabari Juybari, Raheleh; Forouzi, Meysam; Ghorbani, Mohammad

    2017-09-01

    Introduction: The main purpose of this study was to investigate patient dose in pelvic and abdomen x-ray examinations. This work also provided the LDRLs (local diagnostic reference levels) in Khuzestan region, southwest of Iran to help establish the NDRLs (national diagnostic reference levels). Methods: Patient doses were assessed from patient's anatomical data and exposure parameters based on the IAEA indirect dosimetry method. With regard to this method, exposure parameters such as tube output, kVp, mAs, FFD and patient anatomical data were used for calculating ESD (entrance skin dose) of patients. This study was conducted on 250 standard patients (50% men and 50% women) at eight high-patient-load imaging centers. Results: The results indicate that mean ESDs for the both pelvic and abdomen examinations were lower than the IAEA and EC reference levels, 2.3 and 3.7 mGy, respectively. Mean applied kVps were 67 and 70 and mean FFDs were 103 and 109, respectively. Tube loadings obtained in this study for pelvic examination were lower than all the corresponding values in the reviewed literature. Likewise, the average annual patient load across all hospitals were more than 37000 patients, i.e. more than 100 patients a day. Conclusions: The authors recommend that DRLs (diagnostic reference levels) obtained in this region, which are the first available data, can be used as local DRLs for pelvic and abdomen procedures. This work also provides that on-the-job training programs for staffs and close cross collaboration between physicists and physicians should be strongly considered.

  3. Calibration of Valiantzas' reference evapotranspiration equations for the Pilbara region, Western Australia

    NASA Astrophysics Data System (ADS)

    Ahooghalandari, Matin; Khiadani, Mehdi; Jahromi, Mina Esmi

    2017-05-01

    Reference evapotranspiration (ET0) is a critical component of water resources management and planning. Different methods have been developed to estimate ET0 with various required data. In this study, Hargreaves, Turc, Oudin, Copais, Abtew methods and three forms of Valiantzas' formulas, developed in recent years, were used to estimate ET0 for the Pilbara region of Western Australia. The estimated ET0 values from these methods were compared with those from the FAO-56 Penman-Monteith (PM) method. The results showed that the Copais methods and two of Valiantzas' equations, in their original forms, are suitable for estimating ET0 for the study area. A modification of Honey-Bee Mating Optimization (MHBMO) algorithm was further implemented, and three Valiantzas' equations for a region located in the southern hemisphere were calibrated.

  4. Method of predicting mechanical properties of decayed wood

    DOEpatents

    Kelley, Stephen S.

    2003-07-15

    A method for determining the mechanical properties of decayed wood that has been exposed to wood decay microorganisms, comprising: a) illuminating a surface of decayed wood that has been exposed to wood decay microorganisms with wavelengths from visible and near infrared (VIS-NIR) spectra; b) analyzing the surface of the decayed wood using a spectrometric method, the method generating a first spectral data of wavelengths in VIS-NIR spectra region; and c) using a multivariate analysis to predict mechanical properties of decayed wood by comparing the first spectral data with a calibration model, the calibration model comprising a second spectrometric method of spectral data of wavelengths in VIS-NIR spectra obtained from a reference decay wood, the second spectral data being correlated with a known mechanical property analytical result obtained from the reference decayed wood.

  5. Stable isotope analyses of oxygen (18O:17O:16O) and chlorine (37Cl:35Cl) in perchlorate: reference materials, calibrations, methods, and interferences

    USGS Publications Warehouse

    Böhlke, John Karl; Mroczkowski, Stanley J.; Sturchio, Neil C.; Heraty, Linnea J.; Richman, Kent W.; Sullivan, Donald B.; Griffith, Kris N.; Gu, Baohua; Hatzinger, Paul B.

    2017-01-01

    RationalePerchlorate (ClO4−) is a common trace constituent of water, soils, and plants; it has both natural and synthetic sources and is subject to biodegradation. The stable isotope ratios of Cl and O provide three independent quantities for ClO4− source attribution and natural attenuation studies: δ37Cl, δ18O, and δ17O (or Δ17O or 17Δ) values. Documented reference materials, calibration schemes, methods, and interferences will improve the reliability of such studies.MethodsThree large batches of KClO4 with contrasting isotopic compositions were synthesized and analyzed against VSMOW-SLAP, atmospheric O2, and international nitrate and chloride reference materials. Three analytical methods were tested for O isotopes: conversion of ClO4− to CO for continuous-flow IRMS (CO-CFIRMS), decomposition to O2 for dual-inlet IRMS (O2-DIIRMS), and decomposition to O2 with molecular-sieve trap (O2-DIIRMS+T). For Cl isotopes, KCl produced by thermal decomposition of KClO4 was reprecipitated as AgCl and converted into CH3Cl for DIIRMS.ResultsKClO4 isotopic reference materials (USGS37, USGS38, USGS39) represent a wide range of Cl and O isotopic compositions, including non-mass-dependent O isotopic variation. Isotopic fractionation and exchange can affect O isotope analyses of ClO4− depending on the decomposition method. Routine analyses can be adjusted for such effects by normalization, using reference materials prepared and analyzed as samples. Analytical errors caused by SO42−, NO3−, ReO42−, and C-bearing contaminants include isotope mixing and fractionation effects on CO and O2, plus direct interference from CO2 in the mass spectrometer. The results highlight the importance of effective purification of ClO4− from environmental samples.ConclusionsKClO4 reference materials are available for testing methods and calibrating isotopic data for ClO4− and other substances with widely varying Cl or O isotopic compositions. Current ClO4−extraction, purification, and analysis techniques provide relative isotope-ratio measurements with uncertainties much smaller than the range of values in environmental ClO4−, permitting isotopic evaluation of environmental ClO4− sources and natural attenuation.

  6. Design verification of large time constant thermal shields for optical reference cavities.

    PubMed

    Zhang, J; Wu, W; Shi, X H; Zeng, X Y; Deng, K; Lu, Z H

    2016-02-01

    In order to achieve high frequency stability in ultra-stable lasers, the Fabry-Pérot reference cavities shall be put inside vacuum chambers with large thermal time constants to reduce the sensitivity to external temperature fluctuations. Currently, the determination of thermal time constants of vacuum chambers is based either on theoretical calculation or time-consuming experiments. The first method can only apply to simple system, while the second method will take a lot of time to try out different designs. To overcome these limitations, we present thermal time constant simulation using finite element analysis (FEA) based on complete vacuum chamber models and verify the results with measured time constants. We measure the thermal time constants using ultrastable laser systems and a frequency comb. The thermal expansion coefficients of optical reference cavities are precisely measured to reduce the measurement error of time constants. The simulation results and the experimental results agree very well. With this knowledge, we simulate several simplified design models using FEA to obtain larger vacuum thermal time constants at room temperature, taking into account vacuum pressure, shielding layers, and support structure. We adopt the Taguchi method for shielding layer optimization and demonstrate that layer material and layer number dominate the contributions to the thermal time constant, compared with layer thickness and layer spacing.

  7. Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.

    PubMed

    Browne, Patience; Judson, Richard S; Casey, Warren M; Kleinstreuer, Nicole C; Thomas, Russell S

    2015-07-21

    The U.S. Environmental Protection Agency (EPA) is considering high-throughput and computational methods to evaluate the endocrine bioactivity of environmental chemicals. Here we describe a multistep, performance-based validation of new methods and demonstrate that these new tools are sufficiently robust to be used in the Endocrine Disruptor Screening Program (EDSP). Results from 18 estrogen receptor (ER) ToxCast high-throughput screening assays were integrated into a computational model that can discriminate bioactivity from assay-specific interference and cytotoxicity. Model scores range from 0 (no activity) to 1 (bioactivity of 17β-estradiol). ToxCast ER model performance was evaluated for reference chemicals, as well as results of EDSP Tier 1 screening assays in current practice. The ToxCast ER model accuracy was 86% to 93% when compared to reference chemicals and predicted results of EDSP Tier 1 guideline and other uterotrophic studies with 84% to 100% accuracy. The performance of high-throughput assays and ToxCast ER model predictions demonstrates that these methods correctly identify active and inactive reference chemicals, provide a measure of relative ER bioactivity, and rapidly identify chemicals with potential endocrine bioactivities for additional screening and testing. EPA is accepting ToxCast ER model data for 1812 chemicals as alternatives for EDSP Tier 1 ER binding, ER transactivation, and uterotrophic assays.

  8. Application of the Reference Method Isotope Dilution Gas Chromatography Mass Spectrometry (ID/GC/MS) to Establish Metrological Traceability for Calibration and Control of Blood Glucose Test Systems

    PubMed Central

    Andreis, Elisabeth; Küllmer, Kai

    2014-01-01

    Self-monitoring of blood glucose (BG) by means of handheld BG systems is a cornerstone in diabetes therapy. The aim of this article is to describe a procedure with proven traceability for calibration and evaluation of BG systems to guarantee reliable BG measurements. Isotope dilution gas chromatography mass spectrometry (ID/GC/MS) is a method that fulfills all requirements to be used in a higher-order reference measurement procedure. However, this method is not applicable for routine measurements because of the time-consuming sample preparation. A hexokinase method with perchloric acid (PCA) sample pretreatment is used in a measurement procedure for such purposes. This method is directly linked to the ID/GC/MS method by calibration with a glucose solution that has an ID/GC/MS-determined target value. BG systems are calibrated with whole blood samples. The glucose levels in such samples are analyzed by this ID/GC/MS-linked hexokinase method to establish traceability to higher-order reference material. For method comparison, the glucose concentrations in 577 whole blood samples were measured using the PCA-hexokinase method and the ID/GC/MS method; this resulted in a mean deviation of 0.1%. The mean deviation between BG levels measured in >500 valid whole blood samples with BG systems and the ID/GC/MS was 1.1%. BG systems allow a reliable glucose measurement if a true reference measurement procedure, with a noninterrupted traceability chain using ID/GC/MS linked hexokinase method for calibration of BG systems, is implemented. Systems should be calibrated by means of a traceable and defined measurement procedure to avoid bias. PMID:24876614

  9. Identification and antimicrobial susceptibility testing of Staphylococcus vitulinus by the BD phoenix automated microbiology system.

    PubMed

    Cirković, Ivana; Hauschild, Tomasz; Jezek, Petr; Dimitrijević, Vladimir; Vuković, Dragana; Stepanović, Srdjan

    2008-08-01

    This study evaluated the performance of the BD Phoenix system for the identification (ID) and antimicrobial susceptibility testing (AST) of Staphylococcus vitulinus. Of the 10 S. vitulinus isolates included in the study, 2 were obtained from the Czech Collection of Microorganisms, 5 from the environment, 2 from human clinical samples, and 1 from an animal source. The results of conventional biochemical and molecular tests were used for the reference method for ID, while antimicrobial susceptibility testing performed in accordance with Clinical and Laboratory Standards Institute recommendations and PCR for the mecA gene were the reference for AST. Three isolates were incorrectly identified by the BD Phoenix system; one of these was incorrectly identified to the genus level, and two to the species level. The results of AST by the BD Phoenix system were in agreement with those by the reference method used. While the results of susceptibility testing compared favorably, the 70% accuracy of the Phoenix system for identification of this unusual staphylococcal species was not fully satisfactory.

  10. Study on the calibration and optimization of double theodolites baseline

    NASA Astrophysics Data System (ADS)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  11. Reference genes for normalization of gene expression studies in human osteoarthritic articular cartilage.

    PubMed

    Pombo-Suarez, Manuel; Calaza, Manuel; Gomez-Reino, Juan J; Gonzalez, Antonio

    2008-01-29

    Assessment of gene expression is an important component of osteoarthritis (OA) research, greatly improved by the development of quantitative real-time PCR (qPCR). This technique requires normalization for precise results, yet no suitable reference genes have been identified in human articular cartilage. We have examined ten well-known reference genes to determine the most adequate for this application. Analyses of expression stability in cartilage from 10 patients with hip OA, 8 patients with knee OA and 10 controls without OA were done with classical statistical tests and the software programs geNorm and NormFinder. Results from the three methods of analysis were broadly concordant. Some of the commonly used reference genes, GAPDH, ACTB and 18S RNA, performed poorly in our analysis. In contrast, the rarely used TBP, RPL13A and B2M genes were the best. It was necessary to use together several of these three genes to obtain the best results. The specific combination depended, to some extent, on the type of samples being compared. Our results provide a satisfactory set of previously unused reference genes for qPCR in hip and knee OA This confirms the need to evaluate the suitability of reference genes in every tissue and experimental situation before starting the quantitative assessment of gene expression by qPCR.

  12. Reliability of Different Mark-Recapture Methods for Population Size Estimation Tested against Reference Population Sizes Constructed from Field Data

    PubMed Central

    Grimm, Annegret; Gruber, Bernd; Henle, Klaus

    2014-01-01

    Reliable estimates of population size are fundamental in many ecological studies and biodiversity conservation. Selecting appropriate methods to estimate abundance is often very difficult, especially if data are scarce. Most studies concerning the reliability of different estimators used simulation data based on assumptions about capture variability that do not necessarily reflect conditions in natural populations. Here, we used data from an intensively studied closed population of the arboreal gecko Gehyra variegata to construct reference population sizes for assessing twelve different population size estimators in terms of bias, precision, accuracy, and their 95%-confidence intervals. Two of the reference populations reflect natural biological entities, whereas the other reference populations reflect artificial subsets of the population. Since individual heterogeneity was assumed, we tested modifications of the Lincoln-Petersen estimator, a set of models in programs MARK and CARE-2, and a truncated geometric distribution. Ranking of methods was similar across criteria. Models accounting for individual heterogeneity performed best in all assessment criteria. For populations from heterogeneous habitats without obvious covariates explaining individual heterogeneity, we recommend using the moment estimator or the interpolated jackknife estimator (both implemented in CAPTURE/MARK). If data for capture frequencies are substantial, we recommend the sample coverage or the estimating equation (both models implemented in CARE-2). Depending on the distribution of catchabilities, our proposed multiple Lincoln-Petersen and a truncated geometric distribution obtained comparably good results. The former usually resulted in a minimum population size and the latter can be recommended when there is a long tail of low capture probabilities. Models with covariates and mixture models performed poorly. Our approach identified suitable methods and extended options to evaluate the performance of mark-recapture population size estimators under field conditions, which is essential for selecting an appropriate method and obtaining reliable results in ecology and conservation biology, and thus for sound management. PMID:24896260

  13. Determination of Age-Dependent Reference Ranges for Coagulation Tests Performed Using Destiny Plus

    PubMed Central

    Arslan, Fatma Demet; Serdar, Muhittin; Merve Ari, Elif; Onur Oztan, Mustafa; Hikmet Kozcu, Sureyya; Tarhan, Huseyin; Cakmak, Ozgur; Zeytinli, Merve; Yasar Ellidag, Hamit

    2016-01-01

    Background In order to apply the right treatment for hemostatic disorders in pediatric patients, laboratory data should be interpreted with age-appropriate reference ranges. Objectives The purpose of this study was to determining age-dependent reference range values for prothrombin time (PT), activated partial thromboplastin time (aPTT), fibrinogen tests, and D-dimer tests. Materials and Methods A total of 320 volunteers were included in the study with the following ages: 1 month - 1 year (n = 52), 2 - 5 years (n = 50), 6 - 10 years (n = 48), 11 - 17 years (n = 38), and 18 - 65 years (n = 132). Each volunteer completed a survey to exclude hemostatic system disorder. Using a nonparametric method, the lower and upper limits, including 95% distribution and 90% confidence intervals, were calculated. Results No statistically significant differences were found between PT and aPTT values in the groups consisting of children. Thus, the reference ranges were separated into child and adult age groups. PT and aPTT values were significantly higher in the children than in the adults. Fibrinogen values in the 6 - 10 age group and the adult age group were significantly higher than in the other groups. D-dimer levels were significantly lower in those aged 2 - 17; thus, a separate reference range was established. Conclusions These results support other findings related to developmental hemostasis, confirming that adult and pediatric age groups should be evaluated using different reference ranges. PMID:27617078

  14. Investigation on the Reference Evapotranspiration Distribution at Regional Scale By Alternative Methods to Compute the FAO Penman-Monteith Equation

    NASA Astrophysics Data System (ADS)

    Snyder, R. L.; Mancosu, N.; Spano, D.

    2014-12-01

    This study derived the summer (June-August) reference evapotranspiration distribution map for Sardinia (Italy) based on weather station data and use of the geographic information system (GIS). A modified daily Penman-Monteith equation from the Food and Agriculture Organization of the United Nations (UN-FAO) and the American Society of Civil Engineers Environmental and Water Resources Institute (ASCE-EWRI) was used to calculate the Standardized Reference Evapotranspiration (ETos) for all weather stations having a "full" set of required data for the calculations. For stations having only temperature data (partial stations), the Hargreaves-Samani equation was used to estimate the reference evapotranspiration for a grass surface (ETo). The ETos and ETo results were different depending on the local climate, so two methods to estimate ETos from the ETo were tested. Substitution of missing solar radiation, wind speed, and humidity data from a nearby station within a similar microclimate was found to give better results than using a calibration factor that related ETos and ETo. Therefore, the substitution method was used to estimate ETos at "partial" stations having only temperature data. The combination of 63 full and partial stations was sufficient to use GIS to map ETos for Sardinia. Three interpolation methods were studied, and the ordinary kriging model fitted the observed data better than a radial basis function or the inverse distance weighting method. Using station data points to create a regional map simplified the zonation of ETos when large scale computations were needed. Making a distinction based on ETos classes allows the simulation of crop water requirements for large areas and it can potentially lead to improved irrigation management and water savings. It also provides a baseline to investigate possible impact of climate change.

  15. [Tasks and duties of veterinary reference laboratories for food borne zoonoses].

    PubMed

    Ellerbroek, Lüppo; Alter, T; Johne, R; Nöckler, K; Beutin, L; Helmuth, R

    2009-02-01

    Reference laboratories are of central importance for consumer protection. Field expertise and high scientific competence are basic requirements for the nomination of a national reference laboratory. To ensure a common approach in the analysis of zoonotic hazards, standards have been developed by the reference laboratories together with national official laboratories on the basis of Art. 33 of Directive (EG) No. 882/2004. Reference laboratories function as arbitrative boards in the case of ambivalent or debatable results. New methods for detection of zoonotic agents are developed and validated to provide tools for analysis, e. g., in legal cases, if results from different parties are disputed. Besides these tasks, national reference laboratories offer capacity building and advanced training courses and control the performance of ring trials to ensure consistency in the quality of analyses in official laboratories. All reference laboratories work according to the ISO standard 17025 which defines the grounds for strict laboratory quality rules and in cooperation with the respective Community Reference Laboratories (CRL). From the group of veterinary reference laboratories for food-borne zoonoses, the national reference laboratories are responsible for Listeria monocytogenes, for Campylobacter, for the surveillance and control of viral and bacterial contamination of bivalve molluscs, for E. coli, for the performance of analysis and tests on zoonoses (Salmonella), and from the group of parasitological zoonotic agents, the national reference laboratory for Trichinella.

  16. Single-trial event-related potential extraction through one-unit ICA-with-reference.

    PubMed

    Lee, Wee Lih; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong

    2016-12-01

    In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.

  17. An Adaptive Filter for the Removal of Drifting Sinusoidal Noise Without a Reference.

    PubMed

    Kelly, John W; Siewiorek, Daniel P; Smailagic, Asim; Wang, Wei

    2016-01-01

    This paper presents a method for filtering sinusoidal noise with a variable bandwidth filter that is capable of tracking a sinusoid's drifting frequency. The method, which is based on the adaptive noise canceling (ANC) technique, will be referred to here as the adaptive sinusoid canceler (ASC). The ASC eliminates sinusoidal contamination by tracking its frequency and achieving a narrower bandwidth than typical notch filters. The detected frequency is used to digitally generate an internal reference instead of relying on an external one as ANC filters typically do. The filter's bandwidth adjusts to achieve faster and more accurate convergence. In this paper, the focus of the discussion and the data is physiological signals, specifically electrocorticographic (ECoG) neural data contaminated with power line noise, but the presented technique could be applicable to other recordings as well. On simulated data, the ASC was able to reliably track the noise's frequency, properly adjust its bandwidth, and outperform comparative methods including standard notch filters and an adaptive line enhancer. These results were reinforced by visual results obtained from real ECoG data. The ASC showed that it could be an effective method for increasing signal to noise ratio in the presence of drifting sinusoidal noise, which is of significant interest for biomedical applications.

  18. Spectral multivariate calibration without laboratory prepared or determined reference analyte values.

    PubMed

    Ottaway, Josh; Farrell, Jeremy A; Kalivas, John H

    2013-02-05

    An essential part to calibration is establishing the analyte calibration reference samples. These samples must characterize the sample matrix and measurement conditions (chemical, physical, instrumental, and environmental) of any sample to be predicted. Calibration usually requires measuring spectra for numerous reference samples in addition to determining the corresponding analyte reference values. Both tasks are typically time-consuming and costly. This paper reports on a method named pure component Tikhonov regularization (PCTR) that does not require laboratory prepared or determined reference values. Instead, an analyte pure component spectrum is used in conjunction with nonanalyte spectra for calibration. Nonanalyte spectra can be from different sources including pure component interference samples, blanks, and constant analyte samples. The approach is also applicable to calibration maintenance when the analyte pure component spectrum is measured in one set of conditions and nonanalyte spectra are measured in new conditions. The PCTR method balances the trade-offs between calibration model shrinkage and the degree of orthogonality to the nonanalyte content (model direction) in order to obtain accurate predictions. Using visible and near-infrared (NIR) spectral data sets, the PCTR results are comparable to those obtained using ridge regression (RR) with reference calibration sets. The flexibility of PCTR also allows including reference samples if such samples are available.

  19. Evaluation of purity with its uncertainty value in high purity lead stick by conventional and electro-gravimetric methods

    PubMed Central

    2013-01-01

    Background A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. Results On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Conclusions Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results. PMID:23800080

  20. Evaluating Level of Specificity of Normative Referents in Relation to Personal Drinking Behavior*

    PubMed Central

    Larimer, Mary E.; Kaysen, Debra L.; Lee, Christine M.; Kilmer, Jason R.; Lewis, Melissa A.; Dillworth, Tiara; Montoya, Heidi D.; Neighbors, Clayton

    2009-01-01

    Objective: Research has found perceived descriptive norms to be one of the strongest predictors of college student drinking, and several intervention approaches have incorporated normative feedback to correct misperceptions of peer drinking behavior. Little research has focused on the role of the reference group in normative perceptions. The current study sought to examine whether normative perceptions vary based on specificity of the reference group and whether perceived norms for more specific reference-group norms are related to individual drinking behavior. Method: Participants were first-year undergraduates (n = 1,276, 58% female) randomly selected from a university list of incoming students. Participants reported personal drinking behavior and perceived descriptive norms for eight reference groups, including typical student; same gender, ethnicity, or residence; and combinations of those reference groups (e.g., same gender and residence). Results: Findings indicated that participants distinguished among different reference groups in estimating descriptive drinking norms. Moreover, results indicated misperceptions in drinking norms were evident at all levels of specificity of the reference group. Additionally, findings showed perceived norms for more specific groups were uniquely related to participants' own drinking. Conclusions: These results suggest that providing normative feedback targeting at least one level of specificity to the participant (i.e., beyond what the “typical” student does) may be an important tool in normative feedback interventions. PMID:19538919

  1. A Comparative Study of Average, Linked Mastoid, and REST References for ERP Components Acquired during fMRI

    PubMed Central

    Yang, Ping; Fan, Chenggui; Wang, Min; Li, Ling

    2017-01-01

    In simultaneous electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) studies, average reference (AR), and digitally linked mastoid (LM) are popular re-referencing techniques in event-related potential (ERP) analyses. However, they may introduce their own physiological signals and alter the EEG/ERP outcome. A reference electrode standardization technique (REST) that calculated a reference point at infinity was proposed to solve this problem. To confirm the advantage of REST in ERP analyses of synchronous EEG-fMRI studies, we compared the reference effect of AR, LM, and REST on task-related ERP results of a working memory task during an fMRI scan. As we hypothesized, we found that the adopted reference did not change the topography map of ERP components (N1 and P300 in the present study), but it did alter the task-related effect on ERP components. LM decreased or eliminated the visual working memory (VWM) load effect on P300, and the AR distorted the distribution of VWM location-related effect at left posterior electrodes as shown in the statistical parametric scalp mapping (SPSM) of N1. ERP cortical source estimates, which are independent of the EEG reference choice, were used as the golden standard to infer the relative utility of different references on the ERP task-related effect. By comparison, REST reference provided a more integrated and reasonable result. These results were further confirmed by the results of fMRI activations and a corresponding EEG-only study. Thus, we recommend the REST, especially with a realistic head model, as the optimal reference method for ERP data analysis in simultaneous EEG-fMRI studies. PMID:28529472

  2. A Comparative Study of Average, Linked Mastoid, and REST References for ERP Components Acquired during fMRI.

    PubMed

    Yang, Ping; Fan, Chenggui; Wang, Min; Li, Ling

    2017-01-01

    In simultaneous electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) studies, average reference (AR), and digitally linked mastoid (LM) are popular re-referencing techniques in event-related potential (ERP) analyses. However, they may introduce their own physiological signals and alter the EEG/ERP outcome. A reference electrode standardization technique (REST) that calculated a reference point at infinity was proposed to solve this problem. To confirm the advantage of REST in ERP analyses of synchronous EEG-fMRI studies, we compared the reference effect of AR, LM, and REST on task-related ERP results of a working memory task during an fMRI scan. As we hypothesized, we found that the adopted reference did not change the topography map of ERP components (N1 and P300 in the present study), but it did alter the task-related effect on ERP components. LM decreased or eliminated the visual working memory (VWM) load effect on P300, and the AR distorted the distribution of VWM location-related effect at left posterior electrodes as shown in the statistical parametric scalp mapping (SPSM) of N1. ERP cortical source estimates, which are independent of the EEG reference choice, were used as the golden standard to infer the relative utility of different references on the ERP task-related effect. By comparison, REST reference provided a more integrated and reasonable result. These results were further confirmed by the results of fMRI activations and a corresponding EEG-only study. Thus, we recommend the REST, especially with a realistic head model, as the optimal reference method for ERP data analysis in simultaneous EEG-fMRI studies.

  3. Development and Validation of a Rapid 13C6-Glucose Isotope Dilution UPLC-MRM Mass Spectrometry Method for Use in Determining System Accuracy and Performance of Blood Glucose Monitoring Devices

    PubMed Central

    Matsunami, Risë K.; Angelides, Kimon; Engler, David A.

    2015-01-01

    Background: There is currently considerable discussion about the accuracy of blood glucose concentrations determined by personal blood glucose monitoring systems (BGMS). To date, the FDA has allowed new BGMS to demonstrate accuracy in reference to other glucose measurement systems that use the same or similar enzymatic-based methods to determine glucose concentration. These types of reference measurement procedures are only comparative in nature and are subject to the same potential sources of error in measurement and system perturbations as the device under evaluation. It would be ideal to have a completely orthogonal primary method that could serve as a true standard reference measurement procedure for establishing the accuracy of new BGMS. Methods: An isotope-dilution liquid chromatography/mass spectrometry (ID-UPLC-MRM) assay was developed using 13C6-glucose as a stable isotope analogue to specifically measure glucose concentration in human plasma, and validated for use against NIST standard reference materials, and against fresh isolates of whole blood and plasma into which exogenous glucose had been spiked. Assay performance was quantified to NIST-traceable dry weight measures for both glucose and 13C6-glucose. Results: The newly developed assay method was shown to be rapid, highly specific, sensitive, accurate, and precise for measuring plasma glucose levels. The assay displayed sufficient dynamic range and linearity to measure across the range of both normal and diabetic blood glucose levels. Assay performance was measured to within the same uncertainty levels (<1%) as the NIST definitive method for glucose measurement in human serum. Conclusions: The newly developed ID UPLC-MRM assay can serve as a validated reference measurement procedure to which new BGMS can be assessed for glucose measurement performance. PMID:25986627

  4. 40 CFR Table E-1 to Subpart E of... - Summary of Test Requirements for Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Reference and Class I Equivalent Methods for PM2.5 and PM10-2.5 E Table E-1 to Subpart E of Part 53... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM2.5 or PM10â2.5 Pt. 53...

  5. Direct-to-digital holography reduction of reference hologram noise and fourier space smearing

    DOEpatents

    Voelkl, Edgar

    2006-06-27

    Systems and methods are described for reduction of reference hologram noise and reduction of Fourier space smearing, especially in the context of direct-to-digital holography (off-axis interferometry). A method of reducing reference hologram noise includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference image waves; and transforming the corresponding plurality of reference image waves into a reduced noise reference image wave. A method of reducing smearing in Fourier space includes: recording a plurality of reference holograms; processing the plurality of reference holograms into a corresponding plurality of reference complex image waves; transforming the corresponding plurality of reference image waves into a reduced noise reference complex image wave; recording a hologram of an object; processing the hologram of the object into an object complex image wave; and dividing the complex image wave of the object by the reduced noise reference complex image wave to obtain a reduced smearing object complex image wave.

  6. A mixture approach to the acoustic properties of a macroscopically inhomogeneous porous aluminum in the equivalent fluid approximation.

    PubMed

    Sacristan, C J; Dupont, T; Sicot, O; Leclaire, P; Verdière, K; Panneton, R; Gong, X L

    2016-10-01

    The acoustic properties of an air-saturated macroscopically inhomogeneous aluminum foam in the equivalent fluid approximation are studied. A reference sample built by forcing a highly compressible melamine foam with conical shape inside a constant diameter rigid tube is studied first. In this process, a radial compression varying with depth is applied. With the help of an assumption on the compressed pore geometry, properties of the reference sample can be modelled everywhere in the thickness and it is possible to use the classical transfer matrix method as theoretical reference. In the mixture approach, the material is viewed as a mixture of two known materials placed in a patchwork configuration and with proportions of each varying with depth. The properties are derived from the use of a mixing law. For the reference sample, the classical transfer matrix method is used to validate the experimental results. These results are used to validate the mixture approach. The mixture approach is then used to characterize a porous aluminium for which only the properties of the external faces are known. A porosity profile is needed and is obtained from the simulated annealing optimization process.

  7. TU-AB-BRC-03: Accurate Tissue Characterization for Monte Carlo Dose Calculation Using Dual-and Multi-Energy CT Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalonde, A; Bouchard, H

    Purpose: To develop a general method for human tissue characterization with dual-and multi-energy CT and evaluate its performance in determining elemental compositions and the associated proton stopping power relative to water (SPR) and photon mass absorption coefficients (EAC). Methods: Principal component analysis is used to extract an optimal basis of virtual materials from a reference dataset of tissues. These principal components (PC) are used to perform two-material decomposition using simulated DECT data. The elemental mass fraction and the electron density in each tissue is retrieved by measuring the fraction of each PC. A stoichiometric calibration method is adapted to themore » technique to make it suitable for clinical use. The present approach is compared with two others: parametrization and three-material decomposition using the water-lipid-protein (WLP) triplet. Results: Monte Carlo simulations using TOPAS for four reference tissues shows that characterizing them with only two PC is enough to get a submillimetric precision on proton range prediction. Based on the simulated DECT data of 43 references tissues, the proposed method is in agreement with theoretical values of protons SPR and low-kV EAC with a RMS error of 0.11% and 0.35%, respectively. In comparison, parametrization and WLP respectively yield RMS errors of 0.13% and 0.29% on SPR, and 2.72% and 2.19% on EAC. Furthermore, the proposed approach shows potential applications for spectral CT. Using five PC and five energy bins reduces the SPR RMS error to 0.03%. Conclusion: The proposed method shows good performance in determining elemental compositions from DECT data and physical quantities relevant to radiotherapy dose calculation and generally shows better accuracy and unbiased results compared to reference methods. The proposed method is particularly suitable for Monte Carlo calculations and shows promise in using more than two energies to characterize human tissue with CT.« less

  8. A statistically robust EEG re-referencing procedure to mitigate reference effect

    PubMed Central

    Lepage, Kyle Q.; Kramer, Mark A.; Chu, Catherine J.

    2014-01-01

    Background The electroencephalogram (EEG) remains the primary tool for diagnosis of abnormal brain activity in clinical neurology and for in vivo recordings of human neurophysiology in neuroscience research. In EEG data acquisition, voltage is measured at positions on the scalp with respect to a reference electrode. When this reference electrode responds to electrical activity or artifact all electrodes are affected. Successful analysis of EEG data often involves re-referencing procedures that modify the recorded traces and seek to minimize the impact of reference electrode activity upon functions of the original EEG recordings. New method We provide a novel, statistically robust procedure that adapts a robust maximum-likelihood type estimator to the problem of reference estimation, reduces the influence of neural activity from the re-referencing operation, and maintains good performance in a wide variety of empirical scenarios. Results The performance of the proposed and existing re-referencing procedures are validated in simulation and with examples of EEG recordings. To facilitate this comparison, channel-to-channel correlations are investigated theoretically and in simulation. Comparison with existing methods The proposed procedure avoids using data contaminated by neural signal and remains unbiased in recording scenarios where physical references, the common average reference (CAR) and the reference estimation standardization technique (REST) are not optimal. Conclusion The proposed procedure is simple, fast, and avoids the potential for substantial bias when analyzing low-density EEG data. PMID:24975291

  9. Identification of appropriate reference genes for human mesenchymal stem cell analysis by quantitative real-time PCR.

    PubMed

    Li, Xiuying; Yang, Qiwei; Bai, Jinping; Xuan, Yali; Wang, Yimin

    2015-01-01

    Normalization to a reference gene is the method of choice for quantitative reverse transcription-PCR (RT-qPCR) analysis. The stability of reference genes is critical for accurate experimental results and conclusions. We have evaluated the expression stability of eight commonly used reference genes found in four different human mesenchymal stem cells (MSC). Using geNorm, NormFinder and BestKeeper algorithms, we show that beta-2-microglobulin and peptidyl-prolylisomerase A were the optimal reference genes for normalizing RT-qPCR data obtained from MSC, whereas the TATA box binding protein was not suitable due to its extensive variability in expression. Our findings emphasize the significance of validating reference genes for qPCR analyses. We offer a short list of reference genes to use for normalization and recommend some commercially-available software programs as a rapid approach to validate reference genes. We also demonstrate that the two reference genes, β-actin and glyceraldehyde-3-phosphate dehydrogenase, are frequently used are not always successful in many cases.

  10. Image classification at low light levels

    NASA Astrophysics Data System (ADS)

    Wernick, Miles N.; Morris, G. Michael

    1986-12-01

    An imaging photon-counting detector is used to achieve automatic sorting of two image classes. The classification decision is formed on the basis of the cross correlation between a photon-limited input image and a reference function stored in computer memory. Expressions for the statistical parameters of the low-light-level correlation signal are given and are verified experimentally. To obtain a correlation-based system for two-class sorting, it is necessary to construct a reference function that produces useful information for class discrimination. An expression for such a reference function is derived using maximum-likelihood decision theory. Theoretically predicted results are used to compare on the basis of performance the maximum-likelihood reference function with Fukunaga-Koontz basis vectors and average filters. For each method, good class discrimination is found to result in milliseconds from a sparse sampling of the input image.

  11. A Profilometry-Based Dentifrice Abrasion Method for V8 Brushing Machines Part II: Comparison of RDA-PE and Radiotracer RDA Measures.

    PubMed

    Schneiderman, Eva; Colón, Ellen; White, Donald J; St John, Samuel

    2015-01-01

    The purpose of this study was to compare the abrasivity of commercial dentifrices by two techniques: the conventional gold standard radiotracer-based Radioactive Dentin Abrasivity (RDA) method; and a newly validated technique based on V8 brushing that included a profilometry-based evaluation of dentin wear. This profilometry-based method is referred to as RDA-Profilometry Equivalent, or RDA-PE. A total of 36 dentifrices were sourced from four global dentifrice markets (Asia Pacific [including China], Europe, Latin America, and North America) and tested blindly using both the standard radiotracer (RDA) method and the new profilometry method (RDA-PE), taking care to follow specific details related to specimen preparation and treatment. Commercial dentifrices tested exhibited a wide range of abrasivity, with virtually all falling well under the industry accepted upper limit of 250; that is, 2.5 times the level of abrasion measured using an ISO 11609 abrasivity reference calcium pyrophosphate as the reference control. RDA and RDA-PE comparisons were linear across the entire range of abrasivity (r2 = 0.7102) and both measures exhibited similar reproducibility with replicate assessments. RDA-PE assessments were not just linearly correlated, but were also proportional to conventional RDA measures. The linearity and proportionality of the results of the current study support that both methods (RDA or RDA-PE) provide similar results and justify a rationale for making the upper abrasivity limit of 250 apply to both RDA and RDA-PE.

  12. A symmetrical subtraction combined with interpolated values for eliminating scattering from fluorescence EEM data

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Liu, Xiaofei; Wang, Yutian

    2016-08-01

    Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components.

  13. Diagnostic Performance of a Molecular Test versus Clinician Assessment of Vaginitis.

    PubMed

    Schwebke, Jane R; Gaydos, Charlotte A; Nyirjesy, Paul; Paradis, Sonia; Kodsi, Salma; Cooper, Charles K

    2018-06-01

    Vaginitis is a common complaint, diagnosed either empirically or using Amsel's criteria and wet mount microscopy. This study sought to determine characteristics of an investigational test (a molecular test for vaginitis), compared to reference, for detection of bacterial vaginosis, Candida spp., and Trichomonas vaginalis Vaginal specimens from a cross-sectional study were obtained from 1,740 women (≥18 years old), with vaginitis symptoms, during routine clinic visits (across 10 sites in the United States). Specimens were analyzed using a commercial PCR/fluorogenic probe-based investigational test that detects bacterial vaginosis, Candida spp., and Trichomonas vaginalis Clinician diagnosis and in-clinic testing (Amsel's test, potassium hydroxide preparation, and wet mount) were also employed to detect the three vaginitis causes. All testing methods were compared to the respective reference methods (Nugent Gram stain for bacterial vaginosis, detection of the Candida gene its2 , and Trichomonas vaginalis culture). The investigational test, clinician diagnosis, and in-clinic testing were compared to reference methods for bacterial vaginosis, Candida spp., and Trichomonas vaginalis The investigational test resulted in significantly higher sensitivity and negative predictive value than clinician diagnosis or in-clinic testing. In addition, the investigational test showed a statistically higher overall percent agreement with each of the three reference methods than did clinician diagnosis or in-clinic testing. The investigational test showed significantly higher sensitivity for detecting vaginitis, involving more than one cause, than did clinician diagnosis. Taken together, these results suggest that a molecular investigational test can facilitate accurate detection of vaginitis. Copyright © 2018 Schwebke et al.

  14. Evaluation of VIDAS Salmonella (SLM) easy Salmonella method for the detection of Salmonella in a variety of foods: collaborative study.

    PubMed

    Crowley, Erin; Bird, Patrick; Fisher, Kiel; Goetz, Katherine; Benzinger, M Joseph; Agin, James; Goins, David; Johnson, Ronald L

    2011-01-01

    The VIDAS Salmonella (SLM) Easy Salmonella method is a specific enzyme-linked fluorescent immunoassay performed in the automated VIDAS instrument. The VIDAS Easy Salmonella method is a simple 2-step enrichment procedure, using pre-enrichment followed by selective enrichment in a newly formulated broth, SX2 broth. This new method was compared in a multilaboratory collaborative study to the U.S. Food and Drug Administration's Bacteriological Analytical Manual, Chapter 5 method for five food matrixes (liquid egg, vanilla ice cream, spinach, raw shrimp, and peanut butter) and the U.S. Department of Agriculture's Microbiology Laboratory Guidebook 4.04 method for deli turkey. Each food type was artificially contaminated with Salmonella at three inoculation levels. A total of 15 laboratories representing government, academia, and industry, throughout the United States, participated. In this study, 1583 samples were analyzed, of which 792 were paired replicates and 791 were unpaired replicates. Of the 792 paired replicates, 285 were positive by both the VIDAS and reference methods. Of the 791 unpaired replicates, 341 were positive by the VIDAS method and 325 were positive by the cultural reference method. A Chi-square analysis of each of the six food types was performed at the three inoculation levels tested. For all foods evaluated, the VIDAS Easy SLM method demonstrated results comparable to those of the reference methods for the detection of Salmonella.

  15. 40 CFR Appendix A-3 to Part 60 - Test Methods 4 through 5I

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... isokinetic sampling rates prior to a pollutant emission measurement run. The approximation method described... with a pollutant emission measurement run. When it is, calculation of percent isokinetic, pollutant emission rate, etc., for the run shall be based upon the results of the reference method or its equivalent...

  16. Financing New Technologies, Equipment/Furniture Replacement, and Building Renovation: A Survey Report.

    ERIC Educational Resources Information Center

    Shirk, Gary M.

    1984-01-01

    Reports results of survey of methods used by 77 North American academic and public libraries to finance implementation of new technologies, replace equipment and furniture, and renovate buildings. Financing methods used, frequency of use, choice, and range of methods are discussed. Eight references and list of survey participants are appended.…

  17. Acid digestion of geological and environmental samples using open-vessel focused microwave digestion.

    PubMed

    Taylor, Vivien F; Toms, Andrew; Longerich, Henry P

    2002-01-01

    The application of open vessel focused microwave acid digestion is described for the preparation of geological and environmental samples for analysis using inductively coupled plasma-mass spectrometry (ICP-MS). The method is compared to conventional closed-vessel high pressure methods which are limited in the use of HF to break down silicates. Open-vessel acid digestion more conveniently enables the use of HF to remove Si from geological and plant samples as volatile SiF4, as well as evaporation-to-dryness and sequential acid addition during the procedure. Rock reference materials (G-2 granite, MRG-1 gabbros, SY-2 syenite, JA-1 andesite, and JB-2 and SRM-688 basalts) and plant reference materials (BCR and IAEA lichens, peach leaves, apple leaves, Durham wheat flour, and pine needles) were digested with results comparable to conventional hotplate digestion. The microwave digestion method gave poor results for granitic samples containing refractory minerals, however fusion was the preferred method of preparation for these samples. Sample preparation time was reduced from several days, using conventional hotplate digestion method, to one hour per sample using our microwave method.

  18. Reference genes for measuring mRNA expression.

    PubMed

    Dundas, Jitesh; Ling, Maurice

    2012-12-01

    The aim of this review is to find answers to some of the questions surrounding reference genes and their reliability for quantitative experiments. Reference genes are assumed to be at a constant expression level, over a range of conditions such as temperature. These genes, such as GADPH and beta-actin, are used extensively for gene expression studies using techniques like quantitative PCR. There have been several studies carried out on identifying reference genes. However, a lot of evidence indicates issues to the general suitability of these genes. Recent studies had shown that different factors, including the environment and methods, play an important role in changing the expression levels of the reference genes. Thus, we conclude that there is no reference gene that can deemed suitable for all the experimental conditions. In addition, we believe that every experiment will require the scientific evaluation and selection of the best candidate gene for use as a reference gene to obtain reliable scientific results.

  19. Simplified procedure for computing the absorption of sound by the atmosphere

    DOT National Transportation Integrated Search

    2007-10-31

    This paper describes a study that resulted in the development of a simplified : method for calculating attenuation by atmospheric-absorption for wide-band : sounds analyzed by one-third octave-band filters. The new method [referred to : herein as the...

  20. Time-efficient high-resolution whole-brain three-dimensional macromolecular proton fraction mapping

    PubMed Central

    Yarnykh, Vasily L.

    2015-01-01

    Purpose Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole-brain MPF mapping technique utilizing a minimal possible number of source images for scan time reduction. Methods The described technique is based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole-brain three-dimensional MPF mapping with isotropic 1.25×1.25×1.25 mm3 voxel size and scan time of 20 minutes. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from 8 healthy subjects. Results Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (<2%). High-resolution MPF maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details including gray matter structures with high iron content. Conclusions Synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. PMID:26102097

  1. Quantity quotient reporting. A proposal for a standardized presentation of laboratory results.

    PubMed

    Haeckel, Rainer; Wosniok, Werner

    2009-01-01

    Laboratory results are reported in different units (despite international recommendations for SI units) together with different reference limits, of which several exist for many quantities. It is proposed to adopt the concept of the intelligence quotient and to report quantitative results as a quantity quotient (QQ) in laboratory medicine. This quotient is essentially the difference (measured result minus mean or mode value of the reference interval) divided by the observed biological variation CV(o). Thus, all quantities are reported in the same unit system with the same reference limits (for convenience shifted to e.g., 80-120). The critical difference can also be included in this standardization concept. In this way the information of reference intervals and the original result are integrated into one combined value, which has the same format for all quantities suited for quotient reporting (QR). The proposal of QR does not interfere with neither the current concepts of traceability, SI units or method standardization. This proposal represents a further step towards harmonization of reporting. It provides simple values which can be interpreted easily by physicians and their patients.

  2. Confirmation and Identification of Listeria monocytogenes, Listeria spp. and Other Gram-Positive Organisms by the Bruker MALDI Biotyper Method: Collaborative Study, First Action 2017.10.

    PubMed

    Bastin, Benjamin; Bird, Patrick; Crowley, Erin; Benzinger, M Joseph; Agin, James; Goins, David; Sohier, Daniele; Timke, Markus; Awad, Marian; Kostrzewa, Markus

    2018-04-27

    The Bruker MALDI Biotyper® method utilizes matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) MS for the rapid and accurate confirmation and identification of Gram-positive bacteria from select media types. This alternative method was evaluated using nonselective and selective agar plates to identify and confirm Listeria monocytogenes, Listeria species, and select Gram-positive bacteria. Results obtained by the Bruker MALDI Biotyper were compared with the traditional biochemical methods as prescribed in the appropriate reference method standards. Sixteen collaborators from 16 different laboratories located within the European Union participated in the collaborative study. A total of 36 blind-coded isolates were evaluated by each collaborator. In each set of 36 organisms, there were 16 L. monocytogenes strains, 12 non- monocytogenes Listeria species strains, and 8 additional Gram-positive exclusivity strains. After testing was completed, the total percentage of correct identifications (to both genus and species level) and confirmation from each agar type for each strain was determined at a percentage of 99.9% to the genus level and 98.8% to the species level. The results indicated that the alternative method produced equivalent results when compared with the confirmatory procedures specified by each reference method.

  3. Theoretical foundation, methods, and criteria for calibrating human vibration models using frequency response functions

    PubMed Central

    Dong, Ren G.; Welcome, Daniel E.; McDowell, Thomas W.; Wu, John Z.

    2015-01-01

    While simulations of the measured biodynamic responses of the whole human body or body segments to vibration are conventionally interpreted as summaries of biodynamic measurements, and the resulting models are considered quantitative, this study looked at these simulations from a different angle: model calibration. The specific aims of this study are to review and clarify the theoretical basis for model calibration, to help formulate the criteria for calibration validation, and to help appropriately select and apply calibration methods. In addition to established vibration theory, a novel theorem of mechanical vibration is also used to enhance the understanding of the mathematical and physical principles of the calibration. Based on this enhanced understanding, a set of criteria was proposed and used to systematically examine the calibration methods. Besides theoretical analyses, a numerical testing method is also used in the examination. This study identified the basic requirements for each calibration method to obtain a unique calibration solution. This study also confirmed that the solution becomes more robust if more than sufficient calibration references are provided. Practically, however, as more references are used, more inconsistencies can arise among the measured data for representing the biodynamic properties. To help account for the relative reliabilities of the references, a baseline weighting scheme is proposed. The analyses suggest that the best choice of calibration method depends on the modeling purpose, the model structure, and the availability and reliability of representative reference data. PMID:26740726

  4. Photoacoustic spectroscopy and thermal relaxation method to evaluate corn moisture content

    NASA Astrophysics Data System (ADS)

    Pedrochi, F.; Medina, A. N.; Bento, A. C.; Baesso, M. L.; Luz, M. L. S.; Dalpasquale, V. A.

    2005-06-01

    In this study, samples of popcorn with different degrees of moisture were analyzed. The optical absorption bands at the mid infrared were measured using photoacoustic spectroscopy and were correlated to the sample moisture. The results were in agreement with moisture data determined by the well known reference method, the Karl Fischer. In addition, the thermal relaxation method was used to determine the sample specific heat as a function of the moisture content. The results were also in agreement with the two mentioned methods.

  5. Determination of selected elements in whole coal and in coal ash from the eight argonne premium coal samples by atomic absorption spectrometry, atomic emission spectrometry, and ion-selective electrode

    USGS Publications Warehouse

    Doughten, M.W.; Gillison, J.R.

    1990-01-01

    Methods for the determination of 24 elements in whole coal and coal ash by inductively coupled argon plasma-atomic emission spectrometry, flame, graphite furnace, and cold vapor atomic absorption spectrometry, and by ion-selective electrode are described. Coal ashes were analyzed in triplicate to determine the precision of the methods. Results of the analyses of NBS Standard Reference Materials 1633, 1633a, 1632a, and 1635 are reported. Accuracy of the methods is determined by comparison of the analysis of standard reference materials to their certified values as well as other values in the literature.

  6. First proficiency testing to evaluate the ability of European Union National Reference Laboratories to detect staphylococcal enterotoxins in milk products.

    PubMed

    Hennekinne, Jacques-Antoine; Gohier, Martine; Maire, Tiphaine; Lapeyre, Christiane; Lombard, Bertrand; Dragacci, Sylviane

    2003-01-01

    The European Commission has designed a network of European Union-National Reference Laboratories (EU-NRLs), coordinated by a Community Reference Laboratory (CRL), for control of hygiene of milk and milk products (Council Directive 92/46/ECC). As a common contaminant of milk and milk products such as cheese, staphylococcal enterotoxins are often involved in human outbreaks and should be monitored regularly. The main tasks of the EU-CRLs were to select and transfer to the EU-NRLs a reference method for detection of enterotoxins, and to set up proficiency testing to evaluate the competency of the European laboratory network. The first interlaboratory exercise was performed on samples of freeze-dried cheese inoculated with 2 levels of staphylococcal enterotoxins (0.1 and 0.25 ng/g) and on an uninoculated control. These levels were chosen considering the EU regulation for staphylococcal enterotoxins in milk and milk products and the limit of detection of the enzyme-linked immunosorbent assay test recommended in the reference method. The trial was conducted according to the recommendations of ISO Guide 43. Results produced by laboratories were compiled and compared through statistical analysis. Except for data from 2 laboratories for the uninoculated control and cheese inoculated at 0.1 ng/g, all laboratories produced satisfactory results, showing the ability of the EU-NRL network to monitor the enterotoxin contaminant.

  7. The deconvolution of complex spectra by artificial immune system

    NASA Astrophysics Data System (ADS)

    Galiakhmetova, D. I.; Sibgatullin, M. E.; Galimullin, D. Z.; Kamalova, D. I.

    2017-11-01

    An application of the artificial immune system method for decomposition of complex spectra is presented. The results of decomposition of the model contour consisting of three components, Gaussian contours, are demonstrated. The method of artificial immune system is an optimization method, which is based on the behaviour of the immune system and refers to modern methods of search for the engine optimization.

  8. The evaluation of evaporation by infrared thermography: A critical analysis of the measurements on the Crau test site. [France

    NASA Technical Reports Server (NTRS)

    Seguin, B.; Petit, V.; Devillard, R.; Reich, P.; Thouy, G. (Principal Investigator)

    1980-01-01

    Evapotranspiration was calculated for both the dry and irrigated zone by four methods which were compared with the energy balance method serving as a reference. Two methods did not involve the surface temperature. They are ETR(n) = R(n), liable to be valid under wet conditions and ET(eq) = (delta/delta + gamma) R(n) i.e, the first term of Penman's equation, adapted to moderately dry conditions. The methods using surface temperature were the combined energy balance aerodynamic approach and a simplified approach proposed by Jackson et al. Tests show the surface temperature methods give relatively satisfactory results both in the dry and wet zone, with a precision of 10% to 15% compared with the reference method. As was to be expected, ET(eq) gave satisfactory results only in the dry zone and ET(Rn) in the irrigated zone. Thermography increased the precision in the estimate of ET relative to the most suitable classical method by 5% to 8% and is equally suitable for both dry and wet conditions. The Jackson method does not require extensive ground measurements and the evaluation of the surface roughness.

  9. A novel method for rapid determination of total solid content in viscous liquids by multiple headspace extraction gas chromatography.

    PubMed

    Xin, Li-Ping; Chai, Xin-Sheng; Hu, Hui-Chao; Barnes, Donald G

    2014-09-05

    This work demonstrates a novel method for rapid determination of total solid content in viscous liquid (polymer-enriched) samples. The method is based multiple headspace extraction gas chromatography (MHE-GC) on a headspace vial at a temperature above boiling point of water. Thus, the trend of water loss from the tested liquid due to evaporation can be followed. With the limited MHE-GC testing (e.g., 5 extractions) and a one-point calibration procedure (i.e., recording the weight difference before and after analysis), the total amount of water in the sample can be determined, from which the total solid contents in the liquid can be calculated. A number of black liquors were analyzed by the new method which yielded results that closely matched those of the reference method; i.e., the results of these two methods differed by no more than 2.3%. Compared with the reference method, the MHE-GC method is much simpler and more practical. Therefore, it is suitable for the rapid determination of the solid content in many polymer-containing liquid samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, E.; Hamilton, D.

    The purpose of this ITER is to chronicle the development of the ROST (trademark), its capabilities, associated equipment, and accessories. The report concludes with an evaluation of how closely the results obtained using the technology compare to the results obtained using the reference methods.

  11. Ray Effect Mitigation Through Reference Frame Rotation

    DOE PAGES

    Tencer, John

    2016-05-01

    The discrete ordinates method is a popular and versatile technique for solving the radiative transport equation, a major drawback of which is the presence of ray effects. Mitigation of ray effects can yield significantly more accurate results and enhanced numerical stability for combined mode codes. Moreover, when ray effects are present, the solution is seen to be highly dependent upon the relative orientation of the geometry and the global reference frame. It is an undesirable property. A novel ray effect mitigation technique of averaging the computed solution for various reference frame orientations is proposed.

  12. System for monitoring an industrial or biological process

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Vilim, Rick B.; White, Andrew M.

    1998-01-01

    A method and apparatus for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT.

  13. System for monitoring an industrial or biological process

    DOEpatents

    Gross, K.C.; Wegerich, S.W.; Vilim, R.B.; White, A.M.

    1998-06-30

    A method and apparatus are disclosed for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT. 49 figs.

  14. Genetic Adaptive Control for PZT Actuators

    NASA Technical Reports Server (NTRS)

    Kim, Jeongwook; Stover, Shelley K.; Madisetti, Vijay K.

    1995-01-01

    A piezoelectric transducer (PZT) is capable of providing linear motion if controlled correctly and could provide a replacement for traditional heavy and large servo systems using motors. This paper focuses on a genetic model reference adaptive control technique (GMRAC) for a PZT which is moving a mirror where the goal is to keep the mirror velocity constant. Genetic Algorithms (GAs) are an integral part of the GMRAC technique acting as the search engine for an optimal PID controller. Two methods are suggested to control the actuator in this research. The first one is to change the PID parameters and the other is to add an additional reference input in the system. The simulation results of these two methods are compared. Simulated Annealing (SA) is also used to solve the problem. Simulation results of GAs and SA are compared after simulation. GAs show the best result according to the simulation results. The entire model is designed using the Mathworks' Simulink tool.

  15. Spatial-Heterodyne Interferometry For Reflection And Transm Ission (Shirt) Measurements

    DOEpatents

    Hanson, Gregory R [Clinton, TN; Bingham, Philip R [Knoxville, TN; Tobin, Ken W [Harriman, TN

    2006-02-14

    Systems and methods are described for spatial-heterodyne interferometry for reflection and transmission (SHIRT) measurements. A method includes digitally recording a first spatially-heterodyned hologram using a first reference beam and a first object beam; digitally recording a second spatially-heterodyned hologram using a second reference beam and a second object beam; Fourier analyzing the digitally recorded first spatially-heterodyned hologram to define a first analyzed image; Fourier analyzing the digitally recorded second spatially-heterodyned hologram to define a second analyzed image; digitally filtering the first analyzed image to define a first result; and digitally filtering the second analyzed image to define a second result; performing a first inverse Fourier transform on the first result, and performing a second inverse Fourier transform on the second result. The first object beam is transmitted through an object that is at least partially translucent, and the second object beam is reflected from the object.

  16. Spatial-heterodyne interferometry for transmission (SHIFT) measurements

    DOEpatents

    Bingham, Philip R.; Hanson, Gregory R.; Tobin, Ken W.

    2006-10-10

    Systems and methods are described for spatial-heterodyne interferometry for transmission (SHIFT) measurements. A method includes digitally recording a spatially-heterodyned hologram including spatial heterodyne fringes for Fourier analysis using a reference beam, and an object beam that is transmitted through an object that is at least partially translucent; Fourier analyzing the digitally recorded spatially-heterodyned hologram, by shifting an original origin of the digitally recorded spatially-heterodyned hologram to sit on top of a spatial-heterodyne carrier frequency defined by an angle between the reference beam and the object beam, to define an analyzed image; digitally filtering the analyzed image to cut off signals around the original origin to define a result; and performing an inverse Fourier transform on the result.

  17. Adaptive nonlinear control for autonomous ground vehicles

    NASA Astrophysics Data System (ADS)

    Black, William S.

    We present the background and motivation for ground vehicle autonomy, and focus on uses for space-exploration. Using a simple design example of an autonomous ground vehicle we derive the equations of motion. After providing the mathematical background for nonlinear systems and control we present two common methods for exactly linearizing nonlinear systems, feedback linearization and backstepping. We use these in combination with three adaptive control methods: model reference adaptive control, adaptive sliding mode control, and extremum-seeking model reference adaptive control. We show the performances of each combination through several simulation results. We then consider disturbances in the system, and design nonlinear disturbance observers for both single-input-single-output and multi-input-multi-output systems. Finally, we show the performance of these observers with simulation results.

  18. [Geographical distribution of the Serum creatinine reference values of healthy adults].

    PubMed

    Wei, De-Zhi; Ge, Miao; Wang, Cong-Xia; Lin, Qian-Yi; Li, Meng-Jiao; Li, Peng

    2016-11-20

    To explore the relationship between serum creatinine (Scr) reference values in healthy adults and geographic factors and provide evidence for establishing Scr reference values in different regions. We collected 29 697 Scr reference values from healthy adults measured by 347 medical facilities from 23 provinces, 4 municipalities and 5 autonomous regions. We chose 23 geographical factors and analyzed their correlation with Scr reference values to identify the factors correlated significantly with Scr reference values. According to the Principal component analysis and Ridge regression analysis, two predictive models were constructed and the optimal model was chosen after comparison of the two model's fitting degree of predicted results and measured results. The distribution map of Scr reference values was drawn using the Kriging interpolation method. Seven geographic factors, including latitude, annual sunshine duration, annual average temperature, annual average relative humidity, annual precipitation, annual temperature range and topsoil (silt) cation exchange capacity were found to correlate significantly with Scr reference values. The overall distribution of Scr reference values featured a pattern that the values were high in the south and low in the north, varying consistently with the latitude change. The data of the geographic factors in a given region allows the prediction of the Scr values in healthy adults. Analysis of these geographical factors can facilitate the determination of the reference values specific to a region to improve the accuracy for clinical diagnoses.

  19. Learning to Rank the Severity of Unrepaired Cleft Lip Nasal Deformity on 3D Mesh Data.

    PubMed

    Wu, Jia; Tse, Raymond; Shapiro, Linda G

    2014-08-01

    Cleft lip is a birth defect that results in deformity of the upper lip and nose. Its severity is widely variable and the results of treatment are influenced by the initial deformity. Objective assessment of severity would help to guide prognosis and treatment. However, most assessments are subjective. The purpose of this study is to develop and test quantitative computer-based methods of measuring cleft lip severity. In this paper, a grid-patch based measurement of symmetry is introduced, with which a computer program learns to rank the severity of cleft lip on 3D meshes of human infant faces. Three computer-based methods to define the midfacial reference plane were compared to two manual methods. Four different symmetry features were calculated based upon these reference planes, and evaluated. The result shows that the rankings predicted by the proposed features were highly correlated with the ranking orders provided by experts that were used as the ground truth.

  20. Incorporating geographical factors with artificial neural networks to predict reference values of erythrocyte sedimentation rate.

    PubMed

    Yang, Qingsheng; Mwenda, Kevin M; Ge, Miao

    2013-03-12

    The measurement of the Erythrocyte Sedimentation Rate (ESR) value is a standard procedure performed during a typical blood test. In order to formulate a unified standard of establishing reference ESR values, this paper presents a novel prediction model in which local normal ESR values and corresponding geographical factors are used to predict reference ESR values using multi-layer feed-forward artificial neural networks (ANN). Local normal ESR values were obtained from hospital data, while geographical factors that include altitude, sunshine hours, relative humidity, temperature and precipitation were obtained from the National Geographical Data Information Centre in China.The results show that predicted values are statistically in agreement with measured values. Model results exhibit significant agreement between training data and test data. Consequently, the model is used to predict the unseen local reference ESR values. Reference ESR values can be established with geographical factors by using artificial intelligence techniques. ANN is an effective method for simulating and predicting reference ESR values because of its ability to model nonlinear and complex relationships.

  1. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    PubMed Central

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  2. Species identification of Aspergillus, Fusarium and Mucorales with direct surface analysis by matrix-assisted laser desorption ionization time-of-flight mass spectrometry.

    PubMed

    De Carolis, E; Posteraro, B; Lass-Flörl, C; Vella, A; Florio, A R; Torelli, R; Girmenia, C; Colozza, C; Tortorano, A M; Sanguinetti, M; Fadda, G

    2012-05-01

    Accurate species discrimination of filamentous fungi is essential, because some species have specific antifungal susceptibility patterns, and misidentification may result in inappropriate therapy. We evaluated matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) for species identification through direct surface analysis of the fungal culture. By use of culture collection strains representing 55 species of Aspergillus, Fusarium and Mucorales, a reference database was established for MALDI-TOF MS-based species identification according to the manufacturer's recommendations for microflex measurements and MALDI BioTyper 2.0 software. The profiles of young and mature colonies were analysed for each of the reference strains, and species-specific spectral fingerprints were obtained. To evaluate the database, 103 blind-coded fungal isolates collected in the routine clinical microbiology laboratory were tested. As a reference method for species designation, multilocus sequencing was used. Eighty-five isolates were unequivocally identified to the species level (≥99% sequence similarity); 18 isolates producing ambiguous results at this threshold were initially rated as identified to the genus level only. Further molecular analysis definitively assigned these isolates to the species Aspergillus oryzae (17 isolates) and Aspergillus flavus (one isolate), concordant with the MALDI-TOF MS results. Excluding nine isolates that belong to the fungal species not included in our reference database, 91 (96.8%) of 94 isolates were identified by MALDI-TOF MS to the species level, in agreement with the results of the reference method; three isolates were identified to the genus level. In conclusion, MALDI-TOF MS is suitable for the routine identification of filamentous fungi in a medical microbiology laboratory. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.

  3. Quantitative method for gait pattern detection based on fiber Bragg grating sensors

    NASA Astrophysics Data System (ADS)

    Ding, Lei; Tong, Xinglin; Yu, Lie

    2017-03-01

    This paper presents a method that uses fiber Bragg grating (FBG) sensors to distinguish the temporal gait patterns in gait cycles. Unlike most conventional methods that focus on electronic sensors to collect those physical quantities (i.e., strains, forces, pressure, displacements, velocity, and accelerations), the proposed method utilizes the backreflected peak wavelength from FBG sensors to describe the motion characteristics in human walking. Specifically, the FBG sensors are sensitive to external strain with the result that their backreflected peak wavelength will be shifted according to the extent of the influence of external strain. Therefore, when subjects walk in different gait patterns, the strains on FBG sensors will be different such that the magnitude of the backreflected peak wavelength varies. To test the reliability of the FBG sensor platform for gait pattern detection, the gold standard method using force-sensitive resistors (FSRs) for defining gait patterns is introduced as a reference platform. The reliability of the FBG sensor platform is determined by comparing the detection results between the FBG sensors and FSRs platforms. The experimental results show that the FBG sensor platform is reliable in gait pattern detection and gains high reliability when compared with the reference platform.

  4. Comparative study of methodologies for pulse wave velocity estimation.

    PubMed

    Salvi, P; Magnani, E; Valbusa, F; Agnoletti, D; Alecu, C; Joly, L; Benetos, A

    2008-10-01

    Arterial stiffness, estimated by pulse wave velocity (PWV), is an independent predictor of cardiovascular mortality and morbidity. However, the clinical applicability of these measurements and the elaboration of reference PWV values are difficult due to differences between the various devices used. In a population of 50 subjects aged 20-84 years, we compared PWV measurements with three frequently used devices: the Complior and the PulsePen, both of which determine aortic PWV as the delay between carotid and femoral pressure wave and the PulseTrace, which estimates the Stiffness Index (SI) by analyzing photoplethysmographic waves acquired on the fingertip. PWV was measured twice by each device. Coefficient of variation of PWV was 12.3, 12.4 and 14.5% for PulsePen, Complior and PulseTrace, respectively. These measurements were compared with the reference method, that is, a simultaneous acquisition of pressure waves using two tonometers. High correlation coefficients with the reference method were observed for PulsePen (r = 0.99) and Complior (r = 0.83), whereas for PulseTrace correlation with the reference method was much lower (r = 0.55). Upon Bland-Altman analysis, mean differences of values +/- 2s.d. versus the reference method were -0.15 +/- 0.62 m/s, 2.09 +/- 2.68 m/s and -1.12 +/- 4.92 m/s, for PulsePen, Complior and Pulse-Trace, respectively. This study confirms the reliability of Complior and PulsePen devices in estimating PWV, while the SI determined by the PulseTrace device was found to be inappropriate as a surrogate of PWV. The present results indicate the urgent need for evaluation and comparison of the different devices to standardize PWV measurements and establish reference values.

  5. Reference Intervals of Alpha-Fetoprotein and Carcinoembryonic Antigen in the Apparently Healthy Population

    PubMed Central

    Zhang, Gao-Ming; Guo, Xu-Xiao; Ma, Xiao-Bo; Zhang, Guo-Ming

    2016-01-01

    Background The aim of this study was to calculate 95% reference intervals and double-sided limits of serum alpha-fetoprotein (AFP) and carcinoembryonic antigen (CEA) according to the CLSI EP28-A3 guideline. Material/Methods Serum AFP and CEA values were measured in samples from 26 000 healthy subjects in the Shuyang area receiving general health checkups. The 95% reference intervals and upper limits were calculated by using MedCalc. Results We provided continuous reference intervals from 20 years old to 90 years old for AFP and CEA. The reference intervals were: AFP, 1.31–7.89 ng/ml (males) and 1.01–7.10 ng/ml (females); CEA, 0.51–4.86 ng/ml (males) and 0.35–3.45ng/ml (females). AFP and CEA were significantly positively correlated with age in both males (r=0.196 and r=0.198) and females (r=0.121 and r=0.197). Conclusions Different races or populations and different detection systems may result in different reference intervals for AFP and CEA. Continuous reference intervals of age changes are more accurate than age groups. PMID:27941709

  6. Evaluation of the platelet counting by Abbott CELL-DYN SAPPHIRE haematology analyser compared with flow cytometry.

    PubMed

    Grimaldi, E; Del Vecchio, L; Scopacasa, F; Lo Pardo, C; Capone, F; Pariante, S; Scalia, G; De Caterina, M

    2009-04-01

    The Abbot Cell-Dyn Sapphire is a new generation haematology analyser. The system uses optical/fluorescence flow cytometry in combination with electronic impedance to produce a full blood count. Optical and impedance are the default methods for platelet counting while automated CD61-immunoplatelet analysis can be run as selectable test. The aim of this study was to determine the platelet count performance of the three counting methods available on the instrument and to compare the results with those provided by Becton Dickinson FACSCalibur flow cytometer used as reference method. A lipid interference experiment was also performed. Linearity, carryover and precision were good, and satisfactory agreement with reference method was found for the impedance, optical and CD61-immunoplatelet analysis, although this latter provided the closest results in comparison with flow cytometry. In the lipid interference experiment, a moderate inaccuracy of optical and immunoplatelet counts was observed starting from a very high lipid value.

  7. What Analytic Method Should Clinicians Use to Derive Spine T-scores and Predict Incident Fractures in Men? Results from the MrOS study

    PubMed Central

    Hansen, Karen E; Blank, Robert D; Palermo, Lisa; Fink, Howard A; Orwoll, Eric S

    2014-01-01

    Summary In this study, the area under the curve was highest when using the lowest vertebral body T-score to diagnose osteoporosis. In men for whom hip imaging is not possible, the lowest vertebral body T-score improves ability to diagnose osteoporosis in men who are likely to have an incident fragility fracture. Purpose Spine T-scores have limited ability to predict fragility fracture. We hypothesized that using lowest vertebral body T-score to diagnose osteoporosis would better predict fracture. Methods Among men enrolled in the Osteoporotic Fractures in Men Study, we identified cases with incident clinical fracture (n=484) and controls without fracture (n=1,516). We analyzed the lumbar spine BMD in cases and controls (n=2,000) to record the L1-L4 (referent), the lowest vertebral body and ISCD-determined T-scores using a male normative database, and the L1-L4 T-score using a female normative database. We compared the ability of method to diagnose osteoporosis and therefore predict incident clinical fragility fracture, using area under the receiver operator curves (AUC) and the net reclassification index (NCI) as measures of diagnostic accuracy. ISCD-determined T-scores were determined in only 60% of participants (n=1205). Results Among 1,205 men, the AUC to predict incident clinical fracture was 0.546 for L1-L4 male, 0.542 for the L1-L4 female, 0.585 for lowest vertebral body and 0.559 for ISCD-determined T-score. The lowest vertebral body AUC was the only method significantly different from the referent method (p=0.002). Likewise, a diagnosis of osteoporosis based on the lowest vertebral body T-score demonstrated a significantly better NRI than the referent method (net NRI +0.077, p=0.005). By contrast, the net NRI for other methods of analysis did not differ from the referent method. Conclusion Our study suggests that in men, the lowest vertebral body T-score is an acceptable method by which to estimate fracture risk. PMID:24850381

  8. Robust fluoroscopic respiratory gating for lung cancer radiotherapy without implanted fiducial markers

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Dy, Jennifer G.; Sharp, Greg C.; Alexander, Brian; Jiang, Steve B.

    2007-02-01

    For gated lung cancer radiotherapy, it is difficult to generate accurate gating signals due to the large uncertainties when using external surrogates and the risk of pneumothorax when using implanted fiducial markers. We have previously investigated and demonstrated the feasibility of generating gating signals using the correlation scores between the reference template image and the fluoroscopic images acquired during the treatment. In this paper, we present an in-depth study, aiming at the improvement of robustness of the algorithm and its validation using multiple sets of patient data. Three different template generating and matching methods have been developed and evaluated: (1) single template method, (2) multiple template method, and (3) template clustering method. Using the fluoroscopic data acquired during patient setup before each fraction of treatment, reference templates are built that represent the tumour position and shape in the gating window, which is assumed to be at the end-of-exhale phase. For the single template method, all the setup images within the gating window are averaged to generate a composite template. For the multiple template method, each setup image in the gating window is considered as a reference template and used to generate an ensemble of correlation scores. All the scores are then combined to generate the gating signal. For the template clustering method, clustering (grouping of similar objects together) is performed to reduce the large number of reference templates into a few representative ones. Each of these methods has been evaluated against the reference gating signal as manually determined by a radiation oncologist. Five patient datasets were used for evaluation. In each case, gated treatments were simulated at both 35% and 50% duty cycles. False positive, negative and total error rates were computed. Experiments show that the single template method is sensitive to noise; the multiple template and clustering methods are more robust to noise due to the smoothing effect of aggregation of correlation scores; and the clustering method results in the best performance in terms of computational efficiency and accuracy.

  9. Investigation of Aerosol Surface Area Estimation from Number and Mass Concentration Measurements: Particle Density Effect.

    PubMed

    Ku, Bon Ki; Evans, Douglas E

    2012-04-01

    For nanoparticles with nonspherical morphologies, e.g., open agglomerates or fibrous particles, it is expected that the actual density of agglomerates may be significantly different from the bulk material density. It is further expected that using the material density may upset the relationship between surface area and mass when a method for estimating aerosol surface area from number and mass concentrations (referred to as "Maynard's estimation method") is used. Therefore, it is necessary to quantitatively investigate how much the Maynard's estimation method depends on particle morphology and density. In this study, aerosol surface area estimated from number and mass concentration measurements was evaluated and compared with values from two reference methods: a method proposed by Lall and Friedlander for agglomerates and a mobility based method for compact nonspherical particles using well-defined polydisperse aerosols with known particle densities. Polydisperse silver aerosol particles were generated by an aerosol generation facility. Generated aerosols had a range of morphologies, count median diameters (CMD) between 25 and 50 nm, and geometric standard deviations (GSD) between 1.5 and 1.8. The surface area estimates from number and mass concentration measurements correlated well with the two reference values when gravimetric mass was used. The aerosol surface area estimates from the Maynard's estimation method were comparable to the reference method for all particle morphologies within the surface area ratios of 3.31 and 0.19 for assumed GSDs 1.5 and 1.8, respectively, when the bulk material density of silver was used. The difference between the Maynard's estimation method and surface area measured by the reference method for fractal-like agglomerates decreased from 79% to 23% when the measured effective particle density was used, while the difference for nearly spherical particles decreased from 30% to 24%. The results indicate that the use of particle density of agglomerates improves the accuracy of the Maynard's estimation method and that an effective density should be taken into account, when known, when estimating aerosol surface area of nonspherical aerosol such as open agglomerates and fibrous particles.

  10. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  11. Visually testing the dynamic character of a blazed-angle adjustable grating by digital holographic microscopy.

    PubMed

    Qin, Chuan; Zhao, Jianlin; Di, Jianglei; Wang, Le; Yu, Yiting; Yuan, Weizheng

    2009-02-10

    We employed digital holographic microscopy to visually test microoptoelectromechanical systems (MOEMS). The sample is a blazed-angle adjustable grating. Considering the periodic structure of the sample, a local area unwrapping method based on a binary template was adopted to demodulate the fringes obtained by referring to a reference hologram. A series of holograms at different deformation states due to different drive voltages were captured to analyze the dynamic character of the MOEMS, and the uniformity of different microcantilever beams was also inspected. The results show this testing method is effective for a periodic structure.

  12. Comparative study of minutiae selection algorithms for ISO fingerprint templates

    NASA Astrophysics Data System (ADS)

    Vibert, B.; Charrier, C.; Le Bars, J.-M.; Rosenberger, C.

    2015-03-01

    We address the selection of fingerprint minutiae given a fingerprint ISO template. Minutiae selection plays a very important role when a secure element (i.e. a smart-card) is used. Because of the limited capability of computation and memory, the number of minutiae of a stored reference in the secure element is limited. We propose in this paper a comparative study of 6 minutiae selection methods including 2 methods from the literature and 1 like reference (No Selection). Experimental results on 3 fingerprint databases from the Fingerprint Verification Competition show their relative efficiency in terms of performance and computation time.

  13. 40 CFR 75.22 - Reference test methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Reference test methods. 75.22 Section...) CONTINUOUS EMISSION MONITORING Operation and Maintenance Requirements § 75.22 Reference test methods. (a) The owner or operator shall use the following methods, which are found in appendices A-1 through A-4 to part...

  14. 7 CFR 801.7 - Reference methods and tolerances for near-infrared spectroscopy (NIRS) analyzers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Reference methods and tolerances for near-infrared spectroscopy (NIRS) analyzers. 801.7 Section 801.7 Agriculture Regulations of the Department of Agriculture... methods and tolerances for near-infrared spectroscopy (NIRS) analyzers. (a) Reference methods. (1) The...

  15. 78 FR 40000 - Method for the Determination of Lead in Total Suspended Particulate Matter

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-03

    .... Purpose of the New Reference Method B. Rationale for Selection of the New Reference Method C. Comments on.../files/ambient/criteria/reference-equivalent-methods-list.pdf . C. Comments on the Proposed Rule On... information collection requirements beyond those imposed by the existing Pb monitoring requirements. C...

  16. Determination of relative ion chamber calibration coefficients from depth-ionization measurements in clinical electron beams

    NASA Astrophysics Data System (ADS)

    Muir, B. R.; McEwen, M. R.; Rogers, D. W. O.

    2014-10-01

    A method is presented to obtain ion chamber calibration coefficients relative to secondary standard reference chambers in electron beams using depth-ionization measurements. Results are obtained as a function of depth and average electron energy at depth in 4, 8, 12 and 18 MeV electron beams from the NRC Elekta Precise linac. The PTW Roos, Scanditronix NACP-02, PTW Advanced Markus and NE 2571 ion chambers are investigated. The challenges and limitations of the method are discussed. The proposed method produces useful data at shallow depths. At depths past the reference depth, small shifts in positioning or drifts in the incident beam energy affect the results, thereby providing a built-in test of incident electron energy drifts and/or chamber set-up. Polarity corrections for ion chambers as a function of average electron energy at depth agree with literature data. The proposed method produces results consistent with those obtained using the conventional calibration procedure while gaining much more information about the behavior of the ion chamber with similar data acquisition time. Measurement uncertainties in calibration coefficients obtained with this method are estimated to be less than 0.5%. These results open up the possibility of using depth-ionization measurements to yield chamber ratios which may be suitable for primary standards-level dissemination.

  17. Construction of measurement uncertainty profiles for quantitative analysis of genetically modified organisms based on interlaboratory validation data.

    PubMed

    Macarthur, Roy; Feinberg, Max; Bertheau, Yves

    2010-01-01

    A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be < 0.45% to demonstrate compliance, and > 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%.

  18. Time evolution of an SLR reference frame

    NASA Astrophysics Data System (ADS)

    Angermann, D.; Gerstl, M.; Kelm, R.; Müller, H.; Seemüller, W.; Vei, M.

    2002-07-01

    On the basis of LAGEOS-1 and LAGEOS-2 data we computed a 10-years (1990-2000) solution for SLR station positions and velocities. The paper describes the data processing with the DGFI software package DOGS. We present results for station coordinates and their time variation for 41 stations of the global SLR network, and discuss the stability and time evolution of the SLR reference frame established in the same way. We applied different methods to assess the quality and consistency of the SLR results. The results presented in this paper include: (1) a time series of weekly estimated station coordinates; (2) a comparison of a 10-year LAGEOS-1 and LAGEOS-2 solution; (3) a comparison of 2.5-year solutions with the combined 10-year solution to assess the internal stability and the time evolution of the SLR reference frame; (4) a comparison of the SLR reference frame with ITRF97; and (5) a comparison of SLR station velocities with those of ITRF97 and NNR NUVEL-1A.

  19. Reference-free Shack-Hartmann wavefront sensor.

    PubMed

    Zhao, Liping; Guo, Wenjiang; Li, Xiang; Chen, I-Ming

    2011-08-01

    The traditional Shack-Hartmann wavefront sensing (SHWS) system measures the wavefront slope by calculating the centroid shift between the sample and a reference piece, and then the wavefront is reconstructed by a suitable iterative reconstruction method. Because of the necessity of a reference, many issues are brought up, which limit the system in most applications. This Letter proposes a reference-free wavefront sensing (RFWS) methodology, and an RFWS system is built up where wavefront slope changes are measured by introducing a lateral disturbance to the sampling aperture. By using Southwell reconstruction two times to process the measured data, the form of the wavefront at the sampling plane can be well reconstructed. A theoretical simulation platform of RFWS is established, and various surface forms are investigated. Practical measurements with two measurement systems-SHWS and our RFWS-are conducted, analyzed, and compared. All the simulation and measurement results prove and demonstrate the correctness and effectiveness of the method. © 2011 Optical Society of America

  20. Reference intervals for urinary renal injury biomarkers KIM-1 and NGAL in healthy children

    PubMed Central

    McWilliam, Stephen J; Antoine, Daniel J; Sabbisetti, Venkata; Pearce, Robin E; Jorgensen, Andrea L; Lin, Yvonne; Leeder, J Steven; Bonventre, Joseph V; Smyth, Rosalind L; Pirmohamed, Munir

    2014-01-01

    Aim The aim of this study was to establish reference intervals in healthy children for two novel urinary biomarkers of acute kidney injury, kidney injury molecule-1 (KIM-1) and neutrophil gelatinase-associated lipocalin (NGAL). Materials & Methods Urinary biomarkers were determined in samples from children in the UK (n = 120) and the USA (n = 171) using both Meso Scale Discovery (MSD) and Luminex-based analytical approaches. Results 95% reference intervals for each biomarker in each cohort are presented and stratified by sex or ethnicity where necessary, and age-related variability is explored using quantile regression. We identified consistently higher NGAL concentrations in females than males (p < 0.0001), and lower KIM-1 concentrations in African–Americans than Caucasians (p = 0.02). KIM-1 demonstrated diurnal variation, with higher concentrations in the morning (p < 0.001). Conclusion This is the first report of reference intervals for KIM-1 and NGAL using two analytical methods in a healthy pediatric population in both UK and US-based populations. PMID:24661102

  1. Development of candidate reference materials for the measurement of lead in bone

    PubMed Central

    Hetter, Katherine M.; Bellis, David J.; Geraghty, Ciaran; Todd, Andrew C.; Parsons, Patrick J.

    2010-01-01

    The production of modest quantities of candidate bone lead (Pb) reference materials is described, and an optimized production procedure is presented. The reference materials were developed to enable an assessment of the interlaboratory agreement of laboratories measuring Pb in bone; method validation; and for calibration of solid sampling techniques such as laser ablation ICP-MS. Long bones obtained from Pb-dosed and undosed animals were selected to produce four different pools of a candidate powdered bone reference material. The Pb concentrations of these pools reflect both environmental and occupational exposure levels in humans. The animal bones were harvested post mortem, cleaned, defatted, and broken into pieces using the brittle fracture technique at liquid nitrogen temperature. The bone pieces were then ground in a knife mill to produce fragments of 2-mm size. These were further ground in an ultra-centrifugal mill, resulting in finely powdered bone material that was homogenized and then sampled-scooped into vials. Testing for contamination and homogeneity was performed via instrumental methods of analysis. PMID:18421443

  2. Towards the optimal fusion of high-resolution Digital Elevation Models for detailed urban flood assessment

    NASA Astrophysics Data System (ADS)

    Leitão, J. P.; de Sousa, L. M.

    2018-06-01

    Newly available, more detailed and accurate elevation data sets, such as Digital Elevation Models (DEMs) generated on the basis of imagery from terrestrial LiDAR (Light Detection and Ranging) systems or Unmanned Aerial Vehicles (UAVs), can be used to improve flood-model input data and consequently increase the accuracy of the flood modelling results. This paper presents the first application of the MBlend merging method and assesses the impact of combining different DEMs on flood modelling results. It was demonstrated that different raster merging methods can have different and substantial impacts on these results. In addition to the influence associated with the method used to merge the original DEMs, the magnitude of the impact also depends on (i) the systematic horizontal and vertical differences of the DEMs, and (ii) the orientation between the DEM boundary and the terrain slope. The greater water depth and flow velocity differences between the flood modelling results obtained using the reference DEM and the merged DEMs ranged from -9.845 to 0.002 m, and from 0.003 to 0.024 m s-1 respectively; these differences can have a significant impact on flood hazard estimates. In most of the cases investigated in this study, the differences from the reference DEM results were smaller for the MBlend method than for the results of the two conventional methods. This study highlighted the importance of DEM merging when conducting flood modelling and provided hints on the best DEM merging methods to use.

  3. Strength Analysis on Ship Ladder Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Budianto; Wahyudi, M. T.; Dinata, U.; Ruddianto; Eko P., M. M.

    2018-01-01

    In designing the ship’s structure, it should refer to the rules in accordance with applicable classification standards. In this case, designing Ladder (Staircase) on a Ferry Ship which is set up, it must be reviewed based on the loads during ship operations, either during sailing or at port operations. The classification rules in ship design refer to the calculation of the structure components described in Classification calculation method and can be analysed using the Finite Element Method. Classification Regulations used in the design of Ferry Ships used BKI (Bureau of Classification Indonesia). So the rules for the provision of material composition in the mechanical properties of the material should refer to the classification of the used vessel. The analysis in this structure used program structure packages based on Finite Element Method. By using structural analysis on Ladder (Ladder), it obtained strength and simulation structure that can withstand load 140 kg both in static condition, dynamic, and impact. Therefore, the result of the analysis included values of safety factors in the ship is to keep the structure safe but the strength of the structure is not excessive.

  4. Detector-unit-dependent calibration for polychromatic projections of rock core CT.

    PubMed

    Li, Mengfei; Zhao, Yunsong; Zhang, Peng

    2017-01-01

    Computed tomography (CT) plays an important role in digital rock analysis, which is a new prospective technique for oil and gas industry. But the artifacts in CT images will influence the accuracy of the digital rock model. In this study, we proposed and demonstrated a novel method to restore detector-unit-dependent functions for polychromatic projection calibration by scanning some simple shaped reference samples. As long as the attenuation coefficients of the reference samples are similar to the scanned object, the size or position is not needed to be exactly known. Both simulated and real data were used to verify the proposed method. The results showed that the new method reduced both beam hardening artifacts and ring artifacts effectively. Moreover, the method appeared to be quite robust.

  5. [Study on HPLC fingerprint of flavonoids from Houttuynia cordata by comparing with fingerprint reference].

    PubMed

    Zhang, Ting-Ting; Wu, Yi; Hang, Tai-Jun

    2009-05-01

    To establish a stable and repeatable HPLC fingerprint standard and evaluate the flavonoids from Houttuynia cordata qualitatively and quantitatively. HPLC separation was performed on a C18 column with methanol-0.1% phosphoric acid mixed solution as mobile phase in gradient elution mode. The fingerprint reference was determined as one of the most typical chromatograms and used to be compared with other samples through Cosine and Relative Euclid Distance methods, thus the chromatographic fingerprints of flavonoids from Houttuynia cordata were evaluated by constitutes and contents, respectively. Fourteen mutual peaks were fixed in the HPLC fingerprint of flavonoids from Houttaynia cordata. It showed good results in validation tests in which the quercitrin's peak was set as the reference peak to calculate relative retention time and area of other peaks in the chromatograms, and the RSD were less than 0.2% and 5.0%, respectively. The linear ranges for quercitrin was 1.07-83.4 microg/mL (r=0.9999) and the average recovery was 100.3%. The method shows good repeatability, ruggedness and reliability. Comparing with the established reference fingerprint, the evaluation system including Cosine and Relative Euclid Distance methods lays dependable foundation for controlling the quality of Houttuynia cordata.

  6. The Current Status and Tendency of China Millimeter Coordinate Frame Implementation and Maintenance

    NASA Astrophysics Data System (ADS)

    Cheng, P.; Cheng, Y.; Bei, J.

    2017-12-01

    China Geodetic Coordinate System 2000 (CGCS2000) was first officially declared as the national standard coordinate system on July 1, 2008. This reference frame was defined in the ITRF97 frame at epoch 2000.0 and included 2600 GPS geodetic control points. The paper discusses differences between China Geodetic Coordinate System 2000 (CGCS2000) and later updated ITRF versions, such as ITRF2014,in terms of technical implementation and maintenance. With the development of the Beidou navigation satellite system, especially third generation of BDS with signal global coverage in the future, and with progress of space geodetic technology, it is possible for us to establish a global millimeter-level reference frame based on space geodetic technology including BDS. The millimeter reference frame implementation concerns two factors: 1) The variation of geocenter motion estimation, and 2) the site nonlinear motion modeling. In this paper, the geocentric inversion methods are discussed and compared among results derived from various technical methods. Our nonlinear site movement modeling focuses on singular spectrum analysis method, which is of apparent advantages over earth physical effect modeling. All presented in the paper expected to provide reference to our future CGCS2000 maintenance.

  7. Analytic Guidance for the First Entry in a Skip Atmospheric Entry

    NASA Technical Reports Server (NTRS)

    Garcia-Llama, Eduardo

    2007-01-01

    This paper presents an analytic method to generate a reference drag trajectory for the first entry portion of a skip atmospheric entry. The drag reference, expressed as a polynomial function of the velocity, will meet the conditions necessary to fit the requirements of the complete entry phase. The generic method proposed to generate the drag reference profile is further simplified by thinking of the drag and the velocity as density and cumulative distribution functions respectively. With this notion it will be shown that the reference drag profile can be obtained by solving a linear algebraic system of equations. The resulting drag profile is flown using the feedback linearization method of differential geometric control as guidance law with the error dynamics of a second order homogeneous equation in the form of a damped oscillator. This approach was first proposed as a revisited version of the Space Shuttle Orbiter entry guidance. However, this paper will show that it can be used to fly the first entry in a skip entry trajectory. In doing so, the gains in the error dynamics will be changed at a certain point along the trajectory to improve the tracking performance.

  8. Microwave-assisted wet digestion with H2O2 at high temperature and pressure using single reaction chamber for elemental determination in milk powder by ICP-OES and ICP-MS.

    PubMed

    Muller, Edson I; Souza, Juliana P; Muller, Cristiano C; Muller, Aline L H; Mello, Paola A; Bizzi, Cezar A

    2016-08-15

    In this work a green digestion method which only used H2O2 as an oxidant and high temperature and pressure in the single reaction chamber system (SRC-UltraWave™) was applied for subsequent elemental determination by inductively coupled plasma-based techniques. Milk powder was chosen to demonstrate the feasibility and advantages of the proposed method. Samples masses up to 500mg were efficiently digested, and the determination of Ca, Fe, K, Mg and Na was performed by inductively coupled plasma optical emission spectrometry (ICP-OES), while trace elements (B, Ba, Cd, Cu, Mn, Mo, Pb, Sr and Zn) were determined by inductively coupled plasma mass spectrometry (ICP-MS). Residual carbon (RC) lower than 918mgL(-1) of C was obtained for digests which contributed to minimizing interferences in determination by ICP-OES and ICP-MS. Accuracy was evaluated using certified reference materials NIST 1549 (non-fat milk powder certified reference material) and NIST 8435 (whole milk powder reference material). The results obtained by the proposed method were in agreement with the certified reference values (t-test, 95% confidence level). In addition, no significant difference was observed between results obtained by the proposed method and conventional wet digestion using concentrated HNO3. As digestion was performed without using any kind of acid, the characteristics of final digests were in agreement with green chemistry principles when compared to digests obtained using conventional wet digestion method with concentrated HNO3. Additionally, H2O2 digests were more suitable for subsequent analysis by ICP-based techniques due to of water being the main product of organic matrix oxidation. The proposed method was suitable for quality control of major components and trace elements present in milk powder in consonance with green sample preparation. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Examination of a Method to Determine the Reference Region for Calculating the Specific Binding Ratio in Dopamine Transporter Imaging.

    PubMed

    Watanabe, Ayumi; Inoue, Yusuke; Asano, Yuji; Kikuchi, Kei; Miyatake, Hiroki; Tokushige, Takanobu

    2017-01-01

    The specific binding ratio (SBR) was first reported by Tossici-Bolt et al. for quantitative indicators for dopamine transporter (DAT) imaging. It is defined as the ratio of the specific binding concentration of the striatum to the non-specific binding concentration of the whole brain other than the striatum. The non-specific binding concentration is calculated based on the region of interest (ROI), which is set 20 mm inside the outer contour, defined by a threshold technique. Tossici-Bolt et al. used a 50% threshold, but sometimes we couldn't define the ROI of non-specific binding concentration (reference region) and calculate SBR appropriately with a 50% threshold. Therefore, we sought a new method for determining the reference region when calculating SBR. We used data from 20 patients who had undergone DAT imaging in our hospital, to calculate the non-specific binding concentration by the following methods, the threshold to define a reference region was fixed at some specific values (the fixing method) and reference region was visually optimized by an examiner at every examination (the visual optimization method). First, we assessed the reference region of each method visually, and afterward, we quantitatively compared SBR calculated based on each method. In the visual assessment, the scores of the fixing method at 30% and visual optimization method were higher than the scores of the fixing method at other values, with or without scatter correction. In the quantitative assessment, the SBR obtained by visual optimization of the reference region, based on consensus of three radiological technologists, was used as a baseline (the standard method). The values of SBR showed good agreement between the standard method and both the fixing method at 30% and the visual optimization method, with or without scatter correction. Therefore, the fixing method at 30% and the visual optimization method were equally suitable for determining the reference region.

  10. The Latest Developments in the Field of University Teaching Methods: A View from the German Democratic Republic.

    ERIC Educational Resources Information Center

    Klose-Berger, Annelore; Mohle, Horst

    1989-01-01

    Several aspects of East German research on university teaching methods, with special reference to Karl Marx University, are discussed: the development of teaching methods as part of the educational sciences field; selected recent research results, and the application of research findings to practice in the training and retraining of university…

  11. Establishing Upper Limits for Item Ratings for the Angoff Method: Are Resulting Standards More 'Realistic'?

    ERIC Educational Resources Information Center

    Reid, Jerry B.

    This report investigates an area of uncertainty in using the Angoff method for setting standards, namely whether or not a judge's conceptualizations of borderline group performance are realistic. Ratings are usually made with reference to the performance of this hypothetical group, therefore the Angoff method's success is dependent on this point.…

  12. Job Search Methods: Consequences for Gender-based Earnings Inequality.

    ERIC Educational Resources Information Center

    Huffman, Matt L.; Torres, Lisa

    2001-01-01

    Data from adults in Atlanta, Boston, and Los Angeles (n=1,942) who searched for work using formal (ads, agencies) or informal (networks) methods indicated that type of method used did not contribute to the gender gap in earnings. Results do not support formal job search as a way to reduce gender inequality. (Contains 55 references.) (SK)

  13. Three Methods of Estimating a Model of Group Effects: A Comparison with Reference to School Effect Studies.

    ERIC Educational Resources Information Center

    Igra, Amnon

    1980-01-01

    Three methods of estimating a model of school effects are compared: ordinary least squares; an approach based on the analysis of covariance; and, a residualized input-output approach. Results are presented using a matrix algebra formulation, and advantages of the first two methods are considered. (Author/GK)

  14. Artificial Satellites Observations Using the Complex of Telescopes of RI "MAO"

    NASA Astrophysics Data System (ADS)

    Sybiryakova, Ye. S.; Shulga, O. V.; Vovk, V. S.; Kaliuzny, M. P.; Bushuev, F. I.; Kulichenko, M. O.; Haloley, M. I.; Chernozub, V. M.

    2017-02-01

    Special methods, means and software for cosmic objects' observation and processing of obtained results were developed. Combined method, which consists in separated accumulation of images of reference stars and artificial objects, is the main method used in observations of artificial cosmic objects. It is used for observations of artificial objects at all types of orbits.

  15. Comparison of Three Different Methods for Pile Integrity Testing on a Cylindrical Homogeneous Polyamide Specimen

    NASA Astrophysics Data System (ADS)

    Lugovtsova, Y. D.; Soldatov, A. I.

    2016-01-01

    Three different methods for pile integrity testing are proposed to compare on a cylindrical homogeneous polyamide specimen. The methods are low strain pile integrity testing, multichannel pile integrity testing and testing with a shaker system. Since the low strain pile integrity testing is well-established and standardized method, the results from it are used as a reference for other two methods.

  16. Reduced-Reference Quality Assessment Based on the Entropy of DWT Coefficients of Locally Weighted Gradient Magnitudes.

    PubMed

    Golestaneh, S Alireza; Karam, Lina

    2016-08-24

    Perceptual image quality assessment (IQA) attempts to use computational models to estimate the image quality in accordance with subjective evaluations. Reduced-reference (RR) image quality assessment (IQA) methods make use of partial information or features extracted from the reference image for estimating the quality of distorted images. Finding a balance between the number of RR features and accuracy of the estimated image quality is essential and important in IQA. In this paper we propose a training-free low-cost RRIQA method that requires a very small number of RR features (6 RR features). The proposed RRIQA algorithm is based on the discrete wavelet transform (DWT) of locally weighted gradient magnitudes.We apply human visual system's contrast sensitivity and neighborhood gradient information to weight the gradient magnitudes in a locally adaptive manner. The RR features are computed by measuring the entropy of each DWT subband, for each scale, and pooling the subband entropies along all orientations, resulting in L RR features (one average entropy per scale) for an L-level DWT. Extensive experiments performed on seven large-scale benchmark databases demonstrate that the proposed RRIQA method delivers highly competitive performance as compared to the state-of-the-art RRIQA models as well as full reference ones for both natural and texture images. The MATLAB source code of REDLOG and the evaluation results are publicly available online at https://http://lab.engineering.asu.edu/ivulab/software/redlog/.

  17. Two-Component Noncollinear Time-Dependent Spin Density Functional Theory for Excited State Calculations.

    PubMed

    Egidi, Franco; Sun, Shichao; Goings, Joshua J; Scalmani, Giovanni; Frisch, Michael J; Li, Xiaosong

    2017-06-13

    We present a linear response formalism for the description of the electronic excitations of a noncollinear reference defined via Kohn-Sham spin density functional methods. A set of auxiliary variables, defined using the density and noncollinear magnetization density vector, allows the generalization of spin density functional kernels commonly used in collinear DFT to noncollinear cases, including local density, GGA, meta-GGA and hybrid functionals. Working equations and derivations of functional second derivatives with respect to the noncollinear density, required in the linear response noncollinear TDDFT formalism, are presented in this work. This formalism takes all components of the spin magnetization into account independent of the type of reference state (open or closed shell). As a result, the method introduced here is able to afford a nonzero local xc torque on the spin magnetization while still satisfying the zero-torque theorem globally. The formalism is applied to a few test cases using the variational exact-two-component reference including spin-orbit coupling to illustrate the capabilities of the method.

  18. Synthesized view comparison method for no-reference 3D image quality assessment

    NASA Astrophysics Data System (ADS)

    Luo, Fangzhou; Lin, Chaoyi; Gu, Xiaodong; Ma, Xiaojun

    2018-04-01

    We develop a no-reference image quality assessment metric to evaluate the quality of synthesized view rendered from the Multi-view Video plus Depth (MVD) format. Our metric is named Synthesized View Comparison (SVC), which is designed for real-time quality monitoring at the receiver side in a 3D-TV system. The metric utilizes the virtual views in the middle which are warped from left and right views by Depth-image-based rendering algorithm (DIBR), and compares the difference between the virtual views rendered from different cameras by Structural SIMilarity (SSIM), a popular 2D full-reference image quality assessment metric. The experimental results indicate that our no-reference quality assessment metric for the synthesized images has competitive prediction performance compared with some classic full-reference image quality assessment metrics.

  19. Optical monitoring of QSO in the framework of the Gaia space mission

    NASA Astrophysics Data System (ADS)

    Taris, F.; Damljanovic, G.; Andrei, A.; Klotz, A.; Vachier, F.

    2015-08-01

    The Gaia astrometric mission of the European Space Agency has been launched the 19th December 2013. It will provide an astrometric catalogue of 500 000 extragalactic sources that could be the basis of a new optical reference frame. On the other hand, the current International Celestial Reference Frame (ICRF) is based on the observations of extragalactic sources at radio wavelength. The astrometric coordinates of sources in these two reference systems will have roughly the same uncertainty. It is then mandatory to observe a set of common targets at both optical and radio wavelength to link the ICRF with what could be called the GCRF (Gaia Celestial Reference Frame). We will show in this paper some results obtained with the TJO, Telescopi Juan Oro, from Observatori Astronomic del Montsec in Spain. It also presents some results obtained with the Lomb-Scargle and CLEAN algorithm methods applied to optical magnitude obtained with the TAROT telescopes.

  20. Model reference adaptive control (MRAC)-based parameter identification applied to surface-mounted permanent magnet synchronous motor

    NASA Astrophysics Data System (ADS)

    Zhong, Chongquan; Lin, Yaoyao

    2017-11-01

    In this work, a model reference adaptive control-based estimated algorithm is proposed for online multi-parameter identification of surface-mounted permanent magnet synchronous machines. By taking the dq-axis equations of a practical motor as the reference model and the dq-axis estimation equations as the adjustable model, a standard model-reference-adaptive-system-based estimator was established. Additionally, the Popov hyperstability principle was used in the design of the adaptive law to guarantee accurate convergence. In order to reduce the oscillation of identification result, this work introduces a first-order low-pass digital filter to improve precision regarding the parameter estimation. The proposed scheme was then applied to an SPM synchronous motor control system without any additional circuits and implemented using a DSP TMS320LF2812. For analysis, the experimental results reveal the effectiveness of the proposed method.

  1. Hematology and plasma chemistry reference intervals for cultured tilapia (Oreochromis hybrid).

    PubMed

    Hrubec, Terry C.; Cardinale, Jenifer L.; Smith, Stephen A.

    2000-01-01

    Tilapia are a commonly aquacultured fish yet little is known about their normal physiology and response to disease. In this study we determined the results of complete hematologic (n=40) and plasma biochemical profiles (n=63) in production tilapia (Oreochromis hybrids). The fish were raised in recirculating systems with a high stocking density (120 g/L), and were in the middle of a 15-month production cycle. Blood was analyzed using standard techniques, and reference intervals were determined using nonparametric methods. Non-production tilapia (n=15) from low-density tanks (4 g/L) also were sampled; the clinical chemistry results were compared to reference intervals from the fish raised in high-density tanks. Differences were noted in plasma protein, calcium and phosphorus concentrations, such that reference intervals for high-density production tilapia were not applicable to fish raised under different environmental and management conditions.

  2. Frequency References for Gravitational Wave Missions

    NASA Technical Reports Server (NTRS)

    Preston, Alix; Thrope, J. I.; Donelan, D.; Miner, L.

    2012-01-01

    The mitigation of laser frequency noise is an important aspect of interferometry for LISA-like missions. One portion of the baseline mitigation strategy in LISA is active stabilization utilizing opto-mechanical frequency references. The LISA optical bench is an attractive place to implement such frequency references due to its environmental stability and its access to primary and redundant laser systems. We have made an initial investigation of frequency references constructed using the techniques developed for the LISA and LISA Pathfinder optical benches. Both a Mach-Zehnder interferometer and triangular Fabry-Perot cavity have been successfully bonded to a Zerodur baseplate using the hydroxide bonding method. We will describe the construction of the bench along with preliminary stability results.

  3. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  4. Evaluation and improvements of a mayfly, Neocloeon (Centroptilum) triangulifer ?(Ephemeroptera: Baetidae) toxicity test method

    EPA Science Inventory

    A recently published test method for Neocloeon triangulifer assessed the sensitivities of larval mayflies to several reference toxicants (NaCl, KCl, and CuSO4). Subsequent exposures have shown discrepancies from those results previously reported. To identify potential sources of ...

  5. Compositional analysis of biomass reference materials: Results from an interlaboratory study

    DOE PAGES

    Templeton, David W.; Wolfrum, Edward J.; Yen, James H.; ...

    2015-10-29

    Biomass compositional methods are used to compare different lignocellulosic feedstocks, to measure component balances around unit operations and to determine process yields and therefore the economic viability of biomass-to-biofuel processes. Four biomass reference materials (RMs NIST 8491–8494) were prepared and characterized, via an interlaboratory comparison exercise in the early 1990s to evaluate biomass summative compositional methods, analysts, and laboratories. Having common, uniform, and stable biomass reference materials gives the opportunity to assess compositional data compared to other analysts, to other labs, and to a known compositional value. The expiration date for the original characterization of these RMs was reached andmore » an effort to assess their stability and recharacterize the reference values for the remaining material using more current methods of analysis was initiated. We sent samples of the four biomass RMs to 11 academic, industrial, and government laboratories, familiar with sulfuric acid compositional methods, for recharacterization of the component reference values. In this work, we have used an expanded suite of analytical methods that are more appropriate for herbaceous feedstocks, to recharacterize the RMs’ compositions. We report the median values and the expanded uncertainty values for the four RMs on a dry-mass, whole-biomass basis. The original characterization data has been recalculated using median statistics to facilitate comparisons with this data. We found improved total component closures for three out of the four RMs compared to the original characterization, and the total component closures were near 100 %, which suggests that most components were accurately measured and little double counting occurred. Here, the major components were not statistically different in the recharacterization which suggests that the biomass materials are stable during storage and that additional components, not seen in the original characterization, were quantified here.« less

  6. Size consistent formulations of the perturb-then-diagonalize Møller-Plesset perturbation theory correction to non-orthogonal configuration interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yost, Shane R.; Head-Gordon, Martin, E-mail: mhg@cchem.berkeley.edu; Chemical Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720

    2016-08-07

    In this paper we introduce two size consistent forms of the non-orthogonal configuration interaction with second-order Møller-Plesset perturbation theory method, NOCI-MP2. We show that the original NOCI-MP2 formulation [S. R. Yost, T. Kowalczyk, and T. VanVoorh, J. Chem. Phys. 193, 174104 (2013)], which is a perturb-then-diagonalize multi-reference method, is not size consistent. We also show that this causes significant errors in large systems like the linear acenes. By contrast, the size consistent versions of the method give satisfactory results for singlet and triplet excited states when compared to other multi-reference methods that include dynamic correlation. For NOCI-MP2 however, the numbermore » of required determinants to yield similar levels of accuracy is significantly smaller. These results show the promise of the NOCI-MP2 method, though work still needs to be done in creating a more consistent black-box approach to computing the determinants that comprise the many-electron NOCI basis.« less

  7. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  8. The determination of mercury in mushrooms by CV-AAS and ICP-AES techniques.

    PubMed

    Jarzynska, Grazyna; Falandysz, Jerzy

    2011-01-01

    This research presents an example of an excellent applied study on analytical problems due to hazardous mercury determination in environmental materials and validity of published results on content of this element in wild growing mushrooms. The total mercury content has been analyzed in a several species of wild-grown mushrooms and some herbal origin certified reference materials, using two analytical methods. One method was commonly known and well validated the cold-vapour atomic absorption spectroscopy (CV-AAS) after a direct sample pyrolysis coupled to the gold wool trap, which was a reference method. A second method was a procedure that involved a final mercury measurement using the inductively-coupled plasma atomic emission spectroscopy (ICP-AES) at λ 194.163 nm, which was used by some authors to report on a high mercury content of a large sets of wild-grown mushrooms. We found that the method using the ICP-AES at λ 194.163 nm gave inaccurate and imprecise results. The results of this study imply that because of unsuitability of total mercury determination using the ICP-AES at λ 194.163 nm, the reports on great concentrations of this metal in a large sets of wild-grown mushrooms, when examined using this method, have to be studied with caution, since data are highly biased.

  9. Monitoring hydrofrac-induced seismicity by surface arrays - the DHM-Project Basel case study

    NASA Astrophysics Data System (ADS)

    Blascheck, P.; Häge, M.; Joswig, M.

    2012-04-01

    The method "nanoseismic monitoring" was applied during the hydraulic stimulation at the Deep-Heat-Mining-Project (DHM-Project) Basel. Two small arrays in a distance of 2.1 km and 4.8 km to the borehole recorded continuously for two days. During this time more than 2500 seismic events were detected. The method of the surface monitoring of induced seismicity was compared to the reference which the hydrofrac monitoring presented. The latter was conducted by a network of borehole seismometers by Geothermal Explorers Limited. Array processing provides a outlier resistant, graphical jack-knifing localization method which resulted in a average deviation towards the reference of 850 m. Additionally, by applying the relative localization master-event method, the NNW-SSE strike direction of the reference was confirmed. It was shown that, in order to successfully estimate the magnitude of completeness as well as the b-value at the event rate and detection sensibility present, 3 h segments of data are sufficient. This is supported by two segment out of over 13 h of evaluated data. These segments were chosen so that they represent a time during the high seismic noise during normal working hours in daytime as well as the minimum anthropogenic noise at night. The low signal-to-noise ratio was compensated by the application of a sonogram event detection as well as a coincidence analysis within each array. Sonograms allow by autoadaptive, non-linear filtering to enhance signals whose amplitudes are just above noise level. For these events the magnitude was determined by the master-event method, allowing to compute the magnitude of completeness by the entire-magnitude-range method provided by the ZMAP toolbox. Additionally, the b-values were determined and compared to the reference values. An introduction to the method of "nanoseismic monitoring" will be given as well as the comparison to reference data in the Basel case study.

  10. A quasiparticle-based multi-reference coupled-cluster method.

    PubMed

    Rolik, Zoltán; Kállay, Mihály

    2014-10-07

    The purpose of this paper is to introduce a quasiparticle-based multi-reference coupled-cluster (MRCC) approach. The quasiparticles are introduced via a unitary transformation which allows us to represent a complete active space reference function and other elements of an orthonormal multi-reference (MR) basis in a determinant-like form. The quasiparticle creation and annihilation operators satisfy the fermion anti-commutation relations. On the basis of these quasiparticles, a generalization of the normal-ordered operator products for the MR case can be introduced as an alternative to the approach of Mukherjee and Kutzelnigg [Recent Prog. Many-Body Theor. 4, 127 (1995); Mukherjee and Kutzelnigg, J. Chem. Phys. 107, 432 (1997)]. Based on the new normal ordering any quasiparticle-based theory can be formulated using the well-known diagram techniques. Beyond the general quasiparticle framework we also present a possible realization of the unitary transformation. The suggested transformation has an exponential form where the parameters, holding exclusively active indices, are defined in a form similar to the wave operator of the unitary coupled-cluster approach. The definition of our quasiparticle-based MRCC approach strictly follows the form of the single-reference coupled-cluster method and retains several of its beneficial properties. Test results for small systems are presented using a pilot implementation of the new approach and compared to those obtained by other MR methods.

  11. 40 CFR 53.11 - Cancellation of reference or equivalent method designation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Cancellation of reference or equivalent method designation. 53.11 Section 53.11 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General...

  12. Investigation of Aerosol Surface Area Estimation from Number and Mass Concentration Measurements: Particle Density Effect

    PubMed Central

    Ku, Bon Ki; Evans, Douglas E.

    2015-01-01

    For nanoparticles with nonspherical morphologies, e.g., open agglomerates or fibrous particles, it is expected that the actual density of agglomerates may be significantly different from the bulk material density. It is further expected that using the material density may upset the relationship between surface area and mass when a method for estimating aerosol surface area from number and mass concentrations (referred to as “Maynard’s estimation method”) is used. Therefore, it is necessary to quantitatively investigate how much the Maynard’s estimation method depends on particle morphology and density. In this study, aerosol surface area estimated from number and mass concentration measurements was evaluated and compared with values from two reference methods: a method proposed by Lall and Friedlander for agglomerates and a mobility based method for compact nonspherical particles using well-defined polydisperse aerosols with known particle densities. Polydisperse silver aerosol particles were generated by an aerosol generation facility. Generated aerosols had a range of morphologies, count median diameters (CMD) between 25 and 50 nm, and geometric standard deviations (GSD) between 1.5 and 1.8. The surface area estimates from number and mass concentration measurements correlated well with the two reference values when gravimetric mass was used. The aerosol surface area estimates from the Maynard’s estimation method were comparable to the reference method for all particle morphologies within the surface area ratios of 3.31 and 0.19 for assumed GSDs 1.5 and 1.8, respectively, when the bulk material density of silver was used. The difference between the Maynard’s estimation method and surface area measured by the reference method for fractal-like agglomerates decreased from 79% to 23% when the measured effective particle density was used, while the difference for nearly spherical particles decreased from 30% to 24%. The results indicate that the use of particle density of agglomerates improves the accuracy of the Maynard’s estimation method and that an effective density should be taken into account, when known, when estimating aerosol surface area of nonspherical aerosol such as open agglomerates and fibrous particles. PMID:26526560

  13. Automatic dynamic range adjustment for ultrasound B-mode imaging.

    PubMed

    Lee, Yeonhwa; Kang, Jinbum; Yoo, Yangmo

    2015-02-01

    In medical ultrasound imaging, dynamic range (DR) is defined as the difference between the maximum and minimum values of the displayed signal to display and it is one of the most essential parameters that determine its image quality. Typically, DR is given with a fixed value and adjusted manually by operators, which leads to low clinical productivity and high user dependency. Furthermore, in 3D ultrasound imaging, DR values are unable to be adjusted during 3D data acquisition. A histogram matching method, which equalizes the histogram of an input image based on that from a reference image, can be applied to determine the DR value. However, it could be lead to an over contrasted image. In this paper, a new Automatic Dynamic Range Adjustment (ADRA) method is presented that adaptively adjusts the DR value by manipulating input images similar to a reference image. The proposed ADRA method uses the distance ratio between the log average and each extreme value of a reference image. To evaluate the performance of the ADRA method, the similarity between the reference and input images was measured by computing a correlation coefficient (CC). In in vivo experiments, the CC values were increased by applying the ADRA method from 0.6872 to 0.9870 and from 0.9274 to 0.9939 for kidney and liver data, respectively, compared to the fixed DR case. In addition, the proposed ADRA method showed to outperform the histogram matching method with in vivo liver and kidney data. When using 3D abdominal data with 70 frames, while the CC value from the ADRA method is slightly increased (i.e., 0.6%), the proposed method showed improved image quality in the c-plane compared to its fixed counterpart, which suffered from a shadow artifact. These results indicate that the proposed method can enhance image quality in 2D and 3D ultrasound B-mode imaging by improving the similarity between the reference and input images while eliminating unnecessary manual interaction by the user. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Reference values assessment in a Mediterranean population for small dense low-density lipoprotein concentration isolated by an optimized precipitation method.

    PubMed

    Fernández-Cidón, Bárbara; Padró-Miquel, Ariadna; Alía-Ramos, Pedro; Castro-Castro, María José; Fanlo-Maresma, Marta; Dot-Bach, Dolors; Valero-Politi, José; Pintó-Sala, Xavier; Candás-Estébanez, Beatriz

    2017-01-01

    High serum concentrations of small dense low-density lipoprotein cholesterol (sd-LDL-c) particles are associated with risk of cardiovascular disease (CVD). Their clinical application has been hindered as a consequence of the laborious current method used for their quantification. Optimize a simple and fast precipitation method to isolate sd-LDL particles and establish a reference interval in a Mediterranean population. Forty-five serum samples were collected, and sd-LDL particles were isolated using a modified heparin-Mg 2+ precipitation method. sd-LDL-c concentration was calculated by subtracting high-density lipoprotein cholesterol (HDL-c) from the total cholesterol measured in the supernatant. This method was compared with the reference method (ultracentrifugation). Reference values were estimated according to the Clinical and Laboratory Standards Institute and The International Federation of Clinical Chemistry and Laboratory Medicine recommendations. sd-LDL-c concentration was measured in serums from 79 subjects with no lipid metabolism abnormalities. The Passing-Bablok regression equation is y = 1.52 (0.72 to 1.73) + 0.07 x (-0.1 to 0.13), demonstrating no significant statistical differences between the modified precipitation method and the ultracentrifugation reference method. Similarly, no differences were detected when considering only sd-LDL-c from dyslipidemic patients, since the modifications added to the precipitation method facilitated the proper sedimentation of triglycerides and other lipoproteins. The reference interval for sd-LDL-c concentration estimated in a Mediterranean population was 0.04-0.47 mmol/L. An optimization of the heparin-Mg 2+ precipitation method for sd-LDL particle isolation was performed, and reference intervals were established in a Spanish Mediterranean population. Measured values were equivalent to those obtained with the reference method, assuring its clinical application when tested in both normolipidemic and dyslipidemic subjects.

  15. Postural stabilization after single-leg vertical jump in individuals with chronic ankle instability.

    PubMed

    Nunes, Guilherme S; de Noronha, Marcos

    2016-11-01

    To investigate the impact different ways to define reference balance can have when analysing time to stabilization (TTS). Secondarily, to investigate the difference in TTS between people with chronic ankle instability (CAI) and healthy controls. Cross-sectional study. Laboratory. Fifty recreational athletes (25 CAI, 25 controls). TTS of the center of pressure (CoP) after maximal single-leg vertical jump using as reference method the single-leg stance, pre-jump period, and post-jump period; and the CoP variability during the reference methods. The post-jump reference period had lower values for TTS in the anterior-posterior (AP) direction when compared to single-leg stance (P = 0.001) and to pre-jump (P = 0.002). For TTS in the medio-lateral (ML) direction, the post-jump reference period showed lower TTS when compared to single-leg stance (P = 0.01). We found no difference between CAI and control group for TTS for any direction. The CAI group showed more CoP variability than control group in the single-leg stance reference period for both directions. Different reference periods will produce different results for TTS. There is no difference in TTS after a maximum vertical jump between groups. People with CAI have more CoP variability in both directions during single-leg stance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Determination of Reference Catalogs for Meridian Observations Using Statistical Method

    NASA Astrophysics Data System (ADS)

    Li, Z. Y.

    2014-09-01

    The meridian observational data are useful for developing high-precision planetary ephemerides of the solar system. These historical data are provided by the jet propulsion laboratory (JPL) or the Institut De Mecanique Celeste Et De Calcul Des Ephemerides (IMCCE). However, we find that the reference systems (realized by the fundamental catalogs FK3 (Third Fundamental Catalogue), FK4 (Fourth Fundamental Catalogue), and FK5 (Fifth Fundamental Catalogue), or Hipparcos), to which the observations are referred, are not given explicitly for some sets of data. The incompleteness of information prevents us from eliminating the systematic effects due to the different fundamental catalogs. The purpose of this paper is to specify clearly the reference catalogs of these observations with the problems in their records by using the JPL DE421 ephemeris. The data for the corresponding planets in the geocentric celestial reference system (GCRS) obtained from the DE421 are transformed to the apparent places with different hypothesis regarding the reference catalogs. Then the validations of the hypothesis are tested by two kinds of statistical quantities which are used to indicate the significance of difference between the original and transformed data series. As a result, this method is proved to be effective for specifying the reference catalogs, and the missed information is determined unambiguously. Finally these meridian data are transformed to the GCRS for further applications in the development of planetary ephemerides.

  17. Minimum Information for Reporting Next Generation Sequence Genotyping (MIRING): Guidelines for Reporting HLA and KIR Genotyping via Next Generation Sequencing

    PubMed Central

    Mack, Steven J.; Milius, Robert P.; Gifford, Benjamin D.; Sauter, Jürgen; Hofmann, Jan; Osoegawa, Kazutoyo; Robinson, James; Groeneweg, Mathijs; Turenchalk, Gregory S.; Adai, Alex; Holcomb, Cherie; Rozemuller, Erik H.; Penning, Maarten T.; Heuer, Michael L.; Wang, Chunlin; Salit, Marc L.; Schmidt, Alexander H.; Parham, Peter R.; Müller, Carlheinz; Hague, Tim; Fischer, Gottfried; Fernandez-Viňa, Marcelo; Hollenbach, Jill A; Norman, Paul J.; Maiers, Martin

    2015-01-01

    The development of next-generation sequencing (NGS) technologies for HLA and KIR genotyping is rapidly advancing knowledge of genetic variation of these highly polymorphic loci. NGS genotyping is poised to replace older methods for clinical use, but standard methods for reporting and exchanging these new, high quality genotype data are needed. The Immunogenomic NGS Consortium, a broad collaboration of histocompatibility and immunogenetics clinicians, researchers, instrument manufacturers and software developers, has developed the Minimum Information for Reporting Immunogenomic NGS Genotyping (MIRING) reporting guidelines. MIRING is a checklist that specifies the content of NGS genotyping results as well as a set of messaging guidelines for reporting the results. A MIRING message includes five categories of structured information – message annotation, reference context, full genotype, consensus sequence and novel polymorphism – and references to three categories of accessory information – NGS platform documentation, read processing documentation and primary data. These eight categories of information ensure the long-term portability and broad application of this NGS data for all current histocompatibility and immunogenetics use cases. In addition, MIRING can be extended to allow the reporting of genotype data generated using pre-NGS technologies. Because genotyping results reported using MIRING are easily updated in accordance with reference and nomenclature databases, MIRING represents a bold departure from previous methods of reporting HLA and KIR genotyping results, which have provided static and less-portable data. More information about MIRING can be found online at miring.immunogenomics.org. PMID:26407912

  18. Validation of Modifications to the ANSR(®) Listeria Method for Improved Ease of Use and Performance.

    PubMed

    Caballero, Oscar; Alles, Susan; Le, Quynh-Nhi; Gray, R Lucas; Hosking, Edan; Pinkava, Lisa; Norton, Paul; Tolan, Jerry; Mozola, Mark; Rice, Jennifer; Chen, Yi; Odumeru, Joseph; Ryser, Elliot

    2016-01-01

    A study was conducted to validate minor reagent formulation, enrichment, and procedural changes to the ANSR(®) Listeria method, Performance-Tested Method(SM) 101202. In order to improve ease of use and diminish risk of amplicon contamination, the lyophilized reagent components were reformulated for increased solubility, thus eliminating the need to mix by pipetting. In the alternative procedure, an aliquot of the lysate is added to lyophilized ANSR reagents, immediately capped, and briefly mixed by vortexing. When three foods (hot dogs, Mexican-style cheese, and cantaloupe) and sponge samples taken from a stainless steel surface were tested, significant differences in performance between the ANSR and U.S. Food and Drug Administration Bacteriological Analytical Manual or U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook reference culture procedures were seen with hot dogs and Mexican-style cheese after 16 h enrichment, with the reference methods producing more positive results. After 24 h enrichment, however, there were no significant differences in method performance for any of the four matrixes tested. Robustness testing was also conducted, with variations to lysis buffer volume, lysis time, and sample volume having no demonstrable effect on assay results. Accelerated stability testing was carried out over a 10-week period and showed no diminishment in assay performance. A second phase of the study examined performance of the ANSR assay following enrichment in a new medium, LESS Plus broth, designed for use with all food and environmental sample types. With the alternative LESS Plus broth, there were no significant differences in performance between the ANSR method and the reference culture procedures for any of the matrixes tested after either 16 or 24 h enrichment, although 24 h enrichment is recommended for hot dogs due to higher sensitivity. Results of inclusivity and exclusivity testing using LESS Plus broth showed that the ANSR assay is highly specific, with 100% expected results for target and nontarget bacteria.

  19. Virtual screening of cocrystal formers for CL-20

    NASA Astrophysics Data System (ADS)

    Zhou, Jun-Hong; Chen, Min-Bo; Chen, Wei-Ming; Shi, Liang-Wei; Zhang, Chao-Yang; Li, Hong-Zhen

    2014-08-01

    According to the structure characteristics of 2,4,6,8,10,12-hexanitrohexaazaisowurtzitane (CL-20) and the kinetic mechanism of the cocrystal formation, the method of virtual screening CL-20 cocrystal formers by the criterion of the strongest intermolecular site pairing energy (ISPE) was proposed. In this method the strongest ISPE was thought to determine the first step of the cocrystal formation. The prediction results for four sets of common drug molecule cocrystals by this method were compared with those by the total ISPE method from the reference (Musumeci et al., 2011), and the experimental results. This method was then applied to virtually screen the CL-20 cocrystal formers, and the prediction results were compared with the experimental results.

  20. A symmetrical subtraction combined with interpolated values for eliminating scattering from fluorescence EEM data.

    PubMed

    Xu, Jing; Liu, Xiaofei; Wang, Yutian

    2016-08-05

    Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Evaluation of the Water Film Weber Number in Glaze Icing Scaling

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching; Kreeger, Richard E.; Feo, Alejandro

    2010-01-01

    Icing scaling tests were performed in the NASA Glenn Icing Research Tunnel to evaluate a new scaling method, developed and proposed by Feo for glaze icing, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number. For comparison purpose, tests were also conducted using the constant We(sub L) method for velocity scaling. The reference tests used a full-span, fiberglass, 91.4-cm-chord NACA 0012 model with velocities of 76 and 100 knot and MVD sizes of 150 and 195 microns. Scale-to-reference model size ratio was 1:2.6. All tests were made at 0deg AOA. Results will be presented for stagnation point freezing fractions of 0.3 and 0.5.

  2. Analysis of polycyclic aromatic hydrocarbons in sediment reference materials by microwave-assisted extraction.

    PubMed

    Shu, Y Y; Lao, R C; Chiu, C H; Turle, R

    2000-12-01

    The microwave-assisted extraction (MAE) of polycyclic aromatic hydrocarbons (PAHs) from harbor sediment reference material EC-1, marine sediment reference material HS-2 and PAH-spiked river bed soil was conducted. The extraction conditions for EC-1 were carried out at 70 degrees C and 100 degrees C under pressure in closed vessels with cyclohexane acetone (1:1), cyclohexane-water (3:1), hexane acetone (1:1), and hexane-water (3:1) for 10 min. A comparison between MAE and a 16-h Soxhlet extraction (SX) method showed that both techniques gave comparable results with certified values. MAE has advantages over the currently used Soxhlet technique due to a faster extraction time and lower quantity of solvent used. The consumption of organic solvent of the microwave method was less than one-tenth compared to Soxhlet.

  3. HUGO: Hierarchical mUlti-reference Genome cOmpression for aligned reads

    PubMed Central

    Li, Pinghao; Jiang, Xiaoqian; Wang, Shuang; Kim, Jihoon; Xiong, Hongkai; Ohno-Machado, Lucila

    2014-01-01

    Background and objective Short-read sequencing is becoming the standard of practice for the study of structural variants associated with disease. However, with the growth of sequence data largely surpassing reasonable storage capability, the biomedical community is challenged with the management, transfer, archiving, and storage of sequence data. Methods We developed Hierarchical mUlti-reference Genome cOmpression (HUGO), a novel compression algorithm for aligned reads in the sorted Sequence Alignment/Map (SAM) format. We first aligned short reads against a reference genome and stored exactly mapped reads for compression. For the inexact mapped or unmapped reads, we realigned them against different reference genomes using an adaptive scheme by gradually shortening the read length. Regarding the base quality value, we offer lossy and lossless compression mechanisms. The lossy compression mechanism for the base quality values uses k-means clustering, where a user can adjust the balance between decompression quality and compression rate. The lossless compression can be produced by setting k (the number of clusters) to the number of different quality values. Results The proposed method produced a compression ratio in the range 0.5–0.65, which corresponds to 35–50% storage savings based on experimental datasets. The proposed approach achieved 15% more storage savings over CRAM and comparable compression ratio with Samcomp (CRAM and Samcomp are two of the state-of-the-art genome compression algorithms). The software is freely available at https://sourceforge.net/projects/hierachicaldnac/with a General Public License (GPL) license. Limitation Our method requires having different reference genomes and prolongs the execution time for additional alignments. Conclusions The proposed multi-reference-based compression algorithm for aligned reads outperforms existing single-reference based algorithms. PMID:24368726

  4. Rapid detection of multidrug-resistant Mycobacterium tuberculosis using the malachite green decolourisation assay

    PubMed Central

    Coban, Ahmet Yilmaz; Uzun, Meltem

    2013-01-01

    Early detection of drug resistance in Mycobacterium tuberculosis isolates allows for earlier and more effective treatment of patients. The aim of this study was to investigate the performance of the malachite green decolourisation assay (MGDA) in detecting isoniazid (INH) and rifampicin (RIF) resistance in M. tuberculosis clinical isolates. Fifty M. tuberculosis isolates, including 19 multidrug-resistant, eight INH-resistant and 23 INH and RIF-susceptible samples, were tested. The sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) and agreement of the assay for INH were 92.5%, 91.3%, 92.5%, 91.3% and 92%, respectively. Similarly, the sensitivity, specificity, PPV, NPV and agreement of the assay for RIF were 94.7%, 100%, 100%, 96.8% and 98%, respectively. There was a major discrepancy in the tests of two isolates, as they were sensitive to INH by the MGDA test, but resistant by the reference method. There was a minor discrepancy in the tests of two additional isolates, as they were sensitive to INH by the reference method, but resistant by the MGDA test. The drug susceptibility test results were obtained within eight-nine days. In conclusion, the MGDA test is a reliable and accurate method for the rapid detection of INH and RIF resistance compared with the reference method and the MGDA test additionally requires less time to obtain results. PMID:24402143

  5. Verification of Abbott 25-OH-vitamin D assay on the architect system.

    PubMed

    Hutchinson, Katrina; Healy, Martin; Crowley, Vivion; Louw, Michael; Rochev, Yury

    2017-04-01

    Analytical and clinical verification of both old and new generations of the Abbott total 25-hydroxyvitamin D (25OHD) assays, and an examination of reference Intervals. Determination of between-run precision, and Deming comparison between patient sample results for 25OHD on the Abbott Architect, DiaSorin Liaison and AB SCIEX API 4000 (LC-MS/MS). Establishment of uncertainty of measurement for 25OHD Architect methods using old and new generations of the reagents, and estimation of reference interval in healthy Irish population. For between-run precision the manufacturer claims 2.8% coefficients of variation (CVs) of 2.8% and 4.6% for their high and low controls, respectively. Our instrument showed CVs between 4% and 6.2% for all levels of the controls on both generations of the Abbott reagents. The between-run uncertainties were 0.28 and 0.36, with expanded uncertainties 0.87 and 0.98 for the old and the new generations of reagent, respectively. The difference between all methods used for patients' samples was within total allowable error, and the instruments produced clinically equivalent results. The results covered the medical decision points of 30, 40, 50 and 125 nmol/L. The reference interval for total 25OHD in our healthy Irish subjects was lower than recommended levels (24-111 nmol/L). In a clinical laboratory Abbott 25OHD immunoassays are a useful, rapid and accurate method for measuring total 25OHD. The new generation of the assay was confirmed to be reliable, accurate, and a good indicator for 25OHD measurement. More study is needed to establish reference intervals that correctly represent the healthy population in Ireland.

  6. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 inches and the scale model had a chord of 21 inches. Reference tests were run with airspeeds of 100 and 130.3 knots and with MVD's of 85 and 170 microns. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number W (sub eL). The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the non-dimensional water-film thickness expression and the film Weber number W (sub ef). All tests were conducted at 0 degrees angle of arrival. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For non-dimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-dimensional ice shape profiles at any selected span-wise location from the high fidelity 3-dimensional scanned ice shapes obtained in the IRT.

  7. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.

  8. 40 CFR 53.14 - Modification of a reference or equivalent method.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Modification of a reference or equivalent method. 53.14 Section 53.14 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions...

  9. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Designation of reference and equivalent methods. 53.8 Section 53.8 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions § 53.8...

  10. Thermal-noise-limited higher-order mode locking of a reference cavity

    NASA Astrophysics Data System (ADS)

    Zeng, X. Y.; Ye, Y. X.; Shi, X. H.; Wang, Z. Y.; Deng, K.; Zhang, J.; Lu, Z. H.

    2018-04-01

    Higher-order mode locking has been proposed to reduce the thermal noise limit of reference cavities. By locking a laser to the HG02 mode of a 10-cm long all ULE cavity, and measure its performance with the three-cornered-hat method among three independently stabilized lasers, we demonstrate a thermal noise limited performance of a fractional frequency instability of 4.9E-16. The results match the theoretical models with higher-order optical modes. The achieved laser instability improves the all ULE short cavity results to a new low level.

  11. Comparison between B·R·A·H·M·S PCT direct, a new sensitive point-of-care testing device for rapid quantification of procalcitonin in emergency department patients and established reference methods - a prospective multinational trial.

    PubMed

    Kutz, Alexander; Hausfater, Pierre; Oppert, Michael; Alan, Murat; Grolimund, Eva; Gast, Claire; Alonso, Christine; Wissmann, Christoph; Kuehn, Christian; Bernard, Maguy; Huber, Andreas; Mueller, Beat; Schuetz, Philipp

    2016-04-01

    Procalcitonin (PCT) is increasingly being used for the diagnostic and prognostic work up of patients with suspected infections in the emergency department (ED). Recently, B·R·A·H·M·S PCT direct, the first high sensitive point-of-care test (POCT), has been developed for fast PCT measurement on capillary or venous blood samples. This is a prospective, international comparison study conducted in three European EDs. Consecutive patients with suspicion of bacterial infection were included. Duplicate determination of PCT was performed in capillary (fingertip) and venous whole blood (EDTA), and compared to the reference method. The diagnostic accuracy was evaluated by correlation and concordance analyses. Three hundred and three patients were included over a 6-month period (60.4% male, median age 65.2 years). The correlation between capillary or venous whole blood and the reference method was excellent: r2=0.96 and 0.97, sensitivity 88.1% and 93.0%, specificity 96.5% and 96.8%, concordance 93% and 95%, respectively at a 0.25 μg/L threshold. No significant bias was observed (-0.04 and -0.02 for capillary and venous whole blood) although there were 6.8% and 5.1% outliers, respectively. B·R·A·H·M·S PCT direct had a shorter time to result as compared to the reference method (25 vs. 144 min, difference 119 min, 95% CI 110-134 min, p<0.0001). This study found a high diagnostic accuracy and a faster time to result of B·R·A·H·M·S PCT direct in the ED setting, allowing shortening time to therapy and a more wide-spread use of PCT.

  12. First Definition of Reference Intervals of Liver Function Tests in China: A Large-Population-Based Multi-Center Study about Healthy Adults

    PubMed Central

    Zhang, Chuanbao; Guo, Wei; Huang, Hengjian; Ma, Yueyun; Zhuang, Junhua; Zhang, Jie

    2013-01-01

    Background Reference intervals of Liver function tests are very important for the screening, diagnosis, treatment, and monitoring of liver diseases. We aim to establish common reference intervals of liver function tests specifically for the Chinese adult population. Methods A total of 3210 individuals (20–79 years) were enrolled in six representative geographical regions in China. Analytes of ALT, AST, GGT, ALP, total protein, albumin and total bilirubin were measured using three analytical systems mainly used in China. The newly established reference intervals were based on the results of traceability or multiple systems, and then validated in 21 large hospitals located nationwide qualified by the National External Quality Assessment (EQA) of China. Results We had been established reference intervals of the seven liver function tests for the Chinese adult population and found there were apparent variances of reference values for the variables for partitioning analysis such as gender(ALT, GGT, total bilirubin), age(ALP, albumin) and region(total protein). More than 86% of the 21 laboratories passed the validation in all subgroup of reference intervals and overall about 95.3% to 98.8% of the 1220 validation results fell within the range of the new reference interval for all liver function tests. In comparison with the currently recommended reference intervals in China, the single side observed proportions of out of range of reference values from our study for most of the tests deviated significantly from the nominal 2.5% such as total bilirubin (15.2%), ALP (0.2%), albumin (0.0%). Most of reference intervals in our study were obviously different from that of other races. Conclusion These used reference intervals are no longer applicable for the current Chinese population. We have established common reference intervals of liver function tests that are defined specifically for Chinese population and can be universally used among EQA-approved laboratories located all over China. PMID:24058449

  13. Establishment of reference costs for occupational health services and implementation of cost management in Japanese manufacturing companies

    PubMed Central

    Nagata, Tomohisa; Mori, Koji; Aratake, Yutaka; Ide, Hiroshi; Nobori, Junichiro; Kojima, Reiko; Odagami, Kiminori; Kato, Anna; Hiraoka, Mika; Shiota, Naoki; Kobayashi, Yuichi; Ito, Masato; Tsutsumi, Akizumi; Matsuda, Shinya

    2016-01-01

    Objectives: We developed a standardized cost estimation method for occupational health (OH) services. The purpose of this study was to set reference OH services costs and to conduct OH services cost management assessments in two workplaces by comparing actual OH services costs with the reference costs. Methods: Data were obtained from retrospective analyses of OH services costs regarding 15 OH activities over a 1-year period in three manufacturing workplaces. We set the reference OH services costs in one of the three locations and compared OH services costs of each of the two other workplaces with the reference costs. Results: The total reference OH services cost was 176,654 Japanese yen (JPY) per employee. The personnel cost for OH staff to conduct OH services was JPY 47,993, and the personnel cost for non-OH staff was JPY 38,699. The personnel cost for receipt of OH services-opportunity cost-was JPY 19,747, expense was JPY 25,512, depreciation expense was 34,849, and outsourcing cost was JPY 9,854. We compared actual OH services costs from two workplaces (the total OH services costs were JPY 182,151 and JPY 238,023) with the reference costs according to OH activity. The actual costs were different from the reference costs, especially in the case of personnel cost for non-OH staff, expense, and depreciation expense. Conclusions: Using our cost estimation tool, it is helpful to compare actual OH services cost data with reference cost data. The outcomes help employers make informed decisions regarding investment in OH services. PMID:27170449

  14. A structural SVM approach for reference parsing.

    PubMed

    Zhang, Xiaoli; Zou, Jie; Le, Daniel X; Thoma, George R

    2011-06-09

    Automated extraction of bibliographic data, such as article titles, author names, abstracts, and references is essential to the affordable creation of large citation databases. References, typically appearing at the end of journal articles, can also provide valuable information for extracting other bibliographic data. Therefore, parsing individual reference to extract author, title, journal, year, etc. is sometimes a necessary preprocessing step in building citation-indexing systems. The regular structure in references enables us to consider reference parsing a sequence learning problem and to study structural Support Vector Machine (structural SVM), a newly developed structured learning algorithm on parsing references. In this study, we implemented structural SVM and used two types of contextual features to compare structural SVM with conventional SVM. Both methods achieve above 98% token classification accuracy and above 95% overall chunk-level accuracy for reference parsing. We also compared SVM and structural SVM to Conditional Random Field (CRF). The experimental results show that structural SVM and CRF achieve similar accuracies at token- and chunk-levels. When only basic observation features are used for each token, structural SVM achieves higher performance compared to SVM since it utilizes the contextual label features. However, when the contextual observation features from neighboring tokens are combined, SVM performance improves greatly, and is close to that of structural SVM after adding the second order contextual observation features. The comparison of these two methods with CRF using the same set of binary features show that both structural SVM and CRF perform better than SVM, indicating their stronger sequence learning ability in reference parsing.

  15. Development of an evidence-based approach to external quality assurance for breast cancer hormone receptor immunohistochemistry: comparison of reference values.

    PubMed

    Makretsov, Nikita; Gilks, C Blake; Alaghehbandan, Reza; Garratt, John; Quenneville, Louise; Mercer, Joel; Palavdzic, Dragana; Torlakovic, Emina E

    2011-07-01

    External quality assurance and proficiency testing programs for breast cancer predictive biomarkers are based largely on traditional ad hoc design; at present there is no universal consensus on definition of a standard reference value for samples used in external quality assurance programs. To explore reference values for estrogen receptor and progesterone receptor immunohistochemistry in order to develop an evidence-based analytic platform for external quality assurance. There were 31 participating laboratories, 4 of which were previously designated as "expert" laboratories. Each participant tested a tissue microarray slide with 44 breast carcinomas for estrogen receptor and progesterone receptor and submitted it to the Canadian Immunohistochemistry Quality Control Program for analysis. Nuclear staining in 1% or more of the tumor cells was a positive score. Five methods for determining reference values were compared. All reference values showed 100% agreement for estrogen receptor and progesterone receptor scores, when indeterminate results were excluded. Individual laboratory performance (agreement rates, test sensitivity, test specificity, positive predictive value, negative predictive value, and κ value) was very similar for all reference values. Identification of suboptimal performance by all methods was identical for 30 of 31 laboratories. Estrogen receptor assessment of 1 laboratory was discordant: agreement was less than 90% for 3 of 5 reference values and greater than 90% with the use of 2 other reference values. Various reference values provide equivalent laboratory rating. In addition to descriptive feedback, our approach allows calculation of technical test sensitivity and specificity, positive and negative predictive values, agreement rates, and κ values to guide corrective actions.

  16. Potential contributions of asphalt and coal tar to black carbon quantification in urban dust, soils, and sediments

    USGS Publications Warehouse

    Yang, Y.; Mahler, B.J.; Van Metre, P.C.; Ligouis, B.; Werth, C.J.

    2010-01-01

    Measurements of black carbon (BC) using either chemical or thermal oxidation methods are generally thought to indicate the amount of char and/or soot present in a sample. In urban environments, however, asphalt and coal-tar particles worn from pavement are ubiquitous and, because of their pyrogenic origin, could contribute to measurements of BC. Here we explored the effect of the presence of asphalt and coal-tar particles on the quantification of BC in a range of urban environmental sample types, and evaluated biases in the different methods used for quantifying BC. Samples evaluated were pavement dust, residential and commercial area soils, lake sediments from a small urban watershed, and reference materials of asphalt and coal tar. Total BC was quantified using chemical treatment through acid dichromate (Cr2O7) oxidation and chemo-thermal oxidation at 375??C (CTO-375). BC species, including soot and char/charcoal, asphalt, and coal tar, were quantified with organic petrographic analysis. Comparison of results by the two oxidation methods and organic petrography indicates that both coal tar and asphalt contribute to BC quantified by Cr2O7 oxidation, and that coal tar contributes to BC quantified by CTO-375. These results are supported by treatment of asphalt and coal-tar reference samples with Cr2O7 oxidation and CTO-375. The reference asphalt is resistant to Cr2O7 oxidation but not to CTO-375, and the reference coal tar is resistant to both Cr2O7 oxidation and CTO-375. These results indicate that coal tar and/or asphalt can contribute to BC measurements in samples from urban areas using Cr2O7 oxidation or CTO-375, and caution is advised when interpreting BC measurements made with these methods. ?? 2010 Elsevier Ltd.

  17. Evaluation of purity with its uncertainty value in high purity lead stick by conventional and electro-gravimetric methods.

    PubMed

    Singh, Nahar; Singh, Niranjan; Tripathy, S Swarupa; Soni, Daya; Singh, Khem; Gupta, Prabhat K

    2013-06-26

    A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results.

  18. Establishment of reference intervals of clinical chemistry analytes for the adult population in Saudi Arabia: a study conducted as a part of the IFCC global study on reference values.

    PubMed

    Borai, Anwar; Ichihara, Kiyoshi; Al Masaud, Abdulaziz; Tamimi, Waleed; Bahijri, Suhad; Armbuster, David; Bawazeer, Ali; Nawajha, Mustafa; Otaibi, Nawaf; Khalil, Haitham; Kawano, Reo; Kaddam, Ibrahim; Abdelaal, Mohamed

    2016-05-01

    This study is a part of the IFCC-global study to derive reference intervals (RIs) for 28 chemistry analytes in Saudis. Healthy individuals (n=826) aged ≥18 years were recruited using the global study protocol. All specimens were measured using an Architect analyzer. RIs were derived by both parametric and non-parametric methods for comparative purpose. The need for secondary exclusion of reference values based on latent abnormal values exclusion (LAVE) method was examined. The magnitude of variation attributable to gender, ages and regions was calculated by the standard deviation ratio (SDR). Sources of variations: age, BMI, physical exercise and smoking levels were investigated by using the multiple regression analysis. SDRs for gender, age and regional differences were significant for 14, 8 and 2 analytes, respectively. BMI-related changes in test results were noted conspicuously for CRP. For some metabolic related parameters the ranges of RIs by non-parametric method were wider than by the parametric method and RIs derived using the LAVE method were significantly different than those without it. RIs were derived with and without gender partition (BMI, drugs and supplements were considered). RIs applicable to Saudis were established for the majority of chemistry analytes, whereas gender, regional and age RI partitioning was required for some analytes. The elevated upper limits of metabolic analytes reflects the existence of high prevalence of metabolic syndrome in Saudi population.

  19. Establishment of a biological reference preparation for hepatitis A vaccine (inactivated, non-adsorbed).

    PubMed

    Stalder, J; Costanzo, A; Daas, A; Rautmann, G; Buchheit, K-H

    2010-04-01

    A reference standard calibrated in International Units (IU) is needed for the in vitro potency assay of hepatitis A vaccines prepared by formalin-inactivation of purified hepatitis A virus grown in cell cultures. Thus, a project was launched by the European Directorate for the Quality of Medicines & HealthCare (EDQM) to establish one or more non-adsorbed inactivated hepatitis A vaccine reference preparation(s) as working standard(s), calibrated against the 1st International Standard (IS), for the in vitro potency assay (ELISA) of all vaccines present on the European market. Four non-adsorbed liquid preparations of formalin-inactivated hepatitis A antigen with a known antigen content were obtained from 3 manufacturers as candidate Biological Reference Preparations (BRPs). Thirteen laboratories participated in the collaborative study. They were asked to use an in vitro ELISA method adapted from a commercially available kit for the detection of antibodies to hepatitis A virus. In-house validated assays were to be run in parallel, where available. Some participants also included commercially available hepatitis A vaccines in the assays, after appropriate desorption. During the collaborative study, several participants using the standard method were faced with problems with some of the most recent lots of the test kits. Due to these problems, the standard method did not perform satisfactorily and a high number of assays were invalid, whereas the in-house methods appeared to perform better. Despite this, the overall mean results of the valid assays using both methods were in agreement. Nonetheless, it was decided to base the assignment of the potency values on the in-house methods only. The results showed that all candidate BRPs were suitable for the intended purpose. However, based on availability of the material and on the results of end-product testing, 2 candidate reference preparations, Samples C and D, were selected. Both were from the same batch but filled on different days; no statistically significant difference in potency was observed. They were thus combined in 1 single batch. The candidate preparation (Sample C/D) was adopted at the June 2009 session of the European Pharmacopoeia (Ph. Eur.) Commission as the Ph. Eur. BRP batch 1 for hepatitis A vaccine (inactivated, non-adsorbed), with an assigned potency of 12 IU/ml for in vitro antigen content assays. Accelerated degradation studies have been initiated. The preliminary data show that the BRP is stable at the recommended storage temperature (< -50 degrees C). The BRP will be monitored at regular intervals throughout its lifetime.

  20. SSVEP recognition using common feature analysis in brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2015-04-15

    Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Reference-free, high-resolution measurement method of timing jitter spectra of optical frequency combs

    PubMed Central

    Kwon, Dohyeon; Jeon, Chan-Gi; Shin, Junho; Heo, Myoung-Sun; Park, Sang Eon; Song, Youjian; Kim, Jungwon

    2017-01-01

    Timing jitter is one of the most important properties of femtosecond mode-locked lasers and optical frequency combs. Accurate measurement of timing jitter power spectral density (PSD) is a critical prerequisite for optimizing overall noise performance and further advancing comb applications both in the time and frequency domains. Commonly used jitter measurement methods require a reference mode-locked laser with timing jitter similar to or lower than that of the laser-under-test, which is a demanding requirement for many laser laboratories, and/or have limited measurement resolution. Here we show a high-resolution and reference-source-free measurement method of timing jitter spectra of optical frequency combs using an optical fibre delay line and optical carrier interference. The demonstrated method works well for both mode-locked oscillators and supercontinua, with 2 × 10−9 fs2/Hz (equivalent to −174 dBc/Hz at 10-GHz carrier frequency) measurement noise floor. The demonstrated method can serve as a simple and powerful characterization tool for timing jitter PSDs of various comb sources including mode-locked oscillators, supercontinua and recently emerging Kerr-frequency combs; the jitter measurement results enabled by our method will provide new insights for understanding and optimizing timing noise in such comb sources. PMID:28102352

  2. Development and validation of a method for mercury determination in seawater for the process control of a candidate certified reference material.

    PubMed

    Sánchez, Raquel; Snell, James; Held, Andrea; Emons, Hendrik

    2015-08-01

    A simple, robust and reliable method for mercury determination in seawater matrices based on the combination of cold vapour generation and inductively coupled plasma mass spectrometry (CV-ICP-MS) and its complete in-house validation are described. The method validation covers parameters such as linearity, limit of detection (LOD), limit of quantification (LOQ), trueness, repeatability, intermediate precision and robustness. A calibration curve covering the whole working range was achieved with coefficients of determination typically higher than 0.9992. The repeatability of the method (RSDrep) was 0.5 %, and the intermediate precision was 2.3 % at the target mass fraction of 20 ng/kg. Moreover, the method was robust with respect to the salinity of the seawater. The limit of quantification was 2.7 ng/kg, which corresponds to 13.5 % of the target mass fraction in the future certified reference material (20 ng/kg). An uncertainty budget for the measurement of mercury in seawater has been established. The relative expanded (k = 2) combined uncertainty is 6 %. The performance of the validated method was demonstrated by generating results for process control and a homogeneity study for the production of a candidate certified reference material.

  3. A Rapid Segmentation-Insensitive "Digital Biopsy" Method for Radiomic Feature Extraction: Method and Pilot Study Using CT Images of Non-Small Cell Lung Cancer.

    PubMed

    Echegaray, Sebastian; Nair, Viswam; Kadoch, Michael; Leung, Ann; Rubin, Daniel; Gevaert, Olivier; Napel, Sandy

    2016-12-01

    Quantitative imaging approaches compute features within images' regions of interest. Segmentation is rarely completely automatic, requiring time-consuming editing by experts. We propose a new paradigm, called "digital biopsy," that allows for the collection of intensity- and texture-based features from these regions at least 1 order of magnitude faster than the current manual or semiautomated methods. A radiologist reviewed automated segmentations of lung nodules from 100 preoperative volume computed tomography scans of patients with non-small cell lung cancer, and manually adjusted the nodule boundaries in each section, to be used as a reference standard, requiring up to 45 minutes per nodule. We also asked a different expert to generate a digital biopsy for each patient using a paintbrush tool to paint a contiguous region of each tumor over multiple cross-sections, a procedure that required an average of <3 minutes per nodule. We simulated additional digital biopsies using morphological procedures. Finally, we compared the features extracted from these digital biopsies with our reference standard using intraclass correlation coefficient (ICC) to characterize robustness. Comparing the reference standard segmentations to our digital biopsies, we found that 84/94 features had an ICC >0.7; comparing erosions and dilations, using a sphere of 1.5-mm radius, of our digital biopsies to the reference standard segmentations resulted in 41/94 and 53/94 features, respectively, with ICCs >0.7. We conclude that many intensity- and texture-based features remain consistent between the reference standard and our method while substantially reducing the amount of operator time required.

  4. Methodological considerations for implementation of lymphocyte subset analysis in a clinical reference laboratory.

    PubMed

    Muirhead, K A; Wallace, P K; Schmitt, T C; Frescatore, R L; Franco, J A; Horan, P K

    1986-01-01

    As the diagnostic utility of lymphocyte subset analysis has been recognized in the clinical research laboratory, a wide variety of reagents and cell preparation, staining and analysis methods have also been described. Methods that are perfectly suitable for analysis of smaller sample numbers in the biological or clinical research setting are not always appropriate and/or applicable in the setting of a high volume clinical reference laboratory. We describe here some of the specific considerations involved in choosing a method for flow cytometric analysis which minimizes sample preparation and data analysis time while maximizing sample stability, viability, and reproducibility. Monoclonal T- and B-cell reagents from three manufacturers were found to give equivalent results for a reference population of healthy individuals. This was true whether direct or indirect immunofluorescence staining was used and whether cells were prepared by Ficoll-Hypaque fractionation (FH) or by lysis of whole blood. When B cells were enumerated using a polyclonal anti-immunoglobulin reagent, less cytophilic immunoglobulin staining was present after lysis than after FH preparation. However, both preparation methods required additional incubation at 37 degrees C to obtain results concordant with monoclonal B-cell reagents. Standard reagents were chosen on the basis of maximum positive/negative separation and the availability of appropriate negative controls. The effects of collection medium and storage conditions on sample stability and reproducibility of subset analysis were also assessed. Specimens collected in heparin and stored at room temperature in buffered medium gave reproducible results for 3 days after specimen collection, using either FH or lysis as the preparation method. General strategies for instrument optimization, quality control, and biohazard containment are also discussed.

  5. A Qualitative Approach to Sketch the Graph of a Function.

    ERIC Educational Resources Information Center

    Alson, Pedro

    1992-01-01

    Presents a qualitative and global method of graphing functions that involves transformations of the graph of a known function in the cartesian coordinate system referred to as graphic operators. Explains how the method has been taught to students and some comments about the results obtained. (MDH)

  6. Smartphone-Based Point-of-Care Urinalysis Under Variable Illumination

    PubMed Central

    Ra, Moonsoo; Lim, Chiawei; Han, Sehui; Jung, Chansung; Kim, Whoi-Yul

    2018-01-01

    Urine tests are performed by using an off-the-shelf reference sheet to compare the color of test strips. However, the tabular representation is difficult to use and more prone to visual errors, especially when the reference color-swatches to be compared are spatially apart. Thus, making it is difficult to distinguish between the subtle differences of shades on the reagent pads. This manuscript represents a new arrangement of reference arrays for urine test strips (urinalysis). Reference color swatches are grouped in a doughnut chart, surrounding each reagent pad on the strip. The urine test can be evaluated using naked eye by referring to the strip with no additional sheet necessary. Along with this new strip, an algorithm for smartphone based application is also proposed as an alternative to deliver diagnostic results. The proposed colorimetric detection method evaluates the captured image of the strip, under various color spaces and evaluates ten different tests for urine. Thus, the proposed system can deliver results on the spot using both naked eye and smartphone. The proposed scheme delivered accurate results under various environmental illumination conditions without any calibration requirements, exhibiting performances suitable for real-life applications and an ease for a common user. PMID:29333352

  7. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  8. Evaluating new HbA1c methods for adoption by the IFCC and NGSP reference networks using international quality targets.

    PubMed

    Lenters-Westra, Erna; English, Emma

    2017-08-28

    As a reference laboratory for HbA1c, it is essential to have accurate and precise HbA1c methods covering a range of measurement principles. We report an evaluation of the Abbott Enzymatic (Architect c4000), Roche Gen.3 HbA1c (Cobas c513) and Tosoh G11 using different quality targets. The effect of hemoglobin variants, other potential interferences and the performance in comparison to both the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) and the National Glycohemoglobin Standardization Program (NGSP) reference systems was assessed using certified evaluation protocols. Each of the evaluated HbA1c methods had CVs <3% in SI units and <2% in NGSP units at 46 mmol/mol (6.4%) and 72 mmol/mol (8.7%) and passed the NGSP criteria when compared with six secondary reference measurement procedures (SRMPs). Sigma was 8.6 for Abbott Enzymatic, 3.3 for Roche Cobas c513 and 6.9 for Tosoh G11. No clinically significant interference was detected for the common Hb variants for the three methods. All three methods performed well and are suitable for clinical application in the analysis of HbA1c. Partly based on the result of this study, the Abbott Enzymatic method on the Architect c4000 and the Roche Gen.3 HbA1c on the Cobas c513 are now official, certified IFCC and NGSP SRMPs in the IFCC and NGSP networks. Sigma metrics quality criteria presented in a graph distinguish between good and excellent performance.

  9. Biscayne aquifer drinking water (USGS45): a new isotopic reference material for δ2H and δ18O measurements of water

    USGS Publications Warehouse

    Lorenz, Jennifer M.; Tarbox, Lauren V.; Buck, Bryan; Qi, Haiping; Coplen, Tyler B.

    2014-01-01

    RATIONALE As a result of the scarcity of isotopic reference waters for daily use, a new secondary isotopic reference material for international distribution has been prepared from drinking water collected from the Biscayne aquifer in Ft. Lauderdale, Florida. METHODS This isotopic reference water was filtered, homogenized, loaded into glass ampoules, sealed with a torch, autoclaved to eliminate biological activity, and measured by dual-inlet isotope-ratio mass spectrometry. This reference material is available by the case of 144 glass ampoules containing either 4 mL or 5 mL of water in each ampoule. RESULTS The δ2H and δ18O values of this reference material are –10.3 ± 0.4 ‰ and –2.238 ± 0.011 ‰, respectively, relative to VSMOW, on scales normalized such that the δ2H and δ18O values of SLAP reference water are, respectively, –428 and –55.5 ‰. Each uncertainty is an estimated expanded uncertainty (U = 2uc) about the reference value that provides an interval that has about a 95 % probability of encompassing the true value. CONCLUSIONS This isotopic reference material, designated as USGS45, is intended as one of two isotopic reference waters for daily normalization of stable hydrogen and oxygen isotopic analysis of water with an isotope-ratio mass spectrometer or a laser absorption spectrometer. 

  10. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 3 2013-01-01 2013-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  11. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505.1...

  12. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  13. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  14. 10 CFR 434.505 - Reference building method.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Reference building method. 434.505 Section 434.505 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL AND MULTI-FAMILY HIGH RISE RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.505 Reference building method. 505...

  15. Applying Suffix Rules to Organization Name Recognition

    NASA Astrophysics Data System (ADS)

    Inui, Takashi; Murakami, Koji; Hashimoto, Taiichi; Utsumi, Kazuo; Ishikawa, Masamichi

    This paper presents a method for boosting the performance of the organization name recognition, which is a part of named entity recognition (NER). Although gazetteers (lists of the NEs) have been known as one of the effective features for supervised machine learning approaches on the NER task, the previous methods which have applied the gazetteers to the NER were very simple. The gazetteers have been used just for searching the exact matches between input text and NEs included in them. The proposed method generates regular expression rules from gazetteers, and, with these rules, it can realize a high-coverage searches based on looser matches between input text and NEs. To generate these rules, we focus on the two well-known characteristics of NE expressions; 1) most of NE expressions can be divided into two parts, class-reference part and instance-reference part, 2) for most of NE expressions the class-reference parts are located at the suffix position of them. A pattern mining algorithm runs on the set of NEs in the gazetteers, and some frequent word sequences from which NEs are constructed are found. Then, we employ only word sequences which have the class-reference part at the suffix position as suffix rules. Experimental results showed that our proposed method improved the performance of the organization name recognition, and achieved the 84.58 F-value for evaluation data.

  16. Extrapolation-Based References Improve Motion and Eddy-Current Correction of High B-Value DWI Data: Application in Parkinson's Disease Dementia.

    PubMed

    Nilsson, Markus; Szczepankiewicz, Filip; van Westen, Danielle; Hansson, Oskar

    2015-01-01

    Conventional motion and eddy-current correction, where each diffusion-weighted volume is registered to a non diffusion-weighted reference, suffers from poor accuracy for high b-value data. An alternative approach is to extrapolate reference volumes from low b-value data. We aim to compare the performance of conventional and extrapolation-based correction of diffusional kurtosis imaging (DKI) data, and to demonstrate the impact of the correction approach on group comparison studies. DKI was performed in patients with Parkinson's disease dementia (PDD), and healthy age-matched controls, using b-values of up to 2750 s/mm2. The accuracy of conventional and extrapolation-based correction methods was investigated. Parameters from DTI and DKI were compared between patients and controls in the cingulum and the anterior thalamic projection tract. Conventional correction resulted in systematic registration errors for high b-value data. The extrapolation-based methods did not exhibit such errors, yielding more accurate tractography and up to 50% lower standard deviation in DKI metrics. Statistically significant differences were found between patients and controls when using the extrapolation-based motion correction that were not detected when using the conventional method. We recommend that conventional motion and eddy-current correction should be abandoned for high b-value data in favour of more accurate methods using extrapolation-based references.

  17. Baseline Assessment of 25-Hydroxyvitamin D Reference Material and Proficiency Testing/External Quality Assurance Material Commutability: A Vitamin D Standardization Program Study.

    PubMed

    Phinney, Karen W; Sempos, Christopher T; Tai, Susan S-C; Camara, Johanna E; Wise, Stephen A; Eckfeldt, John H; Hoofnagle, Andrew N; Carter, Graham D; Jones, Julia; Myers, Gary L; Durazo-Arvizu, Ramon; Miller, W Greg; Bachmann, Lorin M; Young, Ian S; Pettit, Juanita; Caldwell, Grahame; Liu, Andrew; Brooks, Stephen P J; Sarafin, Kurtis; Thamm, Michael; Mensink, Gert B M; Busch, Markus; Rabenberg, Martina; Cashman, Kevin D; Kiely, Mairead; Galvin, Karen; Zhang, Joy Y; Kinsella, Michael; Oh, Kyungwon; Lee, Sun-Wha; Jung, Chae L; Cox, Lorna; Goldberg, Gail; Guberg, Kate; Meadows, Sarah; Prentice, Ann; Tian, Lu; Brannon, Patsy M; Lucas, Robyn M; Crump, Peter M; Cavalier, Etienne; Merkel, Joyce; Betz, Joseph M

    2017-09-01

    The Vitamin D Standardization Program (VDSP) coordinated a study in 2012 to assess the commutability of reference materials and proficiency testing/external quality assurance materials for total 25-hydroxyvitamin D [25(OH)D] in human serum, the primary indicator of vitamin D status. A set of 50 single-donor serum samples as well as 17 reference and proficiency testing/external quality assessment materials were analyzed by participating laboratories that used either immunoassay or LC-MS methods for total 25(OH)D. The commutability test materials included National Institute of Standards and Technology Standard Reference Material 972a Vitamin D Metabolites in Human Serum as well as materials from the College of American Pathologists and the Vitamin D External Quality Assessment Scheme. Study protocols and data analysis procedures were in accordance with Clinical and Laboratory Standards Institute guidelines. The majority of the test materials were found to be commutable with the methods used in this commutability study. These results provide guidance for laboratories needing to choose appropriate reference materials and select proficiency or external quality assessment programs and will serve as a foundation for additional VDSP studies.

  18. Methylphenidate use in children with attention deficit hyperactivity disorder

    PubMed Central

    Machado, Felipe Salles Neves; Caetano, Sheila Cavalcante; Hounie, Ana Gabriela; Scivoletto, Sandra; Muszkat, Mauro; Gattás, Ivete Gianfaldoni; Casella, Erasmo Barbante; de Andrade, Ênio Roberto; Polanczyk, Guilherme Vanoni; do Rosário, Maria Conceição

    2015-01-01

    A Brazilian Health Technology Assessment Bulletin (BRATS) article regarding scientific evidence of the efficacy and safety of methylphenidate for treating attention deficit hyperactivity disorder (ADHD) has caused much controversy about its methods. Considering the relevance of BRATS for public health in Brazil, we critically reviewed this article by remaking the BRATS search and discussing its methods and results. Two questions were answered: did BRATS include all references available in the literature? Do the conclusions reflect the reviewed articles? The results indicate that BRATS did not include all the references from the literature on this subject and also that the proposed conclusions are different from the results of the articles chosen by the BRATS authors themselves. The articles selected by the BRATS authors showed that using methylphenidate is safe and effective. However, the BRATS final conclusion does not reflect the aforementioned and should not be used to support decisions on the use of methylphenidate. PMID:26061456

  19. Variability of bioaccessibility results using seventeen different methods on a standard reference material, NIST 2710.

    PubMed

    Koch, Iris; Reimer, Kenneth J; Bakker, Martine I; Basta, Nicholas T; Cave, Mark R; Denys, Sébastien; Dodd, Matt; Hale, Beverly A; Irwin, Rob; Lowney, Yvette W; Moore, Margo M; Paquin, Viviane; Rasmussen, Pat E; Repaso-Subang, Theresa; Stephenson, Gladys L; Siciliano, Steven D; Wragg, Joanna; Zagury, Gerald J

    2013-01-01

    Bioaccessibility is a measurement of a substance's solubility in the human gastro-intestinal system, and is often used in the risk assessment of soils. The present study was designed to determine the variability among laboratories using different methods to measure the bioaccessibility of 24 inorganic contaminants in one standardized soil sample, the standard reference material NIST 2710. Fourteen laboratories used a total of 17 bioaccessibility extraction methods. The variability between methods was assessed by calculating the reproducibility relative standard deviations (RSDs), where reproducibility is the sum of within-laboratory and between-laboratory variability. Whereas within-laboratory repeatability was usually better than (<) 15% for most elements, reproducibility RSDs were much higher, indicating more variability, although for many elements they were comparable to typical uncertainties (e.g., 30% in commercial laboratories). For five trace elements of interest, reproducibility RSDs were: arsenic (As), 22-44%; cadmium (Cd), 11-41%; Cu, 15-30%; lead (Pb), 45-83%; and Zn, 18-56%. Only one method variable, pH, was found to correlate significantly with bioaccessibility for aluminum (Al), Cd, copper (Cu), manganese (Mn), Pb and zinc (Zn) but other method variables could not be examined systematically because of the study design. When bioaccessibility results were directly compared with bioavailability results for As (swine and mouse) and Pb (swine), four methods returned results within uncertainty ranges for both elements: two that were defined as simpler (gastric phase only, limited chemicals) and two were more complex (gastric + intestinal phases, with a mixture of chemicals).

  20. Identification of Suitable Reference Genes for Gene Expression Normalization in qRT-PCR Analysis in Watermelon

    PubMed Central

    Gao, Lingyun; Zhao, Shuang; Jiang, Wei; Huang, Yuan; Bie, Zhilong

    2014-01-01

    Watermelon is one of the major Cucurbitaceae crops and the recent availability of genome sequence greatly facilitates the fundamental researches on it. Quantitative real-time reverse transcriptase PCR (qRT–PCR) is the preferred method for gene expression analyses, and using validated reference genes for normalization is crucial to ensure the accuracy of this method. However, a systematic validation of reference genes has not been conducted on watermelon. In this study, transcripts of 15 candidate reference genes were quantified in watermelon using qRT–PCR, and the stability of these genes was compared using geNorm and NormFinder. geNorm identified ClTUA and ClACT, ClEF1α and ClACT, and ClCAC and ClTUA as the best pairs of reference genes in watermelon organs and tissues under normal growth conditions, abiotic stress, and biotic stress, respectively. NormFinder identified ClYLS8, ClUBCP, and ClCAC as the best single reference genes under the above experimental conditions, respectively. ClYLS8 and ClPP2A were identified as the best reference genes across all samples. Two to nine reference genes were required for more reliable normalization depending on the experimental conditions. The widely used watermelon reference gene 18SrRNA was less stable than the other reference genes under the experimental conditions. Catalase family genes were identified in watermelon genome, and used to validate the reliability of the identified reference genes. ClCAT1and ClCAT2 were induced and upregulated in the first 24 h, whereas ClCAT3 was downregulated in the leaves under low temperature stress. However, the expression levels of these genes were significantly overestimated and misinterpreted when 18SrRNA was used as a reference gene. These results provide a good starting point for reference gene selection in qRT–PCR analyses involving watermelon. PMID:24587403

  1. Identification of suitable reference genes for gene expression normalization in qRT-PCR analysis in watermelon.

    PubMed

    Kong, Qiusheng; Yuan, Jingxian; Gao, Lingyun; Zhao, Shuang; Jiang, Wei; Huang, Yuan; Bie, Zhilong

    2014-01-01

    Watermelon is one of the major Cucurbitaceae crops and the recent availability of genome sequence greatly facilitates the fundamental researches on it. Quantitative real-time reverse transcriptase PCR (qRT-PCR) is the preferred method for gene expression analyses, and using validated reference genes for normalization is crucial to ensure the accuracy of this method. However, a systematic validation of reference genes has not been conducted on watermelon. In this study, transcripts of 15 candidate reference genes were quantified in watermelon using qRT-PCR, and the stability of these genes was compared using geNorm and NormFinder. geNorm identified ClTUA and ClACT, ClEF1α and ClACT, and ClCAC and ClTUA as the best pairs of reference genes in watermelon organs and tissues under normal growth conditions, abiotic stress, and biotic stress, respectively. NormFinder identified ClYLS8, ClUBCP, and ClCAC as the best single reference genes under the above experimental conditions, respectively. ClYLS8 and ClPP2A were identified as the best reference genes across all samples. Two to nine reference genes were required for more reliable normalization depending on the experimental conditions. The widely used watermelon reference gene 18SrRNA was less stable than the other reference genes under the experimental conditions. Catalase family genes were identified in watermelon genome, and used to validate the reliability of the identified reference genes. ClCAT1and ClCAT2 were induced and upregulated in the first 24 h, whereas ClCAT3 was downregulated in the leaves under low temperature stress. However, the expression levels of these genes were significantly overestimated and misinterpreted when 18SrRNA was used as a reference gene. These results provide a good starting point for reference gene selection in qRT-PCR analyses involving watermelon.

  2. Assessment of the Validity of the Research Diagnostic Criteria for Temporomandibular Disorders: Overview and Methodology

    PubMed Central

    Schiffman, Eric L.; Truelove, Edmond L.; Ohrbach, Richard; Anderson, Gary C.; John, Mike T.; List, Thomas; Look, John O.

    2011-01-01

    AIMS The purpose of the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD) Validation Project was to assess the diagnostic validity of this examination protocol. An overview is presented, including Axis I and II methodology and descriptive statistics for the study participant sample. This paper details the development of reliable methods to establish the reference standards for assessing criterion validity of the Axis I RDC/TMD diagnoses. Validity testing for the Axis II biobehavioral instruments was based on previously validated reference standards. METHODS The Axis I reference standards were based on the consensus of 2 criterion examiners independently performing a comprehensive history, clinical examination, and evaluation of imaging. Intersite reliability was assessed annually for criterion examiners and radiologists. Criterion exam reliability was also assessed within study sites. RESULTS Study participant demographics were comparable to those of participants in previous studies using the RDC/TMD. Diagnostic agreement of the criterion examiners with each other and with the consensus-based reference standards was excellent with all kappas ≥ 0.81, except for osteoarthrosis (moderate agreement, k = 0.53). Intrasite criterion exam agreement with reference standards was excellent (k ≥ 0.95). Intersite reliability of the radiologists for detecting computed tomography-disclosed osteoarthrosis and magnetic resonance imaging-disclosed disc displacement was good to excellent (k = 0.71 and 0.84, respectively). CONCLUSION The Validation Project study population was appropriate for assessing the reliability and validity of the RDC/TMD Axis I and II. The reference standards used to assess the validity of Axis I TMD were based on reliable and clinically credible methods. PMID:20213028

  3. Development of certified reference materials for electrolytes in human serum (GBW09124-09126).

    PubMed

    Feng, Liuxing; Wang, Jun; Cui, Yanjie; Shi, Naijie; Li, Haifeng; Li, Hongmei

    2017-05-01

    Three reference materials, at relatively low, middle, and high concentrations, were developed for analysis of the mass fractions of electrolytes (K, Ca, Na, Mg, Cl, and Li) in human serum. The reference materials were prepared by adding high purity chloride salts to normal human serum. The concentration range of the three levels is within ±20% of normal human serum. It was shown that 14 units with duplicate analysis is enough to demonstrate the homogeneity of these candidate reference materials. The statistical results also showed no significant trends in both short-term stability test for 1 week at 40 °C and long-term stability test for 14 months. The certification methods of the six elements include isotope dilution inductively coupled plasma mass spectrometry (ID-ICP-MS), inductively coupled plasma optical emission spectroscopy (ICP-OES), atomic absorption spectroscopy (AAS), ion chromatography (IC), and ion-selective electrode (ISE). The certification methods were validated by international comparisons among a number of national metrology institutes (NMIs). The combined relative standard uncertainties of the property values were estimated by considering the uncertainties of the analytical methods, homogeneity, and stability. The range of the expanded uncertainties of all the elements is from 2.2% to 3.9%. The certified reference materials (CRMs) are primarily intended for use in the calibration and validation of procedures in clinical analysis for the determination of electrolytes in human serum or plasma. Graphical Abstract Certified reference materials for K, Ca, Mg, Na, Cl and Li in human serum (GBW09124-09126).

  4. MicroSEQ® Salmonella spp. Detection Kit Using the Pathatrix® 10-Pooling Salmonella spp. Kit Linked Protocol Method Modification.

    PubMed

    Wall, Jason; Conrad, Rick; Latham, Kathy; Liu, Eric

    2014-03-01

    Real-time PCR methods for detecting foodborne pathogens offer the advantages of simplicity and quick time to results compared to traditional culture methods. The addition of a recirculating pooled immunomagnetic separation method prior to real-time PCR analysis increases processing output while reducing both cost and labor. This AOAC Research Institute method modification study validates the MicroSEQ® Salmonella spp. Detection Kit [AOAC Performance Tested Method (PTM) 031001] linked with the Pathatrix® 10-Pooling Salmonella spp. Kit (AOAC PTM 090203C) in diced tomatoes, chocolate, and deli ham. The Pathatrix 10-Pooling protocol represents a method modification of the enrichment portion of the MicroSEQ Salmonella spp. The results of the method modification were compared to standard cultural reference methods for diced tomatoes, chocolate, and deli ham. All three matrixes were analyzed in a paired study design. An additional set of chocolate test portions was analyzed using an alternative enrichment medium in an unpaired study design. For all matrixes tested, there were no statistically significant differences in the number of positive test portions detected by the modified candidate method compared to the appropriate reference method. The MicroSEQ Salmonella spp. protocol linked with the Pathatrix individual or 10-Pooling procedure demonstrated reliability as a rapid, simplified, method for the preparation of samples and subsequent detection of Salmonella in diced tomatoes, chocolate, and deli ham.

  5. A Rapid Identification Method for Calamine Using Near-Infrared Spectroscopy Based on Multi-Reference Correlation Coefficient Method and Back Propagation Artificial Neural Network.

    PubMed

    Sun, Yangbo; Chen, Long; Huang, Bisheng; Chen, Keli

    2017-07-01

    As a mineral, the traditional Chinese medicine calamine has a similar shape to many other minerals. Investigations of commercially available calamine samples have shown that there are many fake and inferior calamine goods sold on the market. The conventional identification method for calamine is complicated, therefore as a result of the large scale of calamine samples, a rapid identification method is needed. To establish a qualitative model using near-infrared (NIR) spectroscopy for rapid identification of various calamine samples, large quantities of calamine samples including crude products, counterfeits and processed products were collected and correctly identified using the physicochemical and powder X-ray diffraction method. The NIR spectroscopy method was used to analyze these samples by combining the multi-reference correlation coefficient (MRCC) method and the error back propagation artificial neural network algorithm (BP-ANN), so as to realize the qualitative identification of calamine samples. The accuracy rate of the model based on NIR and MRCC methods was 85%; in addition, the model, which took comprehensive multiple factors into consideration, can be used to identify crude calamine products, its counterfeits and processed products. Furthermore, by in-putting the correlation coefficients of multiple references as the spectral feature data of samples into BP-ANN, a BP-ANN model of qualitative identification was established, of which the accuracy rate was increased to 95%. The MRCC method can be used as a NIR-based method in the process of BP-ANN modeling.

  6. A Technique of Two-Stage Clustering Applied to Environmental and Civil Engineering and Related Methods of Citation Analysis.

    ERIC Educational Resources Information Center

    Miyamoto, S.; Nakayama, K.

    1983-01-01

    A method of two-stage clustering of literature based on citation frequency is applied to 5,065 articles from 57 journals in environmental and civil engineering. Results of related methods of citation analysis (hierarchical graph, clustering of journals, multidimensional scaling) applied to same set of articles are compared. Ten references are…

  7. The Kjeldahl method as a primary reference procedure for total protein in certified reference materials used in clinical chemistry. I. A review of Kjeldahl methods adopted by laboratory medicine.

    PubMed

    Chromý, Vratislav; Vinklárková, Bára; Šprongl, Luděk; Bittová, Miroslava

    2015-01-01

    We found previously that albumin-calibrated total protein in certified reference materials causes unacceptable positive bias in analysis of human sera. The simplest way to cure this defect is the use of human-based serum/plasma standards calibrated by the Kjeldahl method. Such standards, commutative with serum samples, will compensate for bias caused by lipids and bilirubin in most human sera. To find a suitable primary reference procedure for total protein in reference materials, we reviewed Kjeldahl methods adopted by laboratory medicine. We found two methods recommended for total protein in human samples: an indirect analysis based on total Kjeldahl nitrogen corrected for its nonprotein nitrogen and a direct analysis made on isolated protein precipitates. The methods found will be assessed in a subsequent article.

  8. Improving the efficiency of quantitative (1)H NMR: an innovative external standard-internal reference approach.

    PubMed

    Huang, Yande; Su, Bao-Ning; Ye, Qingmei; Palaniswamy, Venkatapuram A; Bolgar, Mark S; Raglione, Thomas V

    2014-01-01

    The classical internal standard quantitative NMR (qNMR) method determines the purity of an analyte by the determination of a solution containing the analyte and a standard. Therefore, the standard must meet the requirements of chemical compatibility and lack of resonance interference with the analyte as well as a known purity. The identification of such a standard can be time consuming and must be repeated for each analyte. In contrast, the external standard qNMR method utilizes a standard with a known purity to calibrate the NMR instrument. The external standard and the analyte are measured separately, thereby eliminating the matter of chemical compatibility and resonance interference between the standard and the analyte. However, the instrumental factors, including the quality of NMR tubes, must be kept the same. Any deviations will compromise the accuracy of the results. An innovative qNMR method reported herein utilizes an internal reference substance along with an external standard to assume the role of the standard used in the traditional internal standard qNMR method. In this new method, the internal reference substance must only be chemically compatible and be free of resonance-interference with the analyte or external standard whereas the external standard must only be of a known purity. The exact purity or concentration of the internal reference substance is not required as long as the same quantity is added to the external standard and the analyte. The new method reduces the burden of searching for an appropriate standard for each analyte significantly. Therefore the efficiency of the qNMR purity assay increases while the precision of the internal standard method is retained. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. A simple method for HPLC retention time prediction: linear calibration using two reference substances.

    PubMed

    Sun, Lei; Jin, Hong-Yu; Tian, Run-Tao; Wang, Ming-Juan; Liu, Li-Na; Ye, Liu-Ping; Zuo, Tian-Tian; Ma, Shuang-Cheng

    2017-01-01

    Analysis of related substances in pharmaceutical chemicals and multi-components in traditional Chinese medicines needs bulk of reference substances to identify the chromatographic peaks accurately. But the reference substances are costly. Thus, the relative retention (RR) method has been widely adopted in pharmacopoeias and literatures for characterizing HPLC behaviors of those reference substances unavailable. The problem is it is difficult to reproduce the RR on different columns due to the error between measured retention time (t R ) and predicted t R in some cases. Therefore, it is useful to develop an alternative and simple method for prediction of t R accurately. In the present study, based on the thermodynamic theory of HPLC, a method named linear calibration using two reference substances (LCTRS) was proposed. The method includes three steps, procedure of two points prediction, procedure of validation by multiple points regression and sequential matching. The t R of compounds on a HPLC column can be calculated by standard retention time and linear relationship. The method was validated in two medicines on 30 columns. It was demonstrated that, LCTRS method is simple, but more accurate and more robust on different HPLC columns than RR method. Hence quality standards using LCTRS method are easy to reproduce in different laboratories with lower cost of reference substances.

  10. Multi-viewpoint Image Array Virtual Viewpoint Rapid Generation Algorithm Based on Image Layering

    NASA Astrophysics Data System (ADS)

    Jiang, Lu; Piao, Yan

    2018-04-01

    The use of multi-view image array combined with virtual viewpoint generation technology to record 3D scene information in large scenes has become one of the key technologies for the development of integrated imaging. This paper presents a virtual viewpoint rendering method based on image layering algorithm. Firstly, the depth information of reference viewpoint image is quickly obtained. During this process, SAD is chosen as the similarity measure function. Then layer the reference image and calculate the parallax based on the depth information. Through the relative distance between the virtual viewpoint and the reference viewpoint, the image layers are weighted and panned. Finally the virtual viewpoint image is rendered layer by layer according to the distance between the image layers and the viewer. This method avoids the disadvantages of the algorithm DIBR, such as high-precision requirements of depth map and complex mapping operations. Experiments show that, this algorithm can achieve the synthesis of virtual viewpoints in any position within 2×2 viewpoints range, and the rendering speed is also very impressive. The average result proved that this method can get satisfactory image quality. The average SSIM value of the results relative to real viewpoint images can reaches 0.9525, the PSNR value can reaches 38.353 and the image histogram similarity can reaches 93.77%.

  11. Two-body potential model based on cosine series expansion for ionic materials

    DOE PAGES

    Oda, Takuji; Weber, William J.; Tanigawa, Hisashi

    2015-09-23

    There is a method to construct a two-body potential model for ionic materials with a Fourier series basis and we examine it. For this method, the coefficients of cosine basis functions are uniquely determined by solving simultaneous linear equations to minimize the sum of weighted mean square errors in energy, force and stress, where first-principles calculation results are used as the reference data. As a validation test of the method, potential models for magnesium oxide are constructed. The mean square errors appropriately converge with respect to the truncation of the cosine series. This result mathematically indicates that the constructed potentialmore » model is sufficiently close to the one that is achieved with the non-truncated Fourier series and demonstrates that this potential virtually provides minimum error from the reference data within the two-body representation. The constructed potential models work appropriately in both molecular statics and dynamics simulations, especially if a two-step correction to revise errors expected in the reference data is performed, and the models clearly outperform two existing Buckingham potential models that were tested. Moreover, the good agreement over a broad range of energies and forces with first-principles calculations should enable the prediction of materials behavior away from equilibrium conditions, such as a system under irradiation.« less

  12. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  13. Antifungal Susceptibility Testing in HIV/AIDS Patients: a Comparison Between Automated Machine and Manual Method.

    PubMed

    Nelwan, Erni J; Indrasanti, Evi; Sinto, Robert; Nurchaida, Farida; Sosrosumihardjo, Rustadi

    2016-01-01

    to evaluate the performance of Vitek2 compact machine (Biomerieux Inc. ver 04.02, France) in reference to manual methods for susceptibility test for Candida resistance among HIV/AIDS patients. a comparison study to evaluate Vitek2 compact machine (Biomerieux Inc. ver 04.02, France) in reference to manual methods for susceptibility test for Candida resistance among HIV/AIDS patient was done. Categorical agreement between manual disc diffusion and Vitek2 machine was calculated using predefined criteria. Time to susceptibility result for automated and manual methods were measured. there were 137 Candida isolates comprising eight Candida species with C.albicans and C. glabrata as the first (56.2%) and second (15.3%) most common species, respectively. For fluconazole drug, among the C. albicans, 2.6% was found resistant on manual disc diffusion methods and no resistant was determined by Vitek2 machine; whereas 100% C. krusei was identified as resistant on both methods. Resistant patterns for C. glabrata to fluconazole, voriconazole and amphotericin B were 52.4%, 23.8%, 23.8% vs. 9.5%, 9.5%, 4.8% respectively between manual diffusion disc methods and Vitek2 machine. Time to susceptibility result for automated methods compared to Vitex2 machine was shorter for all Candida species. there is a good categorical agreement between manual disc diffusion and Vitek2 machine, except for C. glabrata for measuring the antifungal resistant. Time to susceptibility result for automated methods is shorter for all Candida species.

  14. An Improved Calibration Method for Hydrazine Monitors for the United States Air Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korsah, K

    2003-07-07

    This report documents the results of Phase 1 of the ''Air Force Hydrazine Detector Characterization and Calibration Project''. A method for calibrating model MDA 7100 hydrazine detectors in the United States Air Force (AF) inventory has been developed. The calibration system consists of a Kintek 491 reference gas generation system, a humidifier/mixer system which combines the dry reference hydrazine gas with humidified diluent or carrier gas to generate the required humidified reference for calibrations, and a gas sampling interface. The Kintek reference gas generation system itself is periodically calibrated using an ORNL-constructed coulometric titration system to verify the hydrazine concentrationmore » of the sample atmosphere in the interface module. The Kintek reference gas is then used to calibrate the hydrazine monitors. Thus, coulometric titration is only used to periodically assess the performance of the Kintek reference gas generation system, and is not required for hydrazine monitor calibrations. One advantage of using coulometric titration for verifying the concentration of the reference gas is that it is a primary standard (if used for simple solutions), thereby guaranteeing, in principle, that measurements will be traceable to SI units (i.e., to the mole). The effect of humidity of the reference gas was characterized by using the results of concentrations determined by coulometric titration to develop a humidity correction graph for the Kintek 491 reference gas generation system. Using this calibration method, calibration uncertainty has been reduced by 50% compared to the current method used to calibrate hydrazine monitors in the Air Force inventory and calibration time has also been reduced by more than 20%. Significant findings from studies documented in this report are the following: (1) The Kintek 491 reference gas generation system (generator, humidifier and interface module) can be used to calibrate hydrazine detectors. (2) The Kintek system output concentration is less than the calculated output of the generator alone but can be calibrated as a system by using coulometric titration of gas samples collected with impingers. (3) The calibrated Kintek system output concentration is reproducible even after having been disassembled and moved and reassembled. (4) The uncertainty of the reference gas concentration generated by the Kintek system is less than half the uncertainty of the Zellweger Analytics' (ZA) reference gas concentration and can be easily lowered to one third or less of the ZA method by using lower-uncertainty flow rate or total flow measuring instruments. (5) The largest sources of uncertainty in the current ORNL calibration system are the permeation rate of the permeation tubes and the flow rate of the impinger sampling pump used to collect gas samples for calibrating the Kintek system. Upgrading the measurement equipment, as stated in (4), can reduce both of these. (6) The coulometric titration technique can be used to periodically assess the performance of the Kintek system and determine a suitable recalibration interval. (7) The Kintek system has been used to calibrate two MDA 7100s and an Interscan 4187 in less than one workday. The system can be upgraded (e.g., by automating it) to provide more calibrations per day. (8) The humidity of both the reference gas and the environment of the Chemcassette affect the MDA 7100 hydrazine detector's readings. However, ORNL believes that the environmental effect is less significant than the effect of the reference gas humidity. (9) The ORNL calibration method based on the Kintek 491 M-B gas standard can correct for the effect of the humidity of the reference gas to produce the same calibration as that of ZA's. Zellweger Analytics calibrations are typically performed at 45%-55% relative humidity. (10) Tests using the Interscan 4187 showed that the instrument was not accurate in its lower (0-100 ppb) range. Subsequent discussions with Kennedy Space Center (KSC) personnel also indicated that the Interscan units were not reproducible when new sensors were used. KSC had discovered that the Interscan units read incorrectly on the low range because of the presence of carbon dioxide. ORNL did not test the carbon dioxide effect, but it was found that the units did not read zero when a test gas containing no hydrazine was sampled. According to the KSC personnel that ORNL had these discussions with, NASA is phasing out the use of these Interscan detectors.« less

  15. Selection of reference genes is critical for miRNA expression analysis in human cardiac tissue. A focus on atrial fibrillation.

    PubMed

    Masè, Michela; Grasso, Margherita; Avogaro, Laura; D'Amato, Elvira; Tessarolo, Francesco; Graffigna, Angelo; Denti, Michela Alessandra; Ravelli, Flavia

    2017-01-24

    MicroRNAs (miRNAs) are emerging as key regulators of complex biological processes in several cardiovascular diseases, including atrial fibrillation (AF). Reverse transcription-quantitative polymerase chain reaction is a powerful technique to quantitatively assess miRNA expression profile, but reliable results depend on proper data normalization by suitable reference genes. Despite the increasing number of studies assessing miRNAs in cardiac disease, no consensus on the best reference genes has been reached. This work aims to assess reference genes stability in human cardiac tissue with a focus on AF investigation. We evaluated the stability of five reference genes (U6, SNORD48, SNORD44, miR-16, and 5S) in atrial tissue samples from eighteen cardiac-surgery patients in sinus rhythm and AF. Stability was quantified by combining BestKeeper, delta-C q , GeNorm, and NormFinder statistical tools. All methods assessed SNORD48 as the best and U6 as the worst reference gene. Applications of different normalization strategies significantly impacted miRNA expression profiles in the study population. Our results point out the necessity of a consensus on data normalization in AF studies to avoid the emergence of divergent biological conclusions.

  16. Hidden Markov random field model and Broyden-Fletcher-Goldfarb-Shanno algorithm for brain image segmentation

    NASA Astrophysics Data System (ADS)

    Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane

    2018-05-01

    Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.

  17. Selection of reference genes for tissue/organ samples on day 3 fifth-instar larvae in silkworm, Bombyx mori.

    PubMed

    Wang, Genhong; Chen, Yanfei; Zhang, Xiaoying; Bai, Bingchuan; Yan, Hao; Qin, Daoyuan; Xia, Qingyou

    2018-06-01

    The silkworm, Bombyx mori, is one of the world's most economically important insect. Surveying variations in gene expression among multiple tissue/organ samples will provide clues for gene function assignments and will be helpful for identifying genes related to economic traits or specific cellular processes. To ensure their accuracy, commonly used gene expression quantification methods require a set of stable reference genes for data normalization. In this study, 24 candidate reference genes were assessed in 10 tissue/organ samples of day 3 fifth-instar B. mori larvae using geNorm and NormFinder. The results revealed that, using the combination of the expression of BGIBMGA003186 and BGIBMGA008209 was the optimum choice for normalizing the expression data of the B. mori tissue/organ samples. The most stable gene, BGIBMGA003186, is recommended if just one reference gene is used. Moreover, the commonly used reference gene encoding cytoplasmic actin was the least appropriate reference gene of the samples investigated. The reliability of the selected reference genes was further confirmed by evaluating the expression profiles of two cathepsin genes. Our results may be useful for future studies involving the quantification of relative gene expression levels of different tissue/organ samples in B. mori. © 2018 Wiley Periodicals, Inc.

  18. Method for Statically Checking an Object-oriented Computer Program Module

    NASA Technical Reports Server (NTRS)

    Bierhoff, Kevin M. (Inventor); Aldrich, Jonathan (Inventor)

    2012-01-01

    A method for statically checking an object-oriented computer program module includes the step of identifying objects within a computer program module, at least one of the objects having a plurality of references thereto, possibly from multiple clients. A discipline of permissions is imposed on the objects identified within the computer program module. The permissions enable tracking, from among a discrete set of changeable states, a subset of states each object might be in. A determination is made regarding whether the imposed permissions are violated by a potential reference to any of the identified objects. The results of the determination are output to a user.

  19. Assessment of air quality microsensors versus reference methods: The EuNetAir joint exercise

    NASA Astrophysics Data System (ADS)

    Borrego, C.; Costa, A. M.; Ginja, J.; Amorim, M.; Coutinho, M.; Karatzas, K.; Sioumis, Th.; Katsifarakis, N.; Konstantinidis, K.; De Vito, S.; Esposito, E.; Smith, P.; André, N.; Gérard, P.; Francis, L. A.; Castell, N.; Schneider, P.; Viana, M.; Minguillón, M. C.; Reimringer, W.; Otjes, R. P.; von Sicard, O.; Pohle, R.; Elen, B.; Suriano, D.; Pfister, V.; Prato, M.; Dipinto, S.; Penza, M.

    2016-12-01

    The 1st EuNetAir Air Quality Joint Intercomparison Exercise organized in Aveiro (Portugal) from 13th-27th October 2014, focused on the evaluation and assessment of environmental gas, particulate matter (PM) and meteorological microsensors, versus standard air quality reference methods through an experimental urban air quality monitoring campaign. The IDAD-Institute of Environment and Development Air Quality Mobile Laboratory was placed at an urban traffic location in the city centre of Aveiro to conduct continuous measurements with standard equipment and reference analysers for CO, NOx, O3, SO2, PM10, PM2.5, temperature, humidity, wind speed and direction, solar radiation and precipitation. The comparison of the sensor data generated by different microsensor-systems installed side-by-side with reference analysers, contributes to the assessment of the performance and the accuracy of microsensor-systems in a real-world context, and supports their calibration and further development. The overall performance of the sensors in terms of their statistical metrics and measurement profile indicates significant differences in the results depending on the platform and on the sensors considered. In terms of pollutants, some promising results were observed for O3 (r2: 0.12-0.77), CO (r2: 0.53-0.87), and NO2 (r2: 0.02-0.89). For PM (r2: 0.07-0.36) and SO2 (r2: 0.09-0.20) the results show a poor performance with low correlation coefficients between the reference and microsensor measurements. These field observations under specific environmental conditions suggest that the relevant microsensor platforms, if supported by the proper post processing and data modelling tools, have enormous potential for new strategies in air quality control.

  20. Standard setting: comparison of two methods.

    PubMed

    George, Sanju; Haque, M Sayeed; Oyebode, Femi

    2006-09-14

    The outcome of assessments is determined by the standard-setting method used. There is a wide range of standard-setting methods and the two used most extensively in undergraduate medical education in the UK are the norm-reference and the criterion-reference methods. The aims of the study were to compare these two standard-setting methods for a multiple-choice question examination and to estimate the test-retest and inter-rater reliability of the modified Angoff method. The norm-reference method of standard-setting (mean minus 1 SD) was applied to the 'raw' scores of 78 4th-year medical students on a multiple-choice examination (MCQ). Two panels of raters also set the standard using the modified Angoff method for the same multiple-choice question paper on two occasions (6 months apart). We compared the pass/fail rates derived from the norm reference and the Angoff methods and also assessed the test-retest and inter-rater reliability of the modified Angoff method. The pass rate with the norm-reference method was 85% (66/78) and that by the Angoff method was 100% (78 out of 78). The percentage agreement between Angoff method and norm-reference was 78% (95% CI 69% - 87%). The modified Angoff method had an inter-rater reliability of 0.81-0.82 and a test-retest reliability of 0.59-0.74. There were significant differences in the outcomes of these two standard-setting methods, as shown by the difference in the proportion of candidates that passed and failed the assessment. The modified Angoff method was found to have good inter-rater reliability and moderate test-retest reliability.

Top