Sample records for laboratory statistical analysis

  1. Statistical Analysis Tools for Learning in Engineering Laboratories.

    ERIC Educational Resources Information Center

    Maher, Carolyn A.

    1990-01-01

    Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…

  2. Data Analysis and Graphing in an Introductory Physics Laboratory: Spreadsheet versus Statistics Suite

    ERIC Educational Resources Information Center

    Peterlin, Primoz

    2010-01-01

    Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analysing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a nonlinear dependence and a histogram. The merits of each method are compared. (Contains 7…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.

    For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less

  4. United States Air Force Summer Research Program 1991. High School Apprenticeship Program (HSAP) Reports. Volume 11. Phillips Laboratory, Civil Engineering Laboratory

    DTIC Science & Technology

    1992-01-09

    Crystal Polymers Tracy Reed Geophysics Laboratory (GEO) 9 Analysis of Model Output Statistics Thunderstorm Prediction Model Frank Lasley 10...four hours to twenty-four hours. It was predicted that the dogbones would turn brown once they reached the approximate annealing temperature. This was...LYS Hanscom AFB Frank A. Lasley Abstracft. Model Output Statistics (MOS) Thunderstorm prediction information and Service A weather observations

  5. Integrating Statistical Mechanics with Experimental Data from the Rotational-Vibrational Spectrum of HCl into the Physical Chemistry Laboratory

    ERIC Educational Resources Information Center

    Findley, Bret R.; Mylon, Steven E.

    2008-01-01

    We introduce a computer exercise that bridges spectroscopy and thermodynamics using statistical mechanics and the experimental data taken from the commonly used laboratory exercise involving the rotational-vibrational spectrum of HCl. Based on the results from the analysis of their HCl spectrum, students calculate bulk thermodynamic properties…

  6. Upward Flame Propagation and Wire Insulation Flammability: 2006 Round Robin Data Analysis

    NASA Technical Reports Server (NTRS)

    Hirsch, David B.

    2007-01-01

    This viewgraph document reviews test results from tests of different material used for wire insulation for flame propagation and flammability. The presentation focused on investigating data variability both within and between laboratories; evaluated the between-laboratory consistency through consistency statistic h, which indicates how one laboratory s cell average compares with averages from other labs; evaluated the within-laboratory consistency through the consistency statistic k, which is an indicator of how one laboratory s within-laboratory variability compares with the variability of other labs combined; and extreme results were tested to determine whether they resulted by chance or from nonrandom causes (human error, instrument calibration shift, non-adherence to procedures, etc.)

  7. Proficiency Testing for Determination of Water Content in Toluene of Chemical Reagents by iteration robust statistic technique

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Wang, Qunwei; He, Ming

    2018-05-01

    In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.

  8. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  9. Applying Statistics in the Undergraduate Chemistry Laboratory: Experiments with Food Dyes.

    ERIC Educational Resources Information Center

    Thomasson, Kathryn; Lofthus-Merschman, Sheila; Humbert, Michelle; Kulevsky, Norman

    1998-01-01

    Describes several experiments to teach different aspects of the statistical analysis of data using household substances and a simple analysis technique. Each experiment can be performed in three hours. Students learn about treatment of spurious data, application of a pooled variance, linear least-squares fitting, and simultaneous analysis of dyes…

  10. Effectiveness of Podcasts Delivered on Mobile Devices as a Support for Student Learning During General Chemistry Laboratories

    NASA Astrophysics Data System (ADS)

    Powell, Cynthia B.; Mason, Diana S.

    2013-04-01

    Chemistry instructors in teaching laboratories provide expert modeling of techniques and cognitive processes and provide assistance to enrolled students that may be described as scaffolding interaction. Such student support is particularly essential in laboratories taught with an inquiry-based curriculum. In a teaching laboratory with a high instructor-to-student ratio, mobile devices can provide a platform for expert modeling and scaffolding during the laboratory sessions. This research study provides data collected on the effectiveness of podcasts delivered as needed in a first-semester general chemistry laboratory setting. Podcasts with audio and visual tracks covering essential laboratory techniques and central concepts that aid in experimental design or data processing were prepared and made available for students to access on an as-needed basis on iPhones® or iPod touches®. Research focused in three areas: the extent of podcast usage, the numbers and types of interactions between instructors and student laboratory teams, and student performance on graded assignments. Data analysis indicates that on average the podcast treatment laboratory teams accessed a podcast 2.86 times during the laboratory period during each week that podcasts were available. Comparison of interaction data for the lecture treatment laboratory teams and podcast treatment laboratory teams reveals that scaffolding interactions with instructors were statistically significantly fewer for teams that had podcast access rather than a pre-laboratory lecture. The implication of the results is that student laboratory teams were able to gather laboratory information more effectively when it was presented in an on-demand podcast format than in a pre-laboratory lecture format. Finally, statistical analysis of data on student performance on graded assignments indicates no significant differences between outcome measures for the treatment groups when compared as cohorts. The only statistically significant difference is between students who demonstrated a high level of class participation in the concurrent general chemistry lecture course; for this sub-group the students in the podcast treatment group earned a course average that was statistically significantly higher than those in the lecture treatment group.

  11. Statistical Analysis Experiment for Freshman Chemistry Lab.

    ERIC Educational Resources Information Center

    Salzsieder, John C.

    1995-01-01

    Describes a laboratory experiment dissolving zinc from galvanized nails in which data can be gathered very quickly for statistical analysis. The data have sufficient significant figures and the experiment yields a nice distribution of random errors. Freshman students can gain an appreciation of the relationships between random error, number of…

  12. Strengthening Statistics Graduate Programs with Statistical Collaboration--The Case of Hawassa University, Ethiopia

    ERIC Educational Resources Information Center

    Goshu, Ayele Taye

    2016-01-01

    This paper describes the experiences gained from the established statistical collaboration canter at Hawassa University in May 2015 as part of LISA 2020 [Laboratory for Interdisciplinary Statistical Analysis] network. The center has got similar setup as LISA of Virginia Tech. Statisticians are trained on how to become more effective scientific…

  13. A Laboratory Course for Teaching Laboratory Techniques, Experimental Design, Statistical Analysis, and Peer Review Process to Undergraduate Science Students

    ERIC Educational Resources Information Center

    Gliddon, C. M.; Rosengren, R. J.

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…

  14. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  15. Podcast effectiveness as scaffolding support for students enrolled in first-semester general chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Powell, Mary Cynthia Barton

    Podcasts covering essential first-semester general chemistry laboratory techniques and central concepts that aid in experimental design or data processing were prepared and made available for students to access on an as-needed basis on iPhones→ or iPod touches→. Research focused in three areas: the extent of podcast usage, the numbers and types of interactions between instructors and research teams, and student performance on graded assignments. Data analysis indicates that the podcast treatment research teams accessed a podcast 2.86 times on average during each week that podcasts were available. Comparison of interaction data for the lecture treatment research teams and podcast treatment research teams reveals that interactions with instructors were statistically significantly fewer for teams that had podcast access rather than a pre-laboratory lecture. The implication of the results is that student research teams were able to gather laboratory information more effectively when it was presented in an on-demand podcast format. Finally, statistical analysis of data on student performance on graded assignments indicates no significant differences between outcome measures for the treatment groups when compared as cohorts. The only statistically significant difference is between students judged to be highly motivated; for this sub-group the students in the podcast treatment group earned a course average that was statistically significantly higher than those in the lecture treatment group. This research study provides some of the first data collected on the effectiveness of podcasts delivered as needed in a first-semester general chemistry laboratory setting.

  16. Statistical functions and relevant correlation coefficients of clearness index

    NASA Astrophysics Data System (ADS)

    Pavanello, Diego; Zaaiman, Willem; Colli, Alessandra; Heiser, John; Smith, Scott

    2015-08-01

    This article presents a statistical analysis of the sky conditions, during years from 2010 to 2012, for three different locations: the Joint Research Centre site in Ispra (Italy, European Solar Test Installation - ESTI laboratories), the site of National Renewable Energy Laboratory in Golden (Colorado, USA) and the site of Brookhaven National Laboratories in Upton (New York, USA). The key parameter is the clearness index kT, a dimensionless expression of the global irradiance impinging upon a horizontal surface at a given instant of time. In the first part, the sky conditions are characterized using daily averages, giving a general overview of the three sites. In the second part the analysis is performed using data sets with a short-term resolution of 1 sample per minute, demonstrating remarkable properties of the statistical distributions of the clearness index, reinforced by a proof using fuzzy logic methods. Successively some time-dependent correlations between different meteorological variables are presented in terms of Pearson and Spearman correlation coefficients, and introducing a new one.

  17. Ultra-trace analysis of 41Ca in urine by accelerator mass spectrometry: an inter-laboratory comparison

    PubMed Central

    Jackson, George S.; Hillegonds, Darren J.; Muzikar, Paul; Goehring, Brent

    2013-01-01

    A 41Ca interlaboratory comparison between Lawrence Livermore National Laboratory (LLNL) and the Purdue Rare Isotope Laboratory (PRIME Lab) has been completed. Analysis of the ratios assayed by accelerator mass spectrometry (AMS) shows that there is no statistically significant difference in the ratios. Further, Bayesian analysis shows that the uncertainties reported by both facilities are correct with the possibility of a slight under-estimation by one laboratory. Finally, the chemistry procedures used by the two facilities to produce CaF2 for the cesium sputter ion source are robust and don't yield any significant differences in the final result. PMID:24179312

  18. Experimental design of an interlaboratory study for trace metal analysis of liquid fluids. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1983-01-01

    The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.

  19. Experimental toxicology: Issues of statistics, experimental design, and replication.

    PubMed

    Briner, Wayne; Kirwan, Jeral

    2017-01-01

    The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Clustered Stomates in "Begonia": An Exercise in Data Collection & Statistical Analysis of Biological Space

    ERIC Educational Resources Information Center

    Lau, Joann M.; Korn, Robert W.

    2007-01-01

    In this article, the authors present a laboratory exercise in data collection and statistical analysis in biological space using clustered stomates on leaves of "Begonia" plants. The exercise can be done in middle school classes by students making their own slides and seeing imprints of cells, or at the high school level through collecting data of…

  1. Job Analysis Techniques for Restructuring Health Manpower Education and Training in the Navy Medical Department. Attachment 9. Laboratory QPCB Task Sort for Medical Laboratory Technology.

    ERIC Educational Resources Information Center

    Technomics, Inc., McLean, VA.

    This publication is Attachment 9 of a set of 16 computer listed QPCB task sorts, by career level, for the entire Hospital Corps and Dental Technician fields. Statistical data are presented in tabular form for a detailed listing of job duties in medical laboratory technology. (BT)

  2. Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beggs, W.J.

    1981-02-01

    This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less

  3. How to Engage Medical Students in Chronobiology: An Example on Autorhythmometry

    ERIC Educational Resources Information Center

    Rol de Lama, M. A.; Lozano, J. P.; Ortiz, V.; Sanchez-Vazquez, F. J.; Madrid, J. A.

    2005-01-01

    This contribution describes a new laboratory experience that improves medical students' learning of chronobiology by introducing them to basic chronobiology concepts as well as to methods and statistical analysis tools specific for circadian rhythms. We designed an autorhythmometry laboratory session where students simultaneously played the role…

  4. Errors in logic and statistics plague a meta-analysis

    USDA-ARS?s Scientific Manuscript database

    The non-target effects of transgenic insecticidal crops has been a topic of debate for over a decade and many laboratory and field studies have addressed the issue in numerous countries. In 2009 Lovei et al. (Transgenic Insecticidal Crops and Natural Enemies: A Detailed Review of Laboratory Studies)...

  5. [Statistical approach to evaluate the occurrence of out-of acceptable ranges and accuracy for antimicrobial susceptibility tests in inter-laboratory quality control program].

    PubMed

    Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa

    2013-03-01

    To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate < or = 0.05). The standard deviation indices(SDI) were calculated by using reported results, mean and standard deviation values for the respective antimicrobial agents tested. In the evaluation of accuracy, mean value from each laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.

  6. Modification of Poisson Distribution in Radioactive Particle Counting.

    ERIC Educational Resources Information Center

    Drotter, Michael T.

    This paper focuses on radioactive practicle counting statistics in laboratory and field applications, intended to aid the Health Physics technician's understanding of the effect of indeterminant errors on radioactive particle counting. It indicates that although the statistical analysis of radioactive disintegration is best described by a Poisson…

  7. Examining a Terrorist Network Using Contingency Table Analysis

    DTIC Science & Technology

    2011-08-01

    Mathematics and Statistics, with a minor in Actuarial Science. This is my second year as a summer student at the U.S. Army Research Laboratory (ARL...After graduation, I plan on either attending graduate school to concentrate in applied statistics or becoming a mathematical statistician for the

  8. [Errors in laboratory daily practice].

    PubMed

    Larrose, C; Le Carrer, D

    2007-01-01

    Legislation set by GBEA (Guide de bonne exécution des analyses) requires that, before performing analysis, the laboratory directors have to check both the nature of the samples and the patients identity. The data processing of requisition forms, which identifies key errors, was established in 2000 and in 2002 by the specialized biochemistry laboratory, also with the contribution of the reception centre for biological samples. The laboratories follow a strict criteria of defining acceptability as a starting point for the reception to then check requisition forms and biological samples. All errors are logged into the laboratory database and analysis report are sent to the care unit specifying the problems and the consequences they have on the analysis. The data is then assessed by the laboratory directors to produce monthly or annual statistical reports. This indicates the number of errors, which are then indexed to patient files to reveal the specific problem areas, therefore allowing the laboratory directors to teach the nurses and enable corrective action.

  9. Analysis of vehicle classification and truck weight data of the New England states

    DOT National Transportation Integrated Search

    1998-09-01

    This report is about a statistical analysis of 1995-96 classification and weigh in motion (WIM) data from 17 continuous traffic-monitoring sites in New England. It documents work performed by Oak Ridge National Laboratory in fulfillment of 'Analysis ...

  10. Outcomes assessment of a residency program in laboratory medicine.

    PubMed

    Morse, E E; Pisciotto, P T; Hopfer, S M; Makowski, G; Ryan, R W; Aslanzadeh, J

    1997-01-01

    During a down-sizing of residency programs at a State University Medical School, hospital based residents' positions were eliminated. It was determined to find out the characteristics of the residents who graduated from the Laboratory Medicine Program, to compare women graduates with men graduates, and to compare IMGs with United States Graduates. An assessment of a 25 year program in laboratory medicine which had graduated 100 residents showed that there was no statistically significant difference by chi 2 analysis in positions (laboratory directors or staff), in certification (American Board of Pathology [and subspecialties], American Board of Medical Microbiology, American Board of Clinical Chemistry) nor in academic appointments (assistant professor to full professor) when the male graduates were compared with the female graduates or when graduates of American medical schools were compared with graduates of foreign medical schools. There were statistically significant associations by chi 2 analysis between directorship positions and board certification and between academic appointments and board certification. Of 100 graduates, there were 57 directors, 52 certified, and 41 with academic appointments. Twenty-two graduates (11 women and 11 men) attained all three.

  11. Determination of reference ranges for elements in human scalp hair.

    PubMed

    Druyan, M E; Bass, D; Puchyr, R; Urek, K; Quig, D; Harmon, E; Marquardt, W

    1998-06-01

    Expected values, reference ranges, or reference limits are necessary to enable clinicians to apply analytical chemical data in the delivery of health care. Determination of references ranges is not straightforward in terms of either selecting a reference population or performing statistical analysis. In light of logistical, scientific, and economic obstacles, it is understandable that clinical laboratories often combine approaches in developing health associated reference values. A laboratory may choose to: 1. Validate either reference ranges of other laboratories or published data from clinical research or both, through comparison of patients test data. 2. Base the laboratory's reference values on statistical analysis of results from specimens assayed by the clinical reference laboratory itself. 3. Adopt standards or recommendations of regulatory agencies and governmental bodies. 4. Initiate population studies to validate transferred reference ranges or to determine them anew. Effects of external contamination and anecdotal information from clinicians may be considered. The clinical utility of hair analysis is well accepted for some elements. For others, it remains in the realm of clinical investigation. This article elucidates an approach for establishment of reference ranges for elements in human scalp hair. Observed levels of analytes from hair specimens from both our laboratory's total patient population and from a physician-defined healthy American population have been evaluated. Examination of levels of elements often associated with toxicity serves to exemplify the process of determining reference ranges in hair. In addition the approach serves as a model for setting reference ranges for analytes in a variety of matrices.

  12. Lack of grading agreement among international hemostasis external quality assessment programs

    PubMed Central

    Olson, John D.; Jennings, Ian; Meijer, Piet; Bon, Chantal; Bonar, Roslyn; Favaloro, Emmanuel J.; Higgins, Russell A.; Keeney, Michael; Mammen, Joy; Marlar, Richard A.; Meley, Roland; Nair, Sukesh C.; Nichols, William L.; Raby, Anne; Reverter, Joan C.; Srivastava, Alok; Walker, Isobel

    2018-01-01

    Laboratory quality programs rely on internal quality control and external quality assessment (EQA). EQA programs provide unknown specimens for the laboratory to test. The laboratory's result is compared with other (peer) laboratories performing the same test. EQA programs assign target values using a variety of methods statistical tools and performance assessment of ‘pass’ or ‘fail’ is made. EQA provider members of the international organization, external quality assurance in thrombosis and hemostasis, took part in a study to compare outcome of performance analysis using the same data set of laboratory results. Eleven EQA organizations using eight different analytical approaches participated. Data for a normal and prolonged activated partial thromboplastin time (aPTT) and a normal and reduced factor VIII (FVIII) from 218 laboratories were sent to the EQA providers who analyzed the data set using their method of evaluation for aPTT and FVIII, determining the performance for each laboratory record in the data set. Providers also summarized their statistical approach to assignment of target values and laboratory performance. Each laboratory record in the data set was graded pass/fail by all EQA providers for each of the four analytes. There was a lack of agreement of pass/fail grading among EQA programs. Discordance in the grading was 17.9 and 11% of normal and prolonged aPTT results, respectively, and 20.2 and 17.4% of normal and reduced FVIII results, respectively. All EQA programs in this study employed statistical methods compliant with the International Standardization Organization (ISO), ISO 13528, yet the evaluation of laboratory results for all four analytes showed remarkable grading discordance. PMID:29232255

  13. Interlaboratory comparability, bias, and precision for four laboratories measuring constituents in precipitation, November 1982-August 1983

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Malo, B.A.

    1985-01-01

    Four laboratories were evaluated in their analysis of identical natural and simulated precipitation water samples. Interlaboratory comparability was evaluated using analysis of variance coupled with Duncan 's multiple range test, and linear-regression models describing the relations between individual laboratory analytical results for natural precipitation samples. Results of the statistical analyses indicate that certain pairs of laboratories produce different results when analyzing identical samples. Analyte bias for each laboratory was examined using analysis of variance coupled with Duncan 's multiple range test on data produced by the laboratories from the analysis of identical simulated precipitation samples. Bias for a given analyte produced by a single laboratory has been indicated when the laboratory mean for that analyte is shown to be significantly different from the mean for the most-probable analyte concentrations in the simulated precipitation samples. Ion-chromatographic methods for the determination of chloride, nitrate, and sulfate have been compared with the colorimetric methods that were also in use during the study period. Comparisons were made using analysis of variance coupled with Duncan 's multiple range test for means produced by the two methods. Analyte precision for each laboratory has been estimated by calculating a pooled variance for each analyte. Analyte estimated precisions have been compared using F-tests and differences in analyte precisions for laboratory pairs have been reported. (USGS)

  14. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  15. Analysis of Doppler radar windshear data

    NASA Technical Reports Server (NTRS)

    Williams, F.; Mckinney, P.; Ozmen, F.

    1989-01-01

    The objective of this analysis is to process Lincoln Laboratory Doppler radar data obtained during FLOWS testing at Huntsville, Alabama, in the summer of 1986, to characterize windshear events. The processing includes plotting velocity and F-factor profiles, histogram analysis to summarize statistics, and correlation analysis to demonstrate any correlation between different data fields.

  16. A Statistical Analysis of Student Questions in a Cell Biology Laboratory

    ERIC Educational Resources Information Center

    Keeling, Elena L.; Polacek, Kelly M.; Ingram, Ella L.

    2009-01-01

    Asking questions is an essential component of the practice of science, but question-asking skills are often underemphasized in science education. In this study, we examined questions written by students as they prepared for laboratory exercises in a senior-level cell biology class. Our goals were to discover 1) what types of questions students…

  17. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  18. Diagnostic evaluation of HER-2 as a molecular target: an assessment of accuracy and reproducibility of laboratory testing in large, prospective, randomized clinical trials.

    PubMed

    Press, Michael F; Sauter, Guido; Bernstein, Leslie; Villalobos, Ivonne E; Mirlacher, Martina; Zhou, Jian-Yuan; Wardeh, Rooba; Li, Yong-Tian; Guzman, Roberta; Ma, Yanling; Sullivan-Halley, Jane; Santiago, Angela; Park, Jinha M; Riva, Alessandro; Slamon, Dennis J

    2005-09-15

    To critically assess the accuracy and reproducibility of human epidermal growth factor receptor type 2 (HER-2) testing in outside/local community-based hospitals versus two centralized reference laboratories and its effect on selection of women for trastuzumab (Herceptin)-based clinical trials. Breast cancer specimens from 2,600 women were prospectively evaluated by fluorescence in situ hybridization (FISH) for entry into Breast Cancer International Research Group (BCIRG) clinical trials for HER-2-directed therapies. HER-2 gene amplification by FISH was observed in 657 of the 2,502 (26%) breast cancers successfully analyzed. Among 2,243 breast cancers with central laboratory immunohistochemistry (10H8-IHC) analysis, 504 (22.54%) showed overexpression (2+ or 3+). Outside/local laboratories assessed HER-2 status by immunohistochemistry in 1,536 of these cases and by FISH in 131 cases. Overall, the HER-2 alteration status determined by outside/local immunohistochemistry showed a 79% agreement rate [kappa statistic, 0.56; 95% confidence interval (95% CI), 0.52-0.60], with FISH done by the central laboratories. The agreement rate comparing BCIRG central laboratory 10H8-IHC and outside/local laboratory immunohistochemistry was 77.5% (kappa statistic, 0.51; 95% CI, 0.46-0.55). Finally, HER-2 status, determined by unspecified FISH assay methods at outside/local laboratories, showed a 92% agreement rate (kappa statistic, 0.83; 95% CI, 0.73-0.93), with FISH done at the BCIRG central laboratories. Compared with the HER-2 status determined at centralized BCIRG reference laboratories, these results indicate superiority of FISH to accurately and reproducibly assess tumors for the HER-2 alteration at outside/local laboratories for entry to clinical trials.

  19. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses.

    PubMed

    Metz, Anneke M

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study.

  20. Universal immunogenicity validation and assessment during early biotherapeutic development to support a green laboratory.

    PubMed

    Bautista, Ami C; Zhou, Lei; Jawa, Vibha

    2013-10-01

    Immunogenicity support during nonclinical biotherapeutic development can be resource intensive if supported by conventional methodologies. A universal indirect species-specific immunoassay can eliminate the need for biotherapeutic-specific anti-drug antibody immunoassays without compromising quality. By implementing the R's of sustainability (reduce, reuse, rethink), conservation of resources and greener laboratory practices were achieved in this study. Statistical analysis across four biotherapeutics supported identification of consistent product performance standards (cut points, sensitivity and reference limits) and a streamlined universal anti-drug antibody immunoassay method implementation strategy. We propose an efficient, fit-for-purpose, scientifically and statistically supported nonclinical immunogenicity assessment strategy. Utilization of a universal method and streamlined validation, while retaining comparability to conventional immunoassays and meeting the industry recommended standards, provides environmental credits in the scientific laboratory. Collectively, individual reductions in critical material consumption, energy usage, waste and non-environment friendly consumables, such as plastic and paper, support a greener laboratory environment.

  1. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... statistical summaries and other information it maintains? 40.111 Section 40.111 Transportation Office of the... Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other information it maintains? (a) As a laboratory, you must transmit an aggregate statistical summary, by employer...

  2. Occupational safety and health status of medical laboratories in Kajiado County, Kenya.

    PubMed

    Tait, Fridah Ntinyari; Mburu, Charles; Gikunju, Joseph

    2018-01-01

    Despite the increasing interest in Occupational Safety and Health (OSH), seldom studies are available on OSH in medical laboratories from developing countries in general although a high number of injuries occur without proper documentation. It is estimated that every day 6,300 people die as a result of occupational accidents or work-related diseases resulting in over 2.3 million deaths per year. Medical laboratories handle a wide range of materials, potentially dangerous pathogenic agents and exposes health workers to numerous potential hazards. This study evaluated the status of OSH in medical laboratories in Kajiado County, Kenya. The objectives included establishment of biological, chemical and physical hazards; reviewing medical laboratories control measures; and enumerating factors hindering implementation of good practices in OSH. This was a cross-sectional descriptive study research design. Observation check lists, interview schedules and structured questionnaires were used. The study was carried out in 108 medical laboratories among 204 sampled respondents. Data was analysed using statistical package for social science (SPSS) 20 software. The commonest type of hazards in medical laboratories include; bacteria (80%) for Biological hazards; handling un-labelled and un-marked chemicals (38.2%) for chemical hazards; and laboratory equipment's dangerously placed (49.5%) for Physical hazards. According to Pearson's Product Moment Correlation analysis, not-wearing personal protective equipment's was statistically associated with exposure to hazards. Individual control measures were statistically significant at 0.01 significance level. Only 65.1% of the factors influencing implementation of OSH in medical laboratories were identified. Training has the highest contribution to good OSH practices.

  3. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  4. Effects of single sex lab groups on physics self-efficacy, behavior, and academic performance

    NASA Astrophysics Data System (ADS)

    Hunt, Gary L.

    The purpose of this study was to investigate the relationships between the gender composition of a laboratory group and student behaviors, self-efficacy, and quiz performance, within the college physics laboratory. A student population was chosen and subdivided into two groups, which were assigned either same-sex or coed laboratory teams while executing identical laboratory activities and instruction. Assessments were carried out prior to instruction, during the course, and at the end of one semester worth of instruction and laboratory activities. Students were assessed in three areas: behaviors exhibited during laboratory activities, self-efficacy, and scores on laboratory quizzes. Analyses considered the differences in outcomes after a single semester of physics laboratories that differed only in team gender organization. The results indicated that there were no statistically significant differences in behavior variable, self-efficacy or laboratory quiz scores between same sex teams and coed teams. There were also no statistically significant differences between genders, and no interaction effect present. In a post-hoc analysis of the individual behaviors data, it was noted that there is present a practical difference in the individual behaviors exhibited by males and females. This difference implies a difference in how males and females successfully engage in the laboratory activities.

  5. Assessment of statistical uncertainty in the quantitative analysis of solid samples in motion using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Cabalín, L. M.; González, A.; Ruiz, J.; Laserna, J. J.

    2010-08-01

    Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s - 1 . Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.

  6. Analysis of ground water by different laboratories: a comparison of chloride and nitrate data, Nassau and Suffolk counties, New York

    USGS Publications Warehouse

    Katz, Brian G.; Krulikas, Richard K.

    1979-01-01

    Water samples from wells in Nassau and Suffolk Counties were analyzed for chloride and nitrate. Two samples were collected at each well; one was analyzed by the U.S. Geological Survey, the other by a laboratory in the county from which the sample was taken. Results were compared statistically by paired-sample t-test to indicate the degree of uniformity among laboratory results. Chloride analyses from one of the three county laboratories differed significantly (0.95 confidence level) from that of a Geological Survey laboratory. For nitrate analyses, a significant difference (0.95 confidence level) was noted between results from two of the three county laboratories and the Geological Survey laboratory. The lack of uniformity among results reported by the participating laboratories indicates a need for continuing participation in a quality-assurance program and exercise of strong quality control from time of sample collection through analysis so that differences can be evaluated. (Kosco-USGS)

  7. Development and analysis of a meteorological database, Argonne National Laboratory, Illinois

    USGS Publications Warehouse

    Over, Thomas M.; Price, Thomas H.; Ishii, Audrey L.

    2010-01-01

    A database of hourly values of air temperature, dewpoint temperature, wind speed, and solar radiation from January 1, 1948, to September 30, 2003, primarily using data collected at the Argonne National Laboratory station, was developed for use in continuous-time hydrologic modeling in northeastern Illinois. Missing and apparently erroneous data values were replaced with adjusted values from nearby stations used as 'backup'. Temporal variations in the statistical properties of the data resulting from changes in measurement and data-storage methodologies were adjusted to match the statistical properties resulting from the data-collection procedures that have been in place since January 1, 1989. The adjustments were computed based on the regressions between the primary data series from Argonne National Laboratory and the backup series using data obtained during common periods; the statistical properties of the regressions were used to assign estimated standard errors to values that were adjusted or filled from other series. Each hourly value was assigned a corresponding data-source flag that indicates the source of the value and its transformations. An analysis of the data-source flags indicates that all the series in the database except dewpoint have a similar fraction of Argonne National Laboratory data, with about 89 percent for the entire period, about 86 percent from 1949 through 1988, and about 98 percent from 1989 through 2003. The dewpoint series, for which observations at Argonne National Laboratory did not begin until 1958, has only about 71 percent Argonne National Laboratory data for the entire period, about 63 percent from 1948 through 1988, and about 93 percent from 1989 through 2003, indicating a lower reliability of the dewpoint sensor. A basic statistical analysis of the filled and adjusted data series in the database, and a series of potential evapotranspiration computed from them using the computer program LXPET (Lamoreux Potential Evapotranspiration) also was carried out. This analysis indicates annual cycles in solar radiation and potential evapotranspiration that follow the annual cycle of extraterrestrial solar radiation, whereas temperature and dewpoint annual cycles are lagged by about 1 month relative to the solar cycle. The annual cycle of wind has a late summer minimum, and spring and fall maximums. At the annual time scale, the filled and adjusted data series and computed potential evapotranspiration have significant serial correlation and possibly have significant temporal trends. The inter-annual fluctuations of temperature and dewpoint are weakest, whereas those of wind and potential evapotranspiration are strongest.

  8. Interlaboratory studies with the Chinese hamster V79 cell metabolic cooperation assay to detect tumor-promoting agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohrman, J.S.; Burg, J.R.; Elmore, E.

    1988-01-01

    Three laboratories participated in an interlaboratory study to evaluate the usefulness of the Chinese hamster V79 cell metabolic cooperation assay to predict the tumor-promoting activity of selected chemical. Twenty-three chemicals of different chemical structures (phorbol esters, barbiturates, phenols, artificial sweeteners, alkanes, and peroxides) were chosen for testing based on in vivo promotion activities, as reported in the literature. Assay protocols and materials were standardized, and the chemicals were coded to facilitate unbiased evaluation. A chemical was tested only once in each laboratory, with one of the three laboratories testing only 15 out of 23 chemicals. Dunnett's test was used formore » statistical analysis. Chemicals were scored as positive (at least two concentration levels statistically different than control), equivocal (only one concentration statistically different), or negative. For 15 chemicals tested in all three laboratories, there was complete agreement among the laboratories for nine chemicals. For the 23 chemicals tested in only two laboratories, there was agreement on 16 chemicals. With the exception of the peroxides and alkanes, the metabolic cooperation data were in general agreement with in vivo data. However, an overall evaluation of the V79 cell system for predicting in vivo promotion activity was difficult because of the organ specificity of certain chemicals and/or the limited number of adequately tested nonpromoting chemicals.« less

  9. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  10. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1981-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized.

  11. Teaching Statistics in Biology: Using Inquiry-based Learning to Strengthen Understanding of Statistical Analysis in Biology Laboratory Courses

    PubMed Central

    2008-01-01

    There is an increasing need for students in the biological sciences to build a strong foundation in quantitative approaches to data analyses. Although most science, engineering, and math field majors are required to take at least one statistics course, statistical analysis is poorly integrated into undergraduate biology course work, particularly at the lower-division level. Elements of statistics were incorporated into an introductory biology course, including a review of statistics concepts and opportunity for students to perform statistical analysis in a biological context. Learning gains were measured with an 11-item statistics learning survey instrument developed for the course. Students showed a statistically significant 25% (p < 0.005) increase in statistics knowledge after completing introductory biology. Students improved their scores on the survey after completing introductory biology, even if they had previously completed an introductory statistics course (9%, improvement p < 0.005). Students retested 1 yr after completing introductory biology showed no loss of their statistics knowledge as measured by this instrument, suggesting that the use of statistics in biology course work may aid long-term retention of statistics knowledge. No statistically significant differences in learning were detected between male and female students in the study. PMID:18765754

  12. Transportation energy data book

    DOT National Transportation Integrated Search

    2008-01-01

    The Transportation Energy Data Book: Edition 27 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the Office of Planning, Budget Formulation, and Analysis, under the Energy Efficiency and R...

  13. Transportation energy data book

    DOT National Transportation Integrated Search

    2006-01-01

    The Transportation Energy Data Book: Edition 25 is a statistical compendium prepared and published by Oak Ridge National Laboratory (ORNL) under contract with the Office of Planning, Budget Formulation, and Analysis, under the Energy Efficiency and R...

  14. Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation.

    PubMed

    Yigzaw, Kassaye Yitbarek; Michalas, Antonis; Bellika, Johan Gustav

    2017-01-03

    Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N - 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians.

  15. Statistical analysis of DOE EML QAP data from 1982 to 1998.

    PubMed

    Mizanur Rahman, G M; Isenhour, T L; Larget, B; Greenlaw, P D

    2001-01-01

    The historical database from the Environmental Measurements Laboratory's Quality Assessment Program from 1982 to 1998 has been analyzed to determine control limits for future performance evaluations of the different laboratories contracted to the U.S. Department of Energy. Seventy-three radionuclides in four different matrices (air filter, soil, vegetation, and water) were analyzed. The evaluation criteria were established based on a z-score calculation.

  16. DNA Fingerprinting in a Forensic Teaching Experiment

    ERIC Educational Resources Information Center

    Wagoner, Stacy A.; Carlson, Kimberly A.

    2008-01-01

    This article presents an experiment designed to provide students, in a classroom laboratory setting, a hands-on demonstration of the steps used in DNA forensic analysis by performing DNA extraction, DNA fingerprinting, and statistical analysis of the data. This experiment demonstrates how DNA fingerprinting is performed and how long it takes. It…

  17. Statistical analysis of an inter-laboratory comparison of small-scale safety and thermal testing of RDX

    DOE PAGES

    Brown, Geoffrey W.; Sandstrom, Mary M.; Preston, Daniel N.; ...

    2014-11-17

    In this study, the Integrated Data Collection Analysis (IDCA) program has conducted a proficiency test for small-scale safety and thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results from this test for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Class 5 Type II standard. The material was tested as a well-characterized standard several times during the proficiency test to assess differences among participants and the range of results that may arise for well-behaved explosive materials.

  18. Considering whether Medicaid is worth the cost: revisiting the Oregon Health Study.

    PubMed

    Muennig, Peter A; Quan, Ryan; Chiuzan, Codruta; Glied, Sherry

    2015-05-01

    The Oregon Health Study was a groundbreaking experiment in which uninsured participants were randomized to either apply for Medicaid or stay with their current care. The study showed that Medicaid produced numerous important socioeconomic and health benefits but had no statistically significant impact on hypertension, hypercholesterolemia, or diabetes. Medicaid opponents interpreted the findings to mean that Medicaid is not a worthwhile investment. Medicaid proponents viewed the experiment as statistically underpowered and, irrespective of the laboratory values, suggestive that Medicaid is a good investment. We tested these competing claims and, using a sensitive joint test and statistical power analysis, confirmed that the Oregon Health Study did not improve laboratory values. However, we also found that Medicaid is a good value, with a cost of just $62 000 per quality-adjusted life-years gained.

  19. GHEP-ISFG collaborative exercise on mixture profiles (GHEP-MIX06). Reporting conclusions: Results and evaluation.

    PubMed

    Barrio, P A; Crespillo, M; Luque, J A; Aler, M; Baeza-Richer, C; Baldassarri, L; Carnevali, E; Coufalova, P; Flores, I; García, O; García, M A; González, R; Hernández, A; Inglés, V; Luque, G M; Mosquera-Miguel, A; Pedrosa, S; Pontes, M L; Porto, M J; Posada, Y; Ramella, M I; Ribeiro, T; Riego, E; Sala, A; Saragoni, V G; Serrano, A; Vannelli, S

    2018-07-01

    One of the main goals of the Spanish and Portuguese-Speaking Group of the International Society for Forensic Genetics (GHEP-ISFG) is to promote and contribute to the development and dissemination of scientific knowledge in the field of forensic genetics. Due to this fact, GHEP-ISFG holds different working commissions that are set up to develop activities in scientific aspects of general interest. One of them, the Mixture Commission of GHEP-ISFG, has organized annually, since 2009, a collaborative exercise on analysis and interpretation of autosomal short tandem repeat (STR) mixture profiles. Until now, six exercises have been organized. At the present edition (GHEP-MIX06), with 25 participant laboratories, the exercise main aim was to assess mixture profiles results by issuing a report, from the proposal of a complex mock case. One of the conclusions obtained from this exercise is the increasing tendency of participating laboratories to validate DNA mixture profiles analysis following international recommendations. However, the results have shown some differences among them regarding the edition and also the interpretation of mixture profiles. Besides, although the last revision of ISO/IEC 17025:2017 gives indications of how results should be reported, not all laboratories strictly follow their recommendations. Regarding the statistical aspect, all those laboratories that have performed statistical evaluation of the data have employed the likelihood ratio (LR) as a parameter to evaluate the statistical compatibility. However, LR values obtained show a wide range of variation. This fact could not be attributed to the software employed, since the vast majority of laboratories that performed LR calculation employed the same software (LRmixStudio). Thus, the final allelic composition of the edited mixture profile and the parameters employed in the software could explain this data dispersion. This highlights the need, for each laboratory, to define through internal validations its criteria for editing and interpreting mixtures, and to continuous train in software handling. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Statistical compilation of NAPAP chemical erosion observations

    USGS Publications Warehouse

    Mossotti, Victor G.; Eldeeb, A. Raouf; Reddy, Michael M.; Fries, Terry L.; Coombs, Mary Jane; Schmiermund, Ron L.; Sherwood, Susan I.

    2001-01-01

    In the mid 1980s, the National Acid Precipitation Assessment Program (NAPAP), in cooperation with the National Park Service (NPS) and the U.S. Geological Survey (USGS), initiated a Materials Research Program (MRP) that included a series of field and laboratory studies with the broad objective of providing scientific information on acid rain effects on calcareous building stone. Among the several effects investigated, the chemical dissolution of limestone and marble by rainfall was given particular attention because of the pervasive appearance of erosion effects on cultural materials situated outdoors. In order to track the chemical erosion of stone objects in the field and in the laboratory, the Ca 2+ ion concentration was monitored in the runoff solution from a variety of test objects located both outdoors and under more controlled conditions in the laboratory. This report provides a graphical and statistical overview of the Ca 2+ chemistry in the runoff solutions from (1) five urban and rural sites (DC, NY, NJ, NC, and OH) established by the MRP for materials studies over the period 1984 to 1989, (2) subevent study at the New York MRP site, (3) in situ study of limestone and marble monuments at Gettysburg, (4) laboratory experiments on calcite dissolution conducted by Baedecker, (5) laboratory simulations by Schmiermund, and (6) laboratory investigation of the surface reactivity of calcareous stone conducted by Fries and Mossotti. The graphical representations provided a means for identifying erroneous data that can randomly appear in a database when field operations are semi-automated; a purged database suitable for the evaluation of quantitative models of stone erosion is appended to this report. An analysis of the sources of statistical variability in the data revealed that the rate of stone erosion is weakly dependent on the type of calcareous stone, the ambient temperature, and the H + concentration delivered in the incident rain. The analysis also showed that the rate of stone erosion is strongly dependent on the rain-delivery conditions and on the surface morphology and orientation.

  1. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--part 2: forensic inter-laboratory trial: bulk carbon and nitrogen stable isotopes in a range of chemical compounds (Australia and New Zealand).

    PubMed

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Neal, Ken; Stuart-Williams, Hilary; Hope, Janet; Walker, G Stewart; Roux, Claude

    2010-01-01

    Comparability of data over time and between laboratories is a key issue for consideration in the development of global databases, and more broadly for quality assurance in general. One mechanism that can be utilized for evaluating traceability is an inter-laboratory trial. This paper addresses an inter-laboratory trial conducted across a number of Australian and New Zealand isotope ratio mass spectrometry (IRMS) laboratories. The main objective of this trial was to determine whether IRMS laboratories in these countries would record comparable values for the distributed samples. Four carbon containing and four nitrogen containing compounds were distributed to seven laboratories in Australia and one in New Zealand. The laboratories were requested to analyze the samples using their standard procedures. The data from each laboratory was evaluated collectively using International Standard ISO 13528 (Statistical methods for use in proficiency testing by inter-laboratory comparisons). "Warning signals" were raised against one participant in this trial. "Action signals" requiring corrective action were raised against four participants. These participants reviewed the data and possible sources for the discrepancies. This inter-laboratory trial was successful in providing an initial snapshot of the potential for traceability between the participating laboratories. The statistical methods described in this article could be used as a model for others needing to evaluate stable isotope results derived from multiple laboratories, e.g., inter-laboratory trials/proficiency testing. Ongoing trials will be conducted to improve traceability across the Australian and New Zealand IRMS community.

  2. Evaluation of Resilient Modulus of Subgrade and Base Materials in Indiana and Its Implementation in MEPDG

    PubMed Central

    Siddiki, Nayyarzia; Nantung, Tommy; Kim, Daehyeon

    2014-01-01

    In order to implement MEPDG hierarchical inputs for unbound and subgrade soil, a database containing subgrade M R, index properties, standard proctor, and laboratory M R for 140 undisturbed roadbed soil samples from six different districts in Indiana was created. The M R data were categorized in accordance with the AASHTO soil classifications and divided into several groups. Based on each group, this study develops statistical analysis and evaluation datasets to validate these models. Stress-based regression models were evaluated using a statistical tool (analysis of variance (ANOVA)) and Z-test, and pertinent material constants (k 1, k 2 and k 3) were determined for different soil types. The reasonably good correlations of material constants along with M R with routine soil properties were established. Furthermore, FWD tests were conducted on several Indiana highways in different seasons, and laboratory resilient modulus tests were performed on the subgrade soils that were collected from the falling weight deflectometer (FWD) test sites. A comparison was made of the resilient moduli obtained from the laboratory resilient modulus tests with those from the FWD tests. Correlations between the laboratory resilient modulus and the FWD modulus were developed and are discussed in this paper. PMID:24701162

  3. The breaking load method - Results and statistical modification from the ASTM interlaboratory test program

    NASA Technical Reports Server (NTRS)

    Colvin, E. L.; Emptage, M. R.

    1992-01-01

    The breaking load test provides quantitative stress corrosion cracking data by determining the residual strength of tension specimens that have been exposed to corrosive environments. Eight laboratories have participated in a cooperative test program under the auspices of ASTM Committee G-1 to evaluate the new test method. All eight laboratories were able to distinguish between three tempers of aluminum alloy 7075. The statistical analysis procedures that were used in the test program do not work well in all situations. An alternative procedure using Box-Cox transformations shows a great deal of promise. An ASTM standard method has been drafted which incorporates the Box-Cox procedure.

  4. Exceedance statistics of accelerations resulting from thruster firings on the Apollo-Soyuz mission

    NASA Technical Reports Server (NTRS)

    Fichtl, G. H.; Holland, R. L.

    1983-01-01

    Spacecraft acceleration resulting from firings of vernier control system thrusters is an important consideration in the design, planning, execution and post-flight analysis of laboratory experiments in space. In particular, scientists and technologists involved with the development of experiments to be performed in space in many instances required statistical information on the magnitude and rate of occurrence of spacecraft accelerations. Typically, these accelerations are stochastic in nature, so that it is useful to characterize these accelerations in statistical terms. Statistics of spacecraft accelerations are summarized. Previously announced in STAR as N82-12127

  5. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  6. Online Statistical Modeling (Regression Analysis) for Independent Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  7. Quality Evaluation of Zirconium Dioxide Frameworks Produced in Five Dental Laboratories from Different Countries.

    PubMed

    Schneebeli, Esther; Brägger, Urs; Scherrer, Susanne S; Keller, Andrea; Wittneben, Julia G; Hicklin, Stefan P

    2017-07-01

    The aim of this study was to assess and compare quality as well as economic aspects of CAD/CAM high strength ceramic three-unit FDP frameworks ordered from dental laboratories located in emerging countries and Switzerland. The master casts of six cases were sent to five dental laboratories located in Thailand (Bangkok), China (Peking and Shenzhen), Turkey (Izmir), and Switzerland (Bern). Each laboratory was using a different CAD/CAM system. The clinical fit of the frameworks was qualitatively assessed, and the thickness of the framework material, the connector height, the width, and the diameter were evaluated using a measuring sensor. The analysis of the internal fit of the frameworks was performed by means of a replica technique, whereas the inner and outer surfaces of the frameworks were evaluated for traces of postprocessing and damage to the intaglio surface with light and electronic microscopes. Groups (dental laboratories and cases) were compared for statistically significant differences using Mann-Whitney U-tests after Bonferroni correction. An acceptable clinical fit was found at 97.9% of the margins produced in laboratory E, 87.5% in B, 93.7% in C, 79.2% in A, and 62.5% in D. The mean framework thicknesses were not statistically significantly different for the premolar regions; however, for the molar area 4/8 of the evaluated sites were statistically significantly different. Circumference, surface, and width of the connectors produced in the different laboratories were statistically significantly different but not the height. There were great differences in the designs for the pontic and connector regions, and some of the frameworks would not be recommended for clinical use. Traces of heavy postprocessing were found in frameworks from some of the laboratories. The prices per framework ranged from US$177 to US$896. By ordering laboratory work in developing countries, a considerable price reduction was obtained compared to the price level in Switzerland. Despite the use of the standardized CAD/CAM chains of production in all laboratories, a large variability in the quality aspects, such as clinical marginal fit, connector and pontic design, as well as postprocessing traces was noted. Recommended sound handling of postprocessing was not applied in all laboratories. Dentists should be aware of the true and factitious advantages of CAD/CAM production chains and not lose control over the process. © 2015 by the American College of Prosthodontists.

  8. Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites.

    PubMed

    Agin, Patricia Poh; Edmonds, Susan H

    2002-08-01

    The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.

  9. Interlaboratory comparability, bias, and precision for four laboratories measuring analytes in wet deposition, October 1983-December 1984

    USGS Publications Warehouse

    Brooks, Myron H.; Schroder, LeRoy J.; Willoughby, Timothy C.

    1987-01-01

    Four laboratories involved in the routine analysis of wet-deposition samples participated in an interlaboratory comparison program managed by the U.S. Geological Survey. The four participants were: Illinois State Water Survey central analytical laboratory in Champaign, Illinois; U.S. Geological Survey national water-quality laboratories in Atlanta, Georgia, and Denver, Colorado; and Inland Waters Directorate national water-quality laboratory in Burlington, Ontario, Canada. Analyses of interlaboratory samples performed by the four laboratories from October 1983 through December 1984 were compared.Participating laboratories analyzed three types of interlaboratory samples--natural wet deposition, simulated wet deposition, and deionized water--for pH and specific conductance, and for dissolved calcium, magnesium, sodium, sodium, potassium, chloride, sulfate, nitrate, ammonium, and orthophosphate. Natural wet-deposition samples were aliquots of actual wet-deposition samples. Analyses of these samples by the four laboratories were compared using analysis of variance. Test results indicated that pH, calcium, nitrate, and ammonium results were not directly comparable among the four laboratories. Statistically significant differences between laboratory results probably only were meaningful for analyses of dissolved calcium. Simulated wet-deposition samples with known analyte concentrations were used to test each laboratory for analyte bias. Laboratory analyses of calcium, magnesium, sodium, potassium, chloride, sulfate, and nitrate were not significantly different from the known concentrations of these analytes when tested using analysis of variance. Deionized-water samples were used to test each laboratory for reporting of false positive values. The Illinois State Water Survey Laboratory reported the smallest percentage of false positive values for most analytes. Analyte precision was estimated for each laboratory from results of replicate measurements. In general, the Illinois State Water Survey laboratory achieved the greatest precision, whereas the U.S. Geological Survey laboratories achieved the least precision.

  10. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system.

    PubMed

    Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.

  11. Digital immunohistochemistry platform for the staining variation monitoring based on integration of image and statistical analyses with laboratory information system

    PubMed Central

    2014-01-01

    Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This document comprises Pacific Northwest National Laboratory`s report for Fiscal Year 1996 on research and development programs. The document contains 161 project summaries in 16 areas of research and development. The 16 areas of research and development reported on are: atmospheric sciences, biotechnology, chemical instrumentation and analysis, computer and information science, ecological science, electronics and sensors, health protection and dosimetry, hydrological and geologic sciences, marine sciences, materials science and engineering, molecular science, process science and engineering, risk and safety analysis, socio-technical systems analysis, statistics and applied mathematics, and thermal and energy systems. In addition, this report provides an overview ofmore » the research and development program, program management, program funding, and Fiscal Year 1997 projects.« less

  13. Optimizing the design of a reproduction toxicity test with the pond snail Lymnaea stagnalis.

    PubMed

    Charles, Sandrine; Ducrot, Virginie; Azam, Didier; Benstead, Rachel; Brettschneider, Denise; De Schamphelaere, Karel; Filipe Goncalves, Sandra; Green, John W; Holbech, Henrik; Hutchinson, Thomas H; Faber, Daniel; Laranjeiro, Filipe; Matthiessen, Peter; Norrgren, Leif; Oehlmann, Jörg; Reategui-Zirena, Evelyn; Seeland-Fremer, Anne; Teigeler, Matthias; Thome, Jean-Pierre; Tobor Kaplon, Marysia; Weltje, Lennart; Lagadic, Laurent

    2016-11-01

    This paper presents the results from two ring-tests addressing the feasibility, robustness and reproducibility of a reproduction toxicity test with the freshwater gastropod Lymnaea stagnalis (RENILYS strain). Sixteen laboratories (from inexperienced to expert laboratories in mollusc testing) from nine countries participated in these ring-tests. Survival and reproduction were evaluated in L. stagnalis exposed to cadmium, tributyltin, prochloraz and trenbolone according to an OECD draft Test Guideline. In total, 49 datasets were analysed to assess the practicability of the proposed experimental protocol, and to estimate the between-laboratory reproducibility of toxicity endpoint values. The statistical analysis of count data (number of clutches or eggs per individual-day) leading to ECx estimation was specifically developed and automated through a free web-interface. Based on a complementary statistical analysis, the optimal test duration was established and the most sensitive and cost-effective reproduction toxicity endpoint was identified, to be used as the core endpoint. This validation process and the resulting optimized protocol were used to consolidate the OECD Test Guideline for the evaluation of reproductive effects of chemicals in L. stagnalis. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Differences in results of analyses of concurrent and split stream-water samples collected and analyzed by the US Geological Survey and the Illinois Environmental Protection Agency, 1985-91

    USGS Publications Warehouse

    Melching, C.S.; Coupe, R.H.

    1995-01-01

    During water years 1985-91, the U.S. Geological Survey (USGS) and the Illinois Environmental Protection Agency (IEPA) cooperated in the collection and analysis of concurrent and split stream-water samples from selected sites in Illinois. Concurrent samples were collected independently by field personnel from each agency at the same time and sent to the IEPA laboratory, whereas the split samples were collected by USGS field personnel and divided into aliquots that were sent to each agency's laboratory for analysis. The water-quality data from these programs were examined by means of the Wilcoxon signed ranks test to identify statistically significant differences between results of the USGS and IEPA analyses. The data sets for constituents and properties identified by the Wilcoxon test as having significant differences were further examined by use of the paired t-test, mean relative percentage difference, and scattergrams to determine if the differences were important. Of the 63 constituents and properties in the concurrent-sample analysis, differences in only 2 (pH and ammonia) were statistically significant and large enough to concern water-quality engineers and planners. Of the 27 constituents and properties in the split-sample analysis, differences in 9 (turbidity, dissolved potassium, ammonia, total phosphorus, dissolved aluminum, dissolved barium, dissolved iron, dissolved manganese, and dissolved nickel) were statistically significant and large enough to con- cern water-quality engineers and planners. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between pairs of the concurrent samples were compared to the precision of the laboratory or field method used. The differences in concentration between paris of split samples were compared to the precision of the laboratory method used and the interlaboratory precision of measuring a given concentration or property. Consideration of method precision indicated that differences between concurrent samples were insignificant for all concentrations and properties except pH, and that differences between split samples were significant for all concentrations and properties. Consideration of interlaboratory precision indicated that the differences between the split samples were not unusually large. The results for the split samples illustrate the difficulty in obtaining comparable and accurate water-quality data.

  15. Compliance program data management system for The Idaho National Engineering Laboratory/Environmental Protection Agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzler, C.L.; Poloski, J.P.; Bates, R.A.

    1988-01-01

    The Compliance Program Data Management System (DMS) developed at the Idaho National Engineering Laboratory (INEL) validates and maintains the integrity of data collected to support the Consent Order and Compliance Agreement (COCA) between the INEL and the Environmental Protection Agency (EPA). The system uses dBase III Plus programs and dBase III Plus in an interactive mode to enter, store, validate, manage, and retrieve analytical information provided on EPA Contract Laboratory Program (CLP) forms and CLP forms modified to accommodate 40 CFR 264 Appendix IX constituent analyses. Data analysis and presentation is performed utilizing SAS, a statistical analysis software program. Archivingmore » of data and results is performed at appropriate stages of data management. The DMS is useful for sampling and analysis programs where adherence to EPA CLP protocol, along with maintenance and retrieval of waste site investigation sampling results is desired or requested. 3 refs.« less

  16. Effects of pH, lactate, hematocrit and potassium level on the accuracy of continuous glucose monitoring (CGM) in pediatric intensive care unit.

    PubMed

    Marics, Gábor; Koncz, Levente; Eitler, Katalin; Vatai, Barbara; Szénási, Boglárka; Zakariás, David; Mikos, Borbála; Körner, Anna; Tóth-Heyn, Péter

    2015-03-19

    Continuous glucose monitoring (CGM) originally was developed for diabetic patients and it may be a useful tool for monitoring glucose changes in pediatric intensive care unit (PICU). Its use is, however, limited by the lack of sufficient data on its reliability at insufficient peripheral perfusion. We aimed to correlate the accuracy of CGM with laboratory markers relevant to disturbed tissue perfusion. In 38 pediatric patients (age range, 0-18 years) requiring intensive care we tested the effect of pH, lactate, hematocrit and serum potassium on the difference between CGM and meter glucose measurements. Guardian® (Medtronic®) CGM results were compared to GEM 3000 (Instrumentation laboratory®) and point-of-care measurements. The clinical accuracy of CGM was evaluated by Clarke Error Grid -, Bland-Altman analysis and Pearson's correlation. We used Friedman test for statistical analysis (statistical significance was established as a p < 0.05). CGM values exhibited a considerable variability without any correlation with the examined laboratory parameters. Clarke, Bland-Altman analysis and Pearson's correlation coefficient demonstrated a good clinical accuracy of CGM (zone A and B = 96%; the mean difference between reference and CGM glucose was 1,3 mg/dL, 48 from the 780 calibration pairs overrunning the 2 standard deviation; Pearson's correlation coefficient: 0.83). The accuracy of CGM measurements is independent of laboratory parameters relevant to tissue hypoperfusion. CGM may prove a reliable tool for continuous monitoring of glucose changes in PICUs, not much influenced by tissue perfusion, but still not appropriate for being the base for clinical decisions.

  17. Performance testing of NIOSH Method 5524/ASTM Method D-7049-04, for determination of metalworking fluids.

    PubMed

    Glaser, Robert; Kurimo, Robert; Shulman, Stanley

    2007-08-01

    A performance test of NIOSH Method 5524/ASTM Method D-7049-04 for analysis of metalworking fluids (MWF) was conducted. These methods involve determination of the total and extractable weights of MWF samples; extractions are performed using a ternary blend of toluene:dichloromethane:methanol and a binary blend of methanol:water. Six laboratories participated in this study. A preliminary analysis of 20 blank samples was made to familiarize the laboratories with the procedure(s) and to estimate the methods' limits of detection/quantitation (LODs/LOQs). Synthetically generated samples of a semisynthetic MWF aerosol were then collected on tared polytetrafluoroethylene (PTFE) filters and analyzed according to the methods by all participants. Sample masses deposited (approximately 400-500 micro g) corresponded to amounts expected in an 8-hr shift at the NIOSH recommended exposure levels (REL) of 0.4 mg/m(3) (thoracic) and 0.5 mg/m(3) (total particulate). The generator output was monitored with a calibrated laser particle counter. One laboratory significantly underreported the sampled masses relative to the other five labs. A follow-up study compared only gravimetric results of this laboratory with those of two other labs. In the preliminary analysis of blanks; the average LOQs were 0.094 mg for the total weight analysis and 0.136 mg for the extracted weight analyses. For the six-lab study, the average LOQs were 0.064 mg for the total weight analyses and 0.067 mg for the extracted weight analyses. Using ASTM conventions, h and k statistics were computed to determine the degree of consistency of each laboratory with the others. One laboratory experienced problems with precision but not bias. The precision estimates for the remaining five labs were not different statistically (alpha = 0.005) for either the total or extractable weights. For all six labs, the average fraction extracted was > or =0.94 (CV = 0.025). Pooled estimates of the total coefficients of variation of analysis were 0.13 for the total weight samples and 0.13 for the extracted weight samples. An overall method bias of -5% was determined by comparing the overall mean concentration reported by the participants to that determined by the particle counter. In the three-lab follow-up study, the nonconsistent lab reported results that were unbiased but statistically less precise than the others; the average LOQ was 0.133 mg for the total weight analyses. It is concluded that aerosolized MWF sampled at concentrations corresponding to either of the NIOSH RELs can generally be shipped unrefrigerated, stored refrigerated up to 7 days, and then analyzed quantitatively and precisely for MWF using the NIOSH/ASTM procedures.

  18. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  19. A comparison of two microscale laboratory reporting methods in a secondary chemistry classroom

    NASA Astrophysics Data System (ADS)

    Martinez, Lance Michael

    This study attempted to determine if there was a difference between the laboratory achievement of students who used a modified reporting method and those who used traditional laboratory reporting. The study also determined the relationships between laboratory performance scores and the independent variables score on the Group Assessment of Logical Thinking (GALT) test, chronological age in months, gender, and ethnicity for each of the treatment groups. The study was conducted using 113 high school students who were enrolled in first-year general chemistry classes at Pueblo South High School in Colorado. The research design used was the quasi-experimental Nonequivalent Control Group Design. The statistical treatment consisted of the Multiple Regression Analysis and the Analysis of Covariance. Based on the GALT, students in the two groups were generally in the concrete and transitional stages of the Piagetian cognitive levels. The findings of the study revealed that the traditional and the modified methods of laboratory reporting did not have any effect on the laboratory performance outcome of the subjects. However, the students who used the traditional method of reporting showed a higher laboratory performance score when evaluation was conducted using the New Standards rubric recommended by the state. Multiple Regression Analysis revealed that there was a significant relationship between the criterion variable student laboratory performance outcome of individuals who employed traditional laboratory reporting methods and the composite set of predictor variables. On the contrary, there was no significant relationship between the criterion variable student laboratory performance outcome of individuals who employed modified laboratory reporting methods and the composite set of predictor variables.

  20. Effect of the statin therapy on biochemical laboratory tests--a chemometrics study.

    PubMed

    Durceková, Tatiana; Mocák, Ján; Boronová, Katarína; Balla, Ján

    2011-01-05

    Statins are the first-line choice for lowering total and LDL cholesterol levels and very important medicaments for reducing the risk of coronary artery disease. The aim of this study is therefore assessment of the results of biochemical tests characterizing the condition of 172 patients before and after administration of statins. For this purpose, several chemometric tools, namely principal component analysis, cluster analysis, discriminant analysis, logistic regression, KNN classification, ROC analysis, descriptive statistics and ANOVA were used. Mutual relations of 11 biochemical laboratory tests, the patient's age and gender were investigated in detail. Achieved results enable to evaluate the extent of the statin treatment in each individual case. They may also help in monitoring the dynamic progression of the disease. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. A case-control study of malignant melanoma among Lawrence Livermore National Laboratory employees: A critical evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kupper, L.L.; Setzer, R.W.; Schwartzbaum, J.

    1987-07-01

    This document reports on a reevaluation of data obtained in a previous report on occupational factors associated with the development of malignant melanomas at Lawrence Livermore National Laboratory. The current report reduces the number of these factors from five to three based on a rigorous statistical analysis of the original data. Recommendations include restructuring the original questionnaire and trying to contact more individuals that worked with volatile photographic chemicals. 17 refs., 7 figs., 22 tabs. (TEM)

  2. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  3. New dimensions from statistical graphics for GIS (geographic information system) analysis and interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCord, R.A.; Olson, R.J.

    1988-01-01

    Environmental research and assessment activities at Oak Ridge National Laboratory (ORNL) include the analysis of spatial and temporal patterns of ecosystem response at a landscape scale. Analysis through use of geographic information system (GIS) involves an interaction between the user and thematic data sets frequently expressed as maps. A portion of GIS analysis has a mathematical or statistical aspect, especially for the analysis of temporal patterns. ARC/INFO is an excellent tool for manipulating GIS data and producing the appropriate map graphics. INFO also has some limited ability to produce statistical tabulation. At ORNL we have extended our capabilities by graphicallymore » interfacing ARC/INFO and SAS/GRAPH to provide a combined mapping and statistical graphics environment. With the data management, statistical, and graphics capabilities of SAS added to ARC/INFO, we have expanded the analytical and graphical dimensions of the GIS environment. Pie or bar charts, frequency curves, hydrographs, or scatter plots as produced by SAS can be added to maps from attribute data associated with ARC/INFO coverages. Numerous, small, simplified graphs can also become a source of complex map ''symbols.'' These additions extend the dimensions of GIS graphics to include time, details of the thematic composition, distribution, and interrelationships. 7 refs., 3 figs.« less

  4. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    PubMed

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.

  5. Integrating teaching and authentic research in the field and laboratory settings

    NASA Astrophysics Data System (ADS)

    Daryanto, S.; Wang, L.; Kaseke, K. F.; Ravi, S.

    2016-12-01

    Typically authentic research activities are separated from rigorous classroom teaching. Here we assessed the potential of integrating teaching and research activities both in the field and in the laboratory. We worked with students from both US and abroad without strong science background to utilize advanced environmental sensors and statistical tool to conduct innovative projects. The students include one from Namibia and two local high school students in Indianapolis (through Project SEED, Summer Experience for the Economically Disadvantaged). They conducted leaf potential measurements, isotope measurements and meta-analysis. The experience showed us the great potential of integrating teaching and research in both field and laboratory settings.

  6. Measurement and Predition Errors in Body Composition Assessment and the Search for the Perfect Prediction Equation.

    ERIC Educational Resources Information Center

    Katch, Frank I.; Katch, Victor L.

    1980-01-01

    Sources of error in body composition assessment by laboratory and field methods can be found in hydrostatic weighing, residual air volume, skinfolds, and circumferences. Statistical analysis can and should be used in the measurement of body composition. (CJ)

  7. Sampling surface and subsurface particle-size distributions in wadable gravel-and cobble-bed streams for analyses in sediment transport, hydraulics, and streambed monitoring

    Treesearch

    Kristin Bunte; Steven R. Abt

    2001-01-01

    This document provides guidance for sampling surface and subsurface sediment from wadable gravel-and cobble-bed streams. After a short introduction to streams types and classifications in gravel-bed rivers, the document explains the field and laboratory measurement of particle sizes and the statistical analysis of particle-size distributions. Analysis of particle...

  8. The Penny Experiment Revisited: An Illustration of Significant Figures, Accuracy, Precision, and Data Analysis

    ERIC Educational Resources Information Center

    Bularzik, Joseph

    2007-01-01

    Measuring the mass of many pennies has been used as an easy way to generate data for exercises with statistical analysis. In this general chemistry laboratory the densities of pennies are measured by weighting the pennies and using two different methods to measure the volumes. There is much to be discovered by the students on the variability of…

  9. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  10. A SURVEY OF LABORATORY AND STATISTICAL ISSUES RELATED TO FARMWORKER EXPOSURE STUDIES

    EPA Science Inventory

    Developing internally valid, and perhaps generalizable, farmworker exposure studies is a complex process that involves many statistical and laboratory considerations. Statistics are an integral component of each study beginning with the design stage and continuing to the final da...

  11. Laboratory animal science: a resource to improve the quality of science.

    PubMed

    Forni, M

    2007-08-01

    The contribution of animal experimentation to biomedical research is of undoubted value, nevertheless the real usefulness of animal models is still being hotly debated. Laboratory Animal Science is a multidisciplinary approach to humane animal experimentation that allows the choice of the correct animal model and the collection of unbiased data. Refinement, Reduction and Replacement, the "3Rs rule", are now widely accepted and have a major influence on animal experimentation procedures. Refinement, namely any decrease in the incidence or severity of inhumane procedures applied to animals, has been today extended to the entire lives of the experimental animals. Reduction of the number of animals used to obtain statistically significant data may be achieved by improving experimental design and statistical analysis of data. Replacement refers to the development of validated alternative methods. A Laboratory Animal Science training program in biomedical degrees can promote the 3Rs and improve the welfare of laboratory animals as well as the quality of science with ethical, scientific and economic advantages complying with the European requirement that "persons who carry out, take part in, or supervise procedures on animals, or take care of animals used in procedures, shall have had appropriate education and training".

  12. Experimental Analysis of Cell Function Using Cytoplasmic Streaming

    ERIC Educational Resources Information Center

    Janssens, Peter; Waldhuber, Megan

    2012-01-01

    This laboratory exercise investigates the phenomenon of cytoplasmic streaming in the fresh water alga "Nitella". Students use the fungal toxin cytochalasin D, an inhibitor of actin polymerization, to investigate the mechanism of streaming. Students use simple statistical methods to analyze their data. Typical student data are provided. (Contains 3…

  13. Visual Data Analysis for Satellites

    NASA Technical Reports Server (NTRS)

    Lau, Yee; Bhate, Sachin; Fitzpatrick, Patrick

    2008-01-01

    The Visual Data Analysis Package is a collection of programs and scripts that facilitate visual analysis of data available from NASA and NOAA satellites, as well as dropsonde, buoy, and conventional in-situ observations. The package features utilities for data extraction, data quality control, statistical analysis, and data visualization. The Hierarchical Data Format (HDF) satellite data extraction routines from NASA's Jet Propulsion Laboratory were customized for specific spatial coverage and file input/output. Statistical analysis includes the calculation of the relative error, the absolute error, and the root mean square error. Other capabilities include curve fitting through the data points to fill in missing data points between satellite passes or where clouds obscure satellite data. For data visualization, the software provides customizable Generic Mapping Tool (GMT) scripts to generate difference maps, scatter plots, line plots, vector plots, histograms, timeseries, and color fill images.

  14. Three-wave and four-wave interactions in gravity wave turbulence

    NASA Astrophysics Data System (ADS)

    Aubourg, Quentin; Campagne, Antoine; Peureux, Charles; Ardhuin, Fabrice; Sommeria, Joel; Viboud, Samuel; Mordant, Nicolas

    2017-11-01

    Weak-turbulence theory is a statistical framework to describe a large ensemble of nonlinearly interacting waves. The archetypal example of such system is the ocean surface that is made of interacting surface gravity waves. Here we describe a laboratory experiment dedicated to probe the statistical properties of turbulent gravity waves. We set up an isotropic state of interacting gravity waves in the Coriolis facility (13-m-diam circular wave tank) by exciting waves at 1 Hz by wedge wave makers. We implement a stereoscopic technique to obtain a measurement of the surface elevation that is resolved in both space and time. Fourier analysis shows that the laboratory spectra are systematically steeper than the theoretical predictions and the field observations in the Black Sea by Leckler et al. [F. Leckler et al., J. Phys. Oceanogr. 45, 2484 (2015), 10.1175/JPO-D-14-0237.1]. We identify a strong impact of surface dissipation on the scaling of the Fourier spectrum at the scales that are accessible in the experiments. We use bicoherence and tricoherence statistical tools in frequency and/or wave-vector space to identify the active nonlinear coupling. These analyses are also performed on the field data by Leckler et al. for comparison with the laboratory data. Three-wave coupling is characterized by and shown to involve mostly quasiresonances of waves with second- or higher-order harmonics. Four-wave coupling is not observed in the laboratory but is evidenced in the field data. We discuss temporal scale separation to explain our observations.

  15. Analysis of negative historical control group data from the in vitro micronucleus assay using TK6 cells.

    PubMed

    Lovell, David P; Fellows, Mick; Marchetti, Francesco; Christiansen, Joan; Elhajouji, Azeddine; Hashimoto, Kiyohiro; Kasamoto, Sawako; Li, Yan; Masayasu, Ozaki; Moore, Martha M; Schuler, Maik; Smith, Robert; Stankowski, Leon F; Tanaka, Jin; Tanir, Jennifer Y; Thybaud, Veronique; Van Goethem, Freddy; Whitwell, James

    2018-01-01

    The recent revisions of the Organisation for Economic Co-operation and Development (OECD) genetic toxicology test guidelines emphasize the importance of historical negative controls both for data quality and interpretation. The goal of a HESI Genetic Toxicology Technical Committee (GTTC) workgroup was to collect data from participating laboratories and to conduct a statistical analysis to understand and publish the range of values that are normally seen in experienced laboratories using TK6 cells to conduct the in vitro micronucleus assay. Data from negative control samples from in vitro micronucleus assays using TK6 cells from 13 laboratories were collected using a standard collection form. Although in some cases statistically significant differences can be seen within laboratories for different test conditions, they were very small. The mean incidence of micronucleated cells/1000 cells ranged from 3.2/1000 to 13.8/1000. These almost four-fold differences in micronucleus levels cannot be explained by differences in scoring method, presence or absence of exogenous metabolic activation (S9), length of treatment, presence or absence of cytochalasin B or different solvents used as vehicles. The range of means from the four laboratories using flow cytometry methods (3.7-fold: 3.5-12.9 micronucleated cells/1000 cells) was similar to that from the nine laboratories using other scoring methods (4.3-fold: 3.2-13.8 micronucleated cells/1000 cells). No laboratory could be identified as an outlier or as showing unacceptably high variability. Quality Control (QC) methods applied to analyse the intra-laboratory variability showed that there was evidence of inter-experimental variability greater than would be expected by chance (i.e. over-dispersion). However, in general, this was low. This study demonstrates the value of QC methods in helping to analyse the reproducibility of results, building up a 'normal' range of values, and as an aid to identify variability within a laboratory in order to implement processes to maintain and improve uniformity. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. [Investigation of reference intervals of blood gas and acid-base analysis assays in China].

    PubMed

    Zhang, Lu; Wang, Wei; Wang, Zhiguo

    2015-10-01

    To investigate and analyze the upper and lower limits and their sources of reference intervals in blood gas and acid-base analysis assays. The data of reference intervals were collected, which come from the first run of 2014 External Quality Assessment (EQA) program in blood gas and acid-base analysis assays performed by National Center for Clinical Laboratories (NCCL). All the abnormal values and errors were eliminated. Data statistics was performed by SPSS 13.0 and Excel 2007 referring to upper and lower limits of reference intervals and sources of 7 blood gas and acid-base analysis assays, i.e. pH value, partial pressure of carbon dioxide (PCO2), partial pressure of oxygen (PO2), Na+, K+, Ca2+ and Cl-. Values were further grouped based on instrument system and the difference between each group were analyzed. There were 225 laboratories submitting the information on the reference intervals they had been using. The three main sources of reference intervals were National Guide to Clinical Laboratory Procedures [37.07% (400/1 079)], instructions of instrument manufactures [31.23% (337/1 079)] and instructions of reagent manufactures [23.26% (251/1 079)]. Approximately 35.1% (79/225) of the laboratories had validated the reference intervals they used. The difference of upper and lower limits in most assays among 7 laboratories was moderate, both minimum and maximum (i.e. the upper limits of pH value was 7.00-7.45, the lower limits of Na+ was 130.00-156.00 mmol/L), and mean and median (i.e. the upper limits of K+ was 5.04 mmol/L and 5.10 mmol/L, the upper limits of PCO2 was 45.65 mmHg and 45.00 mmHg, 1 mmHg = 0.133 kPa), as well as the difference in P2.5 and P97.5 between each instrument system group. It was shown by Kruskal-Wallis method that the P values of upper and lower limits of all the parameters were lower than 0.001, expecting the lower limits of Na+ with P value 0.029. It was shown by Mann-Whitney that the statistic differences were found among instrument system groups and between most of two instrument system groups in all assays. The difference of reference intervals of blood gas and acid-base analysis assays used in China laboratories is moderate, which is better than other specialties in clinical laboratories.

  17. Soils element activities for the period October 1973--September 1974

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, E.B.; Essington, E.H.; White, M.G.

    Soils Element activities were conducted on behalf of the U. S. Atomic Energy Commission's Nevada Applied Ecology Group (NAEG) program to provide source term information for the other program elements and maintain continuous cognizance of program requirements for sampling, sample preparation, and analysis. Activities included presentation of papers; participation in workshops; analysis of soil, vegetation, and animal tissue samples for $sup 238$Pu, $sup 239-240$Pu, $sup 241$Am, $sup 137$Cs, $sup 60$Co, and gamma scan for routine and laboratory quality control purposes; preparation and analysis of animal tissue samples for NAEG laboratory certification; studies on a number of analytical, sample preparation, andmore » sample collection procedures; and contributions to the evaluation of procedures for calculation of specialized counting statistics. (auth)« less

  18. Environmental Impact Assessment Sandia Laboratories, New Mexico.

    DTIC Science & Technology

    1977-05-01

    an airplane crashing into the tank farm on takeoff or land- ing (Appendix E). The essence of the analysis is that national statistics indicate that a...hap- pening is analyzed. The essence of the analysis is the estimation of several probabilities: of an aircraft being in or flying into this airspace...Carrot Family 245. Chimaya (Cymopterus fendleri) *246. (Aletes acaulis) Primrose Family *247. Rock jasmine (Androsace, septentrionalis) 189 Olive Family

  19. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    PubMed Central

    Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.

    2009-01-01

    In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212

  20. Identification of crop cultivars with consistently high lignocellulosic sugar release requires the use of appropriate statistical design and modelling

    PubMed Central

    2013-01-01

    Background In this study, a multi-parent population of barley cultivars was grown in the field for two consecutive years and then straw saccharification (sugar release by enzymes) was subsequently analysed in the laboratory to identify the cultivars with the highest consistent sugar yield. This experiment was used to assess the benefit of accounting for both the multi-phase and multi-environment aspects of large-scale phenotyping experiments with field-grown germplasm through sound statistical design and analysis. Results Complementary designs at both the field and laboratory phases of the experiment ensured that non-genetic sources of variation could be separated from the genetic variation of cultivars, which was the main target of the study. The field phase included biological replication and plot randomisation. The laboratory phase employed re-randomisation and technical replication of samples within a batch, with a subset of cultivars chosen as duplicates that were randomly allocated across batches. The resulting data was analysed using a linear mixed model that incorporated field and laboratory variation and a cultivar by trial interaction, and ensured that the cultivar means were more accurately represented than if the non-genetic variation was ignored. The heritability detected was more than doubled in each year of the trial by accounting for the non-genetic variation in the analysis, clearly showing the benefit of this design and approach. Conclusions The importance of accounting for both field and laboratory variation, as well as the cultivar by trial interaction, by fitting a single statistical model (multi-environment trial, MET, model), was evidenced by the changes in list of the top 40 cultivars showing the highest sugar yields. Failure to account for this interaction resulted in only eight cultivars that were consistently in the top 40 in different years. The correspondence between the rankings of cultivars was much higher at 25 in the MET model. This approach is suited to any multi-phase and multi-environment population-based genetic experiment. PMID:24359577

  1. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  2. Real-time, continuous water-quality monitoring in Indiana and Kentucky

    USGS Publications Warehouse

    Shoda, Megan E.; Lathrop, Timothy R.; Risch, Martin R.

    2015-01-01

    Water-quality “super” gages (also known as “sentry” gages) provide real-time, continuous measurements of the physical and chemical characteristics of stream water at or near selected U.S. Geological Survey (USGS) streamgages in Indiana and Kentucky. A super gage includes streamflow and water-quality instrumentation and representative stream sample collection for laboratory analysis. USGS scientists can use statistical surrogate models to relate instrument values to analyzed chemical concentrations at a super gage. Real-time, continuous and laboratory-analyzed concentration and load data are publicly accessible on USGS Web pages.

  3. Evaluation of a New Spraying Machine for Barrier Treatment and Penetration of Bifenthrin on Vegetation Against Mosquitoes

    DTIC Science & Technology

    2015-03-01

    one at the University of Florida Veterinary Entomology Laboratory (UF- VEL). Leaf samples for both laboratories were collected together. All samples...Mulla’s formula (Mulla et al. 1971): % reduction 5 100 2 (C1/T1 3 T2/C2) 3 100. The C1 variable was the mean number of mosquitoes from the control site...statistical analysis was performed using JMP 11.1 software (SAS Insti- tute Inc., Cary, NC). Treatment mortality was corrected with Abbott’s formula

  4. Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Veal, William R.; Taylor, Dawne; Rogers, Amy L.

    2009-03-01

    Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.

  5. Appraisal of within- and between-laboratory reproducibility of non-radioisotopic local lymph node assay using flow cytometry, LLNA:BrdU-FCM: comparison of OECD TG429 performance standard and statistical evaluation.

    PubMed

    Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin

    2015-05-05

    Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. A Laboratory Experiment on the Statistical Theory of Nuclear Reactions

    ERIC Educational Resources Information Center

    Loveland, Walter

    1971-01-01

    Describes an undergraduate laboratory experiment on the statistical theory of nuclear reactions. The experiment involves measuring the relative cross sections for formation of a nucleus in its meta stable excited state and its ground state by applying gamma-ray spectroscopy to an irradiated sample. Involves 3-4 hours of laboratory time plus…

  7. The effect of ion-exchange purification on the determination of plutonium at the New Brunswick Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, W.G.; Spaletto, M.I.; Lewis, K.

    The method of plutonium (Pu) determination at the Brunswick Laboratory (NBL) consists of a combination of ion-exchange purification followed by controlled-potential coulometric analysis (IE/CPC). The present report's purpose is to quantify any detectable Pu loss occurring in the ion-exchange (IE) purification step which would cause a negative bias in the NBL method for Pu analysis. The magnitude of any such loss would be contained within the reproducibility (0.05%) of the IE/CPC method which utilizes a state-of-the-art autocoulometer developed at NBL. When the NBL IE/CPC method is used for Pu analysis, any loss in ion-exchange purification (<0.05%) is confounded with themore » repeatability of the ion-exchange and the precision of the CPC analysis technique (<0.05%). Consequently, to detect a bias in the IE/CPC method due to the IE alone using the IE/CPC method itself requires that many randomized analyses on a single material be performed over time and that statistical analysis of the data be performed. The initial approach described in this report to quantify any IE loss was an independent method, Isotope Dilution Mass Spectrometry; however, the number of analyses performed was insufficient to assign a statistically significant value to the IE loss (<0.02% of 10 mg samples of Pu). The second method used for quantifying any IE loss of Pu was multiple ion exchanges of the same Pu aliquant; the small number of analyses possible per individual IE together with the column-to-column variability over multiple ion exchanges prevented statistical detection of any loss of <0.05%. 12 refs.« less

  8. Use of statistical tools to evaluate the reductive dechlorination of high levels of TCE in microcosm studies.

    PubMed

    Harkness, Mark; Fisher, Angela; Lee, Michael D; Mack, E Erin; Payne, Jo Ann; Dworatzek, Sandra; Roberts, Jeff; Acheson, Carolyn; Herrmann, Ronald; Possolo, Antonio

    2012-04-01

    A large, multi-laboratory microcosm study was performed to select amendments for supporting reductive dechlorination of high levels of trichloroethylene (TCE) found at an industrial site in the United Kingdom (UK) containing dense non-aqueous phase liquid (DNAPL) TCE. The study was designed as a fractional factorial experiment involving 177 bottles distributed between four industrial laboratories and was used to assess the impact of six electron donors, bioaugmentation, addition of supplemental nutrients, and two TCE levels (0.57 and 1.90 mM or 75 and 250 mg/L in the aqueous phase) on TCE dechlorination. Performance was assessed based on the concentration changes of TCE and reductive dechlorination degradation products. The chemical data was evaluated using analysis of variance (ANOVA) and survival analysis techniques to determine both main effects and important interactions for all the experimental variables during the 203-day study. The statistically based design and analysis provided powerful tools that aided decision-making for field application of this technology. The analysis showed that emulsified vegetable oil (EVO), lactate, and methanol were the most effective electron donors, promoting rapid and complete dechlorination of TCE to ethene. Bioaugmentation and nutrient addition also had a statistically significant positive impact on TCE dechlorination. In addition, the microbial community was measured using phospholipid fatty acid analysis (PLFA) for quantification of total biomass and characterization of the community structure and quantitative polymerase chain reaction (qPCR) for enumeration of Dehalococcoides organisms (Dhc) and the vinyl chloride reductase (vcrA) gene. The highest increase in levels of total biomass and Dhc was observed in the EVO microcosms, which correlated well with the dechlorination results. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Determination of Soil Moisture Content using Laboratory Experimental and Field Electrical Resistivity Values

    NASA Astrophysics Data System (ADS)

    Hazreek, Z. A. M.; Rosli, S.; Fauziah, A.; Wijeyesekera, D. C.; Ashraf, M. I. M.; Faizal, T. B. M.; Kamarudin, A. F.; Rais, Y.; Dan, M. F. Md; Azhar, A. T. S.; Hafiz, Z. M.

    2018-04-01

    The efficiency of civil engineering structure require comprehensive geotechnical data obtained from site investigation. In the past, conventional site investigation was heavily related to drilling techniques thus suffer from several limitations such as time consuming, expensive and limited data collection. Consequently, this study presents determination of soil moisture content using laboratory experimental and field electrical resistivity values (ERV). Field and laboratory electrical resistivity (ER) test were performed using ABEM SAS4000 and Nilsson400 soil resistance meter. Soil sample used for resistivity test was tested for characterization test specifically on particle size distribution and moisture content test according to BS1377 (1990). Field ER data was processed using RES2DINV software while laboratory ER data was analyzed using SPSS and Excel software. Correlation of ERV and moisture content shows some medium relationship due to its r = 0.506. Moreover, coefficient of determination, R2 analyzed has demonstrate that the statistical correlation obtain was very good due to its R2 value of 0.9382. In order to determine soil moisture content based on statistical correlation (w = 110.68ρ-0.347), correction factor, C was established through laboratory and field ERV given as 19.27. Finally, this study has shown that soil basic geotechnical properties with particular reference to water content was applicably determined using integration of laboratory and field ERV data analysis thus able to compliment conventional approach due to its economic, fast and wider data coverage.

  10. Reconnection properties in Kelvin-Helmholtz instabilities

    NASA Astrophysics Data System (ADS)

    Vernisse, Y.; Lavraud, B.; Eriksson, S.; Gershman, D. J.; Dorelli, J.; Pollock, C. J.; Giles, B. L.; Aunai, N.; Avanov, L. A.; Burch, J.; Chandler, M. O.; Coffey, V. N.; Dargent, J.; Ergun, R.; Farrugia, C. J.; Genot, V. N.; Graham, D.; Hasegawa, H.; Jacquey, C.; Kacem, I.; Khotyaintsev, Y. V.; Li, W.; Magnes, W.; Marchaudon, A.; Moore, T. E.; Paterson, W. R.; Penou, E.; Phan, T.; Retino, A.; Schwartz, S. J.; Saito, Y.; Sauvaud, J. A.; Schiff, C.; Torbert, R. B.; Wilder, F. D.; Yokota, S.

    2017-12-01

    Kelvin-Helmholtz instabilities are particular laboratories to study strong guide field reconnection processes. In particular, unlike the usual dayside magnetopause, the conditions across the magnetopause in KH vortices are quasi-symmetric, with low differences in beta and magnetic shear angle. We study these properties by means of statistical analysis of the high-resolution data of the Magnetospheric Multiscale mission. Several events of Kelvin-Helmholtz instabilities pas the terminator plane and a long lasting dayside instabilities event where used in order to produce this statistical analysis. Early results present a consistency between the data and the theory. In addition, the results emphasize the importance of the thickness of the magnetopause as a driver of magnetic reconnection in low magnetic shear events.

  11. How To Make an Impact with Planetary Science. Part II.

    ERIC Educational Resources Information Center

    Scott, Robert

    2002-01-01

    Explains how the moon provides information about the evolution of the solar system and offers scope for physics-based investigations. Uses statistical analysis of real scientific data with which students can predict the diameter and depth of impact craters then compare them with data gathered in institutions or laboratories. (Author/YDS)

  12. Argonne National Laboratory Li-alloy/FeS cell testing and R and D programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, E.C.

    1982-01-01

    Groups of 12 or more identical Li-alloy/FeS cells fabricated by Eagle-Picher Industries, Inc. and Gould Inc. were operated at Argonne National Laboratory (ANL) in the status cell test program to obtain data for statistical analysis of cell cycle life and failure modes. The cells were full-size electric vehicle battery cells (150 to 350 Ah capacity) and they were cycled at the 4-h discharge rate and 8-h charge rate. The end of life was defined as a 20% loss of capacity or a decrease in the coulombic efficiency to less than 95%. Seventy-four cells (six groups of identical cells) were cycle-lifemore » tested and the results were analyzed statistically. The ultimate goal of this analysis was to predict cell and battery reliability. Testing of groups of identical cells also provided a means of identifying common failure modes which were eliminated by cell design changes. Mean time to failure (MTTF) for the cells based on the Weibull distribution is presented.« less

  13. Statistical analysis of Turbine Engine Diagnostic (TED) field test data

    NASA Astrophysics Data System (ADS)

    Taylor, Malcolm S.; Monyak, John T.

    1994-11-01

    During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.

  14. Scaling similarities of multiple fracturing of solid materials

    NASA Astrophysics Data System (ADS)

    Kapiris, P. G.; Balasis, G. T.; Kopanas, J. A.; Antonopoulos, G. N.; Peratzakis, A. S.; Eftaxias, K. A.

    2004-02-01

    It has recently reported that electromagnetic flashes of low-energy gamma-rays emitted during multi-fracturing on a neutron star, and electromagnetic pulses emitted in the laboratory by a disordered material subjected to an increasing external load, share distinctive statistical properties with earthquakes, such as power-law energy distributions (Cheng et al., 1996; Kossobokov et al., 2000; Rabinovitch et al., 2001; Sornette and Helmstetter, 2002). The neutron starquakes may release strain energies up to 1046 erg, while, the fractures in laboratory samples release strain energies approximately a fraction of an erg. An earthquake fault region can build up strain energy up to approximately 1026 erg for the strongest earthquakes. Clear sequences of kilohertz-megahertz electromagnetic avalanches have been detected from a few days up to a few hours prior to recent destructive earthquakes in Greece. A question that arises effortlessly is if the pre-seismic electromagnetic fluctuations also share the same statistical properties. Our study justifies a positive answer. Our analysis also reveals "symptoms" of a transition to the main rupture common with earthquake sequences and acoustic emission pulses observed during laboratory experiments (Maes et al., 1998).

  15. Statistics of natural movements are reflected in motor errors.

    PubMed

    Howard, Ian S; Ingram, James N; Körding, Konrad P; Wolpert, Daniel M

    2009-09-01

    Humans use their arms to engage in a wide variety of motor tasks during everyday life. However, little is known about the statistics of these natural arm movements. Studies of the sensory system have shown that the statistics of sensory inputs are key to determining sensory processing. We hypothesized that the statistics of natural everyday movements may, in a similar way, influence motor performance as measured in laboratory-based tasks. We developed a portable motion-tracking system that could be worn by subjects as they went about their daily routine outside of a laboratory setting. We found that the well-documented symmetry bias is reflected in the relative incidence of movements made during everyday tasks. Specifically, symmetric and antisymmetric movements are predominant at low frequencies, whereas only symmetric movements are predominant at high frequencies. Moreover, the statistics of natural movements, that is, their relative incidence, correlated with subjects' performance on a laboratory-based phase-tracking task. These results provide a link between natural movement statistics and motor performance and confirm that the symmetry bias documented in laboratory studies is a natural feature of human movement.

  16. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort

    PubMed Central

    Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael

    2008-01-01

    Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687

  17. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  18. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other... a report indicating that not enough testing was conducted to warrant a summary. You may transmit the...

  19. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other... a report indicating that not enough testing was conducted to warrant a summary. You may transmit the...

  20. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other... a report indicating that not enough testing was conducted to warrant a summary. You may transmit the...

  1. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Drug Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other... a report indicating that not enough testing was conducted to warrant a summary. You may transmit the...

  2. FY 1999 Laboratory Directed Research and Development annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PJ Hughes

    2000-06-13

    A short synopsis of each project is given covering the following main areas of research and development: Atmospheric sciences; Biotechnology; Chemical and instrumentation analysis; Computer and information science; Design and manufacture engineering; Ecological science; Electronics and sensors; Experimental technology; Health protection and dosimetry; Hydrologic and geologic science; Marine sciences; Materials science; Nuclear science and engineering; Process science and engineering; Sociotechnical systems analysis; Statistics and applied mathematics; and Thermal and energy systems.

  3. Clinical pharmacology quality assurance program: models for longitudinal analysis of antiretroviral proficiency testing for international laboratories.

    PubMed

    DiFrancesco, Robin; Rosenkranz, Susan L; Taylor, Charlene R; Pande, Poonam G; Siminski, Suzanne M; Jenny, Richard W; Morse, Gene D

    2013-10-01

    Among National Institutes of Health HIV Research Networks conducting multicenter trials, samples from protocols that span several years are analyzed at multiple clinical pharmacology laboratories (CPLs) for multiple antiretrovirals. Drug assay data are, in turn, entered into study-specific data sets that are used for pharmacokinetic analyses, merged to conduct cross-protocol pharmacokinetic analysis, and integrated with pharmacogenomics research to investigate pharmacokinetic-pharmacogenetic associations. The CPLs participate in a semiannual proficiency testing (PT) program implemented by the Clinical Pharmacology Quality Assurance program. Using results from multiple PT rounds, longitudinal analyses of recovery are reflective of accuracy and precision within/across laboratories. The objectives of this longitudinal analysis of PT across multiple CPLs were to develop and test statistical models that longitudinally: (1) assess the precision and accuracy of concentrations reported by individual CPLs and (2) determine factors associated with round-specific and long-term assay accuracy, precision, and bias using a new regression model. A measure of absolute recovery is explored as a simultaneous measure of accuracy and precision. Overall, the analysis outcomes assured 97% accuracy (±20% of the final target concentration of all (21) drug concentration results reported for clinical trial samples by multiple CPLs). Using the Clinical Laboratory Improvement Act acceptance of meeting criteria for ≥2/3 consecutive rounds, all 10 laboratories that participated in 3 or more rounds per analyte maintained Clinical Laboratory Improvement Act proficiency. Significant associations were present between magnitude of error and CPL (Kruskal-Wallis P < 0.001) and antiretroviral (Kruskal-Wallis P < 0.001).

  4. Chromosome aberration analysis in peripheral lymphocytes of Gulf War and Balkans War veterans.

    PubMed

    Schröder, H; Heimers, A; Frentzel-Beyme, R; Schott, A; Hoffmann, W

    2003-01-01

    Chromosome aberrations and sister chromatid exchanges (SCEs) were determined in standard peripheral lymphocyte metaphase preparations of 13 British Gulf War veterans, two veterans of the recent war in the Balkans and one veteran of both wars. All 16 volunteers suspect exposures to depleted uranium (DU) while deployed at the two different theatres of war in 1990 and later on. The Bremen laboratory control served as a reference in this study. Compared with this control there was a statistically significant increase in the frequency of dicentric chromosomes (dic) and centric ring chromosomes (cR) in the veterans' group. indicating a previous exposure to ionising radiation. The statistically significant overdispersion of die and cR indicates non-uniform irradiation as would be expected after non-uniform exposure and/or exposure to radiation with a high linear energy transfer (LET). The frequency of SCEs was decreased when compared with the laboratory control.

  5. Recovery of several volatile organic compounds from simulated water samples: Effect of transport and storage

    USGS Publications Warehouse

    Friedman, L.C.; Schroder, L.J.; Brooks, M.G.

    1986-01-01

    Solutions containing volatile organic compounds were prepared in organic-free water and 2% methanol and submitted to two U.S. Geological Survey laboratories. Data from the determination of volatile compounds in these samples were compared to analytical data for the same volatile compounds that had been kept in solutions 100 times more concentrated until immediately before analysis; there was no statistically significant difference in the analytical recoveries. Addition of 2% methanol to the storage containers hindered the recovery of bromomethane and vinyl chloride. Methanol addition did not enhance sample stability. Further, there was no statistically significant difference in results from the two laboratories, and the recovery efficiency was more than 80% in more than half of the determinations made. In a subsequent study, six of eight volatile compounds showed no significant loss of recovery after 34 days.

  6. A Laboratory Experiment, Based on the Maillard Reaction, Conducted as a Project in Introductory Statistics

    ERIC Educational Resources Information Center

    Kravchuk, Olena; Elliott, Antony; Bhandari, Bhesh

    2005-01-01

    A simple laboratory experiment, based on the Maillard reaction, served as a project in Introductory Statistics for undergraduates in Food Science and Technology. By using the principles of randomization and replication and reflecting on the sources of variation in the experimental data, students reinforced the statistical concepts and techniques…

  7. Evaluation of Brazilian Sugarcane Bagasse Characterization: An Interlaboratory Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sluiter, Justin B.; Chum, Helena; Gomes, Absai C.

    2016-05-01

    This paper describes a study of the variability of measured composition for a single bulk sugarcane bagasse conducted across eight laboratories using similar analytical methods, with the purpose of determining the expected variation for compositional analysis performed by different laboratories. The results show good agreement of measured composition within a single laboratory, but greater variability when results are compared among laboratories. These interlaboratory variabilities do not seem to be associated with a specific method or technique or any single piece of instrumentation. The summary censored statistics provide mean values and pooled standard deviations as follows: total extractives 6.7% (0.6%), wholemore » ash 1.5% (0.2%), glucan 42.3% (1.2%), xylan 22.3% (0.5%), total lignin 21.3% (0.4%), and total mass closure 99.4% (2.9%).« less

  8. WHO Melting-Point Reference Substances

    PubMed Central

    Bervenmark, H.; Diding, N. Å.; Öhrner, B.

    1963-01-01

    Batches of 13 highly purified chemicals, intended for use as reference substances in the calibration of apparatus for melting-point determinations, have been subjected to a collaborative assay by 15 laboratories in 13 countries. All the laboratories performed melting-point determinations by the capillary methods described in the proposed text for the second edition of the Pharmacopoea Internationalis and some, in addition, carried out determinations by the microscope hot stage (Kofler) method, using both the “going-through” and the “equilibrium” technique. Statistical analysis of the data obtained by the capillary method showed that the within-laboratory variation was small and that the between-laboratory variation, though constituting the greatest part of the whole variance, was not such as to warrant the exclusion of any laboratory from the evaluation of the results. The average values of the melting-points obtained by the laboratories can therefore be used as constants for the substances in question, which have accordingly been established as WHO Melting-Point Reference Substances and included in the WHO collection of authentic chemical substances. As to the microscope hot stage method, analysis of the results indicated that the values obtained by the “going-through” technique did not differ significantly from those obtained by the capillary method, but the values obtained by the “equilibrium” technique were mostly significantly lower. PMID:20604137

  9. [How reliable is the monitoring for doping?].

    PubMed

    Hüsler, J

    1990-12-01

    The reliability of the dope control, of the chemical analysis of the urine probes in the accredited laboratories and their decisions, is discussed using probabilistic and statistical methods. Basically, we evaluated and estimated the positive predictive value which means the probability that an urine probe contains prohibited dope substances given a positive test decision. Since there are not statistical data and evidence for some important quantities in relation to the predictive value, an exact evaluation is not possible, only conservative, lower bounds can be given. We found that the predictive value is at least 90% or 95% with respect to the analysis and decision based on the A-probe only, and at least 99% with respect to both A- and B-probes. A more realistic observation, but without sufficient statistical confidence, points to the fact that the true predictive value is significantly larger than these lower estimates.

  10. A statistically derived index for classifying East Coast fever reactions in cattle challenged with Theileria parva under experimental conditions.

    PubMed

    Rowlands, G J; Musoke, A J; Morzaria, S P; Nagda, S M; Ballingall, K T; McKeever, D J

    2000-04-01

    A statistically derived disease reaction index based on parasitological, clinical and haematological measurements observed in 309 5 to 8-month-old Boran cattle following laboratory challenge with Theileria parva is described. Principal component analysis was applied to 13 measures including first appearance of schizonts, first appearance of piroplasms and first occurrence of pyrexia, together with the duration and severity of these symptoms, and white blood cell count. The first principal component, which was based on approximately equal contributions of the 13 variables, provided the definition for the disease reaction index, defined on a scale of 0-10. As well as providing a more objective measure of the severity of the reaction, the continuous nature of the index score enables more powerful statistical analysis of the data compared with that which has been previously possible through clinically derived categories of non-, mild, moderate and severe reactions.

  11. The statistical analysis of circadian phase and amplitude in constant-routine core-temperature data

    NASA Technical Reports Server (NTRS)

    Brown, E. N.; Czeisler, C. A.

    1992-01-01

    Accurate estimation of the phases and amplitude of the endogenous circadian pacemaker from constant-routine core-temperature series is crucial for making inferences about the properties of the human biological clock from data collected under this protocol. This paper presents a set of statistical methods based on a harmonic-regression-plus-correlated-noise model for estimating the phases and the amplitude of the endogenous circadian pacemaker from constant-routine core-temperature data. The methods include a Bayesian Monte Carlo procedure for computing the uncertainty in these circadian functions. We illustrate the techniques with a detailed study of a single subject's core-temperature series and describe their relationship to other statistical methods for circadian data analysis. In our laboratory, these methods have been successfully used to analyze more than 300 constant routines and provide a highly reliable means of extracting phase and amplitude information from core-temperature data.

  12. Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis.

    PubMed

    Shrout, Patrick E; Rodgers, Joseph L

    2018-01-04

    Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

  13. A Laboratory Exercise for Ecology Teaching: The Use of Photographs in Detecting Dispersion Patterns in Animals

    ERIC Educational Resources Information Center

    Lenton, G. M.

    1975-01-01

    Photographs of a beetle, Catamerus rugosus, were taken at different stages in its life cycle. Students were asked to relate these to real life and carry out a statistical analysis to determine the degree of dispersion of animals. Results demonstrate a change in dispersion throughout the life cycle. (Author/EB)

  14. Fatigue Countermeasures in Support of CF CC130 Air Transport Operations; from the Operation to the Laboratory and Back to the Operation

    DTIC Science & Technology

    2003-10-01

    Toujours comparativement au placebo, les sujets ayant pris du zopiclone avaient eu moins de difficulté à s’endormir (p < 0,001), s’étaient réveillés...5 Multitask (MT)........................................................................... 6 Experimental Design Considerations...Experimental Design ............................................................................ 19 Statistical Analysis

  15. Integrative Analysis of Salmonellosis Outbreaks in Israel 1999-2012 Revealed an Invasive S. enterica Serovar 9,12:l,v:- and Endemic S. Typhimurium DT104 strain

    USDA-ARS?s Scientific Manuscript database

    Salmonella enterica is the leading etiologic agent of bacterial foodborne outbreaks worldwide. Methods. Laboratory-based statistical surveillance, molecular and genomics analyses were applied to characterize Salmonella outbreaks pattern in Israel. 65,087 Salmonella isolates reported to the National ...

  16. Patient safety in the clinical laboratory: a longitudinal analysis of specimen identification errors.

    PubMed

    Wagar, Elizabeth A; Tamashiro, Lorraine; Yasin, Bushra; Hilborne, Lee; Bruckner, David A

    2006-11-01

    Patient safety is an increasingly visible and important mission for clinical laboratories. Attention to improving processes related to patient identification and specimen labeling is being paid by accreditation and regulatory organizations because errors in these areas that jeopardize patient safety are common and avoidable through improvement in the total testing process. To assess patient identification and specimen labeling improvement after multiple implementation projects using longitudinal statistical tools. Specimen errors were categorized by a multidisciplinary health care team. Patient identification errors were grouped into 3 categories: (1) specimen/requisition mismatch, (2) unlabeled specimens, and (3) mislabeled specimens. Specimens with these types of identification errors were compared preimplementation and postimplementation for 3 patient safety projects: (1) reorganization of phlebotomy (4 months); (2) introduction of an electronic event reporting system (10 months); and (3) activation of an automated processing system (14 months) for a 24-month period, using trend analysis and Student t test statistics. Of 16,632 total specimen errors, mislabeled specimens, requisition mismatches, and unlabeled specimens represented 1.0%, 6.3%, and 4.6% of errors, respectively. Student t test showed a significant decrease in the most serious error, mislabeled specimens (P < .001) when compared to before implementation of the 3 patient safety projects. Trend analysis demonstrated decreases in all 3 error types for 26 months. Applying performance-improvement strategies that focus longitudinally on specimen labeling errors can significantly reduce errors, therefore improving patient safety. This is an important area in which laboratory professionals, working in interdisciplinary teams, can improve safety and outcomes of care.

  17. Analysis/forecast experiments with a multivariate statistical analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1985-01-01

    A three-dimensional, multivariate, statistical analysis method, optimal interpolation (OI) is described for modeling meteorological data from widely dispersed sites. The model was developed to analyze FGGE data at the NASA-Goddard Laboratory of Atmospherics. The model features a multivariate surface analysis over the oceans, including maintenance of the Ekman balance and a geographically dependent correlation function. Preliminary comparisons are made between the OI model and similar schemes employed at the European Center for Medium Range Weather Forecasts and the National Meteorological Center. The OI scheme is used to provide input to a GCM, and model error correlations are calculated for forecasts of 500 mb vertical water mixing ratios and the wind profiles. Comparisons are made between the predictions and measured data. The model is shown to be as accurate as a successive corrections model out to 4.5 days.

  18. Using Pooled Data and Data Visualization to Introduce Statistical Concepts in the General Chemistry Laboratory

    ERIC Educational Resources Information Center

    Olsen, Robert J.

    2008-01-01

    I describe how data pooling and data visualization can be employed in the first-semester general chemistry laboratory to introduce core statistical concepts such as central tendency and dispersion of a data set. The pooled data are plotted as a 1-D scatterplot, a purpose-designed number line through which statistical features of the data are…

  19. Effect of Blended Feedstock on Pyrolysis Oil Composition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kristin M; Gaston, Katherine R

    Current techno-economic analysis results indicate biomass feedstock cost represents 27% of the overall minimum fuel selling price for biofuels produced from fast pyrolysis followed by hydrotreating (hydro-deoxygenation, HDO). As a result, blended feedstocks have been proposed as a way to both reduce cost as well as tailor key chemistry for improved fuel quality. For this study, two feedstocks were provided by Idaho National Laboratory (INL). Both were pyrolyzed and collected under the same conditions in the National Renewable Energy Laboratory's (NREL) Thermochemical Process Development Unit (TCPDU). The resulting oil properties were then analyzed and characterized for statistical differences.

  20. Science Laboratory Environment and Academic Performance

    NASA Astrophysics Data System (ADS)

    Aladejana, Francisca; Aderibigbe, Oluyemisi

    2007-12-01

    The study determined how students assess the various components of their science laboratory environment. It also identified how the laboratory environment affects students' learning outcomes. The modified ex-post facto design was used. A sample of 328 randomly selected students was taken from a population of all Senior Secondary School chemistry students in a state in Nigeria. The research instrument, Science Laboratory Environment Inventory (SLEI) designed and validated by Fraser et al. (Sci Educ 77:1-24, 1993) was administered on the selected students. Data analysis was done using descriptive statistics and Product Moment Correlation. Findings revealed that students could assess the five components (Student cohesiveness, Open-endedness, Integration, Rule clarity, and Material Environment) of the laboratory environment. Student cohesiveness has the highest assessment while material environment has the least. The results also showed that the five components of the science laboratory environment are positively correlated with students' academic performance. The findings are discussed with a view to improving the quality of the laboratory environment, subsequent academic performance in science and ultimately the enrolment and retaining of learners in science.

  1. [Concordance among analysts from Latin-American laboratories for rice grain appearance determination using a gallery of digital images].

    PubMed

    Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie

    2012-06-01

    The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.

  2. A Nonparametric Statistical Approach to the Validation of Computer Simulation Models

    DTIC Science & Technology

    1985-11-01

    Ballistic Research Laboratory, the Experimental Design and Analysis Branch of the Systems Engineering and Concepts Analysis Division was funded to...2 Winter. E M. Wisemiler. D P. azd UjiharmJ K. Venrgcation ad Validatiot of Engineering Simulatiots with Minimal D2ta." Pmeedinr’ of the 1976 Summer...used by numerous authors. Law%6 has augmented their approach with specific suggestions for each of the three stage’s: 1. develop high face-validity

  3. Comparison of multianalyte proficiency test results by sum of ranking differences, principal component analysis, and hierarchical cluster analysis.

    PubMed

    Škrbić, Biljana; Héberger, Károly; Durišić-Mladenović, Nataša

    2013-10-01

    Sum of ranking differences (SRD) was applied for comparing multianalyte results obtained by several analytical methods used in one or in different laboratories, i.e., for ranking the overall performances of the methods (or laboratories) in simultaneous determination of the same set of analytes. The data sets for testing of the SRD applicability contained the results reported during one of the proficiency tests (PTs) organized by EU Reference Laboratory for Polycyclic Aromatic Hydrocarbons (EU-RL-PAH). In this way, the SRD was also tested as a discriminant method alternative to existing average performance scores used to compare mutlianalyte PT results. SRD should be used along with the z scores--the most commonly used PT performance statistics. SRD was further developed to handle the same rankings (ties) among laboratories. Two benchmark concentration series were selected as reference: (a) the assigned PAH concentrations (determined precisely beforehand by the EU-RL-PAH) and (b) the averages of all individual PAH concentrations determined by each laboratory. Ranking relative to the assigned values and also to the average (or median) values pointed to the laboratories with the most extreme results, as well as revealed groups of laboratories with similar overall performances. SRD reveals differences between methods or laboratories even if classical test(s) cannot. The ranking was validated using comparison of ranks by random numbers (a randomization test) and using seven folds cross-validation, which highlighted the similarities among the (methods used in) laboratories. Principal component analysis and hierarchical cluster analysis justified the findings based on SRD ranking/grouping. If the PAH-concentrations are row-scaled, (i.e., z scores are analyzed as input for ranking) SRD can still be used for checking the normality of errors. Moreover, cross-validation of SRD on z scores groups the laboratories similarly. The SRD technique is general in nature, i.e., it can be applied to any experimental problem in which multianalyte results obtained either by several analytical procedures, analysts, instruments, or laboratories need to be compared.

  4. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    PubMed

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  5. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  6. PCI fuel failure analysis: a report on a cooperative program undertaken by Pacific Northwest Laboratory and Chalk River Nuclear Laboratories.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohr, C.L.; Pankaskie, P.J.; Heasler, P.G.

    Reactor fuel failure data sets in the form of initial power (P/sub i/), final power (P/sub f/), transient increase in power (..delta..P), and burnup (Bu) were obtained for pressurized heavy water reactors (PHWRs), boiling water reactors (BWRs), and pressurized water reactors (PWRs). These data sets were evaluated and used as the basis for developing two predictive fuel failure models, a graphical concept called the PCI-OGRAM, and a nonlinear regression based model called PROFIT. The PCI-OGRAM is an extension of the FUELOGRAM developed by AECL. It is based on a critical threshold concept for stress dependent stress corrosion cracking. The PROFITmore » model, developed at Pacific Northwest Laboratory, is the result of applying standard statistical regression methods to the available PCI fuel failure data and an analysis of the environmental and strain rate dependent stress-strain properties of the Zircaloy cladding.« less

  7. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  9. Exploring the links between quality assurance and laboratory resources. An audit-based study.

    PubMed

    Singh, Navjeevan; Panwar, Aru; Masih, Vipin Fazal; Arora, Vinod K; Bhatia, Arati

    2003-01-01

    To investigate and rectify the problems related to Ziehl-Neelsen (Z-N) staining in a cytology laboratory in the context of quality assurance. An audit based quality assurance study of 1,421 patients with clinical diagnoses of tubercular lymphadenopathy who underwent fine needle aspiration cytology. Data from 8 months were audited (group 1). Laboratory practices related to selection of smears for Z-N staining were studied. A 2-step corrective measure based on results of the audit was introduced for 2 months (group 2). Results were subjected to statistical analysis using the chi 2 test. Of 1,172 patients in group 1,368 had diagnoses other than tuberculosis. Overall acid-fast bacillus (AFB) positivity was 42%. AFB positivity in 249 patients in group 2 was 89% (P < .0001). Several issues in the laboratory are linked to quality assurance. Solving everyday problems can have far-reaching benefits for the performance of laboratory personnel, resources and work flow.

  10. Transferability and inter-laboratory variability assessment of the in vitro bovine oocyte fertilization test.

    PubMed

    Tessaro, Irene; Modina, Silvia C; Crotti, Gabriella; Franciosi, Federica; Colleoni, Silvia; Lodde, Valentina; Galli, Cesare; Lazzari, Giovanna; Luciano, Alberto M

    2015-01-01

    The dramatic increase in the number of animals required for reproductive toxicity testing imposes the validation of alternative methods to reduce the use of laboratory animals. As we previously demonstrated for in vitro maturation test of bovine oocytes, the present study describes the transferability assessment and the inter-laboratory variability of an in vitro test able to identify chemical effects during the process of bovine oocyte fertilization. Eight chemicals with well-known toxic properties (benzo[a]pyrene, busulfan, cadmium chloride, cycloheximide, diethylstilbestrol, ketoconazole, methylacetoacetate, mifepristone/RU-486) were tested in two well-trained laboratories. The statistical analysis demonstrated no differences in the EC50 values for each chemical in within (inter-runs) and in between-laboratory variability of the proposed test. We therefore conclude that the bovine in vitro fertilization test could advance toward the validation process as alternative in vitro method and become part of an integrated testing strategy in order to predict chemical hazards on mammalian fertility. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Statistical Models for Averaging of the Pump–Probe Traces: Example of Denoising in Terahertz Time-Domain Spectroscopy

    NASA Astrophysics Data System (ADS)

    Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem

    2018-05-01

    In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  13. Analysis of STAT laboratory turnaround times before and after conversion of the hospital information system.

    PubMed

    Lowe, Gary R; Griffin, Yolanda; Hart, Michael D

    2014-08-01

    Modern electronic health record systems (EHRS) reportedly offer advantages including improved quality, error prevention, cost reduction, and increased efficiency. This project reviewed the impact on specimen turnaround times (TAT) and percent compliance for specimens processed in a STAT laboratory after implementation of an upgraded EHRS. Before EHRS implementation, laboratory personnel received instruction and training for specimen processing. One laboratory member per shift received additional training. TAT and percent compliance data sampling occurred 4 times monthly for 13 months post-conversion and were compared with the mean of data collected for 3 months pre-conversion. Percent compliance was gauged using a benchmark of reporting 95% of all specimens within 7 min from receipt. Control charts were constructed for TAT and percent compliance with control limits set at 2 SD and applied continuously through the data collection period. TAT recovered to pre-conversion levels by the 6th month post-conversion. Percent compliance consistently returned to pre-conversion levels by the 10th month post-conversion. Statistical analyses revealed the TAT were significantly longer for 3 months post-conversion (P < .001) compared with pre-conversion levels. Statistical significance was not observed for subsequent groups. Percent compliance results were significantly lower for 6 months post-conversion (P < .001). Statistical significance was not observed for subsequent groups. Extensive efforts were made to train and prepare personnel for challenges expected after the EHRS upgrade. Specific causes identified with the upgraded EHRS included multiple issues involving personnel and the EHRS. These data suggest that system and user issues contributed to delays in returning to pre-conversion TAT and percent compliance levels following the upgrade in the EHRS.

  14. Vapor Pressure Data and Analysis for Selected Organophosphorus Compounds: DIBMP, DCMP, IMMP, IMPA, EMPA, and MPFA

    DTIC Science & Technology

    2017-04-01

    Methodology, Statistics, and Applications; CRDEC-TR-386; U.S. Army Chemical Research, Development and Engineering Center: Aberdeen Proving Ground...Approved for public release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT: Recent work from our laboratory has focused on chemical ...vaporization Volatility Differential scanning calorimetry (DSC) Vapor saturation Boiling point Diisobutyl methylphosphonate (DIBMP), Chemical Abstracts

  15. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  16. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    A review of the techniques used at Lewis Research Center (LeRC) in trace metals analysis is presented, including the results of Atomic Absorption Spectrometry and DC Arc Emission Spectrometry of blank levels and recovery experiments for several metals. The design of an Interlaboratory Study conducted by LeRC is presented. Several factors were investigated, including: laboratory, analytical technique, fuel type, concentration, and ashing additive. Conclusions drawn from the statistical analysis will help direct research efforts toward those areas most responsible for the poor interlaboratory analytical results.

  17. How Should Students Learn in the School Science Laboratory? The Benefits of Cooperative Learning

    NASA Astrophysics Data System (ADS)

    Raviv, Ayala; Cohen, Sarit; Aflalo, Ester

    2017-07-01

    Despite the inherent potential of cooperative learning, there has been very little research into its effectiveness in middle school laboratory classes. This study focuses on an empirical comparison between cooperative learning and individual learning in the school science laboratory, evaluating the quality of learning and the students' attitudes. The research included 67 seventh-grade students who undertook four laboratory experiments on the subject of "volume measuring skills." Each student engaged both in individual and cooperative learning in the laboratory, and the students wrote individual or group reports, accordingly. A total of 133 experiment reports were evaluated, 108 of which also underwent textual analysis. The findings show that the group reports were superior, both in terms of understanding the concept of "volume" and in terms of acquiring skills for measuring volume. The students' attitudes results were statistically significant and demonstrated that they preferred cooperative learning in the laboratory. These findings demonstrate that science teachers should be encouraged to implement cooperative learning in the laboratory. This will enable them to improve the quality and efficiency of laboratory learning while using a smaller number of experimental kits. Saving these expenditures, together with the possibility to teach a larger number of students simultaneously in the laboratory, will enable greater exposure to learning in the school science laboratory.

  18. A Highly Efficient Design Strategy for Regression with Outcome Pooling

    PubMed Central

    Mitchell, Emily M.; Lyles, Robert H.; Manatunga, Amita K.; Perkins, Neil J.; Schisterman, Enrique F.

    2014-01-01

    The potential for research involving biospecimens can be hindered by the prohibitive cost of performing laboratory assays on individual samples. To mitigate this cost, strategies such as randomly selecting a portion of specimens for analysis or randomly pooling specimens prior to performing laboratory assays may be employed. These techniques, while effective in reducing cost, are often accompanied by a considerable loss of statistical efficiency. We propose a novel pooling strategy based on the k-means clustering algorithm to reduce laboratory costs while maintaining a high level of statistical efficiency when predictor variables are measured on all subjects, but the outcome of interest is assessed in pools. We perform simulations motivated by the BioCycle study to compare this k-means pooling strategy with current pooling and selection techniques under simple and multiple linear regression models. While all of the methods considered produce unbiased estimates and confidence intervals with appropriate coverage, pooling under k-means clustering provides the most precise estimates, closely approximating results from the full data and losing minimal precision as the total number of pools decreases. The benefits of k-means clustering evident in the simulation study are then applied to an analysis of the BioCycle dataset. In conclusion, when the number of lab tests is limited by budget, pooling specimens based on k-means clustering prior to performing lab assays can be an effective way to save money with minimal information loss in a regression setting. PMID:25220822

  19. A highly efficient design strategy for regression with outcome pooling.

    PubMed

    Mitchell, Emily M; Lyles, Robert H; Manatunga, Amita K; Perkins, Neil J; Schisterman, Enrique F

    2014-12-10

    The potential for research involving biospecimens can be hindered by the prohibitive cost of performing laboratory assays on individual samples. To mitigate this cost, strategies such as randomly selecting a portion of specimens for analysis or randomly pooling specimens prior to performing laboratory assays may be employed. These techniques, while effective in reducing cost, are often accompanied by a considerable loss of statistical efficiency. We propose a novel pooling strategy based on the k-means clustering algorithm to reduce laboratory costs while maintaining a high level of statistical efficiency when predictor variables are measured on all subjects, but the outcome of interest is assessed in pools. We perform simulations motivated by the BioCycle study to compare this k-means pooling strategy with current pooling and selection techniques under simple and multiple linear regression models. While all of the methods considered produce unbiased estimates and confidence intervals with appropriate coverage, pooling under k-means clustering provides the most precise estimates, closely approximating results from the full data and losing minimal precision as the total number of pools decreases. The benefits of k-means clustering evident in the simulation study are then applied to an analysis of the BioCycle dataset. In conclusion, when the number of lab tests is limited by budget, pooling specimens based on k-means clustering prior to performing lab assays can be an effective way to save money with minimal information loss in a regression setting. Copyright © 2014 John Wiley & Sons, Ltd.

  20. An audit of immunofixation requesting practices at a South African referral laboratory

    PubMed Central

    Rampursat, Yashna

    2014-01-01

    Background It is common practice in most chemical pathology laboratories for reflective immunofixation electrophoresis (IFE) to occur following the detection or suspicion of a paraprotein on serum protein electrophoresis (SPEP). The chemical pathology laboratory at Inkosi Albert Luthuli Central Hospital (IALCH) in Durban, South Africa, is currently the only non-private laboratory in the KwaZulu Natal province that performs SPEP analysis, with current practice requiring that the clinician request IFE following suggestion by the laboratory after a suspicious SPEP result. Objectives To review the current process for IFE at IALCH in the context of reflective testing and to examine the use of the alpha-2-globulin/alpha-1-globulin ratio as a predictor of a positive IFE result. Methods Data for 1260 consecutive SPEP tests performed at the IALCH National Health Laboratory Service were collected between February and July 2011. SPEP and IFE were performed with a Sebia Hydrasys automated electrophoresis system. The alpha-2-globulin/alpha-1-globulin ratio was calculated using density of corresponding fractions on SPEP. Results Analysis revealed that of the 1260 SPEPs performed during the analysis period, 304 IFEs were suggested by the reviewing pathologist. A total of 45 (15%) of the suggested IFEs were subsequently requested by the attending clinicians. Almost half (46.5%) (n = 20) of the suggested IFEs that were performed revealed the presence of a paraprotein. There was no statistically-significant difference between the alpha-2-globulin/alpha-1-globulin ratio for patients with positive or negative IFEs (p-value = 0.2). Conclusions This study reveals the need for reflective addition of IFE testing by the laboratory following suspicious findings on SPEP. PMID:29043173

  1. Effectiveness of the Flipped Classroom Model in Anatomy and Physiology Laboratory Courses at a Hispanic Serving Institution

    NASA Astrophysics Data System (ADS)

    Sanchez, Gerardo

    A flipped laboratory model involves significant preparation by the students on lab material prior to entry to the laboratory. This allows laboratory time to be focused on active learning through experiments. The aim of this study was to observe changes in student performance through the transition from a traditional laboratory format, to a flipped format. The data showed that for both Anatomy and Physiology (I and II) laboratories a more normal distribution of grades was observed once labs were flipped and lecture grade averages increased. Chi square and analysis of variance tests showed grade changes to a statistically significant degree, with a p value of less than 0.05 on both analyses. Regression analyses gave decreasing numbers after the flipped labs were introduced with an r. 2 value of .485 for A&P I, and .564 for A&P II. Results indicate improved scores for the lecture part of the A&P course, decreased outlying scores above 100, and all score distributions approached a more normal distribution.

  2. EQUAL-quant: an international external quality assessment scheme for real-time PCR.

    PubMed

    Ramsden, Simon C; Daly, Sarah; Geilenkeuser, Wolf-Jochen; Duncan, Graeme; Hermitte, Fabienne; Marubini, Ettore; Neumaier, Michael; Orlando, Claudio; Palicka, Vladimir; Paradiso, Angelo; Pazzagli, Mario; Pizzamiglio, Sara; Verderio, Paolo

    2006-08-01

    Quantitative gene expression analysis by real-time PCR is important in several diagnostic areas, such as the detection of minimum residual disease in leukemia and the prognostic assessment of cancer patients. To address quality assurance in this technically challenging area, the European Union (EU) has funded the EQUAL project to develop methodologic external quality assessment (EQA) relevant to diagnostic and research laboratories among the EU member states. We report here the results of the EQUAL-quant program, which assesses standards in the use of TaqMan probes, one of the most widely used assays in the implementation of real-time PCR. The EQUAL-quant reagent set was developed to assess the technical execution of a standard TaqMan assay, including RNA extraction, reverse transcription, and real-time PCR quantification of target DNA copy number. The multidisciplinary EQA scheme included 137 participating laboratories from 29 countries. We demonstrated significant differences in performance among laboratories, with 20% of laboratories reporting at least one result lacking in precision and/or accuracy according to the statistical procedures described. No differences in performance were observed for the >10 different testing platforms used by the study participants. This EQA scheme demonstrated both the requirement and demand for external assessment of technical standards in real-time PCR. The reagent design and the statistical tools developed within this project will provide a benchmark for defining acceptable working standards in this emerging technology.

  3. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  4. UNCERTAINTY ON RADIATION DOSES ESTIMATED BY BIOLOGICAL AND RETROSPECTIVE PHYSICAL METHODS.

    PubMed

    Ainsbury, Elizabeth A; Samaga, Daniel; Della Monaca, Sara; Marrale, Maurizio; Bassinet, Celine; Burbidge, Christopher I; Correcher, Virgilio; Discher, Michael; Eakins, Jon; Fattibene, Paola; Güçlü, Inci; Higueras, Manuel; Lund, Eva; Maltar-Strmecki, Nadica; McKeever, Stephen; Rääf, Christopher L; Sholom, Sergey; Veronese, Ivan; Wieser, Albrecht; Woda, Clemens; Trompier, Francois

    2018-03-01

    Biological and physical retrospective dosimetry are recognised as key techniques to provide individual estimates of dose following unplanned exposures to ionising radiation. Whilst there has been a relatively large amount of recent development in the biological and physical procedures, development of statistical analysis techniques has failed to keep pace. The aim of this paper is to review the current state of the art in uncertainty analysis techniques across the 'EURADOS Working Group 10-Retrospective dosimetry' members, to give concrete examples of implementation of the techniques recommended in the international standards, and to further promote the use of Monte Carlo techniques to support characterisation of uncertainties. It is concluded that sufficient techniques are available and in use by most laboratories for acute, whole body exposures to highly penetrating radiation, but further work will be required to ensure that statistical analysis is always wholly sufficient for the more complex exposure scenarios.

  5. US-VISIT Identity Matching Algorithm Evaluation Program: ADIS Algorithm Evaluation Project Plan Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grant, C W; Lenderman, J S; Gansemer, J D

    This document is an update to the 'ADIS Algorithm Evaluation Project Plan' specified in the Statement of Work for the US-VISIT Identity Matching Algorithm Evaluation Program, as deliverable II.D.1. The original plan was delivered in August 2010. This document modifies the plan to reflect modified deliverables reflecting delays in obtaining a database refresh. This document describes the revised schedule of the program deliverables. The detailed description of the processes used, the statistical analysis processes and the results of the statistical analysis will be described fully in the program deliverables. The US-VISIT Identity Matching Algorithm Evaluation Program is work performed bymore » Lawrence Livermore National Laboratory (LLNL) under IAA HSHQVT-07-X-00002 P00004 from the Department of Homeland Security (DHS).« less

  6. Validation of a single-platform method for hematopoietic CD34+ stem cells enumeration according to accreditation procedure.

    PubMed

    Massin, Frédéric; Huili, Cai; Decot, Véronique; Stoltz, Jean-François; Bensoussan, Danièle; Latger-Cannard, Véronique

    2015-01-01

    Stem cells for autologous and allogenic transplantation are obtained from several sources including bone marrow, peripheral blood or cord blood. Accurate enumeration of viable CD34+ hematopoietic stem cells (HSC) is routinely used in clinical settings, especially to monitor progenitor cell mobilization and apheresis. The number of viable CD34+ HSC has also been shown to be the most critical factor in haematopoietic engraftment. The International Society for Cellular Therapy actually recommends the use of single-platform flow cytometry system using 7-AAD as a viability dye. In a way to move routine analysis from a BD FACSCaliburTM instrument to a BD FACSCantoTM II, according to ISO 15189 standard guidelines, we define laboratory performance data of the BDTM Stem Cell Enumeration (SCE) kit on a CE-IVD system including a BD FACSCanto II flow cytometer and the BD FACSCantoTM Clinical Software. InterQCTM software, a real time internet laboratory QC management system developed by VitroTM and distributed by Becton DickinsonTM, was also tested to monitor daily QC data, to define the internal laboratory statistics and to compare them to external laboratories. Precision was evaluated with BDTM Stem Cell Control (high and low) results and the InterQC software, an internet laboratory QC management system by Vitro. This last one drew Levey-Jennings curves and generated numeral statistical parameters allowing detection of potential changes in the system performances as well as interlaboratory comparisons. Repeatability, linearity and lower limits of detection were obtained with routine samples from different origins. Agreement evaluation between BD FACSCanto II system versus BD FACSCalibur system was tested on fresh peripheral blood, freeze-thawed apheresis, fresh bone marrow and fresh cord blood samples. Instrument's measure and staining repeatability clearly evidenced acceptable variability on the different samples tested. Intra- and inter-laboratory CV in CD34+ cell absolute count are consistent and reproducible. Linearity analysis, established between 2 and 329 cells/μl showed a linear relation between expected counts and measured counts (R2=0.97). Linear regression and Bland-Altman representations showed an excellent correlation on samples from different sources between the two systems and allowed the transfer of routine analysis from BD FACSCalibur to BD FACSCanto II. The BD SCE kit provides an accurate measure of the CD34 HSC, and can be used in daily routine to optimize the enumeration of hematopoietic CD34+ stem cells by flow cytometry. Moreover, the InterQC system seems to be a very useful tool for laboratory daily quality monitoring and thus for accreditation.

  7. Influence of Decontaminating Agents and Swipe Materials on Laboratory Simulated Working Surfaces Wet Spilled with Sodium Pertechnetate

    PubMed Central

    Akchata, Suman; Lavanya, K; Shivanand, Bhushan

    2017-01-01

    Context: Decontamination of various working surfaces with sodium pertechnetate minor spillage is essential for maintaining good radiation safety practices as well as for regulatory compliance. Aim: To observe the influences of decontaminating agents and swipe materials on different type of surfaces used in nuclear medicine laboratory work area wet spilled with 99m-technetium (99mTc) sodium pertechnetate. Settings and Design: Lab-simulated working surface materials. Experimental study design. Materials and Methods: Direct decontamination method on dust-free lab simulated new working surfaces [stainless steel, polyvinyl chloride (PVC), Perspex, resin] using four decontaminating agents [tap water, soap water (SW), Radiacwash, and spirit] with four different swipe material [cotton, tissue paper (TP), Whatman paper (WP), adsorbent sheet (AS)] was taken 10 samples (n = 10) for each group. Statistical Analysis: Parametric test two-way analysis of variance is used with significance level of 0.005, was used to evaluate statistical differences between different group of decontaminating agent and swipe material, and the results are expressed in mean ± SD. Results: Decontamination factor is calculated after five cleaning for each group. A total of 160 samples result calculated using four decontaminating agent (tap water, SW, Radiacwash, and spirit), four swipe material (cotton, TP, WP, and AS) for commonly used surface (stainless steel, PVC, Perspex, resin) using direct method by 10 samples (n = 10) for each group. Conclusions: Tap water is the best decontaminating agent compared with SW, Radiac wash and spirit for the laboratory simulated stainless steel, PVC, and Perspex surface material, whereas in case of resin surface material, SW decontaminating agent is showing better effectiveness. Cotton is the best swipe material compared to WP-1, AS and TP for the stainless steel, PVC, Perspex, and resin laboratory simulated surface materials. Perspex and stainless steel are the most suitable and recommended laboratory surface material compared to PVC and resin in nuclear medicine. Radiacwash may show better result for 99mTc labelled product and other radionuclide contamination on the laboratory working surface area. PMID:28680198

  8. Parametric Analysis to Study the Influence of Aerogel-Based Renders' Components on Thermal and Mechanical Performance.

    PubMed

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-05-04

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study's objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect.

  9. Parametric Analysis to Study the Influence of Aerogel-Based Renders’ Components on Thermal and Mechanical Performance

    PubMed Central

    Ximenes, Sofia; Silva, Ana; Soares, António; Flores-Colen, Inês; de Brito, Jorge

    2016-01-01

    Statistical models using multiple linear regression are some of the most widely used methods to study the influence of independent variables in a given phenomenon. This study’s objective is to understand the influence of the various components of aerogel-based renders on their thermal and mechanical performance, namely cement (three types), fly ash, aerial lime, silica sand, expanded clay, type of aerogel, expanded cork granules, expanded perlite, air entrainers, resins (two types), and rheological agent. The statistical analysis was performed using SPSS (Statistical Package for Social Sciences), based on 85 mortar mixes produced in the laboratory and on their values of thermal conductivity and compressive strength obtained using tests in small-scale samples. The results showed that aerial lime assumes the main role in improving the thermal conductivity of the mortars. Aerogel type, fly ash, expanded perlite and air entrainers are also relevant components for a good thermal conductivity. Expanded clay can improve the mechanical behavior and aerogel has the opposite effect. PMID:28773460

  10. Utility of Gram stain for the microbiological analysis of burn wound surfaces.

    PubMed

    Elsayed, Sameer; Gregson, Daniel B; Lloyd, Tracie; Crichton, Marilyn; Church, Deirdre L

    2003-11-01

    Surface swab cultures have attracted attention as a potential alternative to biopsy histology or quantitative culture methods for microbiological burn wound monitoring. To our knowledge, the utility of adding a Gram-stained slide in this context has not been evaluated previously. To determine the degree of correlation of Gram stain with culture for the microbiological analysis of burn wound surfaces. Prospective laboratory analysis. Urban health region/centralized diagnostic microbiology laboratory. Burn patients hospitalized in any Calgary Health Region burn center from November 2000 to September 2001. Gram stain plus culture of burn wound surface swab specimens obtained during routine dressing changes or based on clinical signs of infection. Degree of correlation (complete, high, partial, none), including weighted kappa statistic (kappa(w)), of Gram stain with culture based on quantitative microscopy and degree of culture growth. A total of 375 specimens from 50 burn patients were evaluated. Of these, 239 were negative by culture and Gram stain, 7 were positive by Gram stain only, 89 were positive by culture only, and 40 were positive by both methods. The degree of complete, high, partial, and no correlation of Gram stain with culture was 70.9% (266/375), 1.1% (4/375), 2.4% (9/375), and 25.6% (96/375), respectively. The degree of correlation for all 375 specimens, as expressed by the weighted kappa statistic, was found to be fair (kappa(w) = 0.32).Conclusion.-The Gram stain is not suitable for the microbiological analysis of burn wound surfaces.

  11. Applications of "Integrated Data Viewer'' (IDV) in the classroom

    NASA Astrophysics Data System (ADS)

    Nogueira, R.; Cutrim, E. M.

    2006-06-01

    Conventionally, weather products utilized in synoptic meteorology reduce phenomena occurring in four dimensions to a 2-dimensional form. This constitutes a road-block for non-atmospheric-science majors who need to take meteorology as a non-mathematical and complementary course to their major programs. This research examines the use of Integrated Data Viewer-IDV as a teaching tool, as it allows a 4-dimensional representation of weather products. IDV was tested in the teaching of synoptic meteorology, weather analysis, and weather map interpretation to non-science students in the laboratory sessions of an introductory meteorology class at Western Michigan University. Comparison of student exam scores according to the laboratory teaching techniques, i.e., traditional lab manual and IDV was performed for short- and long-term learning. Results of the statistical analysis show that the Fall 2004 students in the IDV-based lab session retained learning. However, in the Spring 2005 the exam scores did not reflect retention in learning when compared with IDV-based and MANUAL-based lab scores (short term learning, i.e., exam taken one week after the lab exercise). Testing the long-term learning, seven weeks between the two exams in the Spring 2005, show no statistically significant difference between IDV-based group scores and MANUAL-based group scores. However, the IDV group obtained exam score average slightly higher than the MANUAL group. Statistical testing of the principal hypothesis in this study, leads to the conclusion that the IDV-based method did not prove to be a better teaching tool than the traditional paper-based method. Future studies could potentially find significant differences in the effectiveness of both manual and IDV methods if the conditions had been more controlled. That is, students in the control group should not be exposed to the weather analysis using IDV during lecture.

  12. Clinical validation of robot simulation of toothbrushing - comparative plaque removal efficacy

    PubMed Central

    2014-01-01

    Background Clinical validation of laboratory toothbrushing tests has important advantages. It was, therefore, the aim to demonstrate correlation of tooth cleaning efficiency of a new robot brushing simulation technique with clinical plaque removal. Methods Clinical programme: 27 subjects received dental cleaning prior to 3-day-plaque-regrowth-interval. Plaque was stained, photographically documented and scored using planimetrical index. Subjects brushed teeth 33–47 with three techniques (horizontal, rotating, vertical), each for 20s buccally and for 20s orally in 3 consecutive intervals. The force was calibrated, the brushing technique was video supported. Two different brushes were randomly assigned to the subject. Robot programme: Clinical brushing programmes were transfered to a 6-axis-robot. Artificial teeth 33–47 were covered with plaque-simulating substrate. All brushing techniques were repeated 7 times, results were scored according to clinical planimetry. All data underwent statistical analysis by t-test, U-test and multivariate analysis. Results The individual clinical cleaning patterns are well reproduced by the robot programmes. Differences in plaque removal are statistically significant for the two brushes, reproduced in clinical and robot data. Multivariate analysis confirms the higher cleaning efficiency for anterior teeth and for the buccal sites. Conclusions The robot tooth brushing simulation programme showed good correlation with clinically standardized tooth brushing. This new robot brushing simulation programme can be used for rapid, reproducible laboratory testing of tooth cleaning. PMID:24996973

  13. Clinical validation of robot simulation of toothbrushing--comparative plaque removal efficacy.

    PubMed

    Lang, Tomas; Staufer, Sebastian; Jennes, Barbara; Gaengler, Peter

    2014-07-04

    Clinical validation of laboratory toothbrushing tests has important advantages. It was, therefore, the aim to demonstrate correlation of tooth cleaning efficiency of a new robot brushing simulation technique with clinical plaque removal. Clinical programme: 27 subjects received dental cleaning prior to 3-day-plaque-regrowth-interval. Plaque was stained, photographically documented and scored using planimetrical index. Subjects brushed teeth 33-47 with three techniques (horizontal, rotating, vertical), each for 20s buccally and for 20s orally in 3 consecutive intervals. The force was calibrated, the brushing technique was video supported. Two different brushes were randomly assigned to the subject. Robot programme: Clinical brushing programmes were transfered to a 6-axis-robot. Artificial teeth 33-47 were covered with plaque-simulating substrate. All brushing techniques were repeated 7 times, results were scored according to clinical planimetry. All data underwent statistical analysis by t-test, U-test and multivariate analysis. The individual clinical cleaning patterns are well reproduced by the robot programmes. Differences in plaque removal are statistically significant for the two brushes, reproduced in clinical and robot data. Multivariate analysis confirms the higher cleaning efficiency for anterior teeth and for the buccal sites. The robot tooth brushing simulation programme showed good correlation with clinically standardized tooth brushing.This new robot brushing simulation programme can be used for rapid, reproducible laboratory testing of tooth cleaning.

  14. Different spectrophotometric methods applied for the analysis of simeprevir in the presence of its oxidative degradation product: Acomparative study

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; El-Abasawi, Nasr M.; El-Olemy, Ahmed; Serag, Ahmed

    2018-02-01

    Five simple spectrophotometric methods were developed for the determination of simeprevir in the presence of its oxidative degradation product namely, ratio difference, mean centering, derivative ratio using the Savitsky-Golay filters, second derivative and continuous wavelet transform. These methods are linear in the range of 2.5-40 μg/mL and validated according to the ICH guidelines. The obtained results of accuracy, repeatability and precision were found to be within the acceptable limits. The specificity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. Furthermore, these methods were statistically comparable to RP-HPLC method and good results were obtained. So, they can be used for the routine analysis of simeprevir in quality-control laboratories.

  15. Research of facial feature extraction based on MMC

    NASA Astrophysics Data System (ADS)

    Xue, Donglin; Zhao, Jiufen; Tang, Qinhong; Shi, Shaokun

    2017-07-01

    Based on the maximum margin criterion (MMC), a new algorithm of statistically uncorrelated optimal discriminant vectors and a new algorithm of orthogonal optimal discriminant vectors for feature extraction were proposed. The purpose of the maximum margin criterion is to maximize the inter-class scatter while simultaneously minimizing the intra-class scatter after the projection. Compared with original MMC method and principal component analysis (PCA) method, the proposed methods are better in terms of reducing or eliminating the statistically correlation between features and improving recognition rate. The experiment results on Olivetti Research Laboratory (ORL) face database shows that the new feature extraction method of statistically uncorrelated maximum margin criterion (SUMMC) are better in terms of recognition rate and stability. Besides, the relations between maximum margin criterion and Fisher criterion for feature extraction were revealed.

  16. Mathematic model analysis of Gaussian beam propagation through an arbitrary thickness random phase screen.

    PubMed

    Tian, Yuzhen; Guo, Jin; Wang, Rui; Wang, Tingfeng

    2011-09-12

    In order to research the statistical properties of Gaussian beam propagation through an arbitrary thickness random phase screen for adaptive optics and laser communication application in the laboratory, we establish mathematic models of statistical quantities, which are based on the Rytov method and the thin phase screen model, involved in the propagation process. And the analytic results are developed for an arbitrary thickness phase screen based on the Kolmogorov power spectrum. The comparison between the arbitrary thickness phase screen and the thin phase screen shows that it is more suitable for our results to describe the generalized case, especially the scintillation index.

  17. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  18. Homogeneity study of a corn flour laboratory reference material candidate for inorganic analysis.

    PubMed

    Dos Santos, Ana Maria Pinto; Dos Santos, Liz Oliveira; Brandao, Geovani Cardoso; Leao, Danilo Junqueira; Bernedo, Alfredo Victor Bellido; Lopes, Ricardo Tadeu; Lemos, Valfredo Azevedo

    2015-07-01

    In this work, a homogeneity study of a corn flour reference material candidate for inorganic analysis is presented. Seven kilograms of corn flour were used to prepare the material, which was distributed among 100 bottles. The elements Ca, K, Mg, P, Zn, Cu, Fe, Mn and Mo were quantified by inductively coupled plasma optical emission spectrometry (ICP OES) after acid digestion procedure. The method accuracy was confirmed by analyzing the rice flour certified reference material, NIST 1568a. All results were evaluated by analysis of variance (ANOVA) and principal component analysis (PCA). In the study, a sample mass of 400mg was established as the minimum mass required for analysis, according to the PCA. The between-bottle test was performed by analyzing 9 bottles of the material. Subsamples of a single bottle were analyzed for the within-bottle test. No significant differences were observed for the results obtained through the application of both statistical methods. This fact demonstrates that the material is homogeneous for use as a laboratory reference material. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Organic Laboratory Experiments: Micro vs. Conventional.

    ERIC Educational Resources Information Center

    Chloupek-McGough, Marge

    1989-01-01

    Presents relevant statistics accumulated in a fall organic laboratory course. Discusses laboratory equipment setup to lower the amount of waste. Notes decreased solid wastes were produced compared to the previous semester. (MVL)

  20. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  1. Evaluation of clinical, laboratory and morphologic prognostic factors in colon cancer

    PubMed Central

    Grande, Michele; Milito, Giovanni; Attinà, Grazia Maria; Cadeddu, Federica; Muzi, Marco Gallinella; Nigro, Casimiro; Rulli, Francesco; Farinon, Attilio Maria

    2008-01-01

    Background The long-term prognosis of patients with colon cancer is dependent on many factors. To investigate the influence of a series of clinical, laboratory and morphological variables on prognosis of colon carcinoma we conducted a retrospective analysis of our data. Methods Ninety-two patients with colon cancer, who underwent surgical resection between January 1999 and December 2001, were analyzed. On survival analysis, demographics, clinical, laboratory and pathomorphological parameters were tested for their potential prognostic value. Furthermore, univariate and multivariate analysis of the above mentioned data were performed considering the depth of tumour invasion into the bowel wall as independent variable. Results On survival analysis we found that depth of tumour invasion (P < 0.001; F-ratio 2.11), type of operation (P < 0.001; F-ratio 3.51) and CT scanning (P < 0.001; F-ratio 5.21) were predictors of survival. Considering the degree of mural invasion as independent variable, on univariate analysis, we observed that mucorrhea, anismus, hematocrit, WBC count, fibrinogen value and CT scanning were significantly related to the degree of mural invasion of the cancer. On the multivariate analysis, fibrinogen value was the most statistically significant variable (P < 0.001) with the highest F-ratio (F-ratio 5.86). Finally, in the present study, the tumour site was significantly related neither to the survival nor to the mural invasion of the tumour. Conclusion The various clinical, laboratory and patho-morphological parameters showed different prognostic value for colon carcinoma. In the future, preoperative prognostic markers will probably gain relevance in order to make a proper choice between surgery, chemotherapy and radiotherapy. Nevertheless, current data do not provide sufficient evidence for preoperative stratification of high and low risk patients. Further assessments in prospective large studies are warranted. PMID:18778464

  2. A statistical data analysis and plotting program for cloud microphysics experiments

    NASA Technical Reports Server (NTRS)

    Jordan, A. J.

    1981-01-01

    The analysis software developed for atmospheric cloud microphysics experiments conducted in the laboratory as well as aboard a KC-135 aircraft is described. A group of four programs was developed and implemented on a Hewlett Packard 1000 series F minicomputer running under HP's RTE-IVB operating system. The programs control and read data from a MEMODYNE Model 3765-8BV cassette recorder, format the data on the Hewlett Packard disk subsystem, and generate statistical data (mean, variance, standard deviation) and voltage and engineering unit plots on a user selected plotting device. The programs are written in HP FORTRAN IV and HP ASSEMBLY Language with the graphics software using the HP 1000 Graphics. The supported plotting devices are the HP 2647A graphics terminal, the HP 9872B four color pen plotter, and the HP 2608A matrix line printer.

  3. Extending the scope of pooled analyses of individual patient biomarker data from heterogeneous laboratory platforms and cohorts using merging algorithms.

    PubMed

    Burke, Órlaith; Benton, Samantha; Szafranski, Pawel; von Dadelszen, Peter; Buhimschi, S Catalin; Cetin, Irene; Chappell, Lucy; Figueras, Francesc; Galindo, Alberto; Herraiz, Ignacio; Holzman, Claudia; Hubel, Carl; Knudsen, Ulla; Kronborg, Camilla; Laivuori, Hannele; Lapaire, Olav; McElrath, Thomas; Moertl, Manfred; Myers, Jenny; Ness, Roberta B; Oliveira, Leandro; Olson, Gayle; Poston, Lucilla; Ris-Stalpers, Carrie; Roberts, James M; Schalekamp-Timmermans, Sarah; Schlembach, Dietmar; Steegers, Eric; Stepan, Holger; Tsatsaris, Vassilis; van der Post, Joris A; Verlohren, Stefan; Villa, Pia M; Williams, David; Zeisler, Harald; Redman, Christopher W G; Staff, Anne Cathrine

    2016-01-01

    A common challenge in medicine, exemplified in the analysis of biomarker data, is that large studies are needed for sufficient statistical power. Often, this may only be achievable by aggregating multiple cohorts. However, different studies may use disparate platforms for laboratory analysis, which can hinder merging. Using circulating placental growth factor (PlGF), a potential biomarker for hypertensive disorders of pregnancy (HDP) such as preeclampsia, as an example, we investigated how such issues can be overcome by inter-platform standardization and merging algorithms. We studied 16,462 pregnancies from 22 study cohorts. PlGF measurements (gestational age ⩾20 weeks) analyzed on one of four platforms: R&D Systems, AlereTriage, RocheElecsys or AbbottArchitect, were available for 13,429 women. Two merging algorithms, using Z-Score and Multiple of Median transformations, were applied. Best reference curves (BRC), based on merged, transformed PlGF measurements in uncomplicated pregnancy across six gestational age groups, were estimated. Identification of HDP by these PlGF-BRCs was compared to that of platform-specific curves. We demonstrate the feasibility of merging PlGF concentrations from different analytical platforms. Overall BRC identification of HDP performed at least as well as platform-specific curves. Our method can be extended to any set of biomarkers obtained from different laboratory platforms in any field. Merged biomarker data from multiple studies will improve statistical power and enlarge our understanding of the pathophysiology and management of medical syndromes. Copyright © 2015 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.

  4. [Analysis on 2011 quality control results on aerobic plate count of microbiology laboratories in China].

    PubMed

    Han, Haihong; Li, Ning; Li, Yepeng; Fu, Ping; Yu, Dongmin; Li Zhigang; Du, Chunming; Guo, Yunchang

    2015-01-01

    To test the aerobic plate count examining capability of microbiology laboratories, to ensure the accuracy and comparability of quantitative bacteria examination results, and to improve the quality of monitoring. The 4 different concentration aerobic plate count piece samples were prepared and noted as I, II, III and IV. After homogeneity and stability tests, the samples were delivered to monitoring institutions. The results of I, II, III samples were logarithmic transformed, and evaluated with Z-score method using the robust average and standard deviation. The results of IV samples were evaluated as "satisfactory" when reported as < 10 CFU/piece or as "not satisfactory" otherwise. Pearson χ2 test was used to analyze the ratio results. 309 monitoring institutions, which was 99.04% of the total number, reported their results. 271 institutions reported a satisfactory result, and the satisfactory rate was 87.70%. There was no statistical difference in satisfactory rates of I, II and III samples which were 81.52%, 88.30% and 91.40% respectively. The satisfactory rate of IV samples was 93.33%. There was no statistical difference in satisfactory rates between provincial and municipal CDC. The quality control program has provided scientific data that the aerobic plate count capability of the laboratories meets the requirements of monitoring tasks.

  5. A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes.

    PubMed

    Haslacher, Helmuth; Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F; Winker, Robert

    2017-01-01

    Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups.

  6. A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes

    PubMed Central

    Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M.; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F.; Winker, Robert

    2017-01-01

    Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups. PMID:28475643

  7. The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.

    PubMed

    Strauss, G H; Stanford, W L; Berkowitz, S J

    1989-03-01

    We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.

  8. Computer-Assisted Instruction in Statistics. Technical Report.

    ERIC Educational Resources Information Center

    Cooley, William W.

    A paper given at a conference on statistical computation discussed teaching statistics with computers. It concluded that computer-assisted instruction is most appropriately employed in the numerical demonstration of statistical concepts, and for statistical laboratory instruction. The student thus learns simultaneously about the use of computers…

  9. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for large laboratories. Conclusion This study suggests that there are continuing issues in the production techniques utilised between dentists and dental laboratories. PMID:25257017

  10. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    PubMed Central

    Perlin, Mark William

    2015-01-01

    Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI-1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI-1) values were examined and compared with corresponding log(LR) values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI-1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative CPI number adds little meaningful information beyond the analyst's initial qualitative assessment that a person's DNA is included in a mixture. Statistical methods for reporting on DNA mixture evidence should be scientifically validated before they are relied upon by criminal justice. PMID:26605124

  11. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    PubMed

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative CPI number adds little meaningful information beyond the analyst's initial qualitative assessment that a person's DNA is included in a mixture. Statistical methods for reporting on DNA mixture evidence should be scientifically validated before they are relied upon by criminal justice.

  12. "I got it on Ebay!": cost-effective approach to surgical skills laboratories.

    PubMed

    Schneider, Ethan; Schenarts, Paul J; Shostrom, Valerie; Schenarts, Kimberly D; Evans, Charity H

    2017-01-01

    Surgical education is witnessing a surge in the use of simulation. However, implementation of simulation is often cost-prohibitive. Online shopping offers a low budget alternative. The aim of this study was to implement cost-effective skills laboratories and analyze online versus manufacturers' prices to evaluate for savings. Four skills laboratories were designed for the surgery clerkship from July 2014 to June 2015. Skills laboratories were implemented using hand-built simulation and instruments purchased online. Trademarked simulation was priced online and instruments priced from a manufacturer. Costs were compiled, and a descriptive cost analysis of online and manufacturers' prices was performed. Learners rated their level of satisfaction for all educational activities, and levels of satisfaction were compared. A total of 119 third-year medical students participated. Supply lists and costs were compiled for each laboratory. A descriptive cost analysis of online and manufacturers' prices showed online prices were substantially lower than manufacturers, with a per laboratory savings of: $1779.26 (suturing), $1752.52 (chest tube), $2448.52 (anastomosis), and $1891.64 (laparoscopic), resulting in a year 1 savings of $47,285. Mean student satisfaction scores for the skills laboratories were 4.32, with statistical significance compared to live lectures at 2.96 (P < 0.05) and small group activities at 3.67 (P < 0.05). A cost-effective approach for implementation of skills laboratories showed substantial savings. By using hand-built simulation boxes and online resources to purchase surgical equipment, surgical educators overcome financial obstacles limiting the use of simulation and provide learning opportunities that medical students perceive as beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Lipid membranes and single ion channel recording for the advanced physics laboratory

    NASA Astrophysics Data System (ADS)

    Klapper, Yvonne; Nienhaus, Karin; Röcker, Carlheinz; Ulrich Nienhaus, G.

    2014-05-01

    We present an easy-to-handle, low-cost, and reliable setup to study various physical phenomena on a nanometer-thin lipid bilayer using the so-called black lipid membrane technique. The apparatus allows us to precisely measure optical and electrical properties of free-standing lipid membranes, to study the formation of single ion channels, and to gain detailed information on the ion conduction properties of these channels using statistical physics and autocorrelation analysis. The experiments are well suited as part of an advanced physics or biophysics laboratory course; they interconnect physics, chemistry, and biology and will be appealing to students of the natural sciences who are interested in quantitative experimentation.

  14. GHEP-ISFG collaborative exercise on mixture profiles of autosomal STRs (GHEP-MIX01, GHEP-MIX02 and GHEP-MIX03): results and evaluation.

    PubMed

    Crespillo, M; Barrio, P A; Luque, J A; Alves, C; Aler, M; Alessandrini, F; Andrade, L; Barretto, R M; Bofarull, A; Costa, S; García, M A; García, O; Gaviria, A; Gladys, A; Gorostiza, A; Hernández, A; Herrera, M; Hombreiro, L; Ibarra, A A; Jiménez, M J; Luque, G M; Madero, P; Martínez-Jarreta, B; Masciovecchio, M V; Modesti, N M; Moreno, F; Pagano, S; Pedrosa, S; Plaza, G; Prat, E; Puente, J; Rendo, F; Ribeiro, T; Sala, A; Santamaría, E; Saragoni, V G; Whittle, M R

    2014-05-01

    One of the main objectives of the Spanish and Portuguese-Speaking Group of the International Society for Forensic Genetics (GHEP-ISFG) is to promote and contribute to the development and dissemination of scientific knowledge in the area of forensic genetics. Due to this fact, GHEP-ISFG holds different working commissions that are set up to develop activities in scientific aspects of general interest. One of them, the Mixture Commission of GHEP-ISFG, has organized annually, since 2009, a collaborative exercise on analysis and interpretation of autosomal short tandem repeat (STR) mixture profiles. Until now, three exercises have been organized (GHEP-MIX01, GHEP-MIX02 and GHEP-MIX03), with 32, 24 and 17 participant laboratories respectively. The exercise aims to give a general vision by addressing, through the proposal of mock cases, aspects related to the edition of mixture profiles and the statistical treatment. The main conclusions obtained from these exercises may be summarized as follows. Firstly, the data show an increased tendency of the laboratories toward validation of DNA mixture profiles analysis following international recommendations (ISO/IEC 17025:2005). Secondly, the majority of discrepancies are mainly encountered in stutters positions (53.4%, 96.0% and 74.9%, respectively for the three editions). On the other hand, the results submitted reveal the importance of performing duplicate analysis by using different kits in order to reduce errors as much as possible. Regarding the statistical aspect (GHEP-MIX02 and 03), all participants employed the likelihood ratio (LR) parameter to evaluate the statistical compatibility and the formulas employed were quite similar. When the hypotheses to evaluate the LR value were locked by the coordinators (GHEP-MIX02) the results revealed a minor number of discrepancies that were mainly due to clerical reasons. However, the GHEP-MIX03 exercise allowed the participants to freely come up with their own hypotheses to calculate the LR value. In this situation the laboratories reported several options to explain the mock cases proposed and therefore significant differences between the final LR values were obtained. Complete information concerning the background of the criminal case is a critical aspect in order to select the adequate hypotheses to calculate the LR value. Although this should be a task for the judicial court to decide, it is important for the expert to account for the different possibilities and scenarios, and also offer this expertise to the judge. In addition, continuing education in the analysis and interpretation of mixture DNA profiles may also be a priority for the vast majority of forensic laboratories. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Statistical analysis of major ion and trace element geochemistry of water, 1986-2006, at seven wells transecting the freshwater/saline-water interface of the Edwards Aquifer, San Antonio, Texas

    USGS Publications Warehouse

    Mahler, Barbara J.

    2008-01-01

    The statistical analyses taken together indicate that the geochemistry at the freshwater-zone wells is more variable than that at the transition-zone wells. The geochemical variability at the freshwater-zone wells might result from dilution of ground water by meteoric water. This is indicated by relatively constant major ion molar ratios; a preponderance of positive correlations between SC, major ions, and trace elements; and a principal components analysis in which the major ions are strongly loaded on the first principal component. Much of the variability at three of the four transition-zone wells might result from the use of different laboratory analytical methods or reporting procedures during the period of sampling. This is reflected by a lack of correlation between SC and major ion concentrations at the transition-zone wells and by a principal components analysis in which the variability is fairly evenly distributed across several principal components. The statistical analyses further indicate that, although the transition-zone wells are less well connected to surficial hydrologic conditions than the freshwater-zone wells, there is some connection but the response time is longer. 

  16. Selection of nontarget arthropod taxa for field research on transgenic insecticidal crops: using empirical data and statistical power.

    PubMed

    Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J

    2008-02-01

    One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.

  17. Statistical summaries of streamflow data for selected gaging stations on and near the Idaho National Engineering Laboratory, Idaho, through September 1990

    USGS Publications Warehouse

    Stone, M.A.J.; Mann, Larry J.; Kjelstrom, L.C.

    1993-01-01

    Statistical summaries and graphs of streamflow data were prepared for 13 gaging stations with 5 or more years of continuous record on and near the Idaho National Engineering Laboratory. Statistical summaries of streamflow data for the Big and Little Lost Rivers and Birch Creek were analyzed as a requisite for a comprehensive evaluation of the potential for flooding of facilities at the Idaho National Engineering Laboratory. The type of statistical analyses performed depended on the length of streamflow record for a gaging station. Streamflow statistics generated for stations with 5 to 9 years of record were: (1) magnitudes of monthly and annual flows; (2) duration of daily mean flows; and (3) maximum, median, and minimum daily mean flows. Streamflow statistics generated for stations with 10 or more years of record were: (1) magnitudes of monthly and annual flows; (2) magnitudes and frequencies of daily low, high, instantaneous peak (flood frequency), and annual mean flows; (3) duration of daily mean flows; (4) exceedance probabilities of annual low, high, instantaneous peak, and mean annual flows; (5) maximum, median, and minimum daily mean flows; and (6) annual mean and mean annual flows.

  18. The Evolution of the Language Laboratory: Changes During Fifteen Years of Operation

    ERIC Educational Resources Information Center

    Stack, Edward M.

    1977-01-01

    This article summarizes conditions and changes in language laboratories. Types of laboratories and equipment are listed; laboratory personnel include technicians, librarians and student assistants. Most maintenance was done by institution personnel; student use is outlined. Professional attitudes and equipment statistics are surveyed. (CHK)

  19. SOBA: sequence ontology bioinformatics analysis.

    PubMed

    Moore, Barry; Fan, Guozhen; Eilbeck, Karen

    2010-07-01

    The advent of cheaper, faster sequencing technologies has pushed the task of sequence annotation from the exclusive domain of large-scale multi-national sequencing projects to that of research laboratories and small consortia. The bioinformatics burden placed on these laboratories, some with very little programming experience can be daunting. Fortunately, there exist software libraries and pipelines designed with these groups in mind, to ease the transition from an assembled genome to an annotated and accessible genome resource. We have developed the Sequence Ontology Bioinformatics Analysis (SOBA) tool to provide a simple statistical and graphical summary of an annotated genome. We envisage its use during annotation jamborees, genome comparison and for use by developers for rapid feedback during annotation software development and testing. SOBA also provides annotation consistency feedback to ensure correct use of terminology within annotations, and guides users to add new terms to the Sequence Ontology when required. SOBA is available at http://www.sequenceontology.org/cgi-bin/soba.cgi.

  20. Comparative study on the selectivity of various spectrophotometric techniques for the determination of binary mixture of fenbendazole and rafoxanide.

    PubMed

    Saad, Ahmed S; Attia, Ali K; Alaraki, Manal S; Elzanfaly, Eman S

    2015-11-05

    Five different spectrophotometric methods were applied for simultaneous determination of fenbendazole and rafoxanide in their binary mixture; namely first derivative, derivative ratio, ratio difference, dual wavelength and H-point standard addition spectrophotometric methods. Different factors affecting each of the applied spectrophotometric methods were studied and the selectivity of the applied methods was compared. The applied methods were validated as per the ICH guidelines and good accuracy; specificity and precision were proven within the concentration range of 5-50 μg/mL for both drugs. Statistical analysis using one-way ANOVA proved no significant differences among the proposed methods for the determination of the two drugs. The proposed methods successfully determined both drugs in laboratory prepared and commercially available binary mixtures, and were found applicable for the routine analysis in quality control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Survey of safety practices among hospital laboratories in Oromia Regional State, Ethiopia.

    PubMed

    Sewunet, Tsegaye; Kebede, Wakjira; Wondafrash, Beyene; Workalemau, Bereket; Abebe, Gemeda

    2014-10-01

    Unsafe working practices, working environments, disposable waste products, and chemicals in clinical laboratories contribute to infectious and non-infectious hazards. Staffs, the community, and patients are less safe. Furthermore, such practices compromise the quality of laboratory services. We conducted a study to describe safety practices in public hospital laboratories of Oromia Regional State, Ethiopia. Randomly selected ten public hospital laboratories in Oromia Regional State were studied from Oct 2011- Feb 2012. Self-administered structured questionnaire and observation checklists were used for data collection. The respondents were heads of the laboratories, senior technicians, and safety officers. The questionnaire addressed biosafety label, microbial hazards, chemical hazards, physical/mechanical hazards, personal protective equipment, first aid kits and waste disposal system. The data was analyzed using descriptive analysis with SPSS version16 statistical software. All of the respondents reported none of the hospital laboratories were labeled with the appropriate safety label and safety symbols. These respondents also reported they may contain organisms grouped under risk group IV in the absence of microbiological safety cabinets. Overall, the respondents reported that there were poor safety regulations or standards in their laboratories. There were higher risks of microbial, chemical and physical/mechanical hazards. Laboratory safety in public hospitals of Oromia Regional State is below the standard. The laboratory workers are at high risk of combined physical, chemical and microbial hazards. Prompt recognition of the problem and immediate action is mandatory to ensure safe working environment in health laboratories.

  2. Use of platelet-rich plasma in the treatment of rotator cuff pathology. What has been scientifically proven?

    PubMed

    Miranda, I; Sánchez-Alepuz, E; Lucas, F J; Carratalá, V; González-Jofre, C A

    To analyze the current scientific and/or clinical evidence supporting the use of platelet-rich plasma (PRP) in the treatment of rotator cuff pathology. After a systematic review in PubMed, studies assessing PRP efficacy in the treatment of rotator cuff pathology published since 2013 to date were identified. Data were grouped based on type of study (laboratory, clinical or meta-analysis); accordingly study design, pathology treated and clinical outcomes were summarized. Thirty five articles have been analyzed: 10 laboratory studies, 17 clinical assays and 8 meta-analyses. While laboratory studies report positive or partially positive results for the use of PRP, 70.6% of clinical studies and 75% of meta-analysis found no statistically significant differences between the PRP group and the control group. The positive results of laboratory studies do not translate well to clinical practice. There is no concordance among the few positive results reported in the clinical studies, and even some contradictory effects have been reported. There is no solid scientific and/or clinical evidence supporting the use of PRP in the treatment of rotator cuff pathology in routine clinical practice. Copyright © 2017 SECOT. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. Cocaine profiling for strategic intelligence, a cross-border project between France and Switzerland: part II. Validation of the statistical methodology for the profiling of cocaine.

    PubMed

    Lociciro, S; Esseiva, P; Hayoz, P; Dujourdy, L; Besacier, F; Margot, P

    2008-05-20

    Harmonisation and optimization of analytical and statistical methodologies were carried out between two forensic laboratories (Lausanne, Switzerland and Lyon, France) in order to provide drug intelligence for cross-border cocaine seizures. Part I dealt with the optimization of the analytical method and its robustness. This second part investigates statistical methodologies that will provide reliable comparison of cocaine seizures analysed on two different gas chromatographs interfaced with a flame ionisation detectors (GC-FIDs) in two distinct laboratories. Sixty-six statistical combinations (ten data pre-treatments followed by six different distance measurements and correlation coefficients) were applied. One pre-treatment (N+S: area of each peak is divided by its standard deviation calculated from the whole data set) followed by the Cosine or Pearson correlation coefficients were found to be the best statistical compromise for optimal discrimination of linked and non-linked samples. The centralisation of the analyses in one single laboratory is not a required condition anymore to compare samples seized in different countries. This allows collaboration, but also, jurisdictional control over data.

  4. Interlaboratory trial for the measurement of total cobalt in equine urine and plasma by ICP-MS.

    PubMed

    Popot, Marie-Agnes; Ho, Emmie N M; Stojiljkovic, Natali; Bagilet, Florian; Remy, Pierre; Maciejewski, Pascal; Loup, Benoit; Chan, George H M; Hargrave, Sabine; Arthur, Rick M; Russo, Charlie; White, James; Hincks, Pamela; Pearce, Clive; Ganio, George; Zahra, Paul; Batty, David; Jarrett, Mark; Brooks, Lydia; Prescott, Lise-Anne; Bailly-Chouriberry, Ludovic; Bonnaire, Yves; Wan, Terence S M

    2017-09-01

    Cobalt is an essential mineral micronutrient and is regularly present in equine nutritional and feed supplements. Therefore, cobalt is naturally present at low concentrations in biological samples. The administration of cobalt chloride is considered to be blood doping and is thus prohibited. To control the misuse of cobalt, it was mandatory to establish an international threshold for cobalt in plasma and/or in urine. To achieve this goal, an international collaboration, consisting of an interlaboratory comparison between 5 laboratories for the urine study and 8 laboratories for the plasma study, has been undertaken. Quantification of cobalt in the biological samples was performed by inductively coupled plasma-mass spectrometry (ICP-MS). Ring tests were based on the analysis of 5 urine samples supplemented at concentrations ranging from 5 up to 500 ng/mL and 5 plasma samples spiked at concentrations ranging from 0.5 up to 25 ng/mL. The results obtained from the different laboratories were collected, compiled, and compared to assess the reproducibility and robustness of cobalt quantification measurements. The statistical approach for the ring test for total cobalt in urine was based on the determination of percentage deviations from the calculated means, while robust statistics based on the calculated median were applied to the ring test for total cobalt in plasma. The inter-laboratory comparisons in urine and in plasma were successful so that 97.6% of the urine samples and 97.5% of the plasma samples gave satisfactory results. Threshold values for cobalt in plasma and urine were established from data only obtained by laboratories involved in the ring test. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. The impact of pneumatic tube system on routine laboratory parameters: a systematic review and meta-analysis.

    PubMed

    Kapoula, Georgia V; Kontou, Panagiota I; Bagos, Pantelis G

    2017-10-26

    Pneumatic tube system (PTS) is a widely used method of transporting blood samples in hospitals. The aim of this study was to evaluate the effects of the PTS transport in certain routine laboratory parameters as it has been implicated with hemolysis. A systematic review and a meta-analysis were conducted. PubMed and Scopus databases were searched (up until November 2016) to identify prospective studies evaluating the impact of PTS transport in hematological, biochemical and coagulation measurements. The random-effects model was used in the meta-analysis utilizing the mean difference (MD). Heterogeneity was quantitatively assessed using the Cohran's Q and the I2 index. Subgroup analysis, meta-regression analysis, sensitivity analysis, cumulative meta-analysis and assessment of publication bias were performed for all outcomes. From a total of 282 studies identified by the searching procedure, 24 were finally included in the meta-analysis. The meta-analysis yielded statistically significant results for potassium (K) [MD=0.04 mmol/L; 95% confidence interval (CI)=0.015-0.065; p=0.002], lactate dehydrogenase (LDH) (MD=10.343 U/L; 95% CI=6.132-14.554; p<10-4) and aspartate aminotransferase (AST) (MD=1.023 IU/L; 95% CI=0.344-1.702; p=0.003). Subgroup analysis and random-effects meta-regression analysis according to the speed and distance of the samples traveled via the PTS revealed that there is relation between the rate and the distance of PTS with the measurements of K, LDH, white blood cells and red blood cells. This meta-analysis suggests that PTS may be associated with alterations in K, LDH and AST measurements. Although these findings may not have any significant clinical effect on laboratory results, it is wise that each hospital validates their PTS.

  6. Results of PBX 9501 and PBX 9502 Round-Robin Quasi-Static Tension Tests from JOWOG-9/39 Focused Exchange.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, D. G.

    2002-01-01

    A round-robin study was conducted with the participation of three laboratory facilities: Los Alamos National Laboratory (LANL), BWXT Pantex Plant (PX), and Lawrence Livermore National Laboratory (LLNL). The study involved the machining and quasi-static tension testing of two plastic-bonded high explosive (PBX) composites, PBX 9501 and PBX 9502. Nine tensile specimens for each type of PBX were to be machined at each of the three facilities; 3 of these specimens were to be sent to each of the participating materials testing facilities for tensile testing. The resultant data was analyzed to look for trends associated with specimen machining location and/ormore » trends associated with materials testing location. The analysis provides interesting insights into the variability and statistical nature of mechanical properties testing on PBX composites. Caution is warranted when results are compared/exchanged between testing facilities.« less

  7. Estimating Fault Friction From Seismic Signals in the Laboratory

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; Ren, Christopher X.; Riviere, Jacques; Marone, Chris; Guyer, Robert A.; Johnson, Paul A.

    2018-02-01

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress and frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. These results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.

  8. Clinical Pharmacology Quality Assurance (CPQA) Program: Models for Longitudinal Analysis of Antiretroviral (ARV) Proficiency Testing for International Laboratories

    PubMed Central

    DiFrancesco, Robin; Rosenkranz, Susan L.; Taylor, Charlene R.; Pande, Poonam G.; Siminski, Suzanne M.; Jenny, Richard W.; Morse, Gene D.

    2013-01-01

    Among National Institutes of Health (NIH) HIV Research Networks conducting multicenter trials, samples from protocols that span several years are analyzed at multiple clinical pharmacology laboratories (CPLs) for multiple antiretrovirals (ARV). Drug assay data are, in turn, entered into study-specific datasets that are used for pharmacokinetic analyses, merged to conduct cross-protocol pharmacokinetic analysis and integrated with pharmacogenomics research to investigate pharmacokinetic-pharmacogenetic associations. The CPLs participate in a semi-annual proficiency testing (PT) program implemented by the Clinical Pharmacology Quality Assurance (CPQA) program. Using results from multiple PT rounds, longitudinal analyses of recovery are reflective of accuracy and precision within/across laboratories. The objectives of this longitudinal analysis of PT across multiple CPLs were to develop and test statistical models that longitudinally: (1)assess the precision and accuracy of concentrations reported by individual CPLs; (2)determine factors associated with round-specific and long-term assay accuracy, precision and bias using a new regression model. A measure of absolute recovery is explored as a simultaneous measure of accuracy and precision. Overall, the analysis outcomes assured 97% accuracy (±20% of the final target concentration of all (21)drug concentration results reported for clinical trial samples by multiple CPLs).Using the CLIA acceptance of meeting criteria for ≥2/3 consecutive rounds, all ten laboratories that participated in three or more rounds per analyte maintained CLIA proficiency. Significant associations were present between magnitude of error and CPL (Kruskal Wallis [KW]p<0.001), and ARV (KW p<0.001). PMID:24052065

  9. Laboratory-Based Prospective Surveillance for Community Outbreaks of Shigella spp. in Argentina

    PubMed Central

    Viñas, María R.; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P.; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I.; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    Background To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. Methodology To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. Principal Findings In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. Conclusions/Significance The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks. PMID:24349586

  10. Laboratory-based prospective surveillance for community outbreaks of Shigella spp. in Argentina.

    PubMed

    Viñas, María R; Tuduri, Ezequiel; Galar, Alicia; Yih, Katherine; Pichel, Mariana; Stelling, John; Brengi, Silvina P; Della Gaspera, Anabella; van der Ploeg, Claudia; Bruno, Susana; Rogé, Ariel; Caffer, María I; Kulldorff, Martin; Galas, Marcelo

    2013-01-01

    To implement effective control measures, timely outbreak detection is essential. Shigella is the most common cause of bacterial diarrhea in Argentina. Highly resistant clones of Shigella have emerged, and outbreaks have been recognized in closed settings and in whole communities. We hereby report our experience with an evolving, integrated, laboratory-based, near real-time surveillance system operating in six contiguous provinces of Argentina during April 2009 to March 2012. To detect localized shigellosis outbreaks timely, we used the prospective space-time permutation scan statistic algorithm of SaTScan, embedded in WHONET software. Twenty three laboratories sent updated Shigella data on a weekly basis to the National Reference Laboratory. Cluster detection analysis was performed at several taxonomic levels: for all Shigella spp., for serotypes within species and for antimicrobial resistance phenotypes within species. Shigella isolates associated with statistically significant signals (clusters in time/space with recurrence interval ≥365 days) were subtyped by pulsed field gel electrophoresis (PFGE) using PulseNet protocols. In three years of active surveillance, our system detected 32 statistically significant events, 26 of them identified before hospital staff was aware of any unexpected increase in the number of Shigella isolates. Twenty-six signals were investigated by PFGE, which confirmed a close relationship among the isolates for 22 events (84.6%). Seven events were investigated epidemiologically, which revealed links among the patients. Seventeen events were found at the resistance profile level. The system detected events of public health importance: infrequent resistance profiles, long-lasting and/or re-emergent clusters and events important for their duration or size, which were reported to local public health authorities. The WHONET-SaTScan system may serve as a model for surveillance and can be applied to other pathogens, implemented by other networks, and scaled up to national and international levels for early detection and control of outbreaks.

  11. Experimental assessment of the spatial variability of porosity, permeability and sorption isotherms in an ordinary building concrete

    NASA Astrophysics Data System (ADS)

    Issaadi, N.; Hamami, A. A.; Belarbi, R.; Aït-Mokhtar, A.

    2017-10-01

    In this paper, spatial variabilities of some transfer and storage properties of a concrete wall were assessed. The studied parameters deal with water porosity, water vapor permeability, intrinsic permeability and water vapor sorption isotherms. For this purpose, a concrete wall was built in the laboratory and specimens were periodically taken and tested. The obtained results allow highlighting a statistical estimation of the mean value, the standard deviation and the spatial correlation length of the studied fields for each parameter. These results were discussed and a statistical analysis was performed in order to assess for each of these parameters the appropriate probability density function.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  13. Odor measurements according to EN 13725: A statistical analysis of variance components

    NASA Astrophysics Data System (ADS)

    Klarenbeek, Johannes V.; Ogink, Nico W. M.; van der Voet, Hilko

    2014-04-01

    In Europe, dynamic olfactometry, as described by the European standard EN 13725, has become the preferred method for evaluating odor emissions emanating from industrial and agricultural sources. Key elements of this standard are the quality criteria for trueness and precision (repeatability). Both are linked to standard values of n-butanol in nitrogen. It is assumed in this standard that whenever a laboratory complies with the overall sensory quality criteria for n-butanol, the quality level is transferable to other, environmental, odors. Although olfactometry is well established, little has been done to investigate inter laboratory variance (reproducibility). Therefore, the objective of this study was to estimate the reproducibility of odor laboratories complying with EN 13725 as well as to investigate the transferability of n-butanol quality criteria to other odorants. Based upon the statistical analysis of 412 odor measurements on 33 sources, distributed in 10 proficiency tests, it was established that laboratory, panel and panel session are components of variance that significantly differ between n-butanol and other odorants (α = 0.05). This finding does not support the transferability of the quality criteria, as determined on n-butanol, to other odorants and as such is a cause for reconsideration of the present single reference odorant as laid down in EN 13725. In case of non-butanol odorants, repeatability standard deviation (sr) and reproducibility standard deviation (sR) were calculated to be 0.108 and 0.282 respectively (log base-10). The latter implies that the difference between two consecutive single measurements, performed on the same testing material by two or more laboratories under reproducibility conditions, will not be larger than a factor 6.3 in 95% of cases. As far as n-butanol odorants are concerned, it was found that the present repeatability standard deviation (sr = 0.108) compares favorably to that of EN 13725 (sr = 0.172). It is therefore suggested that the repeatability limit (r), as laid down in EN 13725, can be reduced from r ≤ 0.477 to r ≤ 0.31.

  14. Reducing cognitive load in the chemistry laboratory by using technology-driven guided inquiry experiments

    NASA Astrophysics Data System (ADS)

    Hubacz, Frank, Jr.

    The chemistry laboratory is an integral component of the learning experience for students enrolled in college-level general chemistry courses. Science education research has shown that guided inquiry investigations provide students with an optimum learning environment within the laboratory. These investigations reflect the basic tenets of constructivism by engaging students in a learning environment that allows them to experience what they learn and to then construct, in their own minds, a meaningful understanding of the ideas and concepts investigated. However, educational research also indicates that the physical plant of the laboratory environment combined with the procedural requirements of the investigation itself often produces a great demand upon a student's working memory. This demand, which is often superfluous to the chemical concept under investigation, creates a sensory overload or extraneous cognitive load within the working memory and becomes a significant obstacle to student learning. Extraneous cognitive load inhibits necessary schema formation within the learner's working memory thereby impeding the transfer of ideas to the learner's long-term memory. Cognitive Load Theory suggests that instructional material developed to reduce extraneous cognitive load leads to an improved learning environment for the student which better allows for schema formation. This study first compared the cognitive load demand, as measured by mental effort, experienced by 33 participants enrolled in a first-year general chemistry course in which the treatment group, using technology based investigations, and the non-treatment group, using traditional labware, investigated identical chemical concepts on five different exercises. Mental effort was measured via a mental effort survey, a statistical comparison of individual survey results to a procedural step count, and an analysis of fourteen post-treatment interviews. Next, a statistical analysis of achievement was completed by comparing lab grade averages, final exam averages, and final course grade averages between the two groups. Participant mental effort survey results showed significant positive effects of technology in reducing cognitive load for two laboratory investigations. One investigation revealed a significant difference in achievement measured by lab grade average comparisons. Although results of this study are inconclusive as to the usefulness of technology-driven investigations to affect learning, recommendations for further study are discussed.

  15. Obtaining mathematical models for assessing efficiency of dust collectors using integrated system of analysis and data management STATISTICA Design of Experiments

    NASA Astrophysics Data System (ADS)

    Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.

    2018-05-01

    The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.

  16. Los Alamos National Laboratory W76 Pit Tube Lifetime Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeln, Terri G.

    2012-04-25

    A metallurgical study was requested as part of the Los Alamos National Laboratory (LANL) W76-1 life-extension program (LEP) involving a lifetime analysis of type 304 stainless steel pit tubes subject to repeat bending loads during assembly and disassembly operations at BWXT/Pantex. This initial test phase was completed during the calendar years of 2004-2006 and the report not issued until additional recommended tests could be performed. These tests have not been funded to this date and therefore this report is considered final. Tubes were reportedly fabricated according to Rocky Flats specification P14548 - Seamless Type 304 VIM/VAR Stainless Steel Tubing. Tubemore » diameter was specified as 0.125 inches and wall thickness as 0.028 inches. A heat treat condition is not specified and the hardness range specification can be characteristic of both 1/8 and 1/4 hard conditions. Properties of all tubes tested were within specification. Metallographic analysis could not conclusively determine a specified limit to number of bends allowable. A statistical analysis suggests a range of 5-7 bends with a 99.95% confidence limit. See the 'Statistical Analysis' section of this report. The initial phase of this study involved two separate sets of test specimens. The first group was part of an investigation originating in the ESA-GTS [now Gas Transfer Systems (W-7) Group]. After the bend cycle test parameters were chosen (all three required bends subjected to the same amount of bend cycles) and the tubes bent, the investigation was transferred to Terri Abeln (Metallurgical Science and Engineering) for analysis. Subsequently, another limited quantity of tubes became available for testing and were cycled with the same bending fixture, but with different test parameters determined by T. Abeln.« less

  17. Multiplex cytokine profiling with highly pathogenic material: use of formalin solution in luminex analysis.

    PubMed

    Dowall, Stuart D; Graham, Victoria A; Tipton, Thomas R W; Hewson, Roger

    2009-08-31

    Work with highly pathogenic material mandates the use of biological containment facilities, involving microbiological safety cabinets and specialist laboratory engineering structures typified by containment level 3 (CL3) and CL4 laboratories. Consequences of working in high containment are the practical difficulties associated with containing specialist assays and equipment often essential for experimental analyses. In an era of increased interest in biodefence pathogens and emerging diseases, immunological analysis has developed rapidly alongside traditional techniques in virology and molecular biology. For example, in order to maximise the use of small sample volumes, multiplexing has become a more popular and widespread approach to quantify multiple analytes simultaneously, such as cytokines and chemokines. The luminex microsphere system allows for the detection of many cytokines and chemokines in a single sample, but the detection method of using aligned lasers and fluidics means that samples often have to be analysed in low containment facilities. In order to perform cytokine analysis in materials from high containment (CL3 and CL4 laboratories), we have developed an appropriate inactivation methodology after staining steps, which although results in a reduction of median fluorescent intensity, produces statistically comparable outcomes when judged against non-inactivated samples. This methodology thus extends the use of luminex technology for material that contains highly pathogenic biological agents.

  18. Considerations for estimating microbial environmental data concentrations collected from a field setting

    PubMed Central

    Silvestri, Erin E; Yund, Cynthia; Taft, Sarah; Bowling, Charlena Yoder; Chappie, Daniel; Garrahan, Kevin; Brady-Roberts, Eletha; Stone, Harry; Nichols, Tonya L

    2017-01-01

    In the event of an indoor release of an environmentally persistent microbial pathogen such as Bacillus anthracis, the potential for human exposure will be considered when remedial decisions are made. Microbial site characterization and clearance sampling data collected in the field might be used to estimate exposure. However, there are many challenges associated with estimating environmental concentrations of B. anthracis or other spore-forming organisms after such an event before being able to estimate exposure. These challenges include: (1) collecting environmental field samples that are adequate for the intended purpose, (2) conducting laboratory analyses and selecting the reporting format needed for the laboratory data, and (3) analyzing and interpreting the data using appropriate statistical techniques. This paper summarizes some key challenges faced in collecting, analyzing, and interpreting microbial field data from a contaminated site. Although the paper was written with considerations for B. anthracis contamination, it may also be applicable to other bacterial agents. It explores the implications and limitations of using field data for determining environmental concentrations both before and after decontamination. Several findings were of interest. First, to date, the only validated surface/sampling device combinations are swabs and sponge-sticks on stainless steel surfaces, thus limiting availability of quantitative analytical results which could be used for statistical analysis. Second, agreement needs to be reached with the analytical laboratory on the definition of the countable range and on reporting of data below the limit of quantitation. Finally, the distribution of the microbial field data and statistical methods needed for a particular data set could vary depending on these data that were collected, and guidance is needed on appropriate statistical software for handling microbial data. Further, research is needed to develop better methods to estimate human exposure from pathogens using environmental data collected from a field setting. PMID:26883476

  19. Statistical analysis of target acquisition sensor modeling experiments

    NASA Astrophysics Data System (ADS)

    Deaver, Dawne M.; Moyer, Steve

    2015-05-01

    The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.

  20. Application of random match probability calculations to mixed STR profiles.

    PubMed

    Bille, Todd; Bright, Jo-Anne; Buckleton, John

    2013-03-01

    Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.

  1. A behavior-analytic view of psychological health

    PubMed Central

    Follette, William C.; Bach, Patricia A.; Follette, Victoria M.

    1993-01-01

    This paper argues that a behavioral analysis of psychological health is useful and appropriate. Such an analysis will allow us to better evaluate intervention outcomes without resorting only to the assessment of pathological behavior, thus providing an alternative to the Diagnostic and Statistical Manual system of conceptualizing behavior. The goals of such an analysis are to distinguish between people and outcomes using each term of the three-term contingency as a dimension to consider. A brief review of other efforts to define psychological health is provided. Laboratory approaches to a behavioral analysis of healthy behavior are presented along with shortcomings in our science that impede our analysis. Finally, we present some of the functional characteristics of psychological health that we value. PMID:22478160

  2. [Infective endocarditis in intensive cardiac care unit - clinical and biochemical differences of blood-culture negative infective endocarditis].

    PubMed

    Kaziród-Wolski, Karol; Sielski, Janusz; Ciuraszkiewicz, Katarzyna

    2017-01-23

    Diagnosis and treatment of infective endocarditis (IE) is still a challenge for physicians. Group of patients with the worst prognosis is treated in Intensive Cardiac Care Unit (ICCU). Etiologic agent can not be identified in a substantial number of patients. The aim of study is to find differences between patients with blood culture negative infective endocarditis (BCNIE) and blood culture positive infective endocarditis (BCPIE) treated in ICCU by comparing their clinical course and laboratory parameters. Retrospective analysis of 30 patients with IE hospitalized in ICCU Swietokrzyskie Cardiac Centre between 2010 and 2016. This group consist of 26 men (86,67%) and 4 women (13,3%). Mean age was 58 years ±13. Most of the cases were new disease, recurrence of the disease was observed in 2 cases (6,7%). 8 patients (26,7%) required artificial ventilation, 11 (36,7%) received inotropes and 6 (20%) vasopresors. In 14 (46,7%) cases blood cultures was negative (BCNIE), the rest of patients (16, 53,3%) was blood cultures - positive infective endocarditis (BCIE). Both of the groups were clinically similar. There were no statistically significant differences in incidence of cardiac implants, localization of bacterial vegetations, administered catecholamines, antibiotic therapy, artificial ventilation, surgical treatment, complication and in-hospital mortality. Incidence of cardiac complications in all of BCNIE cases and in 81,3% cases of BCPIE draws attention, but it is not statistically significant difference (p=0,08). There was statistically significant difference in mean BNP blood concentration (3005,17 ng/ml ±2045,2 vs 1013,42 ng/ml ±1087,6; p=0,01), but there were no statistically significant differences in rest of laboratory parameters. BCNIE group has got higher mean BNP blood concentration than BCPIE group. There were no statistically significant differences between these groups in others laboratory parameters, clinical course and administered antibiotic therapy. In our endemic region major cause of BCNIE seems to be early antibiotic therapy prior to collection of blood samples, but further studies are necessary.

  3. Plasma creatinine in dogs: intra- and inter-laboratory variation in 10 European veterinary laboratories

    PubMed Central

    2011-01-01

    Background There is substantial variation in reported reference intervals for canine plasma creatinine among veterinary laboratories, thereby influencing the clinical assessment of analytical results. The aims of the study was to determine the inter- and intra-laboratory variation in plasma creatinine among 10 veterinary laboratories, and to compare results from each laboratory with the upper limit of its reference interval. Methods Samples were collected from 10 healthy dogs, 10 dogs with expected intermediate plasma creatinine concentrations, and 10 dogs with azotemia. Overlap was observed for the first two groups. The 30 samples were divided into 3 batches and shipped in random order by postal delivery for plasma creatinine determination. Statistical testing was performed in accordance with ISO standard methodology. Results Inter- and intra-laboratory variation was clinically acceptable as plasma creatinine values for most samples were usually of the same magnitude. A few extreme outliers caused three laboratories to fail statistical testing for consistency. Laboratory sample means above or below the overall sample mean, did not unequivocally reflect high or low reference intervals in that laboratory. Conclusions In spite of close analytical results, further standardization among laboratories is warranted. The discrepant reference intervals seem to largely reflect different populations used in establishing the reference intervals, rather than analytical variation due to different laboratory methods. PMID:21477356

  4. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    DTIC Science & Technology

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  5. Integrated Data Collection Analysis (IDCA) Program - Final Review September 12, 2012 at DHS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Warner, Kirstin F.

    The Integrated Data Collection Analysis (IDCA) program conducted a final program review at the Department of Homeland Security on September 12, 2012. The review was focused on the results of the program over the complete performance period. A summary presentation delineating the accomplished tasks started the meeting, followed by technical presentations on various issues that arose during the performance period. The presentations were completed with a statistical evaluation of the testing results from all the participants in the IDCA Proficiency Test study. The meeting closed with a discussion of potential sources of funding for continuing work to resolve some ofmore » these technical issues. This effort, funded by the Department of Homeland Security (DHS), put the issues of safe handling of these materials in perspective with standard military explosives. The study added Small-Scale Safety and Thermal (SSST) testing results for a broad suite of different HMEs to the literature, and suggested new guidelines and methods to develop safe handling practices for HMEs. Each participating testing laboratory used identical test materials and preparation methods wherever possible. Note, however, the test procedures differ among the laboratories. The results were compared among the laboratories and then compared to historical data from various sources. The testing performers involved were Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Naval Surface Warfare Center, Indian Head Division (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory, Tyndall AFB (AFRL/RXQL). These tests were conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.« less

  6. Statistical learning in reading: variability in irrelevant letters helps children learn phonics skills.

    PubMed

    Apfelbaum, Keith S; Hazeltine, Eliot; McMurray, Bob

    2013-07-01

    Early reading abilities are widely considered to derive in part from statistical learning of regularities between letters and sounds. Although there is substantial evidence from laboratory work to support this, how it occurs in the classroom setting has not been extensively explored; there are few investigations of how statistics among letters and sounds influence how children actually learn to read or what principles of statistical learning may improve learning. We examined 2 conflicting principles that may apply to learning grapheme-phoneme-correspondence (GPC) regularities for vowels: (a) variability in irrelevant units may help children derive invariant relationships and (b) similarity between words may force children to use a deeper analysis of lexical structure. We trained 224 first-grade students on a small set of GPC regularities for vowels, embedded in words with either high or low consonant similarity, and tested their generalization to novel tasks and words. Variability offered a consistent benefit over similarity for trained and new words in both trained and new tasks.

  7. [Blood volume for biochemistry determinations--laboratory needs and everyday practice].

    PubMed

    Sztefko, Krystyna; Mamica, Katarzyna; Bugajska, Jolanta; Maziarz, Barbara; Tomasik, Przemysław

    2014-01-01

    Blood loss due to diagnostic phlebotomy jest a very serious problem, especially for newborn, infants and critically ill patients on intensive care units. Although single blood loss can be easily tolerated in adults, in small babies and in patients who are frequently monitored based on laboratory tests iatrogenic anaemia can occur. To evaluate the blood volume drawn for routine biochemistry tests in relation to patient age and the number of parameters requested. Blood volume drawn for routine biochemistry measurements from patients hospitalized in University Children's Hospital (N = 2980, children age from one day to 18 years) and in University Hospital (N = 859, adults, aged > 1.8 years) in Cracow has been analyzed. Blood volume was calculated based on regular tube diameter and blood heights in the tube. In case of microvettes the blood volume was 0.2 ml. Statistical analysis has been performed by using PRISM 5.0. The statistical significance was set at p < 0.05. The mean values of blood volume were 3.02 +/- 0.92 ml and 4.12 +/- 0.68 ml in children and adults, respectively. Analyzing blood volume drawn in children using both microvettes and regular tubes, significant correlation between blood volume and patient age (p < 0.001) as well the number of requested parameters (p < 0.001). The latest relationship was true only for up to five parameters. However, analyzing the blood volume drawn into only into regular tubes blood volume was not related to patients age and number of laboratory tests requested. The proportion of microvettes used for blood collection was highest for newborns and infants, and in all cases where only one to three laboratory tests were requested. 1. All educational programs for nurses and doctors should include the information about current laboratory automation and methods miniaturization; 2) The amount of blood volume needed by laboratory for the requested number of tests should always be taken into account when diagnostic phlebotomy is necessary.

  8. Addendum to Sampling and Analysis Plan (SAP) for Assessment of LANL-Derived Residual Radionuclides in Soils within Tract A-16-d for Land Conveyance and Transfer for Sewage Treatment Facility Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whicker, Jeffrey Jay; Gillis, Jessica Mcdonnel; Ruedig, Elizabeth

    This report summarizes the sampling design used, associated statistical assumptions, as well as general guidelines for conducting post-sampling data analysis. Sampling plan components presented here include how many sampling locations to choose and where within the sampling area to collect those samples. The type of medium to sample (i.e., soil, groundwater, etc.) and how to analyze the samples (in-situ, fixed laboratory, etc.) are addressed in other sections of the sampling plan.

  9. Discriminant analysis of Raman spectra for body fluid identification for forensic purposes.

    PubMed

    Sikirzhytski, Vitali; Virkler, Kelly; Lednev, Igor K

    2010-01-01

    Detection and identification of blood, semen and saliva stains, the most common body fluids encountered at a crime scene, are very important aspects of forensic science today. This study targets the development of a nondestructive, confirmatory method for body fluid identification based on Raman spectroscopy coupled with advanced statistical analysis. Dry traces of blood, semen and saliva obtained from multiple donors were probed using a confocal Raman microscope with a 785-nm excitation wavelength under controlled laboratory conditions. Results demonstrated the capability of Raman spectroscopy to identify an unknown substance to be semen, blood or saliva with high confidence.

  10. Long-term radiation effects on GaAs solar cell characteristics

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Doviak, M. J.

    1978-01-01

    This report investigates preliminary design considerations which should be considered for a space experiment involving Gallium Arsenide (GaAs) solar cells. The electron radiation effects on GaAs solar cells were conducted in a laboratory environment, and a statistical analysis of the data is presented. In order to augment the limited laboratory data, a theoretical investigation of the effect of radiation on GaAs solar cells is also developed. The results of this study are empirical prediction equations which can be used to estimate the actual damage of electrical characteristics in a space environment. The experimental and theoretical studies also indicate how GaAs solar cell parameters should be designed in order to withstand the effects of electron radiation damage.

  11. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    NASA Astrophysics Data System (ADS)

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  12. Risk prediction for chronic kidney disease progression using heterogeneous electronic health record data and time series analysis.

    PubMed

    Perotte, Adler; Ranganath, Rajesh; Hirsch, Jamie S; Blei, David; Elhadad, Noémie

    2015-07-01

    As adoption of electronic health records continues to increase, there is an opportunity to incorporate clinical documentation as well as laboratory values and demographics into risk prediction modeling. The authors develop a risk prediction model for chronic kidney disease (CKD) progression from stage III to stage IV that includes longitudinal data and features drawn from clinical documentation. The study cohort consisted of 2908 primary-care clinic patients who had at least three visits prior to January 1, 2013 and developed CKD stage III during their documented history. Development and validation cohorts were randomly selected from this cohort and the study datasets included longitudinal inpatient and outpatient data from these populations. Time series analysis (Kalman filter) and survival analysis (Cox proportional hazards) were combined to produce a range of risk models. These models were evaluated using concordance, a discriminatory statistic. A risk model incorporating longitudinal data on clinical documentation and laboratory test results (concordance 0.849) predicts progression from state III CKD to stage IV CKD more accurately when compared to a similar model without laboratory test results (concordance 0.733, P<.001), a model that only considers the most recent laboratory test results (concordance 0.819, P < .031) and a model based on estimated glomerular filtration rate (concordance 0.779, P < .001). A risk prediction model that takes longitudinal laboratory test results and clinical documentation into consideration can predict CKD progression from stage III to stage IV more accurately than three models that do not take all of these variables into consideration. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  13. Point-of-care testing of electrolytes and calcium using blood gas analysers: it is time we trusted the results.

    PubMed

    Mirzazadeh, Mehdi; Morovat, Alireza; James, Tim; Smith, Ian; Kirby, Justin; Shine, Brian

    2016-03-01

    Point-of-care testing allows rapid analysis of samples to facilitate prompt clinical decisions. Electrolyte and calcium abnormalities are common in acutely ill patients and can be associated with life-threatening consequences. There is uncertainty whether clinical decisions can be based on the results obtained from blood gas analysers or if laboratory results should be awaited. To assess the agreement between sodium, potassium and calcium results from blood gas and laboratory mainstream analysers in a tertiary centre, with a network consisting of one referral and two peripheral hospitals, consisting of three networked clinical biochemistry laboratories. Using the laboratory information management system database and over 11 000 paired samples in three hospital sites, the results of sodium, potassium and ionised calcium on blood gas analysers were studied over a 5-year period and compared with the corresponding laboratory results from the same patients booked in the laboratory within 1 h. The Pearson's linear correlation coefficient between laboratory and blood gas results for sodium, potassium and calcium were 0.92, 0.84 and 0.78, respectively. Deming regression analysis showed a slope of 1.04 and an intercept of -5.7 for sodium, slope of 0.93 and an intercept of 0.22 for potassium and a slope of 1.23 with an intercept of -0.55 for calcium. With some strict statistical assumptions, percentages of results lying outside the least significant difference were 9%, 26.7% and 20.8% for sodium, potassium and calcium, respectively. Most clinicians wait for the laboratory confirmation of results generated by blood gas analysers. In a large retrospective study we have shown that there is sufficient agreement between the results obtained from the blood gas and laboratory analysers to enable prompt clinical decisions to be made. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Safety leadership in the teaching laboratories of electrical and electronic engineering departments at Taiwanese Universities.

    PubMed

    Wu, Tsung-Chih

    2008-01-01

    Safety has always been one of the principal goals in teaching laboratories. Laboratories cannot serve their educational purpose when accidents occur. The leadership of department heads has a major impact on laboratory safety, so this study discusses the factors affecting safety leadership in teaching laboratories. This study uses a mail survey to explore the perceived safety leadership in electrical and electronic engineering departments at Taiwanese universities. An exploratory factor analysis shows that there are three main components of safety leadership, as measured on a safety leadership scale: safety controlling, safety coaching, and safety caring. The descriptive statistics also reveals that among faculty, the perception of department heads' safety leadership is in general positive. A two-way MANOVA shows that there are interaction effects on safety leadership between university size and instructor age; there are also interaction effects between presence of a safety committee and faculty gender and faculty age. It is therefore necessary to assess organizational factors when determining whether individual factors are the cause of differing perceptions among faculty members. The author also presents advice on improving safety leadership for department heads at small universities and at universities without safety committees.

  15. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  16. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  17. Marginal fit of indirect composite inlays using a new system for manual fabrication.

    PubMed

    Pott, P; Rzasa, A; Stiesch, M; Eisenburger, M

    2016-09-01

    This in vitro study compares a new system for manual chair side fabrication of indirect composite restorations, which uses silicone models after alginate impressions, to CAD/CAM-technology and laboratory manual production techniques. MATRIALS AND METHODS: and study design Each 10 composite inlays were fabricated using different types of production techniques: CAD/CAM- technology (A), the new inlay system (B), plaster model after alginate impression (C) or silicone impression (D). The inlays were adapted into a metal tooth and silicone replicas of the cement gaps were made and measured. Statistical analysis was performed using ANOVA and Tukey's test. Results and Statistics In group A the biggest marginal gaps (174.9μm ± 106.2μm) were found. In group B the gaps were significantly smaller (119.5 μm ± 90.6 μm) than in group A (p=0.035). Between groups C (64.6 μm ± 68.0μm) and D (58.2 μm ± 61.7 μm) no significant differences could be found (p=0.998), but the gaps were significantly smaller compared with group B. Conclusion Chairside manufacturing of composite inlays resulted in better marginal precision than CAD/CAM technology. In comparison to build restorations in a laboratory, the new system is a timesaving and inexpensive alternative. Nevertheless, production of indirect composite restorations in the dental laboratory showed the highest precision.

  18. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    PubMed

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  19. Global Sensitivity Analysis of Environmental Systems via Multiple Indices based on Statistical Moments of Model Outputs

    NASA Astrophysics Data System (ADS)

    Guadagnini, A.; Riva, M.; Dell'Oca, A.

    2017-12-01

    We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.

  20. A model for the statistical description of analytical errors occurring in clinical chemical laboratories with time.

    PubMed

    Hyvärinen, A

    1985-01-01

    The main purpose of the present study was to describe the statistical behaviour of daily analytical errors in the dimensions of place and time, providing a statistical basis for realistic estimates of the analytical error, and hence allowing the importance of the error and the relative contributions of its different sources to be re-evaluated. The observation material consists of creatinine and glucose results for control sera measured in daily routine quality control in five laboratories for a period of one year. The observation data were processed and computed by means of an automated data processing system. Graphic representations of time series of daily observations, as well as their means and dispersion limits when grouped over various time intervals, were investigated. For partition of the total variation several two-way analyses of variance were done with laboratory and various time classifications as factors. Pooled sets of observations were tested for normality of distribution and for consistency of variances, and the distribution characteristics of error variation in different categories of place and time were compared. Errors were found from the time series to vary typically between days. Due to irregular fluctuations in general and particular seasonal effects in creatinine, stable estimates of means or of dispersions for errors in individual laboratories could not be easily obtained over short periods of time but only from data sets pooled over long intervals (preferably at least one year). Pooled estimates of proportions of intralaboratory variation were relatively low (less than 33%) when the variation was pooled within days. However, when the variation was pooled over longer intervals this proportion increased considerably, even to a maximum of 89-98% (95-98% in each method category) when an outlying laboratory in glucose was omitted, with a concomitant decrease in the interaction component (representing laboratory-dependent variation with time). This indicates that a substantial part of the variation comes from intralaboratory variation with time rather than from constant interlaboratory differences. Normality and consistency of statistical distributions were best achieved in the long-term intralaboratory sets of the data, under which conditions the statistical estimates of error variability were also most characteristic of the individual laboratories rather than necessarily being similar to one another. Mixing of data from different laboratories may give heterogeneous and nonparametric distributions and hence is not advisable.(ABSTRACT TRUNCATED AT 400 WORDS)

  1. Applying the LANL Statistical Pattern Recognition Paradigm for Structural Health Monitoring to Data from a Surface-Effect Fast Patrol Boat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoon Sohn; Charles Farrar; Norman Hunter

    2001-01-01

    This report summarizes the analysis of fiber-optic strain gauge data obtained from a surface-effect fast patrol boat being studied by the staff at the Norwegian Defense Research Establishment (NDRE) in Norway and the Naval Research Laboratory (NRL) in Washington D.C. Data from two different structural conditions were provided to the staff at Los Alamos National Laboratory. The problem was then approached from a statistical pattern recognition paradigm. This paradigm can be described as a four-part process: (1) operational evaluation, (2) data acquisition & cleansing, (3) feature extraction and data reduction, and (4) statistical model development for feature discrimination. Given thatmore » the first two portions of this paradigm were mostly completed by the NDRE and NRL staff, this study focused on data normalization, feature extraction, and statistical modeling for feature discrimination. The feature extraction process began by looking at relatively simple statistics of the signals and progressed to using the residual errors from auto-regressive (AR) models fit to the measured data as the damage-sensitive features. Data normalization proved to be the most challenging portion of this investigation. A novel approach to data normalization, where the residual errors in the AR model are considered to be an unmeasured input and an auto-regressive model with exogenous inputs (ARX) is then fit to portions of the data exhibiting similar waveforms, was successfully applied to this problem. With this normalization procedure, a clear distinction between the two different structural conditions was obtained. A false-positive study was also run, and the procedure developed herein did not yield any false-positive indications of damage. Finally, the results must be qualified by the fact that this procedure has only been applied to very limited data samples. A more complete analysis of additional data taken under various operational and environmental conditions as well as other structural conditions is necessary before one can definitively state that the procedure is robust enough to be used in practice.« less

  2. The role of diagnostic laboratories in support of animal disease surveillance systems.

    PubMed

    Zepeda, C

    2007-01-01

    Diagnostic laboratories are an essential component of animal disease surveillance systems. To understand the occurrence of disease in populations, surveillance systems rely on random or targeted surveys using three approaches: clinical, serological and virological surveillance. Clinical surveillance is the basis for early detection of disease and is usually centered on the detection of syndromes and clinical findings requiring confirmation by diagnostic laboratories. Although most of the tests applied usually perform to an acceptable standard, several have not been properly validated in terms of their diagnostic sensitivity and specificity. Sensitivity and specificity estimates can vary according to local conditions and, ideally, should be determined by national laboratories where the tests are to be applied. The importance of sensitivity and specificity estimates in the design and interpretation of statistically based surveys and risk analysis is fundamental to establish appropriate disease control and prevention strategies. The World Organisation for Animal Health's (OIE) network of reference laboratories acts as centers of expertise for the diagnosis of OIE listed diseases and have a role in promoting the validation of OIE prescribed tests for international trade. This paper discusses the importance of the epidemiological evaluation of diagnostic tests and the role of the OIE Reference Laboratories and Collaborating Centres in this process.

  3. Standardisation of a European measurement method for organic carbon and elemental carbon in ambient air: results of the field trial campaign and the determination of a measurement uncertainty and working range.

    PubMed

    Brown, Richard J C; Beccaceci, Sonya; Butterfield, David M; Quincey, Paul G; Harris, Peter M; Maggos, Thomas; Panteliadis, Pavlos; John, Astrid; Jedynska, Aleksandra; Kuhlbusch, Thomas A J; Putaud, Jean-Philippe; Karanasiou, Angeliki

    2017-10-18

    The European Committee for Standardisation (CEN) Technical Committee 264 'Air Quality' has recently produced a standard method for the measurements of organic carbon and elemental carbon in PM 2.5 within its working group 35 in response to the requirements of European Directive 2008/50/EC. It is expected that this method will be used in future by all Member States making measurements of the carbonaceous content of PM 2.5 . This paper details the results of a laboratory and field measurement campaign and the statistical analysis performed to validate the standard method, assess its uncertainty and define its working range to provide clarity and confidence in the underpinning science for future users of the method. The statistical analysis showed that the expanded combined uncertainty for transmittance protocol measurements of OC, EC and TC is expected to be below 25%, at the 95% level of confidence, above filter loadings of 2 μg cm -2 . An estimation of the detection limit of the method for total carbon was 2 μg cm -2 . As a result of the laboratory and field measurement campaign the EUSAAR2 transmittance measurement protocol was chosen as the basis of the standard method EN 16909:2017.

  4. Analysis of the HLA population data (AHPD) submitted to the 15th International Histocompatibility/Immunogenetics Workshop by using the Gene[rate] computer tools accommodating ambiguous data (AHPD project report).

    PubMed

    Nunes, J M; Riccio, M E; Buhler, S; Di, D; Currat, M; Ries, F; Almada, A J; Benhamamouch, S; Benitez, O; Canossi, A; Fadhlaoui-Zid, K; Fischer, G; Kervaire, B; Loiseau, P; de Oliveira, D C M; Papasteriades, C; Piancatelli, D; Rahal, M; Richard, L; Romero, M; Rousseau, J; Spiroski, M; Sulcebe, G; Middleton, D; Tiercy, J-M; Sanchez-Mazas, A

    2010-07-01

    During the 15th International Histocompatibility and Immunogenetics Workshop (IHIWS), 14 human leukocyte antigen (HLA) laboratories participated in the Analysis of HLA Population Data (AHPD) project where 18 new population samples were analyzed statistically and compared with data available from previous workshops. To that aim, an original methodology was developed and used (i) to estimate frequencies by taking into account ambiguous genotypic data, (ii) to test for Hardy-Weinberg equilibrium (HWE) by using a nested likelihood ratio test involving a parameter accounting for HWE deviations, (iii) to test for selective neutrality by using a resampling algorithm, and (iv) to provide explicit graphical representations including allele frequencies and basic statistics for each series of data. A total of 66 data series (1-7 loci per population) were analyzed with this standard approach. Frequency estimates were compliant with HWE in all but one population of mixed stem cell donors. Neutrality testing confirmed the observation of heterozygote excess at all HLA loci, although a significant deviation was established in only a few cases. Population comparisons showed that HLA genetic patterns were mostly shaped by geographic and/or linguistic differentiations in Africa and Europe, but not in America where both genetic drift in isolated populations and gene flow in admixed populations led to a more complex genetic structure. Overall, a fruitful collaboration between HLA typing laboratories and population geneticists allowed finding useful solutions to the problem of estimating gene frequencies and testing basic population diversity statistics on highly complex HLA data (high numbers of alleles and ambiguities), with promising applications in either anthropological, epidemiological, or transplantation studies.

  5. Particle size distributions by transmission electron microscopy: an interlaboratory comparison case study

    PubMed Central

    Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A

    2015-01-01

    This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398

  6. Application of statistical process control to qualitative molecular diagnostic assays.

    PubMed

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  7. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  8. Beyond existence and aiming outside the laboratory: estimating frequency-dependent and pay-off-biased social learning strategies.

    PubMed

    McElreath, Richard; Bell, Adrian V; Efferson, Charles; Lubell, Mark; Richerson, Peter J; Waring, Timothy

    2008-11-12

    The existence of social learning has been confirmed in diverse taxa, from apes to guppies. In order to advance our understanding of the consequences of social transmission and evolution of behaviour, however, we require statistical tools that can distinguish among diverse social learning strategies. In this paper, we advance two main ideas. First, social learning is diverse, in the sense that individuals can take advantage of different kinds of information and combine them in different ways. Examining learning strategies for different information conditions illuminates the more detailed design of social learning. We construct and analyse an evolutionary model of diverse social learning heuristics, in order to generate predictions and illustrate the impact of design differences on an organism's fitness. Second, in order to eventually escape the laboratory and apply social learning models to natural behaviour, we require statistical methods that do not depend upon tight experimental control. Therefore, we examine strategic social learning in an experimental setting in which the social information itself is endogenous to the experimental group, as it is in natural settings. We develop statistical models for distinguishing among different strategic uses of social information. The experimental data strongly suggest that most participants employ a hierarchical strategy that uses both average observed pay-offs of options as well as frequency information, the same model predicted by our evolutionary analysis to dominate a wide range of conditions.

  9. Influence of Artisan Bakery- or Laboratory-Propagated Sourdoughs on the Diversity of Lactic Acid Bacterium and Yeast Microbiotas

    PubMed Central

    Minervini, Fabio; Lattanzi, Anna; De Angelis, Maria; Gobbetti, Marco

    2012-01-01

    Seven mature type I sourdoughs were comparatively back-slopped (80 days) at artisan bakery and laboratory levels under constant technology parameters. The cell density of presumptive lactic acid bacteria and related biochemical features were not affected by the environment of propagation. On the contrary, the number of yeasts markedly decreased from artisan bakery to laboratory propagation. During late laboratory propagation, denaturing gradient gel electrophoresis (DGGE) showed that the DNA band corresponding to Saccharomyces cerevisiae was no longer detectable in several sourdoughs. Twelve species of lactic acid bacteria were variously identified through a culture-dependent approach. All sourdoughs harbored a certain number of species and strains, which were dominant throughout time and, in several cases, varied depending on the environment of propagation. As shown by statistical permutation analysis, the lactic acid bacterium populations differed among sourdoughs propagated at artisan bakery and laboratory levels. Lactobacillus plantarum, Lactobacillus sakei, and Weissella cibaria dominated in only some sourdoughs back-slopped at artisan bakeries, and Leuconostoc citreum seemed to be more persistent under laboratory conditions. Strains of Lactobacillus sanfranciscensis were indifferently found in some sourdoughs. Together with the other stable species and strains, other lactic acid bacteria temporarily contaminated the sourdoughs and largely differed between artisan bakery and laboratory levels. The environment of propagation has an undoubted influence on the composition of sourdough yeast and lactic acid bacterium microbiotas. PMID:22635989

  10. First proficiency testing to evaluate the ability of European Union National Reference Laboratories to detect staphylococcal enterotoxins in milk products.

    PubMed

    Hennekinne, Jacques-Antoine; Gohier, Martine; Maire, Tiphaine; Lapeyre, Christiane; Lombard, Bertrand; Dragacci, Sylviane

    2003-01-01

    The European Commission has designed a network of European Union-National Reference Laboratories (EU-NRLs), coordinated by a Community Reference Laboratory (CRL), for control of hygiene of milk and milk products (Council Directive 92/46/ECC). As a common contaminant of milk and milk products such as cheese, staphylococcal enterotoxins are often involved in human outbreaks and should be monitored regularly. The main tasks of the EU-CRLs were to select and transfer to the EU-NRLs a reference method for detection of enterotoxins, and to set up proficiency testing to evaluate the competency of the European laboratory network. The first interlaboratory exercise was performed on samples of freeze-dried cheese inoculated with 2 levels of staphylococcal enterotoxins (0.1 and 0.25 ng/g) and on an uninoculated control. These levels were chosen considering the EU regulation for staphylococcal enterotoxins in milk and milk products and the limit of detection of the enzyme-linked immunosorbent assay test recommended in the reference method. The trial was conducted according to the recommendations of ISO Guide 43. Results produced by laboratories were compiled and compared through statistical analysis. Except for data from 2 laboratories for the uninoculated control and cheese inoculated at 0.1 ng/g, all laboratories produced satisfactory results, showing the ability of the EU-NRL network to monitor the enterotoxin contaminant.

  11. ProteoSign: an end-user online differential proteomics statistical analysis platform.

    PubMed

    Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis

    2017-07-03

    Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  12. Implementation of novel statistical procedures and other advanced approaches to improve analysis of CASA data.

    PubMed

    Ramón, M; Martínez-Pastor, F

    2018-04-23

    Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.

  13. Radiation shielding quality assurance

    NASA Astrophysics Data System (ADS)

    Um, Dallsun

    For the radiation shielding quality assurance, the validity and reliability of the neutron transport code MCNP, which is now one of the most widely used radiation shielding analysis codes, were checked with lot of benchmark experiments. And also as a practical example, follows were performed in this thesis. One integral neutron transport experiment to measure the effect of neutron streaming in iron and void was performed with Dog-Legged Void Assembly in Knolls Atomic Power Laboratory in 1991. Neutron flux was measured six different places with the methane detectors and a BF-3 detector. The main purpose of the measurements was to provide benchmark against which various neutron transport calculation tools could be compared. Those data were used in verification of Monte Carlo Neutron & Photon Transport Code, MCNP, with the modeling for that. Experimental results and calculation results were compared in both ways, as the total integrated value of neutron fluxes along neutron energy range from 10 KeV to 2 MeV and as the neutron spectrum along with neutron energy range. Both results are well matched with the statistical error +/-20%. MCNP results were also compared with those of TORT, a three dimensional discrete ordinates code which was developed by Oak Ridge National Laboratory. MCNP results are superior to the TORT results at all detector places except one. This means that MCNP is proved as a very powerful tool for the analysis of neutron transport through iron & air and further it could be used as a powerful tool for the radiation shielding analysis. For one application of the analysis of variance (ANOVA) to neutron and gamma transport problems, uncertainties for the calculated values of critical K were evaluated as in the ANOVA on statistical data.

  14. Experimental study on Statistical Damage Detection of RC Structures based on Wavelet Packet Analysis

    NASA Astrophysics Data System (ADS)

    Zhu, X. Q.; Law, S. S.; Jayawardhan, M.

    2011-07-01

    A novel damage indicator based on wavelet packet transform is developed in this study for structural health monitoring. The response signal of a structure under an impact load is normalized and then decomposed into wavelet packet components. Energies of these wavelet packet components are then calculated to obtain the energy distribution. A statistical indicator is developed to describe the damage extent of the structure. This approach is applied to the test results from simply supported reinforced concrete beams in the laboratory. Cases with single damage are created from static loading, and accelerations of the structure from under impact loads are analyzed. Results show that the method can be used for the damage monitoring and assessment of the structure.

  15. Data-optimized source modeling with the Backwards Liouville Test–Kinetic method

    DOE PAGES

    Woodroffe, J. R.; Brito, T. V.; Jordanova, V. K.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution were used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. Our study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra,more » Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  16. Estimating the mass variance in neutron multiplicity counting-A comparison of approaches

    NASA Astrophysics Data System (ADS)

    Dubi, C.; Croft, S.; Favalli, A.; Ocherashvili, A.; Pedersen, B.

    2017-12-01

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α , n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.

  17. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubi, C.; Croft, S.; Favalli, A.

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  18. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE PAGES

    Dubi, C.; Croft, S.; Favalli, A.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  19. The use of computers to teach human anatomy and physiology to allied health and nursing students

    NASA Astrophysics Data System (ADS)

    Bergeron, Valerie J.

    Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the institution requiring investment in additional resources, and at the level of the faculty requiring a commitment to exploration and reflective practice.

  20. Effectiveness of practices to reduce blood culture contamination: a Laboratory Medicine Best Practices systematic review and meta-analysis.

    PubMed

    Snyder, Susan R; Favoretto, Alessandra M; Baetz, Rich Ann; Derzon, James H; Madison, Bereneice M; Mass, Diana; Shaw, Colleen S; Layfield, Christopher D; Christenson, Robert H; Liebow, Edward B

    2012-09-01

    This article is a systematic review of the effectiveness of three practices for reducing blood culture contamination rates: venipuncture, phlebotomy teams, and prepackaged preparation/collection (prep) kits. The CDC-funded Laboratory Medicine Best Practices Initiative systematic review methods for quality improvement practices were used. Studies included as evidence were: 9 venipuncture (vs. versus intravenous catheter), 5 phlebotomy team; and 7 prep kit. All studies for venipuncture and phlebotomy teams favored these practices, with meta-analysis mean odds ratios for venipuncture of 2.69 and phlebotomy teams of 2.58. For prep kits 6 studies' effect sizes were not statistically significantly different from no effect (meta-analysis mean odds ratio 1.12). Venipuncture and the use of phlebotomy teams are effective practices for reducing blood culture contamination rates in diverse hospital settings and are recommended as evidence-based "best practices" with high overall strength of evidence and substantial effect size ratings. No recommendation is made for or against prep kits based on uncertain improvement. Copyright © 2012 The Canadian Society of Clinical Chemists. All rights reserved.

  1. Applications of High Resolution Laser Induced Breakdown Spectroscopy for Environmental and Biological Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Madhavi Z; Labbe, Nicole; Wagner, Rebekah J.

    2013-01-01

    This chapter details the application of LIBS in a number of environmental areas of research such as carbon sequestration and climate change. LIBS has also been shown to be useful in other high resolution environmental applications for example, elemental mapping and detection of metals in plant materials. LIBS has also been used in phytoremediation applications. Other biological research involves a detailed understanding of wood chemistry response to precipitation variations and also to forest fires. A cross-section of Mountain pine (pinceae Pinus pungen Lamb.) was scanned using a translational stage to determine the differences in the chemical features both before andmore » after a fire event. Consequently, by monitoring the elemental composition pattern of a tree and by looking for abrupt changes, one can reconstruct the disturbance history of a tree and a forest. Lastly we have shown that multivariate analysis of the LIBS data is necessary to standardize the analysis and correlate to other standard laboratory techniques. LIBS along with multivariate statistical analysis makes it a very powerful technology that can be transferred from laboratory to field applications with ease.« less

  2. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  3. Learning Outcomes in a Laboratory Environment vs. Classroom for Statistics Instruction: An Alternative Approach Using Statistical Software

    ERIC Educational Resources Information Center

    McCulloch, Ryan Sterling

    2017-01-01

    The role of any statistics course is to increase the understanding and comprehension of statistical concepts and those goals can be achieved via both theoretical instruction and statistical software training. However, many introductory courses either forego advanced software usage, or leave its use to the student as a peripheral activity. The…

  4. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An

  5. What's to Be Done About Laboratory Quality? Process Indicators, Laboratory Stewardship, the Outcomes Problem, Risk Assessment, and Economic Value: Responding to Contemporary Global Challenges.

    PubMed

    Meier, Frederick A; Badrick, Tony C; Sikaris, Kenneth A

    2018-02-17

    For 50 years, structure, process, and outcomes measures have assessed health care quality. For clinical laboratories, structural quality has generally been assessed by inspection. For assessing process, quality indicators (QIs), statistical monitors of steps in the clinical laboratory total testing, have proliferated across the globe. Connections between structural and process laboratory measures and patient outcomes, however, have rarely been demonstrated. To inform further development of clinical laboratory quality systems, we conducted a selective but worldwide review of publications on clinical laboratory quality assessment. Some QIs, like seven generic College of American Pathologists Q-Tracks monitors, have demonstrated significant process improvement; other measures have uncovered critical opportunities to improve test selection and result management. The College of Pathologists of Australasia Key Indicator Monitoring and Management System has deployed risk calculations, introduced from failure mode effects analysis, as surrogate measures for outcomes. Showing economic value from clinical laboratory testing quality is a challenge. Clinical laboratories should converge on fewer (7-14) rather than more (21-35) process monitors; monitors should cover all steps of the testing process under laboratory control and include especially high-risk specimen-quality QIs. Clinical laboratory stewardship, the combination of education interventions among clinician test orderers and report consumers with revision of test order formats and result reporting schemes, improves test ordering, but improving result reception is more difficult. Risk calculation reorders the importance of quality monitors by balancing three probabilities: defect frequency, weight of potential harm, and detection difficulty. The triple approach of (1) a more focused suite of generic consensus quality indicators, (2) more active clinical laboratory testing stewardship, and (3) integration of formal risk assessment, rather than competing with economic value, enhances it.

  6. Multivariate Statistical Analysis of Orthogonal Mass Spectral Data for the Identification of Chemical Attribution Signatures of 3-Methylfentanyl

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayer, B. P.; Valdez, C. A.; DeHope, A. J.

    Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less

  7. An experimental test of an extended discretion approach for high school biology laboratory investigations

    NASA Astrophysics Data System (ADS)

    Leonard, William H.; Cavana, Gordon R.; Lowery, Lawrence F.

    Discretion-the exercise of independent judgment-was observed to be lacking in most commercially available laboratory investigations for high school biology. An Extended Discretion (ED) laboratory approach was developed and tested experimentally against the BSCS Green Version laboratory program, using ten classes of 10th-grade biology in a suburban California high school. Five teachers were each assigned one experimental and one control group. The primary differences between the two approaches were that the BSCS was more prescriptive and directive than the ED approach and the ED approach increased discretionary demands upon the student over the school year. A treatment verification procedure showed statistically significant differences between the two approaches. The hypothesis under test was that when high school biology students are taught laboratory concepts under comparatively high discretionary demands, they would perform as well as or better than a similar group of students taught with BSCS Green Version investigations. A second hypothesis was that teachers would prefer to use the ED approach over the BSCS approach for their future classes. A t analysis between experimental and control groups for each teacher was employed. There were significant differences in favor of the ED group on laboratory report scores for three teachers and no differences for two teachers. There were significant differences in favor of the ED group on laboratory concepts quiz scores for three teachers, no differences for one teacher, and significant differences in favor of the BSCS group for only one teacher. A t analysis of teacher evaluation of the two approaches showed a significant teacher preference overall for the ED approach. Both experimental hypotheses were accepted. The ED approach was observed to be difficult for students at first, but it was found to be a workable and productive means of teaching laboratory concepts in biology which also required extensive use of individual student discretion.

  8. Tank 241-T-204, core 188 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L.

    TANK 241-T-204, CORE 188, ANALYTICAL RESULTS FOR THE FINAL REPORT. This document is the final laboratory report for Tank 241 -T-204. Push mode core segments were removed from Riser 3 between March 27, 1997, and April 11, 1997. Segments were received and extruded at 222-8 Laboratory. Analyses were performed in accordance with Tank 241-T-204 Push Mode Core Sampling and analysis Plan (TRAP) (Winkleman, 1997), Letter of instruction for Core Sample Analysis of Tanks 241-T-201, 241- T-202, 241-T-203, and 241-T-204 (LAY) (Bell, 1997), and Safety Screening Data Qual@ Objective (DO) ODukelow, et al., 1995). None of the subsamples submitted for totalmore » alpha activity (AT) or differential scanning calorimetry (DC) analyses exceeded the notification limits stated in DO. The statistical results of the 95% confidence interval on the mean calculations are provided by the Tank Waste Remediation Systems Technical Basis Group and are not considered in this report.« less

  9. New NIST sediment SRM for inorganic analysis.

    PubMed

    Zeisler, Rolf

    2004-03-01

    NIST maintains a portfolio of more than 1300 standard reference materials (SRM), more than a third of these relating to measurements in the biological and environmental fields. As part of the continuous renewal and replacement efforts, a set of new marine sediments has been recently developed covering organic and inorganic determinations. This paper describes the steps taken in sample preparation, homogeneity assay, and analytical characterization and certification with specific emphasis on SRM 2702 inorganics in marine sediment. Neutron activation analysis showed the SRM to be highly homogeneous, opening the possibility for use with solid sampling techniques. The certificate provides certified mass fraction values for 25 elements, reference values for eight elements, and information values for 11 elements, covering most of the priority pollutants with small uncertainties of only several percent relative. The values were obtained by combining results from different laboratories and techniques using a Bayesian statistical model. An intercomparison carried out in field laboratories with the material before certification illustrates a high commutability of this SRM.

  10. Regional and Temporal Variation in Methamphetamine-Related Incidents: Applications of Spatial and Temporal Scan Statistics

    PubMed Central

    Sudakin, Daniel L.

    2009-01-01

    Introduction This investigation utilized spatial scan statistics, geographic information systems and multiple data sources to assess spatial clustering of statewide methamphetamine-related incidents. Temporal and spatial associations with regulatory interventions to reduce access to precursor chemicals (pseudoephedrine) were also explored. Methods Four statewide data sources were utilized including regional poison control center statistics, fatality incidents, methamphetamine laboratory seizures, and hazardous substance releases involving methamphetamine laboratories. Spatial clustering of methamphetamine incidents was assessed using SaTScan™. SaTScan™ was also utilized to assess space-time clustering of methamphetamine laboratory incidents, in relation to the enactment of regulations to reduce access to pseudoephedrine. Results Five counties with a significantly higher relative risk of methamphetamine-related incidents were identified. The county identified as the most likely cluster had a significantly elevated relative risk of methamphetamine laboratories (RR=11.5), hazardous substance releases (RR=8.3), and fatalities relating to methamphetamine (RR=1.4). A significant increase in relative risk of methamphetamine laboratory incidents was apparent in this same geographic area (RR=20.7) during the time period when regulations were enacted in 2004 and 2005, restricting access to pseudoephedrine. Subsequent to the enactment of these regulations, a significantly lower rate of incidents (RR 0.111, p=0.0001) was observed over a large geographic area of the state, including regions that previously had significantly higher rates. Conclusions Spatial and temporal scan statistics can be effectively applied to multiple data sources to assess regional variation in methamphetamine-related incidents, and explore the impact of preventive regulatory interventions. PMID:19225949

  11. Normative Bilateral Brainstem Evoked Response Data for a Naval Aviation Student Population: Group Statistics

    DTIC Science & Technology

    1979-08-07

    laus levels of the present study all fall within the plus and sinus one -standard deviation boundar; limits of the composite laboratory data plotted by...to be the case in the present study in that the =pz!Aude of the contralateral response prtduced by a given stimulus level follcuzd, in general, that...equivalent Gaussian distribution was applied to Cia study data. Such an analysis, performed by Thornton (36) on the latcncy and amplitude measurements

  12. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  13. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  14. Teacher Professional Development to Foster Authentic Student Research Experiences

    NASA Astrophysics Data System (ADS)

    Conn, K.; Iyengar, E.

    2004-12-01

    This presentation reports on a new teacher workshop design that encourages teachers to initiate and support long-term student-directed research projects in the classroom setting. Teachers were recruited and engaged in an intensive marine ecology learning experience at Shoals Marine Laboratory, Appledore Island, Maine. Part of the weeklong summer workshop was spent in field work, part in laboratory work, and part in learning experimental design and basic statistical analysis of experimental results. Teachers were presented with strategies to adapt their workshop learnings to formulate plans for initiating and managing authentic student research projects in their classrooms. The authors will report on the different considerations and constraints facing the teachers in their home school settings and teachers' progress in implementing their plans. Suggestions for replicating the workshop will be offered.

  15. Studies of annual and seasonal variations in four species of reptiles and amphibians at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, E.I.; Haarmann, T.; Keller, D.C.

    1998-11-01

    Baseline studies of reptiles and amphibians of the Pajarito wetlands at Los Alamos National Laboratory have been conducted by the Ecology group since 1990. With the gathered data from 1990--1997 (excluding 1992), they plan to determine if patterns can be found in the annual and seasonal population changes of four species of reptiles and amphibians over the past seven years. The four species studied are the Woodhouse toad, the western chorus frog, the many-linked skink, and the plateau striped whiptail lizard. Statistical analysis results show that significant changes occurred on a seasonal basis for the western chorus frog and themore » many-lined skink. Results indicate a significant difference in the annual population of the Woodhouse toad.« less

  16. Mathematical and Statistical Software Index. Final Report.

    ERIC Educational Resources Information Center

    Black, Doris E., Comp.

    Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…

  17. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE PAGES

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; ...

    2018-01-29

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  18. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  19. Patients' rights in laboratory examinations: do they realize?

    PubMed

    Leino-Kilpi, H; Nyrhinen, T; Katajisto, J

    1997-11-01

    This article discusses the rights of patients who are attending hospital for the most common laboratory examinations and who may also be taking part in research studies. A distinction is made between five kinds of rights to: protection of privacy, physical integrity, mental integrity, information and self-determination. The data were collected (n = 204) by means of a structured questionnaire specifically developed for this study in the clinical chemistry, haematological, physiological and neurophysiological laboratories of one randomly selected university hospital in Finland. The analysis of the data was statistical. On the whole, patients' rights were realized reasonably well. This was most particularly the case with protection of privacy, as well as with the rights of physical and mental integrity. The rights to information and self-determination were less well realized. There are various steps that health care professionals and organizations can take to make sure that patients can enjoy their full rights, by counselling the patient, by giving opportunities to plan the examinations in advance, and by arranging a sufficient number of small examination rooms.

  20. Statistical analysis of measured free-space laser signal intensity over a 2.33 km optical path.

    PubMed

    Tunick, Arnold

    2007-10-17

    Experimental research is conducted to determine the characteristic behavior of high frequency laser signal intensity data collected over a 2.33 km optical path. Results focus mainly on calculated power spectra and frequency distributions. In addition, a model is developed to calculate optical turbulence intensity (C(n)/2) as a function of receiving and transmitting aperture diameter, log-amplitude variance, and path length. Initial comparisons of calculated to measured C(n)/2 data are favorable. It is anticipated that this kind of signal data analysis will benefit laser communication systems development and testing at the U.S. Army Research Laboratory (ARL) and elsewhere.

  1. Study of Isospin Correlation in High Energy Heavy Ion Interactions with the RHIC PHENIX. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Y.

    This report describes the research work performed under the support of the DOE research grant E-FG02-97ER4108. The work is composed of three parts: (1) Visual analysis and quality control of the Micro Vertex Detector (MVD) of the PHENIX experiments carried out of Brookhaven National Laboratory. (2) Continuation of the data analysis of the EMU05/09/16 experiments for the study of the inclusive particle production spectra and multi-particle correlation. (3) Exploration of a new statistical means to study very high-multiplicity of nuclear-particle ensembles and its perspectives to apply to the higher energy experiments.

  2. An Evaluation of CPRA (Cost Performance Report Analysis) Estimate at Completion Techniques Based Upon AFWAL (Air Force Wright Aeronautical Laboratories) Cost/Schedule Control System Criteria Data

    DTIC Science & Technology

    1985-09-01

    4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS

  3. Discriminant Analysis of Raman Spectra for Body Fluid Identification for Forensic Purposes

    PubMed Central

    Sikirzhytski, Vitali; Virkler, Kelly; Lednev, Igor K.

    2010-01-01

    Detection and identification of blood, semen and saliva stains, the most common body fluids encountered at a crime scene, are very important aspects of forensic science today. This study targets the development of a nondestructive, confirmatory method for body fluid identification based on Raman spectroscopy coupled with advanced statistical analysis. Dry traces of blood, semen and saliva obtained from multiple donors were probed using a confocal Raman microscope with a 785-nm excitation wavelength under controlled laboratory conditions. Results demonstrated the capability of Raman spectroscopy to identify an unknown substance to be semen, blood or saliva with high confidence. PMID:22319277

  4. Gait patterns for crime fighting: statistical evaluation

    NASA Astrophysics Data System (ADS)

    Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan

    2013-10-01

    The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.

  5. An European inter-laboratory validation of alternative endpoints of the murine local lymph node assay: 2nd round.

    PubMed

    Ehling, G; Hecht, M; Heusener, A; Huesler, J; Gamer, A O; van Loveren, H; Maurer, Th; Riecke, K; Ullmann, L; Ulrich, P; Vandebriel, R; Vohr, H-W

    2005-08-15

    The original local lymph node assay (LLNA) is based on the use of radioactive labelling to measure cell proliferation. Other endpoints for the assessment of proliferation are also authorized by the OECD Guideline 429 provided there is appropriate scientific support, including full citations and description of the methodology (OECD, 2002. OECD Guideline for the Testing of Chemicals; Skin Sensitization: Local Lymph Node Assay, Guideline 429. Paris, adopted 24th April 2002.). Here, we describe the outcome of the second round of an inter-laboratory validation of alternative endpoints in the LLNA conducted in nine laboratories in Europe. The validation study was managed and supervised by the Swiss Agency for Therapeutic Products (Swissmedic) in Bern. Ear-draining lymph node (LN) weight and cell counts were used to assess LN cell proliferation instead of [3H]TdR incorporation. In addition, the acute inflammatory skin reaction was measured by ear weight determination of circular biopsies of the ears to identify skin irritation properties of the test items. The statistical analysis was performed in the department of statistics at the university of Bern. Similar to the EC(3) values defined for the radioactive method, threshold values were calculated for the endpoints measured in this modification of the LLNA. It was concluded that all parameters measured have to be taken into consideration for the categorisation of compounds due to their sensitising potencies. Therefore, an assessment scheme has been developed which turned out to be of great importance to consistently assess sensitisation versus irritancy based on the data of the different parameters. In contrast to the radioactive method, irritants have been picked up by all the laboratories applying this assessment scheme.

  6. A meta-analytic review of the impact of intranasal oxytocin administration on cortisol concentrations during laboratory tasks: moderation by method and mental health.

    PubMed

    Cardoso, Christopher; Kingdon, Danielle; Ellenbogen, Mark A

    2014-11-01

    A large body of research has examined the acute effects of intranasal oxytocin administration on social cognition and stress-regulation. While progress has been made with respect to understanding the effect of oxytocin administration on social cognition in clinical populations (e.g. autism, schizophrenia), less is known about its impact on the functioning of the hypothalamic-pituitary-adrenal (HPA) axis among individuals with a mental disorder. We conducted a meta-analysis on the acute effect of intranasal oxytocin administration on the cortisol response to laboratory tasks. The search yielded eighteen studies employing a randomized, placebo-controlled design (k=18, N=675). Random-effects models and moderator analyses were performed using the metafor package for the statistical program R. The overall effect size estimate was modest and not statistically significant (Hedges g=-0.151, p=0.11) with moderate heterogeneity in this effect across studies (I(2)=31%). Controlling for baseline differences in cortisol concentrations, moderation analyses revealed that this effect was larger in response to challenging laboratory tasks that produced a robust stimulation of the HPA-axis (Hedges g=-0.433, 95% CI[-0.841, -0.025]), and in clinical populations relative to healthy controls (Hedges g=-0.742, 95% CI[-1.405, -0.078]). Overall, oxytocin administration showed greater attenuation of the cortisol response to laboratory tasks that strongly activated the HPA-axis, relative to tasks that did not. The effect was more robust among clinical populations, suggesting possible increased sensitivity to oxytocin among those with a clinical diagnosis and concomitant social difficulties. These data support the view that oxytocin may play an important role in HPA dysfunction associated with psychopathology. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Evaluation on the use of cerium in the NBL Titrimetric Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zebrowski, J.P.; Orlowicz, G.J.; Johnson, K.D.

    An alternative to potassium dichromate as titrant in the New Brunswick Laboratory Titrimetric Method for uranium analysis was sought since chromium in the waste makes disposal difficult. Substitution of a ceric-based titrant was statistically evaluated. Analysis of the data indicated statistically equivalent precisions for the two methods, but a significant overall bias of +0.035% for the ceric titrant procedure. The cause of the bias was investigated, alterations to the procedure were made, and a second statistical study was performed. This second study revealed no statistically significant bias, nor any analyst-to-analyst variation in the ceric titration procedure. A statistically significant day-to-daymore » variation was detected, but this was physically small (0.01 5%) and was only detected because of the within-day precision of the method. The added mean and standard deviation of the %RD for a single measurement was found to be 0.031%. A comparison with quality control blind dichromate titration data again indicated similar overall precision. Effects of ten elements on the ceric titration`s performance was determined. Co, Ti, Cu, Ni, Na, Mg, Gd, Zn, Cd, and Cr in previous work at NBL these impurities did not interfere with the potassium dichromate titrant. This study indicated similar results for the ceric titrant, with the exception of Ti. All the elements (excluding Ti and Cr), caused no statistically significant bias in uranium measurements at levels of 10 mg impurity per 20-40 mg uranium. The presence of Ti was found to cause a bias of {minus}0.05%; this is attributed to the presence of sulfate ions, resulting in precipitation of titanium sulfate and occlusion of uranium. A negative bias of 0.012% was also statistically observed in the samples containing chromium impurities.« less

  8. Senior Computational Scientist | Center for Cancer Research

    Cancer.gov

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP), Basic Science Program, HLA Immunogenetics Section, under the leadership of Dr. Mary Carrington, studies the influence of human leukocyte antigens (HLA) and specific KIR/HLA genotypes on risk of and outcomes to infection, cancer, autoimmune disease, and maternal-fetal disease. Recent studies have focused on the impact of HLA gene expression in disease, the molecular mechanism regulating expression levels, and the functional basis for the effect of differential expression on disease outcome. The lab’s further focus is on the genetic basis for resistance/susceptibility to disease conferred by immunogenetic variation. KEY ROLES/RESPONSIBILITIES The Senior Computational Scientist will provide research support to the CIP-BSP-HLA Immunogenetics Section performing bio-statistical design, analysis and reporting of research projects conducted in the lab. This individual will be involved in the implementation of statistical models and data preparation. Successful candidate should have 5 or more years of competent, innovative biostatistics/bioinformatics research experience, beyond doctoral training Considerable experience with statistical software, such as SAS, R and S-Plus Sound knowledge, and demonstrated experience of theoretical and applied statistics Write program code to analyze data using statistical analysis software Contribute to the interpretation and publication of research results

  9. Report on von Willebrand Disease in Malaysia

    PubMed Central

    Periayah, Mercy Halleluyah; Halim, Ahmad Sukari; Saad, Arman Zaharil Mat; Yaacob, Nik Soriani; Karim, Faraizah Abdul

    2016-01-01

    BACKGROUND: Von Willebrand disease (vWD) is an inherited hemostatic disorder that affects the hemostasis pathway. The worldwide prevalence of vWD is estimated to be 1% of the general population but only 0.002% in Malaysia. AIM: Our present paper has been written to disclose the statistical counts on the number of vWD cases reported from 2011 to 2013. MATERIAL AND METHODS: This article is based on sociodemographic data, diagnoses and laboratory findings of vWD in Malaysia. A total of 92 patients were reported to have vWD in Malaysia from 2011 to 2013. RESULTS: Sociodemographic-analysis revealed that 60% were females, 63% were of the Malay ethnicity, 41.3% were in the 19-44 year old age group and 15.2% were from Sabah, with the East region having the highest registered number of vWD cases. In Malaysia, most patients are predominately affected by vWD type 1 (77.2%). Factor 8, von Willebrand factor: Antigen and vWF: Collagen-Binding was the strongest determinants in the laboratory profiles of vWD. CONCLUSION: This report has been done with great interest to provide an immense contribution from Malaysia, by revealing the statistical counts on vWD from 2011-2013. PMID:27275342

  10. Soft Tissue Response to Titanium Abutments with Different Surface Treatment: Preliminary Histologic Report of a Randomized Controlled Trial.

    PubMed

    Canullo, Luigi; Dehner, Jan Friedrich; Penarrocha, David; Checchi, Vittorio; Mazzoni, Annalisa; Breschi, Lorenzo

    2016-01-01

    The aim of this preliminary prospective RCT was to histologically evaluate peri-implant soft tissues around titanium abutments treated using different cleaning methods. Sixteen patients were randomized into three groups: laboratory customized abutments underwent Plasma of Argon treatment (Plasma Group), laboratory customized abutments underwent cleaning by steam (Steam Group), and abutments were used as they came from industry (Control Group). Seven days after the second surgery, soft tissues around abutments were harvested. Samples were histologically analyzed. Soft tissues surrounding Plasma Group abutments predominantly showed diffuse chronic infiltrate, almost no acute infiltrate, with presence of few polymorphonuclear neutrophil granulocytes, and a diffuse presence of collagenization bands. Similarly, in Steam Group, the histological analysis showed a high variability of inflammatory expression factors. Tissues harvested from Control Group showed presence of few neutrophil granulocytes, moderate presence of lymphocytes, and diffuse collagenization bands in some sections, while they showed absence of acute infiltrate in 40% of sections. However, no statistical difference was found among the tested groups for each parameter (p > 0.05). Within the limit of the present study, results showed no statistically significant difference concerning inflammation and healing tendency between test and control groups.

  11. Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia

    2015-04-26

    The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less

  12. The Current State of Sustainability in Bioscience Laboratories: A Statistical Examination of a UK Tertiary Institute

    ERIC Educational Resources Information Center

    Wright, Hazel A.; Ironside, Joseph E.; Gwynn-Jones, Dylan

    2008-01-01

    Purpose: This study aims to identify the current barriers to sustainability in the bioscience laboratory setting and to determine which mechanisms are likely to increase sustainable behaviours in this specialised environment. Design/methodology/approach: The study gathers qualitative data from a sample of laboratory researchers presently…

  13. Home Performance with ENERGY STAR: Utility Bill Analysis on Homes Participating in Austin Energy's Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belzer, D.; Mosey, G.; Plympton, P.

    2007-07-01

    Home Performance with ENERGY STAR (HPwES) is a jointly managed program of the U.S. Department of Energy (DOE) and the U.S. Environmental Protection Agency (EPA). This program focuses on improving energy efficiency in existing homes via a whole-house approach to assessing and improving a home's energy performance, and helping to protect the environment. As one of HPwES's local sponsors, Austin Energy's HPwES program offers a complete home energy analysis and a list of recommendations for efficiency improvements, along with cost estimates. To determine the benefits of this program, the National Renewable Energy Laboratory (NREL) collaborated with the Pacific Northwest Nationalmore » Laboratory (PNNL) to conduct a statistical analysis using energy consumption data of HPwES homes provided by Austin Energy. This report provides preliminary estimates of average savings per home from the HPwES Loan Program for the period 1998 through 2006. The results from this preliminary analysis suggest that the HPwES program sponsored by Austin Energy had a very significant impact on reducing average cooling electricity for participating households. Overall, average savings were in the range of 25%-35%, and appear to be robust under various criteria for the number of households included in the analysis.« less

  14. Anthropometric measures in cardiovascular disease prediction: comparison of laboratory-based versus non-laboratory-based model.

    PubMed

    Dhana, Klodian; Ikram, M Arfan; Hofman, Albert; Franco, Oscar H; Kavousi, Maryam

    2015-03-01

    Body mass index (BMI) has been used to simplify cardiovascular risk prediction models by substituting total cholesterol and high-density lipoprotein cholesterol. In the elderly, the ability of BMI as a predictor of cardiovascular disease (CVD) declines. We aimed to find the most predictive anthropometric measure for CVD risk to construct a non-laboratory-based model and to compare it with the model including laboratory measurements. The study included 2675 women and 1902 men aged 55-79 years from the prospective population-based Rotterdam Study. We used Cox proportional hazard regression analysis to evaluate the association of BMI, waist circumference, waist-to-hip ratio and a body shape index (ABSI) with CVD, including coronary heart disease and stroke. The performance of the laboratory-based and non-laboratory-based models was evaluated by studying the discrimination, calibration, correlation and risk agreement. Among men, ABSI was the most informative measure associated with CVD, therefore ABSI was used to construct the non-laboratory-based model. Discrimination of the non-laboratory-based model was not different than laboratory-based model (c-statistic: 0.680-vs-0.683, p=0.71); both models were well calibrated (15.3% observed CVD risk vs 16.9% and 17.0% predicted CVD risks by the non-laboratory-based and laboratory-based models, respectively) and Spearman rank correlation and the agreement between non-laboratory-based and laboratory-based models were 0.89 and 91.7%, respectively. Among women, none of the anthropometric measures were independently associated with CVD. Among middle-aged and elderly where the ability of BMI to predict CVD declines, the non-laboratory-based model, based on ABSI, could predict CVD risk as accurately as the laboratory-based model among men. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Aerosol in selected laboratories at Faculty of Mechanical Engineering, Opole University of Technology

    NASA Astrophysics Data System (ADS)

    Olszowski, Tomasz

    2017-10-01

    The paper contains the results of a study into mass concentration of the dispersed aerosol fraction with the aerodynamic diameter of up to 2.5 and 10 micrometers. The study was conducted during classes with students participating in them in two laboratories located at Faculty of Mechanical Engineering, Opole University of Technology as well as outdoor outside the building. It was demonstrated that the values of the mass concentration of PM2.5 and PM10 measured in the laboratories differ considerably from the levels measured in the ambient air in the outdoor areas surrounding the faculty building. It was concluded that the diversity of PM2.5/PM10 ratio was greater in the laboratories. Direct correlation was not established between the concentrations of the particular PM fractions in the two investigated environments. It was demonstrated that there is a statistically significant relation between the concentration of PM2.5 and PM10 and the number of people present in the laboratory. The conducted cluster analysis led to the detection of the existence of dominant structures determining air quality parameters. For the analyzed case, endogenic factors are responsible for the aerosanitary condition. The study demonstrated that the evaluation of air quality needs to be performed individually for the specific rooms.

  16. Bonneville Power Administration Communication Alarm Processor expert system:

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goeltz, R.; Purucker, S.; Tonn, B.

    This report describes the Communications Alarm Processor (CAP), a prototype expert system developed for the Bonneville Power Administration by Oak Ridge National Laboratory. The system is designed to receive and diagnose alarms from Bonneville's Microwave Communications System (MCS). The prototype encompasses one of seven branches of the communications network and a subset of alarm systems and alarm types from each system. The expert system employs a backward chaining approach to diagnosing alarms. Alarms are fed into the expert system directly from the communication system via RS232 ports and sophisticated alarm filtering and mailbox software. Alarm diagnoses are presented to operatorsmore » for their review and concurrence before the diagnoses are archived. Statistical software is incorporated to allow analysis of archived data for report generation and maintenance studies. The delivered system resides on a Digital Equipment Corporation VAX 3200 workstation and utilizes Nexpert Object and SAS for the expert system and statistical analysis, respectively. 11 refs., 23 figs., 7 tabs.« less

  17. Increased frequencies of aberrant sperm as indicators of mutagenic damage in mice.

    PubMed

    Soares, E R; Sheridan, W; Haseman, J K; Segall, M

    1979-02-01

    We have tested the effects of TEM in 3 strains of mice using the sperm morphology assay. In addition, we have made an attempt to evaluate this test system with respect to experimental design, statistical problems and possible interlaboratory differences. Treatment with TEM results in significant increases in the percent of abnormally shaped sperm. These increases are readily detectable in sperm treated as spermatocytes and spermatogonial stages. Our data indicate possible problems associated with inter-laboratory variation in slide analysis. We have found that despite the introduction of such sources of variation, our data were consistent with respect to the effects of TEM. Another area of concern in the sperm morphology test is the presence of "outlier" animals. In our study, such animals comprised 4% of the total number of animals considered. Statistical analysis of the slides from these animals have shown that this problem can be dealt with and that when recognized as such, "outliers" do not effect the outcome of the sperm morphology assay.

  18. [THE COMPARATIVE ANALYSIS OF INFORMATION VALUE OF MAIN CLINICAL CRITERIA USED TO DIAGNOSE OF BACTERIAL VAGINOSIS].

    PubMed

    Tsvetkova, A V; Murtazina, Z A; Markusheva, T V; Mavzutov, A R

    2015-05-01

    The bacterial vaginosis is one of the most frequent causes of women visiting gynecologist. The diagnostics of bacterial vaginosis is predominantly based on Amsel criteria (1983). Nowadays, the objectivity of these criteria is disputed more often. The analysis of excretion of mucous membranes of posterolateral fornix of vagina was applied to 640 women with clinical diagnosis bacterial vaginosis. The application of light microscopy to mounts of excretion confirmed in laboratory way the diagnosis of bacterial vaginosis in 100 (15.63%) women. The complaints of burning and unpleasant smell and the Amsel criterion of detection of "key cells" against the background of pH > 4.5 were established as statistically significant for bacterial vaginosis. According study data, the occurrence of excretions has no statistical reliable obligation for differentiation of bacterial vaginosis form other inflammatory pathological conditions of female reproductive sphere. At the same time, detection of "key cells" in mount reliably correlated with bacterial vaginosis.

  19. Evidence for social learning in wild lemurs (Lemur catta).

    PubMed

    Kendal, Rachel L; Custance, Deborah M; Kendal, Jeremy R; Vale, Gillian; Stoinski, Tara S; Rakotomalala, Nirina Lalaina; Rasamimanana, Hantanirina

    2010-08-01

    Interest in social learning has been fueled by claims of culture in wild animals. These remain controversial because alternative explanations to social learning, such as asocial learning or ecological differences, remain difficult to refute. Compared with laboratory-based research, the study of social learning in natural contexts is in its infancy. Here, for the first time, we apply two new statistical methods, option-bias analysis and network-based diffusion analysis, to data from the wild, complemented by standard inferential statistics. Contrary to common thought regarding the cognitive abilities of prosimian primates, our evidence is consistent with social learning within subgroups in the ring-tailed lemur (Lemur catta), supporting the theory of directed social learning (Coussi-Korbel & Fragaszy, 1995). We also caution that, as the toolbox for capturing social learning in natural contexts grows, care is required in ensuring that the methods employed are appropriate-in particular, regarding social dynamics among study subjects. Supplemental materials for this article may be downloaded from http://lb.psychonomic-journals.org/content/supplemental.

  20. Computer Aided Statistics Instruction Protocol (CASIP) Restructuring Undergraduate Statistics in Psychology: An Integration of Computers into Instruction and Evaluation.

    ERIC Educational Resources Information Center

    Rah, Ki-Young; Scuello, Michael

    As a result of the development of two computer statistics laboratories in the psychology department at New York's Brooklyn College, a project was undertaken to develop and implement computer program modules in undergraduate and graduate statistics courses. Rather than use the technology to merely make course presentations more exciting, the…

  1. Affirmative Action Plans, January 1, 1994--December 31, 1994. Revision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-02-16

    This document is the Affirmative Action Plan for January 1, 1994 through December 31, 1994 for the Lawrence Berkeley Laboratory, University of California (``LBL`` or ``the Laboratory.``) This is an official document that will be presented upon request to the Office of Federal Contract Compliance Programs, US Department of Labor. The plan is prepared in accordance with the Executive Order 11246 and 41 CFR Section 60-1 et seq. covering equal employment opportunity and will be updated during the year, if appropriate. Analyses included in this volume as required by government regulations are based on statistical comparisons. All statistical comparisons involvemore » the use of geographic areas and various sources of statistics. The geographic areas and sources of statistics used here are in compliance with the government regulations, as interpreted. The use of any geographic area or statistic does not indicate agreement that the geographic area is the most appropriate or that the statistic is the most relevant. The use of such geographic areas and statistics is intended to have no significance outside the context of this Affirmative Action Plan, although, of course, such statistics and geographic areas will be used in good faith with respect to this Affirmative Action Plan.« less

  2. Assay of potency of the proposed Fifth International Standard for Gas-Gangrene Antitoxin (Perfringens)

    PubMed Central

    Prigge, R.; Micke, H.; Krüger, J.

    1963-01-01

    As part of a collaborative assay of the proposed Fifth International Standard for Gas-Gangrene Antitoxin (Perfringens), five ampoules of the proposed replacement material were assayed in the authors' laboratory against the then current Fourth International Standard. Both in vitro and in vivo methods were used. This paper presents the results and their statistical analysis. The two methods yielded different results which were not likely to have been due to chance, but exact statistical comparison is not possible. It is thought, however, that the differences may be due, at least in part, to differences in the relative proportions of zeta-antitoxin and alpha-antitoxin in the Fourth and Fifth International Standards and the consequent different reactions with the test toxin that was used for titration. PMID:14107746

  3. Assuring the Quality of Test Results in the Field of Nuclear Techniques and Ionizing Radiation. The Practical Implementation of Section 5.9 of the EN ISO/IEC 17025 Standard

    NASA Astrophysics Data System (ADS)

    Cucu, Daniela; Woods, Mike

    2008-08-01

    The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.

  4. The effect of restructuring student writing in the general chemistry laboratory on student understanding of chemistry and on students' approach to the laboratory course

    NASA Astrophysics Data System (ADS)

    Rudd, James Andrew, II

    Many students encounter difficulties engaging with laboratory-based instruction, and reviews of research have indicated that the value of such instruction is not clearly evident. Traditional forms of writing associated with laboratory activities are commonly in a style used by professional scientists to communicate developed explanations. Students probably lack the interpretative skills of a professional, and writing in this style may not support students in learning how to develop scientific explanations. The Science Writing Heuristic (SWH) is an inquiry-based approach to laboratory instruction designed in part to promote student ability in developing such explanations. However, there is not a convincing body of evidence for the superiority of inquiry-based laboratory instruction in chemistry. In a series of studies, the performance of students using the SWH student template in place of the standard laboratory report format was compared to the performance of students using the standard format. The standard reports had Title, Purpose, Procedure, Data & Observations, Calculations & Graphs, and Discussion sections. The SWH reports had Beginning Questions & Ideas, Tests & Procedures, Observations, Claims, Evidence, and Reflection sections. The pilot study produced evidence that using the SWH improved the quality of laboratory reports, improved student performance on a laboratory exam, and improved student approach to laboratory work. A main study found that SWH students statistically exhibited a better understanding of physical equilibrium when written explanations and equations were analyzed on a lecture exam and performed descriptively better on a physical equilibrium practical exam task. In another main study, the activities covering the general equilibrium concept were restructured as an additional change, and it was found that SWH students exhibited a better understanding of chemical equilibrium as shown by statistically greater success in overcoming the common confusion of interpreting equilibrium as equal concentrations and by statistically better performance when explaining aspects of chemical equilibrium. Both main studies found that students and instructors spent less time on the SWH reports and that students preferred the SWH approach because it increased their level of mental engagement. The studies supported the conclusion that inquiry-based laboratory instruction benefits student learning and attitudes.

  5. Spectral characterization of V-type asteroids: are all the basaltic objects coming from Vesta?

    NASA Astrophysics Data System (ADS)

    Ieva, S.; Fulvio, D.; Dotto, E.; Lazzaro, D.; Perna, D.; Strazzulla, G.; Fulchignoni, M.

    In the last twenty-five years several small basaltic V-type asteroids have been identified all around the main belt. Most of them are members of the Vesta dynamical family, but an increasingly large number appear to have no link with it. The question that arises is whether all these basaltic objects do indeed come from Vesta. In the light of the Dawn mission, who visited Vesta in 2011-2012, recent works were dedicated to the observation of several new V-type asteroids and their comparison with laboratory data (Fulvio et al., \\cite{Fulvio2015}), and to a statistical analysis of the spectroscopic and mineralogical properties of the largest sample of V-types ever collected (Ieva et al., \\cite{Ieva2015}, with the objective to highlight similarities and differences among objects belonging and not belonging to the Vesta dynamical family. Laboratory experiments support the idea that V-type NEAs spectral properties could be due to a balance of space weathering and rejuvenation processes triggered by close encounters with terrestrial planets. Statistical analysis shows that although most of the V-type asteroids in the inner main belt do have a surface composition compatible with Vesta family members, this seem not to be the case for V-types in the middle and outer main belt. For these Middle and Outer V-types (MOVs), their sizes, spectral parameters and location far away from the Vesta dynamical region point to a different origin than Vesta.

  6. Inter-laboratory exercise on antibiotic drugs analysis in aqueous samples.

    PubMed

    Roig, B; Brogat, M; Mompelat, S; Leveque, J; Cadiere, A; Thomas, O

    2012-08-30

    An inter-laboratory exercise was organized under the PHARMAS EU project, by the Advanced School of Public Health (EHESP), in order to evaluate the performances of analytical methods for the measurement of antibiotics in waters (surface and tap). This is the first time such an exercise on antibiotics has been organized in Europe, using different kinds of analytical methods and devices. In this exercise thirteen laboratories from five countries (Canada, France, Italy, the Netherlands and Portugal) participated, and a total number of 78 samples were distributed. During the exercise, 2 testing samples (3 bottles of each) prepared from tap water and river water, respectively, spiked with antibiotics, were sent to participants and analyzed over a period of one month. A final number of 77 (98.7%) testing samples were considered. Depending on substances studied by each participant, 305 values in duplicate were collected, with the results for each sample being expressed as the target concentration. A statistical study was initiated using 611 results. The mean value, standard deviation, coefficient of variation, standard uncertainty of the mean, median, the minimum and maximum values of each series as well as the 95% confidence interval were obtained from each participant laboratory. In this exercise, 36 results (6% of accounted values) were outliers according to the distribution over the median (box plot). The outlier results were excluded. In order to establish the stability of testing samples in the course of the exercise, differences between variances obtained for every type of sample at different intervals were evaluated. The results showed no representative variations and it can be considered that all samples were stable during the exercise. The goals of this inter-laboratory study were to assess results variability when analysis is conducted by different laboratories, to evaluate the influence of different matrix samples, and to determine the rate at which participating laboratories successfully completed the tests initiated. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Avian malaria: a new lease of life for an old experimental model to study the evolutionary ecology of Plasmodium.

    PubMed

    Pigeault, Romain; Vézilier, Julien; Cornet, Stéphane; Zélé, Flore; Nicot, Antoine; Perret, Philippe; Gandon, Sylvain; Rivero, Ana

    2015-08-19

    Avian malaria has historically played an important role as a model in the study of human malaria, being a stimulus for the development of medical parasitology. Avian malaria has recently come back to the research scene as a unique animal model to understand the ecology and evolution of the disease, both in the field and in the laboratory. Avian malaria is highly prevalent in birds and mosquitoes around the world and is amenable to laboratory experimentation at each stage of the parasite's life cycle. Here, we take stock of 5 years of experimental laboratory research carried out using Plasmodium relictum SGS1, the most prevalent avian malaria lineage in Europe, and its natural vector, the mosquito Culex pipiens. For this purpose, we compile and analyse data obtained in our laboratory in 14 different experiments. We provide statistical relationships between different infection-related parameters, including parasitaemia, gametocytaemia, host morbidity (anaemia) and transmission rates to mosquitoes. This analysis provides a wide-ranging picture of the within-host and between-host parameters that may bear on malaria transmission and epidemiology. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  8. Reporting standards for Bland-Altman agreement analysis in laboratory research: a cross-sectional survey of current practice.

    PubMed

    Chhapola, Viswas; Kanwal, Sandeep Kumar; Brar, Rekha

    2015-05-01

    To carry out a cross-sectional survey of the medical literature on laboratory research papers published later than 2012 and available in the common search engines (PubMed, Google Scholar) on the quality of statistical reporting of method comparison studies using Bland-Altman (B-A) analysis. Fifty clinical studies were identified which had undertaken method comparison of laboratory analytes using B-A. The reporting of B-A was evaluated using a predesigned checklist with following six items: (1) correct representation of x-axis on B-A plot, (2) representation and correct definition of limits of agreement (LOA), (3) reporting of confidence interval (CI) of LOA, (4) comparison of LOA with a priori defined clinical criteria, (5) evaluation of the pattern of the relationship between difference (y-axis) and average (x-axis) and (6) measures of repeatability. The x-axis and LOA were presented correctly in 94%, comparison with a priori clinical criteria in 74%, CI reporting in 6%, evaluation of pattern in 28% and repeatability assessment in 38% of studies. There is incomplete reporting of B-A in published clinical studies. Despite its simplicity, B-A appears not to be completely understood by researchers, reviewers and editors of journals. There appear to be differences in the reporting of B-A between laboratory medicine journals and other clinical journals. A uniform reporting of B-A method will enhance the generalizability of results. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  9. Antimicrobial susceptibility of Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from a diagnostic veterinary laboratory and recommendations for a surveillance system

    PubMed Central

    Glass-Kaastra, Shiona K.; Pearl, David L.; Reid-Smith, Richard J.; McEwen, Beverly; Slavic, Durda; McEwen, Scott A.; Fairles, Jim

    2014-01-01

    Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection. PMID:24688133

  10. Antimicrobial susceptibility of Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from a diagnostic veterinary laboratory and recommendations for a surveillance system.

    PubMed

    Glass-Kaastra, Shiona K; Pearl, David L; Reid-Smith, Richard J; McEwen, Beverly; Slavic, Durda; McEwen, Scott A; Fairles, Jim

    2014-04-01

    Antimicrobial susceptibility data on Escherichia coli F4, Pasteurella multocida, and Streptococcus suis isolates from Ontario swine (January 1998 to October 2010) were acquired from a comprehensive diagnostic veterinary laboratory in Ontario, Canada. In relation to the possible development of a surveillance system for antimicrobial resistance, data were assessed for ease of management, completeness, consistency, and applicability for temporal and spatial statistical analyses. Limited farm location data precluded spatial analyses and missing demographic data limited their use as predictors within multivariable statistical models. Changes in the standard panel of antimicrobials used for susceptibility testing reduced the number of antimicrobials available for temporal analyses. Data consistency and quality could improve over time in this and similar diagnostic laboratory settings by encouraging complete reporting with sample submission and by modifying database systems to limit free-text data entry. These changes could make more statistical methods available for disease surveillance and cluster detection.

  11. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    USGS Publications Warehouse

    Wegner, S.J.

    1989-01-01

    Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)

  12. A Survey of the Practices, Procedures, and Techniques in Undergraduate Organic Chemistry Teaching Laboratories

    ERIC Educational Resources Information Center

    Martin, Christopher B.; Schmidt, Monica; Soniat, Michael

    2011-01-01

    A survey was conducted of four-year institutions that teach undergraduate organic chemistry laboratories in the United States. The data include results from over 130 schools, describes the current practices at these institutions, and discusses the statistical results such as the scale of the laboratories performed, the chemical techniques applied,…

  13. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. [Interlaboratory Study on Evaporation Residue Test for Food Contact Products (Report 1)].

    PubMed

    Ohno, Hiroyuki; Mutsuga, Motoh; Abe, Tomoyuki; Abe, Yutaka; Amano, Homare; Ishihara, Kinuyo; Ohsaka, Ikue; Ohno, Haruka; Ohno, Yuichiro; Ozaki, Asako; Kakihara, Yoshiteru; Kobayashi, Hisashi; Sakuragi, Hiroshi; Shibata, Hiroshi; Shirono, Katsuhiro; Sekido, Haruko; Takasaka, Noriko; Takenaka, Yu; Tajima, Yoshiyasu; Tanaka, Aoi; Tanaka, Hideyuki; Tonooka, Hiroyuki; Nakanishi, Toru; Nomura, Chie; Haneishi, Nahoko; Hayakawa, Masato; Miura, Toshihiko; Yamaguchi, Miku; Watanabe, Kazunari; Sato, Kyoko

    2018-01-01

    An interlaboratory study was performed to evaluate the equivalence between an official method and a modified method of evaporation residue test using three food-simulating solvents (water, 4% acetic acid and 20% ethanol), based on the Japanese Food Sanitation Law for food contact products. Twenty-three laboratories participated, and tested the evaporation residues of nine test solutions as blind duplicates. For evaporation, a water bath was used in the official method, and a hot plate in the modified method. In most laboratories, the test solutions were heated until just prior to evaporation to dryness, and then allowed to dry under residual heat. Statistical analysis revealed that there was no significant difference between the two methods, regardless of the heating equipment used. Accordingly, the modified method provides performance equal to the official method, and is available as an alternative method.

  15. Surface-wave and refraction tomography at the FACT Site, Sandia National Laboratories, Albuquerque, New Mexico.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, Robert E.; Bartel, Lewis Clark; Pullammanappallil, Satish

    2006-08-01

    We present a technique that allows for the simultaneous acquisition and interpretation of both shear-wave and compressive-wave 3-D velocities. The technique requires no special seismic sources or array geometries, and is suited to studies with small source-receiver offsets. The method also effectively deals with unwanted seismic arrivals by using the statistical properties of the data itself to discriminate against spurious picks. We demonstrate the technique with a field experiment at the Facility for Analysis, Calibration, and Testing at Sandia National Laboratories, Albuquerque, New Mexico. The resulting 3-D shear-velocity and compressive-velocity distributions are consistent with surface geologic mapping. The averaged velocitiesmore » and V{sub p}/V{sub s} ratio in the upper 30 meters are also consistent with examples found in the scientific literature.« less

  16. Differences in gender participation in college physical science laboratory as perceived by students and instructors

    NASA Astrophysics Data System (ADS)

    Gifford, Fay Evan

    The purpose of this study was to determine the difference in gender participation in the college physical science laboratory as perceived by students. The sample n this study consisted of 168 college sophomore architecture students (56 males and 33 females) and engineering students (61 males and 18 females). Depending on the type of information desired, a number of analyses were used including independent samples t-test, two-way Anova, general linear model analysis, Univariate analysis of variance, and descriptive statistics. In the analysis of data for the first fourteen questions of the questionnaire, which are called descriptive data, both gender and academic discipline differences were examined. It was found both genders picked personal choice as the role they played in the lab, and they were recorder, computer operator, and set up. There was no major difference here for the two disciplines except for engineers (by four to one over the architectures), who thought one member took the lead and assigned the role. There was no statistically significant difference in attitude toward group laboratory work between the two genders, but there was a significant difference by academic discipline here. There was a significant difference between genders for the way that students were assigned to small groups (i.e., the females would prefer the professor assign the role). For the open-ended student question dealing with suggestions for improving student participation in the labs, about one-third responded. One major difference between the disciplines was the architectural students by a twenty to one ratio over the engineers thought they didn't need a physics lab. For Hypothesis 4, there was a general agreement between the students' and the instructors' that there was not a difference in the students' gender responses and the instructors'. For Hypothesis 5, the responses from the four special gender questions for the students and instructors show that the males don't agree with the instructors on any of the four questions, but the females agree with the instructors on two of the questions.

  17. CIDR

    Science.gov Websites

    Diagnostic Laboratory and the Molecular Pathology Laboratory. Johns Hopkins Genomics is staffed with clinical molecular geneticists, bioinformaticists, statistical geneticists, clinical geneticists and molecular shape Claes, Peter et al, Nature Genetics, 2018 February GAME-ON & OncoArray: An International

  18. Nuclear forensic analysis of an unknown uranium ore concentrate sample seized in a criminal investigation in Australia

    DOE PAGES

    Keegan, Elizabeth; Kristo, Michael J.; Colella, Michael; ...

    2014-04-13

    In early 2009, a state policing agency raided a clandestine drug laboratory in a suburb of a major city in Australia. While searching the laboratory, they discovered a small glass jar labelled “Gamma Source” and containing a green powder. The powder was radioactive. This paper documents the detailed nuclear forensic analysis undertaken to characterize and identify the material and determine its provenance. Isotopic and impurity content, phase composition, microstructure and other characteristics were measured on the seized sample, and the results were compared with similar material obtained from the suspected source (ore and ore concentrate material). While an extensive rangemore » of parameters were measured, the key ‘nuclear forensic signatures’ used to identify the material were the U isotopic composition, Pb and Sr isotope ratios, and the rare earth element pattern. These measurements, in combination with statistical analysis of the elemental and isotopic content of the material against a database of uranium ore concentrates sourced from mines located worldwide, led to the conclusion that the seized material (a uranium ore concentrate of natural isotopic abundance) most likely originated from Mary Kathleen, a former Australian uranium mine.« less

  19. Long term fine aerosol analysis by XRF and PIXE techniques in the city of Rijeka, Croatia

    NASA Astrophysics Data System (ADS)

    Ivošević, Tatjana; Orlić, Ivica; Radović, Iva Bogdanović

    2015-11-01

    The results of a long term, multi elemental XRF and PIXE analysis of fine aerosol pollution in the city of Rijeka, Croatia, are reported for the first time. The samples were collected during a seven months period (6th Aug 2013-28th Feb 2014) on thin stretched Teflon filters and analyzed by energy dispersive X-ray fluorescence (EDXRF) at the Laboratory for Elemental Micro-Analysis (LEMA), University of Rijeka and by Particle Induced X-ray Emission (PIXE) using 1.6 MeV protons at the Laboratory for Ion Beam Interactions (LIBI), Ruđer Bošković Institute, Zagreb. The newly developed micro-XRF system at LEMA provided results for 19 elements in the range from Si to Pb. The PIXE at the LIBI provided information for the same elements as well for the light elements such as Na, Mg and Al. Black carbon was determined with the Laser Integrated Plate Method (LIPM). The results were statistically evaluated by means of the positive matrix factorization (PMF). The seven major pollution sources were identified together with their relative contributions, these are: secondary sulfates, road traffic, smoke, road dust, sea spray, ship emissions and soil dust.

  20. Analysis of Precipitation (Rain and Snow) Levels and Straight-line Wind Speeds in Support of the 10-year Natural Phenomena Hazards Review for Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Elizabeth J.; Dewart, Jean Marie; Deola, Regina

    This report provides site-specific return level analyses for rain, snow, and straight-line wind extreme events. These analyses are in support of the 10-year review plan for the assessment of meteorological natural phenomena hazards at Los Alamos National Laboratory (LANL). These analyses follow guidance from Department of Energy, DOE Standard, Natural Phenomena Hazards Analysis and Design Criteria for DOE Facilities (DOE-STD-1020-2012), Nuclear Regulatory Commission Standard Review Plan (NUREG-0800, 2007) and ANSI/ ANS-2.3-2011, Estimating Tornado, Hurricane, and Extreme Straight-Line Wind Characteristics at Nuclear Facility Sites. LANL precipitation and snow level data have been collected since 1910, although not all years are complete.more » In this report the results from the more recent data (1990–2014) are compared to those of past analyses and a 2004 National Oceanographic and Atmospheric Administration report. Given the many differences in the data sets used in these different analyses, the lack of statistically significant differences in return level estimates increases confidence in the data and in the modeling and analysis approach.« less

  1. Nuclear forensic analysis of an unknown uranium ore concentrate sample seized in a criminal investigation in Australia.

    PubMed

    Keegan, Elizabeth; Kristo, Michael J; Colella, Michael; Robel, Martin; Williams, Ross; Lindvall, Rachel; Eppich, Gary; Roberts, Sarah; Borg, Lars; Gaffney, Amy; Plaue, Jonathan; Wong, Henri; Davis, Joel; Loi, Elaine; Reinhard, Mark; Hutcheon, Ian

    2014-07-01

    Early in 2009, a state policing agency raided a clandestine drug laboratory in a suburb of a major city in Australia. During the search of the laboratory, a small glass jar labelled "Gamma Source" and containing a green powder was discovered. The powder was radioactive. This paper documents the detailed nuclear forensic analysis undertaken to characterise and identify the material and determine its provenance. Isotopic and impurity content, phase composition, microstructure and other characteristics were measured on the seized sample, and the results were compared with similar material obtained from the suspected source (ore and ore concentrate material). While an extensive range of parameters were measured, the key 'nuclear forensic signatures' used to identify the material were the U isotopic composition, Pb and Sr isotope ratios, and the rare earth element pattern. These measurements, in combination with statistical analysis of the elemental and isotopic content of the material against a database of uranium ore concentrates sourced from mines located worldwide, led to the conclusion that the seized material (a uranium ore concentrate of natural isotopic abundance) most likely originated from Mary Kathleen, a former Australian uranium mine. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Nuclear forensic analysis of an unknown uranium ore concentrate sample seized in a criminal investigation in Australia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keegan, Elizabeth; Kristo, Michael J.; Colella, Michael

    In early 2009, a state policing agency raided a clandestine drug laboratory in a suburb of a major city in Australia. While searching the laboratory, they discovered a small glass jar labelled “Gamma Source” and containing a green powder. The powder was radioactive. This paper documents the detailed nuclear forensic analysis undertaken to characterize and identify the material and determine its provenance. Isotopic and impurity content, phase composition, microstructure and other characteristics were measured on the seized sample, and the results were compared with similar material obtained from the suspected source (ore and ore concentrate material). While an extensive rangemore » of parameters were measured, the key ‘nuclear forensic signatures’ used to identify the material were the U isotopic composition, Pb and Sr isotope ratios, and the rare earth element pattern. These measurements, in combination with statistical analysis of the elemental and isotopic content of the material against a database of uranium ore concentrates sourced from mines located worldwide, led to the conclusion that the seized material (a uranium ore concentrate of natural isotopic abundance) most likely originated from Mary Kathleen, a former Australian uranium mine.« less

  3. Characterization, monitoring, and sensor technology catalogue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matalucci, R.V.; Esparza-Baca, C.; Jimenez, R.D.

    1995-12-01

    This document represents a summary of 58 technologies that are being developed by the Department of Energy`s (DOE`s) Office of Science and Technology (OST) to provide site, waste, and process characterization and monitoring solutions to the DOE weapons complex. The information was compiled to provide performance data on OST-developed technologies to scientists and engineers responsible for preparing Remedial Investigation/Feasibility Studies (RI/FSs) and preparing plans and compliance documents for DOE cleanup and waste management programs. The information may also be used to identify opportunities for partnering and commercialization with industry, DOE laboratories, other federal and state agencies, and the academic community.more » Each technology is featured in a format that provides: (1) a description, (2) technical performance data, (3) applicability, (4) development status, (5) regulatory considerations, (6) potential commercial applications, (7) intellectual property, and (8) points-of-contact. Technologies are categorized into the following areas: (1) Bioremediation Monitoring, (2) Decontamination and Decommissioning, (3) Field Analytical Laboratories, (4) Geophysical and Hydrologic Characterization, (5) Hazardous Inorganic Contaminant Analysis, (6) Hazardous Organic Contaminant Analysis, (7) Mixed Waste, (8) Radioactive Contaminant Analysis, (9) Remote Sensing,(10)Sampling and Drilling, (11) Statistically Guided Sampling, and (12) Tank Waste.« less

  4. Statistical Engineering in Air Traffic Management Research

    NASA Technical Reports Server (NTRS)

    Wilson, Sara R.

    2015-01-01

    NASA is working to develop an integrated set of advanced technologies to enable efficient arrival operations in high-density terminal airspace for the Next Generation Air Transportation System. This integrated arrival solution is being validated and verified in laboratories and transitioned to a field prototype for an operational demonstration at a major U.S. airport. Within NASA, this is a collaborative effort between Ames and Langley Research Centers involving a multi-year iterative experimentation process. Designing and analyzing a series of sequential batch computer simulations and human-in-the-loop experiments across multiple facilities and simulation environments involves a number of statistical challenges. Experiments conducted in separate laboratories typically have different limitations and constraints, and can take different approaches with respect to the fundamental principles of statistical design of experiments. This often makes it difficult to compare results from multiple experiments and incorporate findings into the next experiment in the series. A statistical engineering approach is being employed within this project to support risk-informed decision making and maximize the knowledge gained within the available resources. This presentation describes a statistical engineering case study from NASA, highlights statistical challenges, and discusses areas where existing statistical methodology is adapted and extended.

  5. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  6. Impact of satellite-based data on FGGE general circulation statistics

    NASA Technical Reports Server (NTRS)

    Salstein, David A.; Rosen, Richard D.; Baker, Wayman E.; Kalnay, Eugenia

    1987-01-01

    The NASA Goddard Laboratory for Atmospheres (GLA) analysis/forecast system was run in two different parallel modes in order to evaluate the influence that data from satellites and other FGGE observation platforms can have on analyses of large scale circulation; in the first mode, data from all observation systems were used, while in the second only conventional upper air and surface reports were used. The GLA model was also integrated for the same period without insertion of any data; an independent objective analysis based only on rawinsonde and pilot balloon data is also performed. A small decrease in the vigor of the general circulation is noted to follow from the inclusion of satellite observations.

  7. Facility-specific radiation exposure risks and their implications for radiation workers at Department of Energy laboratories

    NASA Astrophysics Data System (ADS)

    Davis, Adam Christopher

    This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.

  8. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    PubMed

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  9. Field and laboratory analyses of water from the Columbia aquifer in Eastern Maryland

    USGS Publications Warehouse

    Bachman, L.J.

    1984-01-01

    Field and laboratory analyses of pH, alkalinity, and specific conductance from water samples collected from the Columbia aquifer on the Delmarva Peninsula in eastern Maryland were compared to determine if laboratory analyses could be used for making regional water-quality interpretations. Kruskal-Wallis tests of field and laboratory data indicate that the difference between field and laboratory values is usually not enough to affect the outcome of the statistical tests. Thus, laboratory measurements of these constituents may be adequate for making certain regional water-quality interpretations, although they may result in errors if used for geochemical interpretations.

  10. Fluorescent-Antibody Measurement Of Cancer-Cell Urokinase

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    1993-01-01

    Combination of laboratory techniques provides measurements of amounts of urokinase in and between normal and cancer cells. Includes use of fluorescent antibodies specific against different forms of urokinase-type plasminogen activator, (uPA), fluorescence microscopy, quantitative analysis of images of sections of tumor tissue, and flow cytometry of different uPA's and deoxyribonucleic acid (DNA) found in suspended-tumor-cell preparations. Measurements provide statistical method for indicating or predicting metastatic potentials of some invasive tumors. Assessments of metastatic potentials based on such measurements used in determining appropriate follow-up procedures after surgical removal of tumors.

  11. Lift truck safety review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cadwallader, L.C.

    1997-03-01

    This report presents safety information about powered industrial trucks. The basic lift truck, the counterbalanced sit down rider truck, is the primary focus of the report. Lift truck engineering is briefly described, then a hazard analysis is performed on the lift truck. Case histories and accident statistics are also given. Rules and regulations about lift trucks, such as the US Occupational Safety an Health Administration laws and the Underwriter`s Laboratories standards, are discussed. Safety issues with lift trucks are reviewed, and lift truck safety and reliability are discussed. Some quantitative reliability values are given.

  12. Bi-telescopic, deep, simultaneous meteor observations

    NASA Technical Reports Server (NTRS)

    Taff, L. G.

    1986-01-01

    A statistical summary is presented of 10 hours of observing sporadic meteors and two meteor showers using the Experimental Test System of the Lincoln Laboratory. The observatory is briefly described along with the real-time and post-processing hardware, the analysis, and the data reduction. The principal observational results are given for the sporadic meteor zenithal hourly rates. The unique properties of the observatory include twin telescopes to allow the discrimination of meteors by parallax, deep limiting magnitude, good time resolution, and sophisticated real-time and post-observing video processing.

  13. A study of the portability of an Ada system in the software engineering laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    Jun, Linda O.; Valett, Susan Ray

    1990-01-01

    A particular porting effort is discussed, and various statistics on analyzing the portability of Ada and the total staff months (overall and by phase) required to accomplish the rehost, are given. This effort is compared to past experiments on the rehosting of FORTRAN systems. The discussion includes an analysis of the types of errors encountered during the rehosting, the changes required to rehost the system, experiences with the Alsys IBM Ada compiler, the impediments encountered, and the lessons learned during this study.

  14. Statistical significance approximation in local trend analysis of high-throughput time-series data using the theory of Markov chains.

    PubMed

    Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu

    2015-09-21

    Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.

  15. Computers in the General Physics Laboratory.

    ERIC Educational Resources Information Center

    Preston, Daryl W.; Good, R. H.

    1996-01-01

    Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)

  16. Computerized Cognition Laboratory.

    ERIC Educational Resources Information Center

    Motes, Michael A.; Wiegmann, Douglas A.

    1999-01-01

    Describes a software package entitled the "Computerized Cognition Laboratory" that helps integrate the teaching of cognitive psychology and research methods. Allows students to explore short-term memory, long-term memory, and decision making. Can also be used to teach the application of several statistical procedures. (DSK)

  17. Teaching method validation in the clinical laboratory science curriculum.

    PubMed

    Moon, Tara C; Legrys, Vicky A

    2008-01-01

    With the Clinical Laboratory Improvement Amendment's (CLIA) final rule, the ability of the Clinical Laboratory Scientist (CLS) to perform method validation has become increasingly important. Knowledge of the statistical methods and procedures used in method validation is imperative for clinical laboratory scientists. However, incorporating these concepts in a CLS curriculum can be challenging, especially at a time of limited resources. This paper provides an outline of one approach to addressing these topics in lecture courses and integrating them in the student laboratory and the clinical practicum for direct application.

  18. Circulating Carotenoids and Risk of Breast Cancer: Pooled Analysis of Eight Prospective Studies

    PubMed Central

    2012-01-01

    Background Carotenoids, micronutrients in fruits and vegetables, may reduce breast cancer risk. Most, but not all, past studies of circulating carotenoids and breast cancer have found an inverse association with at least one carotenoid, although the specific carotenoid has varied across studies. Methods We conducted a pooled analysis of eight cohort studies comprising more than 80% of the world’s published prospective data on plasma or serum carotenoids and breast cancer, including 3055 case subjects and 3956 matched control subjects. To account for laboratory differences and examine population differences across studies, we recalibrated participant carotenoid levels to a common standard by reassaying 20 plasma or serum samples from each cohort together at the same laboratory. Using conditional logistic regression, adjusting for several breast cancer risk factors, we calculated relative risks (RRs) and 95% confidence intervals (CIs) using quintiles defined among the control subjects from all studies. All P values are two-sided. Results Statistically significant inverse associations with breast cancer were observed for α-carotene (top vs bottom quintile RR = 0.87, 95% CI = 0.71 to 1.05, Ptrend = .04), β-carotene (RR = 0.83, 95% CI = 0.70 to 0.98, Ptrend = .02), lutein+zeaxanthin (RR = 0.84, 95% CI = 0.70 to 1.01, Ptrend = .05), lycopene (RR = 0.78, 95% CI = 0.62 to 0.99, Ptrend = .02), and total carotenoids (RR = 0.81, 95% CI = 0.68 to 0.96, Ptrend = .01). β-Cryptoxanthin was not statistically significantly associated with risk. Tests for heterogeneity across studies were not statistically significant. For several carotenoids, associations appeared stronger for estrogen receptor negative (ER−) than for ER+ tumors (eg, β-carotene: ER−: top vs bottom quintile RR = 0.52, 95% CI = 0.36 to 0.77, Ptrend = .001; ER+: RR = 0.83, 95% CI = 0.66 to 1.04, Ptrend = .06; Pheterogeneity = .01). Conclusions This comprehensive prospective analysis suggests women with higher circulating levels of α-carotene, β-carotene, lutein+zeaxanthin, lycopene, and total carotenoids may be at reduced risk of breast cancer. PMID:23221879

  19. Statistical benchmarking for orthogonal electrostatic quantum dot qubit devices

    NASA Astrophysics Data System (ADS)

    Gamble, John; Frees, Adam; Friesen, Mark; Coppersmith, S. N.

    2014-03-01

    Quantum dots in semiconductor systems have emerged as attractive candidates for the implementation of quantum information processors because of the promise of scalability, manipulability, and integration with existing classical electronics. A limitation in current devices is that the electrostatic gates used for qubit manipulation exhibit strong cross-capacitance, presenting a barrier for practical scale-up. Here, we introduce a statistical framework for making precise the notion of orthogonality. We apply our method to analyze recently implemented designs at the University of Wisconsin-Madison that exhibit much increased orthogonal control than was previously possible. We then use our statistical modeling to future device designs, providing practical guidelines for devices to have robust control properties. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy Nuclear Security Administration under contract DE-AC04-94AL85000. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressly or implied, of the US Government. This work was supported in part by the Laboratory Directed Research and Development program at Sandia National Laboratories, by ARO (W911NF-12-0607), and by the United States Department of Defense.

  20. An Inferentialist Perspective on the Coordination of Actions and Reasons Involved in Making a Statistical Inference

    ERIC Educational Resources Information Center

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-01-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical…

  1. Standard reference water samples for rare earth element determinations

    USGS Publications Warehouse

    Verplanck, P.L.; Antweiler, Ronald C.; Nordstrom, D. Kirk; Taylor, Howard E.

    2001-01-01

    Standard reference water samples (SRWS) were collected from two mine sites, one near Ophir, CO, USA and the other near Redding, CA, USA. The samples were filtered, preserved, and analyzed for rare earth element (REE) concentrations (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, and Lu) by inductively coupled plasma-mass spectrometry (ICP-MS). These two samples were acid mine waters with elevated concentrations of REEs (0.45-161 ??g/1). Seventeen international laboratories participated in a 'round-robin' chemical analysis program, which made it possible to evaluate the data by robust statistical procedures that are insensitive to outliers. The resulting most probable values are reported. Ten to 15 of the participants also reported values for Ba, Y, and Sc. Field parameters, major ion, and other trace element concentrations, not subject to statistical evaluation, are provided.

  2. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  3. Shiga toxigenic Escherichia coli incidence is related to small area variation in cattle density in a region in Ireland.

    PubMed

    Brehony, C; Cullinan, J; Cormican, M; Morris, D

    2018-10-01

    Shiga toxigenic Escherichia coli (STEC) are pathogenic E. coli that cause infectious diarrhoea. In some cases infection may be complicated by renal failure and death. The incidence of human infection with STEC in Ireland is the highest in Europe. The objective of the study was to examine the spatial incidence of human STEC infection in a region of Ireland with significantly higher rates of STEC incidence than the national average and to identify possible risk factors of STEC incidence at area level. Anonymised laboratory records (n = 379) from 2009 to 2015 were obtained from laboratories serving three counties in the West of Ireland. Data included location and sample date. Population and electoral division (ED) data were obtained from the Irish 2011 Census of Population. STEC incidence was calculated for each ED (n = 498) and used to map hotspots/coldspots using the Getis-Ord Gi* spatial statistic and significant spatial clustering using the Anselin's Local Moran's I statistic. Multivariable regression analysis was used to consider the importance of a number of potential predictors of STEC incidence. Incidence rates for the seven-year period ranged from 0 to 10.9 cases per 1000. A number of areas with significant local clustering of STEC incidence as well as variation in the spatial distribution of the two main serogroups associated with disease in the region i.e. O26 and O157 were identified. Cattle density was found to be a statistically significant predictor of STEC in the region. GIS analysis of routine data indicates that cattle density is associated STEC infection in this high incidence region. This finding points to the importance of agricultural practices for human health and the importance of a "one-health" approach to public policy in relation to agriculture, health and environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Simpler score of routine laboratory tests predicts liver fibrosis in patients with chronic hepatitis B.

    PubMed

    Zhou, Kun; Gao, Chun-Fang; Zhao, Yun-Peng; Liu, Hai-Lin; Zheng, Rui-Dan; Xian, Jian-Chun; Xu, Hong-Tao; Mao, Yi-Min; Zeng, Min-De; Lu, Lun-Gen

    2010-09-01

    In recent years, a great interest has been dedicated to the development of noninvasive predictive models to substitute liver biopsy for fibrosis assessment and follow-up. Our aim was to provide a simpler model consisting of routine laboratory markers for predicting liver fibrosis in patients chronically infected with hepatitis B virus (HBV) in order to optimize their clinical management. Liver fibrosis was staged in 386 chronic HBV carriers who underwent liver biopsy and routine laboratory testing. Correlations between routine laboratory markers and fibrosis stage were statistically assessed. After logistic regression analysis, a novel predictive model was constructed. This S index was validated in an independent cohort of 146 chronic HBV carriers in comparison to the SLFG model, Fibrometer, Hepascore, Hui model, Forns score and APRI using receiver operating characteristic (ROC) curves. The diagnostic values of each marker panels were better than single routine laboratory markers. The S index consisting of gamma-glutamyltransferase (GGT), platelets (PLT) and albumin (ALB) (S-index: 1000 x GGT/(PLT x ALB(2))) had a higher diagnostic accuracy in predicting degree of fibrosis than any other mathematical model tested. The areas under the ROC curves (AUROC) were 0.812 and 0.890 for predicting significant fibrosis and cirrhosis in the validation cohort, respectively. The S index, a simpler mathematical model consisting of routine laboratory markers predicts significant fibrosis and cirrhosis in patients with chronic HBV infection with a high degree of accuracy, potentially decreasing the need for liver biopsy.

  5. Statistical analysis and data mining of digital reconstructions of dendritic morphologies.

    PubMed

    Polavaram, Sridevi; Gillette, Todd A; Parekh, Ruchi; Ascoli, Giorgio A

    2014-01-01

    Neuronal morphology is diverse among animal species, developmental stages, brain regions, and cell types. The geometry of individual neurons also varies substantially even within the same cell class. Moreover, specific histological, imaging, and reconstruction methodologies can differentially affect morphometric measures. The quantitative characterization of neuronal arbors is necessary for in-depth understanding of the structure-function relationship in nervous systems. The large collection of community-contributed digitally reconstructed neurons available at NeuroMorpho.Org constitutes a "big data" research opportunity for neuroscience discovery beyond the approaches typically pursued in single laboratories. To illustrate these potential and related challenges, we present a database-wide statistical analysis of dendritic arbors enabling the quantification of major morphological similarities and differences across broadly adopted metadata categories. Furthermore, we adopt a complementary unsupervised approach based on clustering and dimensionality reduction to identify the main morphological parameters leading to the most statistically informative structural classification. We find that specific combinations of measures related to branching density, overall size, tortuosity, bifurcation angles, arbor flatness, and topological asymmetry can capture anatomically and functionally relevant features of dendritic trees. The reported results only represent a small fraction of the relationships available for data exploration and hypothesis testing enabled by sharing of digital morphological reconstructions.

  6. Effectiveness of practices to reduce blood culture contamination: A Laboratory Medicine Best Practices systematic review and meta-analysis☆

    PubMed Central

    Snyder, Susan R.; Favoretto, Alessandra M.; Baetz, Rich Ann; Derzon, James H.; Madison, Bereneice M.; Mass, Diana; Shaw, Colleen S.; Layfield, Christopher D.; Christenson, Robert H.; Liebow, Edward B.

    2015-01-01

    Objectives This article is a systematic review of the effectiveness of three practices for reducing blood culture contamination rates: venipuncture, phlebotomy teams, and prepackaged preparation/collection (prep) kits. Design and methods The CDC-funded Laboratory Medicine Best Practices Initiative systematic review methods for quality improvement practices were used. Results Studies included as evidence were: 9 venipuncture (vs. versus intravenous catheter), 5 phlebotomy team; and 7 prep kit. All studies for venipuncture and phlebotomy teams favored these practices, with meta-analysis mean odds ratios for venipuncture of 2.69 and phlebotomy teams of 2.58. For prep kits 6 studies’ effect sizes were not statistically significantly different from no effect (meta-analysis mean odds ratio 1.12). Conclusions Venipuncture and the use of phlebotomy teams are effective practices for reducing blood culture contamination rates in diverse hospital settings and are recommended as evidence-based “best practices” with high overall strength of evidence and substantial effect size ratings. No recommendation is made for or against prep kits based on uncertain improvement. PMID:22709932

  7. A portable x-ray fluorescence instrument for analyzing dust wipe samples for lead: evaluation with field samples.

    PubMed

    Sterling, D A; Lewis, R D; Luke, D A; Shadel, B N

    2000-06-01

    Dust wipe samples collected in the field were tested by nondestructive X-ray fluorescence (XRF) followed by laboratory analysis with flame atomic absorption spectrophotometry (FAAS). Data were analyzed for precision and accuracy of measurement. Replicate samples with the XRF show high precision with an intraclass correlation coefficient (ICC) of 0.97 (P<0.0001) and an overall coefficient of variation of 11.6%. Paired comparison indicates no statistical difference (P=0.272) between XRF and FAAS analysis. Paired samples are highly correlated with an R(2) ranging between 0.89 for samples that contain paint chips and 0.93 for samples that do not contain paint chips. The ICC for absolute agreement between XRF and laboratory results was 0.95 (P<0.0001). The relative error over the concentration range of 25 to 14,200 microgram Pb is -12% (95% CI, -18 to -5). The XRF appears to be an excellent method for rapid on-site evaluation of dust wipes for clearance and risk assessment purposes, although there are indications of some confounding when paint chips are present. Copyright 2000 Academic Press.

  8. Agreement between allergen-specific IgE assays and ensuing immunotherapy recommendations from four commercial laboratories in the USA

    PubMed Central

    Plant, Jon D; Neradelik, Moni B; Polissar, Nayak L; Fadok, Valerie A; Scott, Brian A

    2014-01-01

    Background Canine allergen-specific IgE assays in the USA are not subjected to an independent laboratory reliability monitoring programme. Hypothesis/Objectives The aim of this study was to evaluate the agreement of diagnostic results and treatment recommendations of four serum IgE assays commercially available in the USA. Methods Replicate serum samples from 10 atopic dogs were submitted to each of four laboratories for allergen-specific IgE assays (ACTT®, VARL Liquid Gold, ALLERCEPT® and Greer® Aller-g-complete®). The interlaboratory agreement of standard, regional panels and ensuing treatment recommendations were analysed with the kappa statistic (κ) to account for agreement that might occur merely by chance. Six comparisons of pairs of laboratories and overall agreement among laboratories were analysed for ungrouped allergens (as tested) and also with allergens grouped according to reported cross-reactivity and taxonomy. Results The overall chance-corrected agreement of the positive/negative test results for ungrouped and grouped allergens was slight (κ = 0.14 and 0.13, respectively). Subset analysis of the laboratory pair with the highest level of diagnostic agreement (κ = 0.36) found slight agreement (κ = 0.13) for ungrouped plants and fungi, but substantial agreement (κ = 0.71) for ungrouped mites. The overall agreement of the treatment recommendations was slight (κ = 0.11). Altogether, 85.1% of ungrouped allergen treatment recommendations were unique to one laboratory or another. Conclusions and clinical importance Our study indicated that the choice of IgE assay may have a major influence on the positive/negative results and ensuing treatment recommendations. PMID:24461034

  9. Inter-laboratory comparison of radiometric culture for Mycobacterium avium subsp. paratuberculosis using raw milk from known infected herds and individual dairy cattle in Victoria.

    PubMed

    Ridge, S E; Andreata, S; Jones, K; Cantlon, K; Francis, B; Florisson, N; Gwozdz, J

    2010-07-01

    To compare the results of radiometric culture conducted in three Australian laboratories for Mycobacterium avium subsp. paratuberculosis (Mptb) using bulk vat and individual animal milk samples. Milk samples were collected from 15 cows exhibiting clinical signs of Johne's disease, and subsequently confirmed as infected with Mptb, and from the bulk milk vats on 91 farms running herds known to be infected with Mptb. Each milk sample was divided into three equivalent samples and one of each of the replicates was forwarded to the three participating laboratories. The identity and nature of the samples was protected from the study collaborators. The laboratories processed the samples and undertook radiometric culture for Mptb using their standard method. Results of testing were provided to the principal investigator for collation and analysis. In total, 2 (2.2%) of 91 vat-milk samples and 8 (53.3%) of 15 individual cows' milk samples returned positive radiometric milk culture results. Only one sample, from a clinical case of Johne's disease, was identified as positive by more than one laboratory. There were differences in the absolute frequency with which Mptb was identified in the milk samples by the collaborating laboratories. Mptb was cultured from a very small percentage of Australian raw bulk milk samples sourced from known infected herds. By contrast, Mptb was successfully cultured from half of the milk samples collected from clinically affected cows. There was no statistical difference between laboratories in the proportion of vat samples or individual animal milk samples in which Mptb was detected.

  10. Methods for detection of haemophilia carriers: a Memorandum*

    PubMed Central

    1977-01-01

    This Memorandum discusses the problems and techniques involved in the detection of carriers of haemophilia A (blood coagulation factor VIII deficiency) and haemophilia B (factor IX deficiency), particularly with a view to its application to genetic counselling. Apart from the personal suffering caused by haemophilia, the proper treatment of haemophiliacs places a great strain on the blood transfusion services, and it is therefore important that potential carriers should have precise information about the consequences of their having children. The Memorandum classifies the types of carrier and describes the laboratory methods used for the assessment of coagulant activity and antigen concentration in blood. Particular emphasis is laid on the establishment of international, national, and laboratory (working) standards for factors VIII and IX and their calibration in international units (IU). This is followed by a detailed account of the statistical analysis of pedigree and laboratory data, which leads to an assessment of the likelihood that a particular person will transmit the haemophilia gene to her children. Finally, the problems and responsibilities involved in genetic counselling are considered. PMID:304395

  11. A Computerized Data-Capture System for Animal Biosafety Level 4 Laboratories

    PubMed Central

    Bente, Dennis A; Friesen, Jeremy; White, Kyle; Koll, Jordan; Kobinger, Gary P

    2011-01-01

    The restrictive nature of an Animal Biosafety Level 4 (ABSL4) laboratory complicates even simple clinical evaluation including data capture. Typically, clinical data are recorded on paper during procedures, faxed out of the ABSL4, and subsequently manually entered into a computer. This system has many disadvantages including transcriptional errors. Here, we describe the development of a highly customizable, tablet-PC-based computerized data-capture system, allowing reliable collection of observational and clinical data from experimental animals in a restrictive biocontainment setting. A multidisciplinary team with skills in containment laboratory animal science, database design, and software engineering collaborated on the development of this system. The goals were to design an easy-to-use and flexible user interface on a touch-screen tablet PC with user-supportable processes for recovery, full auditing capabilities, and cost effectiveness. The system simplifies data capture, reduces the necessary time in an ABSL4 environment, offers timely reporting and review of data, facilitates statistical analysis, reduces potential of erroneous data entry, improves quality assurance of animal care, and advances the use and refinement of humane endpoints. PMID:22330712

  12. The concordance of serial ANA tests in an Australian tertiary hospital pathology laboratory.

    PubMed

    Lee, Adrian Y S; Hudspeth, Andrew R; Adelstein, Stephen

    2016-10-01

    The antinuclear antibody (ANA) tests are some of the more frequently requested tests for the diagnosis of autoimmunity. Although they are used primarily as diagnostic blood tests, multiple requests on the same patient continue to be encountered in the laboratory. This retrospective analysis of serial ANA testing at one pathology laboratory in Australia is the first study that examines the statistical concordance and possible implications of this on clinical practice. High-titred ANA have quite good repeatability for titre and pattern, and low-titred ANA, which can be non-specific, have poor repeatability. Staining patterns are, in general, almost random in nature on serial tests when compared to the first-obtained ANA pattern for each patient. This study confirms that there is little benefit in serial ANA testing, and only if there is a clear change in the patient's clinical picture would repeat of an initial low-titred ANA be useful. The findings reinforce the need for pathology stewardship to minimise costs, wasted resources and unnecessary referrals. Copyright © 2016 Royal College of Pathologists of Australasia. All rights reserved.

  13. 75 FR 8993 - Agency Information Collection Activities: Reinstatement, With Change, of a Previously Approved...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... Publicly Funded Forensic Crime Laboratories. The Department of Justice (DOJ), Bureau of Justice Statistics... of Publicly Funded Forensic Crime Laboratories. (3) Agency form number, if any, and the applicable... perform forensic analyses on criminal evidence. [[Page 8994

  14. Uranium hydrogeochemical and stream sediment reconnaissance of the Solomon NTMS quadrangle, Alaska

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langfeldt, S.L.; Youngquist, C.A.; D'Andrea, R.F. Jr.

    This report presents results of a Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) of the Solomon NTMS quadrangle, Alaska. In addition to this abbreviated data release, more complete data are available to the public in machine-readable form through the Grand Junction Office Information System at Oak Ridge National Laboratory. Presented in this data release are location data, field analyses, and laboratory analyses of several different sample media. For the sake of brevity, many field site observations have not been included in this volume. These data are, however, available on the magnetic tape. Appendices A and B describe the sample media andmore » summarize the analytical results for each medium. The data were subdivided by one of the Los Alamos National Laboratory (LANL) sorting programs of Zinkl and others into groups of stream sediment and stream water samples. For each group which contains a sufficient number of observations, statistical tables, tables of raw data, and 1:1000000 scale maps of pertinent elements have been included in this report. In addition, maps showing results of multivariate statistical analyses have been included. Further information about the HSSR program in general, or about the LANL portion of the program in particular, can be obtained in quarterly or semiannual program progress reports on open-file at DOE's Technical Library in Grand Junction. Information about the field and analytical procedures used by LANL during sample collection and analysis may be found in any HSSR data release prepared by the LANL and will not be included in this report.« less

  15. Discourse in science communities: Issues of language, authority, and gender in a life sciences laboratory

    NASA Astrophysics Data System (ADS)

    Conefrey, Theresa Catherine

    Government-sponsored and private research initiatives continue to document the underrepresentation of women in the sciences. Despite policy initiatives, women's attrition rates each stage of their scientific careers remain higher than those of their male colleagues. In order to improve retention rates more information is needed about why many drop out or do not succeed as well as they could. While broad sociological studies and statistical surveys offer a valuable overview of institutional practices, in-depth qualitative analyses are needed to complement these large-scale studies. This present study goes behind statistical generalizations about the situation of women in science to explore the actual experience of scientific socialization and professionalization. Beginning with one reason often cited by women who have dropped out of science: "a bad lab experience," I explore through detailed observation in a naturalistic setting what this phrase might actually mean. Using ethnographic and discourse analytic methods, I present a detailed analysis of the discourse patterns in a life sciences laboratory group at a large research university. I show how language accomplishes the work of indexing and constituting social constraints, of maintaining or undermining the hierarchical power dynamics of the laboratory, of shaping members' presentation of self, and of modeling social and professional skills required to "do science." Despite the widespread conviction among scientists that "the mind has no sex," my study details how gender marks many routine interactions in the lab, including an emphasis on competition, a reinforcement of sex-role stereotypes, and a conversational style that is in several respects more compatible with men's than women's forms of talk.

  16. Evaluation of mericon E. coli O157 Screen Plus and mericon E. coli STEC O-Type Pathogen Detection Assays in Select Foods: Collaborative Study, First Action 2017.05.

    PubMed

    Bird, Patrick; Benzinger, M Joseph; Bastin, Benjamin; Crowley, Erin; Agin, James; Goins, David; Armstrong, Marcia

    2018-05-01

    QIAGEN mericon Escherichia coli O157 Screen Plus and mericon E. coli Shiga toxin-producing E. coli (STEC) O-Type Pathogen Detection Assays use Real-Time PCR technology for the rapid, accurate detection of E. coli O157 and the "big six" (O26, O45, O103, O111, O121, O145) (non-O157 STEC) in select food types. Using a paired study design, the assays were compared with the U.S. Department of Agriculture, Food Safety Inspection Service Microbiology Laboratory Guidebook Chapter 5.09 reference method for the detection of E. coli O157:H7 in raw ground beef. Both mericon assays were evaluated using the manual and an automated DNA extraction method. Thirteen technicians from five laboratories located within the continental United States participated in the collaborative study. Three levels of contamination were evaluated. Statistical analysis was conducted according to the probability of detection (POD) statistical model. Results obtained for the low-inoculum level test portions produced a difference between laboratories POD (dLPOD) value with a 95% confidence interval of 0.00 (-0.12, 0.12) for the mericon E. coli O157 Screen Plus with manual and automated extraction and mericon E. coli STEC O-Type with manual extraction and -0.01 (-0.13, 0.10) for the mericon E. coli STEC O-Type with automated extraction. The dLPOD results indicate equivalence between the candidate methods and the reference method.

  17. Statistical Methods and Tools for Uxo Characterization (SERDP Final Technical Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.

    2004-11-15

    The Strategic Environmental Research and Development Program (SERDP) issued a statement of need for FY01 titled Statistical Sampling for Unexploded Ordnance (UXO) Site Characterization that solicited proposals to develop statistically valid sampling protocols for cost-effective, practical, and reliable investigation of sites contaminated with UXO; protocols that could be validated through subsequent field demonstrations. The SERDP goal was the development of a sampling strategy for which a fraction of the site is initially surveyed by geophysical detectors to confidently identify clean areas and subsections (target areas, TAs) that had elevated densities of anomalous geophysical detector readings that could indicate the presencemore » of UXO. More detailed surveys could then be conducted to search the identified TAs for UXO. SERDP funded three projects: those proposed by the Pacific Northwest National Laboratory (PNNL) (SERDP Project No. UXO 1199), Sandia National Laboratory (SNL), and Oak Ridge National Laboratory (ORNL). The projects were closely coordinated to minimize duplication of effort and facilitate use of shared algorithms where feasible. This final report for PNNL Project 1199 describes the methods developed by PNNL to address SERDP's statement-of-need for the development of statistically-based geophysical survey methods for sites where 100% surveys are unattainable or cost prohibitive.« less

  18. Evaluation of the VIDAS Listeria (LIS) immunoassay for the detection of Listeria in foods using demi-Fraser and Fraser enrichment broths, as modification of AOAC Official Method 999.06 (AOAC Official Method 2004.06).

    PubMed

    Silbernagel, Karen M; Jechorek, Robert P; Kaufer, Amanda L; Johnson, Ronald L; Aleo, V; Brown, B; Buen, M; Buresh, J; Carson, M; Franklin, J; Ham, P; Humes, L; Husby, G; Hutchins, J; Jechorek, R; Jenkins, J; Kaufer, A; Kexel, N; Kora, L; Lam, L; Lau, D; Leighton, S; Loftis, M; Luc, S; Martin, J; Nacar, I; Nogle, J; Park, J; Schultz, A; Seymore, D; Smith, C; Smith, J; Thou, P; Ulmer, M; Voss, R; Weaver, V

    2005-01-01

    A multilaboratory study was conducted to compare the VIDAS LIS immunoassay with the standard cultural methods for the detection of Listeria in foods using an enrichment modification of AOAC Official Method 999.06. The modified enrichment protocol was implemented to harmonize the VIDAS LIS assay with the VIDAS LMO2 assay. Five food types--brie cheese, vanilla ice cream, frozen green beans, frozen raw tilapia fish, and cooked roast beef--at 3 inoculation levels, were analyzed by each method. A total of 15 laboratories representing government and industry participated. In this study, 1206 test portions were tested, of which 1170 were used in the statistical analysis. There were 433 positive by the VIDAS LIS assay and 396 positive by the standard culture methods. A Chi-square analysis of each of the 5 food types, at the 3 inoculation levels tested, was performed. The resulting average Chi square analysis, 0.42, indicated that, overall, there are no statistical differences between the VIDAS LIS assay and the standard methods at the 5% level of significance.

  19. In-situ structural integrity evaluation for high-power pulsed spallation neutron source - Effects of cavitation damage on structural vibration

    NASA Astrophysics Data System (ADS)

    Wan, Tao; Naoe, Takashi; Futakawa, Masatoshi

    2016-01-01

    A double-wall structure mercury target will be installed at the high-power pulsed spallation neutron source in the Japan Proton Accelerator Research Complex (J-PARC). Cavitation damage on the inner wall is an important factor governing the lifetime of the target-vessel. To monitor the structural integrity of the target vessel, displacement velocity at a point on the outer surface of the target vessel is measured using a laser Doppler vibrometer (LDV). The measured signals can be used for evaluating the damage inside the target vessel because of cyclic loading and cavitation bubble collapse caused by pulsed-beam induced pressure waves. The wavelet differential analysis (WDA) was applied to reveal the effects of the damage on vibrational cycling. To reduce the effects of noise superimposed on the vibration signals on the WDA results, analysis of variance (ANOVA) and analysis of covariance (ANCOVA), statistical methods were applied. Results from laboratory experiments, numerical simulation results with random noise added, and target vessel field data were analyzed by the WDA and the statistical methods. The analyses demonstrated that the established in-situ diagnostic technique can be used to effectively evaluate the structural response of the target vessel.

  20. GHEP-ISFG collaborative simulated exercise for DVI/MPI: Lessons learned about large-scale profile database comparisons.

    PubMed

    Vullo, Carlos M; Romero, Magdalena; Catelli, Laura; Šakić, Mustafa; Saragoni, Victor G; Jimenez Pleguezuelos, María Jose; Romanini, Carola; Anjos Porto, Maria João; Puente Prieto, Jorge; Bofarull Castro, Alicia; Hernandez, Alexis; Farfán, María José; Prieto, Victoria; Alvarez, David; Penacino, Gustavo; Zabalza, Santiago; Hernández Bolaños, Alejandro; Miguel Manterola, Irati; Prieto, Lourdes; Parsons, Thomas

    2016-03-01

    The GHEP-ISFG Working Group has recognized the importance of assisting DNA laboratories to gain expertise in handling DVI or missing persons identification (MPI) projects which involve the need for large-scale genetic profile comparisons. Eleven laboratories participated in a DNA matching exercise to identify victims from a hypothetical conflict with 193 missing persons. The post mortem database was comprised of 87 skeletal remain profiles from a secondary mass grave displaying a minimal number of 58 individuals with evidence of commingling. The reference database was represented by 286 family reference profiles with diverse pedigrees. The goal of the exercise was to correctly discover re-associations and family matches. The results of direct matching for commingled remains re-associations were correct and fully concordant among all laboratories. However, the kinship analysis for missing persons identifications showed variable results among the participants. There was a group of laboratories with correct, concordant results but nearly half of the others showed discrepant results exhibiting likelihood ratio differences of several degrees of magnitude in some cases. Three main errors were detected: (a) some laboratories did not use the complete reference family genetic data to report the match with the remains, (b) the identity and/or non-identity hypotheses were sometimes wrongly expressed in the likelihood ratio calculations, and (c) many laboratories did not properly evaluate the prior odds for the event. The results suggest that large-scale profile comparisons for DVI or MPI is a challenge for forensic genetics laboratories and the statistical treatment of DNA matching and the Bayesian framework should be better standardized among laboratories. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Quantifying Variation in Gait Features from Wearable Inertial Sensors Using Mixed Effects Models

    PubMed Central

    Cresswell, Kellen Garrison; Shin, Yongyun; Chen, Shanshan

    2017-01-01

    The emerging technology of wearable inertial sensors has shown its advantages in collecting continuous longitudinal gait data outside laboratories. This freedom also presents challenges in collecting high-fidelity gait data. In the free-living environment, without constant supervision from researchers, sensor-based gait features are susceptible to variation from confounding factors such as gait speed and mounting uncertainty, which are challenging to control or estimate. This paper is one of the first attempts in the field to tackle such challenges using statistical modeling. By accepting the uncertainties and variation associated with wearable sensor-based gait data, we shift our efforts from detecting and correcting those variations to modeling them statistically. From gait data collected on one healthy, non-elderly subject during 48 full-factorial trials, we identified four major sources of variation, and quantified their impact on one gait outcome—range per cycle—using a random effects model and a fixed effects model. The methodology developed in this paper lays the groundwork for a statistical framework to account for sources of variation in wearable gait data, thus facilitating informative statistical inference for free-living gait analysis. PMID:28245602

  2. New reporting procedures based on long-term method detection levels and some considerations for interpretations of water-quality data provided by the U.S. Geological Survey National Water Quality Laboratory

    USGS Publications Warehouse

    Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.

    1999-01-01

    This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.

  3. Evaluation of 3M™ Molecular Detection Assay (MDA) Listeria for the Detection of Listeria species in Selected Foods and Environmental Surfaces: Collaborative Study, First Action 2014.06.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James; Goins, David; Monteroso, Lisa; Benesh, DeAnn

    2015-01-01

    The 3M™ Molecular Detection Assay (MDA) Listeria is used with the 3M Molecular Detection System for the detection of Listeria species in food, food-related, and environmental samples after enrichment. The assay utilizes loop-mediated isothermal amplification to rapidly amplify Listeria target DNA with high specificity and sensitivity, combined with bioluminescence to detect the amplification. The 3M MDA Listeria method was evaluated using an unpaired study design in a multilaboratory collaborative study and compared to the AOAC Official Method of AnalysisSM (OMA) 993.12 Listeria monocytogenes in Milk and Dairy Products reference method for the detection of Listeria species in full-fat (4% milk fat) cottage cheese (25 g test portions). A total of 15 laboratories located in the continental United States and Canada participated. Each matrix had three inoculation levels: an uninoculated control level (0 CFU/test portion), and two levels artificially contaminated with Listeria monocytogenes, a low inoculum level (0.2-2 CFU/test portion) and a high inoculum level (2-5 CFU/test portion) using nonheat-stressed cells. In total, 792 unpaired replicate portions were analyzed. Statistical analysis was conducted according to the probability of detection (POD) model. Results obtained for the low inoculum level test portions produced a difference in cross-laboratory POD value of -0.07 with a 95% confidence interval of (-0.19, 0.06). No statistically significant differences were observed in the number of positive samples detected by the 3M MDA Listeria method versus the AOAC OMA method.

  4. Space debris characterization in support of a satellite breakup model

    NASA Technical Reports Server (NTRS)

    Fortson, Bryan H.; Winter, James E.; Allahdadi, Firooz A.

    1992-01-01

    The Space Kinetic Impact and Debris Branch began an ambitious program to construct a fully analytical model of the breakup of a satellite under hypervelocity impact. In order to provide empirical data with which to substantiate the model, debris from hypervelocity experiments conducted in a controlled laboratory environment were characterized to provide information of its mass, velocity, and ballistic coefficient distributions. Data on the debris were collected in one master data file, and a simple FORTRAN program allows users to describe the debris from any subset of these experiments that may be of interest to them. A statistical analysis was performed, allowing users to determine the precision of the velocity measurements for the data. Attempts are being made to include and correlate other laboratory data, as well as those data obtained from the explosion or collision of spacecraft in low earth orbit.

  5. A Possible Tool for Checking Errors in the INAA Results, Based on Neutron Data and Method Validation

    NASA Astrophysics Data System (ADS)

    Cincu, Em.; Grigore, Ioana Manea; Barbos, D.; Cazan, I. L.; Manu, V.

    2008-08-01

    This work presents preliminary results of a new type of possible application in the INAA experiments of elemental analysis, useful to check errors occurred during investigation of unknown samples; it relies on the INAA method validation experiments and accuracy of the neutron data from the literature. The paper comprises 2 sections, the first one presents—in short—the steps of the experimental tests carried out for INAA method validation and for establishing the `ACTIVA-N' laboratory performance, which is-at the same time-an illustration of the laboratory evolution on the way to get performance. Section 2 presents our recent INAA results on CRMs, of which interpretation opens discussions about the usefulness of using a tool for checking possible errors, different from the usual statistical procedures. The questionable aspects and the requirements to develop a practical checking tool are discussed.

  6. Evolution, Energy Landscapes and the Paradoxes of Protein Folding

    PubMed Central

    Wolynes, Peter G.

    2014-01-01

    Protein folding has been viewed as a difficult problem of molecular self-organization. The search problem involved in folding however has been simplified through the evolution of folding energy landscapes that are funneled. The funnel hypothesis can be quantified using energy landscape theory based on the minimal frustration principle. Strong quantitative predictions that follow from energy landscape theory have been widely confirmed both through laboratory folding experiments and from detailed simulations. Energy landscape ideas also have allowed successful protein structure prediction algorithms to be developed. The selection constraint of having funneled folding landscapes has left its imprint on the sequences of existing protein structural families. Quantitative analysis of co-evolution patterns allows us to infer the statistical characteristics of the folding landscape. These turn out to be consistent with what has been obtained from laboratory physicochemical folding experiments signalling a beautiful confluence of genomics and chemical physics. PMID:25530262

  7. Voice measures of workload in the advanced flight deck: Additional studies

    NASA Technical Reports Server (NTRS)

    Schneider, Sid J.; Alpert, Murray

    1989-01-01

    These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.

  8. USE OF NATURAL WATERS AS U. S. GEOLOGICAL SURVEY REFERENCE SAMPLES.

    USGS Publications Warehouse

    Janzer, Victor J.

    1985-01-01

    The U. S. Geological Survey conducts research and collects hydrologic data relating to the Nation's water resources. Seven types of natural matrix reference water samples are prepared for use in the Survey's quality assurance program. These include samples containing major constituents, trace metals, nutrients, herbicides, insecticides, trace metals in a water and suspended-sediment mixture, and precipitation (snowmelt). To prepare these reference samples, natural water is collected in plastic drums and the sediment is allowed to settle. The water is then filtered, selected constituents are added, and if necessary the water is acidified and sterilized by ultraviolet irradiation before bottling in plastic or glass. These reference samples are distributed twice yearly to more than 100 laboratories for chemical analysis. The most probable values for each constituent are determined by evaluating the data submitted by the laboratories using statistical techniques recommended by ASTM.

  9. Statistics of software vulnerability detection in certification testing

    NASA Astrophysics Data System (ADS)

    Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.

    2018-05-01

    The paper discusses practical aspects of introduction of the methods to detect software vulnerability in the day-to-day activities of the accredited testing laboratory. It presents the approval results of the vulnerability detection methods as part of the study of the open source software and the software that is a test object of the certification tests under information security requirements, including software for communication networks. Results of the study showing the allocation of identified vulnerabilities by types of attacks, country of origin, programming languages used in the development, methods for detecting vulnerability, etc. are given. The experience of foreign information security certification systems related to the detection of certified software vulnerabilities is analyzed. The main conclusion based on the study is the need to implement practices for developing secure software in the development life cycle processes. The conclusions and recommendations for the testing laboratories on the implementation of the vulnerability analysis methods are laid down.

  10. Computer Series, 61: Bits & Pieces, 24.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1985-01-01

    Describes: (1) laboratory information science in the clinical chemistry curriculum; (2) testing Boyle's Law, a context for statistical methods in undergraduate laboratories; (3) acquiring chemical concepts using microcomputers as tutees; and (4) using Data Interchange Format files for Apple microcomputers. Includes feedback from a previous article…

  11. A laboratory nanoseismological study on deep-focus earthquake micromechanics

    PubMed Central

    Wang, Yanbin; Zhu, Lupei; Shi, Feng; Schubnel, Alexandre; Hilairet, Nadege; Yu, Tony; Rivers, Mark; Gasc, Julien; Addad, Ahmed; Deldicque, Damien; Li, Ziyu; Brunet, Fabrice

    2017-01-01

    Global earthquake occurring rate displays an exponential decay down to ~300 km and then peaks around 550 to 600 km before terminating abruptly near 700 km. How fractures initiate, nucleate, and propagate at these depths remains one of the greatest puzzles in earth science, as increasing pressure inhibits fracture propagation. We report nanoseismological analysis on high-resolution acoustic emission (AE) records obtained during ruptures triggered by partial transformation from olivine to spinel in Mg2GeO4, an analog to the dominant mineral (Mg,Fe)2SiO4 olivine in the upper mantle, using state-of-the-art seismological techniques, in the laboratory. AEs’ focal mechanisms, as well as their distribution in both space and time during deformation, are carefully analyzed. Microstructure analysis shows that AEs are produced by the dynamic propagation of shear bands consisting of nanograined spinel. These nanoshear bands have a near constant thickness (~100 nm) but varying lengths and self-organize during deformation. This precursory seismic process leads to ultimate macroscopic failure of the samples. Several source parameters of AE events were extracted from the recorded waveforms, allowing close tracking of event initiation, clustering, and propagation throughout the deformation/transformation process. AEs follow the Gutenberg-Richter statistics with a well-defined b value of 1.5 over three orders of moment magnitudes, suggesting that laboratory failure processes are self-affine. The seismic relation between magnitude and rupture area correctly predicts AE magnitude at millimeter scales. A rupture propagation model based on strain localization theory is proposed. Future numerical analyses may help resolve scaling issues between laboratory AE events and deep-focus earthquakes. PMID:28776024

  12. Quantification and statistical significance analysis of group separation in NMR-based metabonomics studies

    PubMed Central

    Goodpaster, Aaron M.; Kennedy, Michael A.

    2015-01-01

    Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647

  13. Evaluation of the 3M™ Molecular Detection Assay (MDA) 2 - Salmonella for the Detection of Salmonella spp. in Select Foods and Environmental Surfaces: Collaborative Study, First Action 2016.01.

    PubMed

    Bird, Patrick; Flannery, Jonathan; Crowley, Erin; Agin, James R; Goins, David; Monteroso, Lisa

    2016-07-01

    The 3M™ Molecular Detection Assay (MDA) 2 - Salmonella uses real-time isothermal technology for the rapid and accurate detection of Salmonella spp. from enriched select food, feed, and food-process environmental samples. The 3M MDA 2 - Salmonella was evaluated in a multilaboratory collaborative study using an unpaired study design. The 3M MDA 2 - Salmonella was compared to the U.S. Food and Drug Administration Bacteriological Analytical Manual Chapter 5 reference method for the detection of Salmonella in creamy peanut butter, and to the U.S. Department of Agriculture, Food Safety and Inspection Service Microbiology Laboratory Guidebook Chapter 4.08 reference method "Isolation and Identification of Salmonella from Meat, Poultry, Pasteurized Egg and Catfish Products and Carcass and Environmental Samples" for the detection of Salmonella in raw ground beef (73% lean). Technicians from 16 laboratories located within the continental United States participated. Each matrix was evaluated at three levels of contamination: an uninoculated control level (0 CFU/test portion), a low inoculum level (0.2-2 CFU/test portion), and a high inoculum level (2-5 CFU/test portion). Statistical analysis was conducted according to the probability of detection (POD) statistical model. Results obtained for the low inoculum level test portions produced difference in collaborator POD values of 0.03 (95% confidence interval, -0.10 to 0.16) for raw ground beef and 0.06 (95% confidence interval, -0.06 to 0.18) for creamy peanut butter, indicating no statistically significant difference between the candidate and reference methods.

  14. Accounting for immunoprecipitation efficiencies in the statistical analysis of ChIP-seq data.

    PubMed

    Bao, Yanchun; Vinciotti, Veronica; Wit, Ernst; 't Hoen, Peter A C

    2013-05-30

    ImmunoPrecipitation (IP) efficiencies may vary largely between different antibodies and between repeated experiments with the same antibody. These differences have a large impact on the quality of ChIP-seq data: a more efficient experiment will necessarily lead to a higher signal to background ratio, and therefore to an apparent larger number of enriched regions, compared to a less efficient experiment. In this paper, we show how IP efficiencies can be explicitly accounted for in the joint statistical modelling of ChIP-seq data. We fit a latent mixture model to eight experiments on two proteins, from two laboratories where different antibodies are used for the two proteins. We use the model parameters to estimate the efficiencies of individual experiments, and find that these are clearly different for the different laboratories, and amongst technical replicates from the same lab. When we account for ChIP efficiency, we find more regions bound in the more efficient experiments than in the less efficient ones, at the same false discovery rate. A priori knowledge of the same number of binding sites across experiments can also be included in the model for a more robust detection of differentially bound regions among two different proteins. We propose a statistical model for the detection of enriched and differentially bound regions from multiple ChIP-seq data sets. The framework that we present accounts explicitly for IP efficiencies in ChIP-seq data, and allows to model jointly, rather than individually, replicates and experiments from different proteins, leading to more robust biological conclusions.

  15. The quality of veterinary in-clinic and reference laboratory biochemical testing.

    PubMed

    Rishniw, Mark; Pion, Paul D; Maher, Tammy

    2012-03-01

    Although evaluation of biochemical analytes in blood is common in veterinary practice, studies assessing the global quality of veterinary in-clinic and reference laboratory testing have not been reported. The aim of this study was to assess the quality of biochemical testing in veterinary laboratories using results obtained from analyses of 3 levels of assayed quality control materials over 5 days. Quality was assessed by comparison of calculated total error with quality requirements, determination of sigma metrics, use of a quality goal index to determine factors contributing to poor performance, and agreement between in-clinic and reference laboratory mean results. The suitability of in-clinic and reference laboratory instruments for statistical quality control was determined using adaptations from the computerized program, EZRules3. Reference laboratories were able to achieve desirable quality requirements more frequently than in-clinic laboratories. Across all 3 materials, > 50% of in-clinic analyzers achieved a sigma metric ≥ 6.0 for measurement of 2 analytes, whereas > 50% of reference laboratory analyzers achieved a sigma metric ≥ 6.0 for measurement of 6 analytes. Expanded uncertainty of measurement and ± total allowable error resulted in the highest mean percentages of analytes demonstrating agreement between in-clinic and reference laboratories. Owing to marked variation in bias and coefficient of variation between analyzers of the same and different types, the percentages of analytes suitable for statistical quality control varied widely. These findings reflect the current state-of-the-art with regard to in-clinic and reference laboratory analyzer performance and provide a baseline for future evaluations of the quality of veterinary laboratory testing. © 2012 American Society for Veterinary Clinical Pathology.

  16. Laboratory determination of effective stress laws for deformation and permeability of chalk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teufel, L W; Warpinski, N R

    1990-01-01

    Laboratory deformation and permeability measurements have been made on chalk samples from Ekofisk area fields as a function of confining stress and pore pressure to determine the effective stress laws for chalk. An understanding of the effective stress law is essential to obtain correct reservoir-property data from core analysis and is critical for reservoir management studies and reservoir compaction models. A powerful statistical technique known as the response surface method has been used to analyze our laboratory data determine the form of the effective stress law for deformation and permeability. Experiments were conducted on chalk samples that had a rangemore » of porosities from 15% to 36%, because porosity is the dominant intrinsic property that effects deformation and permeability behavior of chalk. Deformation of a 36% porosity chalk was highly nonlinear, but the effective stress law was linear, with {alpha} equal to about unity. Lower-porosity samples showed linear strain behavior and a linear effective stress law with {alpha} as low as 0.74. Analysis of the effective stress law for permeability is presented only for the lowest porosity chalk sample because changes in permeability in the higher-porosity chalk samples due to increasing confining stress or pore pressure were not were large enough, to deduce meaningful effective stress relationships. 15 refs., 8 figs., 2 tabs.« less

  17. The stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures

    USGS Publications Warehouse

    Gordon, J.D.; Schroder, L.J.; Morden-Moore, A. L.; Bowersox, V.C.

    1995-01-01

    Separate experiments by the U.S. Geological Survey (USGS) and the Illinois State Water Survey Central Analytical Laboratory (CAL) independently assessed the stability of hydrogen ion and specific conductance in filtered wet-deposition samples stored at ambient temperatures. The USGS experiment represented a test of sample stability under a diverse range of conditions, whereas the CAL experiment was a controlled test of sample stability. In the experiment by the USGS, a statistically significant (?? = 0.05) relation between [H+] and time was found for the composited filtered, natural, wet-deposition solution when all reported values are included in the analysis. However, if two outlying pH values most likely representing measurement error are excluded from the analysis, the change in [H+] over time was not statistically significant. In the experiment by the CAL, randomly selected samples were reanalyzed between July 1984 and February 1991. The original analysis and reanalysis pairs revealed that [H+] differences, although very small, were statistically different from zero, whereas specific-conductance differences were not. Nevertheless, the results of the CAL reanalysis project indicate there appears to be no consistent, chemically significant degradation in sample integrity with regard to [H+] and specific conductance while samples are stored at room temperature at the CAL. Based on the results of the CAL and USGS studies, short-term (45-60 day) stability of [H+] and specific conductance in natural filtered wet-deposition samples that are shipped and stored unchilled at ambient temperatures was satisfactory.

  18. Analyzing the efficiency of small and medium-sized enterprises of a national technology innovation research and development program.

    PubMed

    Park, Sungmin

    2014-01-01

    This study analyzes the efficiency of small and medium-sized enterprises (SMEs) of a national technology innovation research and development (R&D) program. In particular, an empirical analysis is presented that aims to answer the following question: "Is there a difference in the efficiency between R&D collaboration types and between government R&D subsidy sizes?" Methodologically, the efficiency of a government-sponsored R&D project (i.e., GSP) is measured by Data Envelopment Analysis (DEA), and a nonparametric analysis of variance method, the Kruskal-Wallis (KW) test is adopted to see if the efficiency differences between R&D collaboration types and between government R&D subsidy sizes are statistically significant. This study's major findings are as follows. First, contrary to our hypothesis, when we controlled the influence of government R&D subsidy size, there was no statistically significant difference in the efficiency between R&D collaboration types. However, the R&D collaboration type, "SME-University-Laboratory" Joint-Venture was superior to the others, achieving the largest median and the smallest interquartile range of DEA efficiency scores. Second, the differences in the efficiency were statistically significant between government R&D subsidy sizes, and the phenomenon of diseconomies of scale was identified on the whole. As the government R&D subsidy size increases, the central measures of DEA efficiency scores were reduced, but the dispersion measures rather tended to get larger.

  19. Perceived emotional aptitude of clinical laboratory sciences students compared to students in other healthcare profession majors.

    PubMed

    Adams, Austin; McCabe, Kristin; Zundel, Cassandra; Price, Travis; Dahl, Corey

    2011-01-01

    Emotional aptitude can be defined as the ability to recognize and manage one's own emotions and interpret the emotions of others. It has been speculated that Clinical Laboratory Sciences students may lack the emotional skills to most effectively interact with patients and other healthcare professionals, therefore a logical hypothesis would be that they would evaluate their own emotional intelligence lower than students from other healthcare majors. While this has been a topic of discussion in healthcare, a lack of research has been conducted to validate this assumption. This study assesses the perceived emotional aptitude of Clinical Laboratory Sciences students compared to students of other healthcare majors in the Dumke College of Health Professions at Weber State University. The perceived emotional aptitude of the healthcare students was determined by completion of a self-evaluation questionnaire that included questions about one's emotions, their understanding of others' emotions, and how they manage conflict. A total of 401 questionnaires were completed, compiled, and analyzed. Although minor differences were seen in the responses, statistical analysis found these differences to be insignificant. The perceived emotional aptitude of Clinical Laboratory Sciences students was insignificantly different than that of students of other healthcare majors at the Dumke College of Health Professions.

  20. Gambling as a teaching aid in the introductory physics laboratory

    NASA Astrophysics Data System (ADS)

    Horodynski-Matsushigue, L. B.; Pascholati, P. R.; Vanin, V. R.; Dias, J. F.; Yoneama, M.-L.; Siqueira, P. T. D.; Amaku, M.; Duarte, J. L. M.

    1998-07-01

    Dice throwing is used to illustrate relevant concepts of the statistical theory of uncertainties, in particular the meaning of a limiting distribution, the standard deviation, and the standard deviation of the mean. It is an important part in a sequence of especially programmed laboratory activities, developed for freshmen, at the Institute of Physics of the University of São Paulo. It is shown how this activity is employed within a constructive teaching approach, which aims at a growing understanding of the measuring processes and of the fundamentals of correct statistical handling of experimental data.

  1. Comparison of upper extremity kinematics in children with spastic diplegic cerebral palsy using anterior and posterior walkers.

    PubMed

    Strifling, Kelly M B; Lu, Na; Wang, Mei; Cao, Kevin; Ackman, Jeffrey D; Klein, John P; Schwab, Jeffrey P; Harris, Gerald F

    2008-10-01

    This prospective study analyzes the upper extremity kinematics of 10 children with spastic diplegic cerebral palsy using anterior and posterior walkers. Although both types of walkers are commonly prescribed by clinicians, no quantitative data comparing the two in regards to upper extremity motion has been published. The study methodology included testing of each subject with both types of walkers in a motion analysis laboratory after an acclimation period of at least 1 month. Overall results showed that statistically, both walkers are relatively similar. With both anterior and posterior walkers, the shoulders were extended, elbows flexed, and wrists extended. Energy expenditure, walking speed and stride length was also similar with both walker types. Several differences were also noted although not statistically significant. Anterior torso tilt was reduced with the posterior walker and shoulder extension and elbow flexion were increased. Outcomes analysis indicated that differences in upper extremity torso and joint motion were not dependent on spasticity or hand dominance. These findings may help to build an understanding of upper extremity motion in walker-assisted gait and potentially to improve walker prescription.

  2. Long-term Results of an Analytical Assessment of Student Compounded Preparations

    PubMed Central

    Roark, Angie M.; Anksorus, Heidi N.

    2014-01-01

    Objective. To investigate the long-term (ie, 6-year) impact of a required remake vs an optional remake on student performance in a compounding laboratory course in which students’ compounded preparations were analyzed. Methods. The analysis data for several preparations made by students were compared for differences in the analyzed content of the active pharmaceutical ingredient (API) and the number of students who successfully compounded the preparation on the first attempt. Results. There was a consistent statistical difference in the API amount or concentration in 4 of the preparations (diphenhydramine, ketoprofen, metoprolol, and progesterone) in each optional remake year compared to the required remake year. As the analysis requirement was continued, the outcome for each preparation approached and/or attained the expected API result. Two preparations required more than 1 year to demonstrate a statistical difference. Conclusion. The analytical assessment resulted in a consistent, long-term improvement in student performance during the 5-year period after the optional remake policy was instituted. Our assumption is that investment in such an assessment would result in a similar benefits at other colleges and schools of pharmacy. PMID:26056402

  3. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    PubMed

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  4. Síndrome metabólico y otros factores asociados a gonartrosis.

    PubMed

    Charles-Lozoya, Sergio; Treviño-Báez, Joaquín Darío; Ramos-Rivera, Jesús Alejandro; Rangel-Flores, Jesús María; Tamez-Montes, Juan Carlos; Brizuela-Ventura, Jesús Miguel

    2017-01-01

    To evaluate whether an association exists between gonarthrosis and metabolic syndrome X (MS) as well as other potential risk factors. Comparative cross-sectional study of 310 patients evaluated by pathology of knee grouped in patients with gonarthrosis and without it. Sociodemographic, anthropometric and laboratory data was obtained. Gonarthrosis was defined as a ≥ 2 score in Kellgren-Lawrence radiological scale, and MS was assessed using the International Diabetes Federation criteria. Odds ratio and logistic regression were used for bivariate and multivariate analysis respectively. The prevalence of MS in patients who had gonarthrosis was 79.9%, statistically higher than in patients without gonarthrosis (p = 0.001). Other factors that had a statistically higher frequency in this group included diabetes mellitus (p = 0.02) and hypertension (p = 0.02). Multivariate analysis revealed MS had an association with a higher prevalence of gonarthrosis (p = 0.003), while high density lipoproteins (p = 0.02) was associated with a lower prevalence. MS and its related alterations are associated to gonarthrosis; their adequate control could prevent patients from developing the disease. Copyright: © 2017 SecretarÍa de Salud

  5. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  6. Long-term Results of an Analytical Assessment of Student Compounded Preparations.

    PubMed

    Roark, Angie M; Anksorus, Heidi N; Shrewsbury, Robert P

    2014-11-15

    To investigate the long-term (ie, 6-year) impact of a required remake vs an optional remake on student performance in a compounding laboratory course in which students' compounded preparations were analyzed. The analysis data for several preparations made by students were compared for differences in the analyzed content of the active pharmaceutical ingredient (API) and the number of students who successfully compounded the preparation on the first attempt. There was a consistent statistical difference in the API amount or concentration in 4 of the preparations (diphenhydramine, ketoprofen, metoprolol, and progesterone) in each optional remake year compared to the required remake year. As the analysis requirement was continued, the outcome for each preparation approached and/or attained the expected API result. Two preparations required more than 1 year to demonstrate a statistical difference. The analytical assessment resulted in a consistent, long-term improvement in student performance during the 5-year period after the optional remake policy was instituted. Our assumption is that investment in such an assessment would result in a similar benefits at other colleges and schools of pharmacy.

  7. Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing

    NASA Astrophysics Data System (ADS)

    Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.

    Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.

  8. Technician Consistency in Specular Microscopy Measurements: A "Real-World" Retrospective Analysis of a United States Eye Bank.

    PubMed

    Rand, Gabriel M; Kwon, Ji Won; Gore, Patrick K; McCartney, Mitchell D; Chuck, Roy S

    2017-10-01

    To quantify consistency of endothelial cell density (ECD) measurements among technicians in a single US eye bank operating under typical operating conditions. In this retrospective analysis of 51 microscopy technicians using a semiautomated counting method on 35,067 eyes from July 2007 to May 2015, technician- and date-related marginal ECD effects were calculated using linear regression models. ECD variance was correlated with the number of specular microscopy technicians. Technician mean ECDs ranged from 2386 ± 431 to 3005 ± 560 cells/mm. Nine technicians had statistically and clinically significant marginal effects. Annual mean ECDs adjusted for changes in technicians ranged from 2422 ± 433 to 2644 ± 430 cells/mm. The period of 2007 to 2009 had statistically and clinically significant marginal effects. There was a nonstatistically significant association between the number of technicians and ECD standard deviation. There was significant ECD variability associated with specular microscopy technicians and with the date of measurement. We recommend that eye banks collect data related to laboratory factors that have been shown to influence ECD variability.

  9. An evaluation of the uncertainty of extreme events statistics at the WMO/CIMO Lead Centre on precipitation intensity

    NASA Astrophysics Data System (ADS)

    Colli, M.; Lanza, L. G.; La Barbera, P.

    2012-12-01

    Improving the quality of point-scale rainfall measurements is a crucial issue fostered in recent years by the WMO Commission for Instruments and Methods of Observation (CIMO) by providing recommendations on the standardization of equipment and exposure, instrument calibration and data correction as a consequence of various comparative campaigns involving manufacturers and national meteorological services from the participating countries. The WMO/CIMO Lead Centre on Precipitation Intensity (LC) was recently constituted, in a joint effort between the Dep. of Civil, Chemical and Environmental Engineering of the University of Genova and the Italian Air Force Met Service, gathering the considerable asset of data and information achieved by the past infield and laboratory campaigns with the aim of researching novel methodologies for improving the accuracy of rainfall intensity (RI) measurement techniques. Among the ongoing experimental activities carried out by the LC laboratory particular attention is paid to the reliability evaluation of extreme rainfall events statistics , a common tool in the engineering practice for urban and non urban drainage system design, based on real world observations obtained from weighing gauges. Extreme events statistics were proven already to be highly affected by the traditional tipping-bucket rain gauge RI measurement inaccuracy (La Barbera et al., 2002) and the time resolution of the available RI series certainly constitutes another key-factor in the reliability of the derived hyetographs. The present work reports the LC laboratory efforts in assembling a rainfall simulation system to reproduce the inner temporal structure of the rainfall process by means of dedicated calibration and validation tests. This allowed testing of catching type rain gauges under non-steady flow conditions and quantifying, in a first instance, the dynamic behaviour of the investigated instruments. Considerations about the influence of the dynamic response on the uncertainty budget of modern rain gauges is also shown . The analysis proceeds with the laboratory simulation of the annual maximum rainfall events recorded for different durations at the Villa Cambiaso meteo-station (University of Genova) over the last two decades. Results are reported and discussed in a comparative form involving the derived extreme events statistics. REFERENCES La Barbera P., Lanza L.G. and Stagi L. (2002). Influence of systematic mechanical errors of tipping-bucket rain gauges on the statistics of rainfall extremes. Water Sci. Techn., 45(2), 1-9. Colli M., Lanza L.G., and Chan P.W. (2011). Co-located tipping-bucket and optical drop counter RI measurements and a simulated correction algorithm, Atmos. Res., doi:10.1016/j.atmosres.2011.07.018 Colli M., Lanza L.G., La Barbera P. (2012). Weighing gauges measurement errors and the design rainfall for urban scale applications. 9th International workshop on precipitation in urban areas. St.Moritz, Switzerland, 6-9 December 2012 Lanza L.G. and Vuerich E. (2009). The WMO Field Intercomparison of Rain Intensity Gauges. Atmos. Res., 94, 534-543.

  10. A laboratory evaluation of the influence of weighing gauges performance on extreme events statistics

    NASA Astrophysics Data System (ADS)

    Colli, Matteo; Lanza, Luca

    2014-05-01

    The effects of inaccurate ground based rainfall measurements on the information derived from rain records is yet not much documented in the literature. La Barbera et al. (2002) investigated the propagation of the systematic mechanic errors of tipping bucket type rain gauges (TBR) into the most common statistics of rainfall extremes, e.g. in the assessment of the return period T (or the related non-exceedance probability) of short-duration/high intensity events. Colli et al. (2012) and Lanza et al. (2012) extended the analysis to a 22-years long precipitation data set obtained from a virtual weighing type gauge (WG). The artificial WG time series was obtained basing on real precipitation data measured at the meteo-station of the University of Genova and modelling the weighing gauge output as a linear dynamic system. This approximation was previously validated with dedicated laboratory experiments and is based on the evidence that the accuracy of WG measurements under real world/time varying rainfall conditions is mainly affected by the dynamic response of the gauge (as revealed during the last WMO Field Intercomparison of Rainfall Intensity Gauges). The investigation is now completed by analyzing actual measurements performed by two common weighing gauges, the OTT Pluvio2 load-cell gauge and the GEONOR T-200 vibrating-wire gauge, since both these instruments demonstrated very good performance under previous constant flow rate calibration efforts. A laboratory dynamic rainfall generation system has been arranged and validated in order to simulate a number of precipitation events with variable reference intensities. Such artificial events were generated basing on real world rainfall intensity (RI) records obtained from the meteo-station of the University of Genova so that the statistical structure of the time series is preserved. The influence of the WG RI measurements accuracy on the associated extreme events statistics is analyzed by comparing the original intensity-duration-frequency (IDF) curves with those obtained from the measuring of the simulated rain events. References: Colli, M., L.G. Lanza, and P. La Barbera, (2012). Weighing gauges measurement errors and the design rainfall for urban scale applications, 9th International Workshop On Precipitation In Urban Areas, 6-9 December, 2012, St. Moritz, Switzerland Lanza, L.G., M. Colli, and P. La Barbera (2012). On the influence of rain gauge performance on extreme events statistics: the case of weighing gauges, EGU General Assembly 2012, April 22th, Wien, Austria La Barbera, P., L.G. Lanza, and L. Stagi, (2002). Influence of systematic mechanical errors of tipping-bucket rain gauges on the statistics of rainfall extremes. Water Sci. Techn., 45(2), 1-9.

  11. 76 FR 76441 - Emergency Clearance; Public Information Collection Requirements Submitted to the Office of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-07

    ... surveyed. Basic analyses will include descriptive statistics on each category of information requested by... Equipment Donations for Schools. Title of Collection: Survey of Laboratory Equipment Donations for Schools... laboratory equipment to elementary schools and secondary schools. The Director * * * shall survey...

  12. Dendropedagogy: Teaching Botany, Ecology and Statistical Principles through Tree-Ring Studies.

    ERIC Educational Resources Information Center

    Rubino, Darrin L.; McCarthy, Brian C.

    2002-01-01

    Develops a simple tree-ring laboratory to demonstrate the basics of dendrochronology. Provides two upper-level laboratory exercises primarily intended to demonstrate the specific dendrochronology subdisciplines of dendroclimatology and dendroecology. Suggests using the exercises separately or in unison as part of a multidisciplinary capstone…

  13. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  14. Statistics for NAEG: past efforts, new results, and future plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.

  15. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.

    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which canmore » then be compared to the behavior of the frequency spectrum.« less

  16. Environmental Testing Philosophy for a Sandia National Laboratories' Small Satellite Project - A Retrospective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CAP,JEROME S.

    2000-08-24

    Sandia has recently completed the flight certification test series for the Multi-Spectral Thermal Imaging satellite (MTI), which is a small satellite for which Sandia was the system integrator. A paper was presented at the 16th Aerospace Testing Seminar discussing plans for performing the structural dynamics certification program for that satellite. The testing philosophy was originally based on a combination of system level vibroacoustic tests and component level shock and vibration tests. However, the plans evolved to include computational analyses using both Finite Element Analysis and Statistical Energy Analysis techniques. This paper outlines the final certification process and discuss lessons learnedmore » including both things that went well and things that should/could have been done differently.« less

  17. The death spiral: predicting death in Drosophila cohorts.

    PubMed

    Mueller, Laurence D; Shahrestani, Parvin; Rauser, Casandra L; Rose, Michael R

    2016-11-01

    Drosophila research has identified a new feature of aging that has been called the death spiral. The death spiral is a period prior to death during which there is a decline in life-history characters, such as fecundity, as well as physiological characters. First, we review the data from the Drosophila and medfly literature that suggest the existence of death spirals. Second, we re-analyze five cases with such data from four laboratories using a generalized statistical framework, a re-analysis that strengthens the case for the salience of the death spiral phenomenon. Third, we raise the issue whether death spirals need to be taken into account in the analysis of functional characters over age, in aging research with model species as well as human data.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Eric J

    The ResStock analysis tool is helping states, municipalities, utilities, and manufacturers identify which home upgrades save the most energy and money. Across the country there's a vast diversity in the age, size, construction practices, installed equipment, appliances, and resident behavior of the housing stock, not to mention the range of climates. These variations have hindered the accuracy of predicting savings for existing homes. Researchers at the National Renewable Energy Laboratory (NREL) developed ResStock. It's a versatile tool that takes a new approach to large-scale residential energy analysis by combining: large public and private data sources, statistical sampling, detailed subhourly buildingmore » simulations, high-performance computing. This combination achieves unprecedented granularity and most importantly - accuracy - in modeling the diversity of the single-family housing stock.« less

  19. Quality assurance of temporal variability of natural decay chain and neutron induced background for low-level NORM analysis

    DOE PAGES

    Yoho, Michael; Porterfield, Donivan R.; Landsberger, Sheldon

    2015-09-22

    In this study, twenty-one high purity germanium (HPGe) background spectra were collected over 2 years at Los Alamos National Laboratory. A quality assurance methodology was developed to monitor spectral background levels from thermal and fast neutron flux levels and naturally occurring radioactive material decay series radionuclides. 238U decay products above 222Rn demonstrated minimal temporal variability beyond that expected from counting statistics. 238U and 232Th progeny below Rn gas displayed at most twice the expected variability. Further, an analysis of the 139 keV 74Ge(n, γ) and 691 keV 72Ge(n, n') spectral features demonstrated temporal stability for both thermal and fastmore » neutron fluxes.« less

  20. Severity of Illness Scores May Misclassify Critically Ill Obese Patients.

    PubMed

    Deliberato, Rodrigo Octávio; Ko, Stephanie; Komorowski, Matthieu; Armengol de La Hoz, M A; Frushicheva, Maria P; Raffa, Jesse D; Johnson, Alistair E W; Celi, Leo Anthony; Stone, David J

    2018-03-01

    Severity of illness scores rest on the assumption that patients have normal physiologic values at baseline and that patients with similar severity of illness scores have the same degree of deviation from their usual state. Prior studies have reported differences in baseline physiology, including laboratory markers, between obese and normal weight individuals, but these differences have not been analyzed in the ICU. We compared deviation from baseline of pertinent ICU laboratory test results between obese and normal weight patients, adjusted for the severity of illness. Retrospective cohort study in a large ICU database. Tertiary teaching hospital. Obese and normal weight patients who had laboratory results documented between 3 days and 1 year prior to hospital admission. None. Seven hundred sixty-nine normal weight patients were compared with 1,258 obese patients. After adjusting for the severity of illness score, age, comorbidity index, baseline laboratory result, and ICU type, the following deviations were found to be statistically significant: WBC 0.80 (95% CI, 0.27-1.33) × 10/L; p = 0.003; log (blood urea nitrogen) 0.01 (95% CI, 0.00-0.02); p = 0.014; log (creatinine) 0.03 (95% CI, 0.02-0.05), p < 0.001; with all deviations higher in obese patients. A logistic regression analysis suggested that after adjusting for age and severity of illness at least one of these deviations had a statistically significant effect on hospital mortality (p = 0.009). Among patients with the same severity of illness score, we detected clinically small but significant deviations in WBC, creatinine, and blood urea nitrogen from baseline in obese compared with normal weight patients. These small deviations are likely to be increasingly important as bigger data are analyzed in increasingly precise ways. Recognition of the extent to which all critically ill patients may deviate from their own baseline may improve the objectivity, precision, and generalizability of ICU mortality prediction and severity adjustment models.

  1. Evaluation of the accuracy of extraoral laboratory scanners with a single-tooth abutment model: A 3D analysis.

    PubMed

    Mandelli, Federico; Gherlone, Enrico; Gastaldi, Giorgio; Ferrari, Marco

    2017-10-01

    The aim of this study was to compare the accuracy of different laboratory scanners using a calibrated coordinate measuring machine as reference. A sand blasted titanium reference model (RM) was scanned with an industrial 3D scanner in order to obtain a reference digital model (dRM) that was saved in the standard tessellation format (.stl). RM was scanned ten times with each one of the tested scanners (GC Europe Aadva, Zfx Evolution, 3Shape D640, 3Shape D700, NobilMetal Sinergia, EGS DScan3, Open Technologies Concept Scan Top) and all the scans were exported in .stl format for the comparison. All files were imported in a dedicated software (Geomagic Qualify 2013). Accuracy was evaluated calculating trueness and precision. Trueness values (μm [95% confidence interval]) were: Aadva 7,7 [6,8-8,5]; Zfx Evolution 9,2 [8,6-9,8]; D640 18,1 [12,2-24,0]; D700 12,8 [12,4-13,3]; Sinergia 31,1 [26,3-35,9]; DScan3 15,6 [11,5-19,7]; Concept Scan Top 28,6 [25,6-31,6]. Differences between scanners were statistically significant (p<.0005). Precision values (μm [95% CI]) were: Aadva 4,0 [3,8-4,2]; Zfx Evolution 5,1 [4,4-5,9]; D640 12,7 [12,4-13,1]; D700 11,0 [10,7-11,3]; Sinergia 16,3 [15,0-17,5]; DScan3 9,5 [8,3-10,6]; Concept Scan Top 19,5 [19,1-19,8]. Differences between scanners were statistically significant (p<.0005). The use a standardized scanning procedure fabricating a titanium reference model is useful to compare trueness and precision of different laboratory scanners; two laboratory scanners (Aadva, Zfx Evolution) were significantly better that other tested scanners. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  2. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  3. Humidity-corrected Arrhenius equation: The reference condition approach.

    PubMed

    Naveršnik, Klemen; Jurečič, Rok

    2016-03-16

    Accelerated and stress stability data is often used to predict shelf life of pharmaceuticals. Temperature, combined with humidity accelerates chemical decomposition and the Arrhenius equation is used to extrapolate accelerated stability results to long-term stability. Statistical estimation of the humidity-corrected Arrhenius equation is not straightforward due to its non-linearity. A two stage nonlinear fitting approach is used in practice, followed by a prediction stage. We developed a single-stage statistical procedure, called the reference condition approach, which has better statistical properties (less collinearity, direct estimation of uncertainty, narrower prediction interval) and is significantly easier to use, compared to the existing approaches. Our statistical model was populated with data from a 35-day stress stability study on a laboratory batch of vitamin tablets and required mere 30 laboratory assay determinations. The stability prediction agreed well with the actual 24-month long term stability of the product. The approach has high potential to assist product formulation, specification setting and stability statements. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Systematic reviews: I. The correlation between laboratory tests on marginal quality and bond strength. II. The correlation between marginal quality and clinical outcome.

    PubMed

    Heintze, Siegward D

    2007-01-01

    An accepted principle in restorative dentistry states that restorations should be placed with the best marginal quality possible to avoid postoperative sensitivity, marginal discoloration, and secondary caries. Different laboratory methods claim to predict the clinical performance of restorative materials, for example, tests of bond strength and microleakage and gap analysis. The purpose of this review was twofold: (1) find studies that correlated the results of bond strength tests with either microleakage or gap analysis for the same materials, and (2) find studies that correlated the results of microleakage and/or gaps with the clinical parameters for the same materials. Furthermore, influencing factors on the results of the laboratory tests were reviewed and assessed. For the first question, searches for studies were conducted in the MEDLINE database and IADR/AADR abtracts online with specific search and inclusion criteria. The outcome for each study was assessed on the basis of the statistical test applied in the study, and finally the number of studies with or without correlation was compiled. For the second question, results of the quantitative marginal analysis of Class V restorations published by the University of Zürich with the same test protocol and prospective clinical trials were searched that investigated the same materials for at least 2 years in Class V cavities. Pearson correlation coefficients were calculated for pooled data of materials and clinical outcome parameters such as retention loss, marginal discoloration, marginal integrity, and secondary caries. For the correlation of dye penetration and clinical outcome, studies on Class V restorations published by the same research institute were searched in MEDLINE that examined the same adhesive systems as the selected clinical trials. For the correlation bond strength/microleakage, 30 studies were included into the review, and for the correlation bond strength/gap analysis 18 studies. For both topics, about 80% of the studies revealed that there was no correlation between the two methods. For the correlation quantitative marginal analysis/clinical outcome, data were compared to the clinical outcome of 11 selected clinical studies. In only 2 out of the 11 studies (18%) did the clinical outcome match the prognosis based on the laboratory tests; the remaining studies did not show any correlation. When pooling data on 20 adhesive systems, no correlation was found between the percentage of continuous margin of restorations placed in extracted premolars and the percentage of teeth that showed no retention loss in clinical studies, no discoloured margins, acceptable margins, or absence of secondary caries. With regard to the correlation of dye penetration and clinical studies, no sufficient number of studies was found that matched the inclusion criteria. However, literature data suggest that there is no correlation between microleakage data as measured in the laboratory and clinical parameters. The results of bond strength tests did not correlate with laboratory tests that evaluated the marginal seal of restorations such as microleakage or gap analysis. The quantitative marginal analysis of Class V fillings in the laboratory was unable to predict the performance of the same materials in vivo. Therefore, microleakage tests or the quantitative marginal analysis should be abandoned and research should focus on laboratory tests that are validated with regard to their ability to satisfactorily predict the clinical performance of restorative materials.

  5. Report on the Project for Establishment of the Standardized Korean Laboratory Terminology Database, 2015.

    PubMed

    Jung, Bo Kyeung; Kim, Jeeyong; Cho, Chi Hyun; Kim, Ju Yeon; Nam, Myung Hyun; Shin, Bong Kyung; Rho, Eun Youn; Kim, Sollip; Sung, Heungsup; Kim, Shinyoung; Ki, Chang Seok; Park, Min Jung; Lee, Kap No; Yoon, Soo Young

    2017-04-01

    The National Health Information Standards Committee was established in 2004 in Korea. The practical subcommittee for laboratory test terminology was placed in charge of standardizing laboratory medicine terminology in Korean. We aimed to establish a standardized Korean laboratory terminology database, Korea-Logical Observation Identifier Names and Codes (K-LOINC) based on former products sponsored by this committee. The primary product was revised based on the opinions of specialists. Next, we mapped the electronic data interchange (EDI) codes that were revised in 2014, to the corresponding K-LOINC. We established a database of synonyms, including the laboratory codes of three reference laboratories and four tertiary hospitals in Korea. Furthermore, we supplemented the clinical microbiology section of K-LOINC using an alternative mapping strategy. We investigated other systems that utilize laboratory codes in order to investigate the compatibility of K-LOINC with statistical standards for a number of tests. A total of 48,990 laboratory codes were adopted (21,539 new and 16,330 revised). All of the LOINC synonyms were translated into Korean, and 39,347 Korean synonyms were added. Moreover, 21,773 synonyms were added from reference laboratories and tertiary hospitals. Alternative strategies were established for mapping within the microbiology domain. When we applied these to a smaller hospital, the mapping rate was successfully increased. Finally, we confirmed K-LOINC compatibility with other statistical standards, including a newly proposed EDI code system. This project successfully established an up-to-date standardized Korean laboratory terminology database, as well as an updated EDI mapping to facilitate the introduction of standard terminology into institutions. © 2017 The Korean Academy of Medical Sciences.

  6. Estimation of spatial-temporal gait parameters using a low-cost ultrasonic motion analysis system.

    PubMed

    Qi, Yongbin; Soh, Cheong Boon; Gunawan, Erry; Low, Kay-Soon; Thomas, Rijil

    2014-08-20

    In this paper, a low-cost motion analysis system using a wireless ultrasonic sensor network is proposed and investigated. A methodology has been developed to extract spatial-temporal gait parameters including stride length, stride duration, stride velocity, stride cadence, and stride symmetry from 3D foot displacements estimated by the combination of spherical positioning technique and unscented Kalman filter. The performance of this system is validated against a camera-based system in the laboratory with 10 healthy volunteers. Numerical results show the feasibility of the proposed system with average error of 2.7% for all the estimated gait parameters. The influence of walking speed on the measurement accuracy of proposed system is also evaluated. Statistical analysis demonstrates its capability of being used as a gait assessment tool for some medical applications.

  7. Comparison of estrogen receptor results from pathology reports with results from central laboratory testing.

    PubMed

    Collins, Laura C; Marotti, Jonathan D; Baer, Heather J; Tamimi, Rulla M

    2008-02-06

    We compared estrogen receptor (ER) assay results abstracted from pathology reports with ER results determined on the same specimens by a central laboratory with an immunohistochemical assay. Paraffin sections were cut from tissue microarrays containing 3093 breast cancer specimens from women enrolled in the Nurses' Health Study, 1851 of which had both pathology reports and tissue available for central laboratory testing. All sections were immunostained for ER at the same time. The original assays were biochemical for 1512 (81.7%) of the 1851 specimens, immunohistochemical for 336 (18.2%), and immunofluorescent for three (0.2%). ER results from pathology reports and repeat central laboratory testing were in agreement for 87.3% of specimens (1615 of the 1851 specimens; kappa statistic = 0.64, P < .001). When the comparison was restricted to the specimens for which the ER assays were originally performed by immunohistochemistry, the agreement rate increased to 92.3% of specimens (310 of the 336 specimens; kappa statistic = 0.78, P < .001). Thus, ER assay results from pathology reports appear to be a reasonable alternative to central laboratory ER testing for large, population-based studies of patients with breast cancer.

  8. Epidemiological study of phylogenetic transmission clusters in a local HIV-1 epidemic reveals distinct differences between subtype B and non-B infections.

    PubMed

    Chalmet, Kristen; Staelens, Delfien; Blot, Stijn; Dinakis, Sylvie; Pelgrom, Jolanda; Plum, Jean; Vogelaers, Dirk; Vandekerckhove, Linos; Verhofstede, Chris

    2010-09-07

    The number of HIV-1 infected individuals in the Western world continues to rise. More in-depth understanding of regional HIV-1 epidemics is necessary for the optimal design and adequate use of future prevention strategies. The use of a combination of phylogenetic analysis of HIV sequences, with data on patients' demographics, infection route, clinical information and laboratory results, will allow a better characterization of individuals responsible for local transmission. Baseline HIV-1 pol sequences, obtained through routine drug-resistance testing, from 506 patients, newly diagnosed between 2001 and 2009, were used to construct phylogenetic trees and identify transmission-clusters. Patients' demographics, laboratory and clinical data, were retrieved anonymously. Statistical analysis was performed to identify subtype-specific and transmission-cluster-specific characteristics. Multivariate analysis showed significant differences between the 59.7% of individuals with subtype B infection and the 40.3% non-B infected individuals, with regard to route of transmission, origin, infection with Chlamydia (p = 0.01) and infection with Hepatitis C virus (p = 0.017). More and larger transmission-clusters were identified among the subtype B infections (p < 0.001). Overall, in multivariate analysis, clustering was significantly associated with Caucasian origin, infection through homosexual contact and younger age (all p < 0.001). Bivariate analysis additionally showed a correlation between clustering and syphilis (p < 0.001), higher CD4 counts (p = 0.002), Chlamydia infection (p = 0.013) and primary HIV (p = 0.017). Combination of phylogenetics with demographic information, laboratory and clinical data, revealed that HIV-1 subtype B infected Caucasian men-who-have-sex-with-men with high prevalence of sexually transmitted diseases, account for the majority of local HIV-transmissions. This finding elucidates observed epidemiological trends through molecular analysis, and justifies sustained focus in prevention on this high risk group.

  9. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Dean N.

    2011-07-20

    This report summarizes work carried out by the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. Themore » UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, and Visualization interfaces.« less

  10. A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.

    2017-02-01

    Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.

  11. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  12. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  13. Building a database for statistical characterization of ELMs on DIII-D

    NASA Astrophysics Data System (ADS)

    Fritch, B. J.; Marinoni, A.; Bortolon, A.

    2017-10-01

    Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.

  14. [Efficacy and tolerability of cisapride in a new formula of 10 mg effervescent capsules for the treatment of functional dyspepsia].

    PubMed

    Grossi, L; Di Felice, F; Marzio, L

    1993-06-01

    The efficacy and tolerability of Cisapride effervescent granules and a metoclopramide-dimethicone combination were compared double-blind in two comparable groups of 15 patients each with dyspepsia. All patients received three sachets daily of either drug for 6 consecutive weeks. As for efficacy, Cisapride effervescent granules was found to reduce 85% (11/13) of symptoms to a statistically significant extent, as against 42% (5/12) in the reference group. Statistical analysis showed Cisapride effervescent granules to be more effective than the reference drug for 6 out of 11 evaluable symptoms. Mean global improvement was 86% for Cisapride effervescent granules vs 41% for the reference combination. Final judgment by the physician was more favorable for Cisapride effervescent granules than for the reference drug (p < 0.0001). Treatment withdrawal was never necessary and no significant changes of laboratory values were observed. No statistically significant difference between the two treatments as to tolerability was observed. In conclusion, Cisapride effervescent granules was found to have a better risk/benefit ratio than the reference combination.

  15. Radar image enhancement and simulation as an aid to interpretation and training

    NASA Technical Reports Server (NTRS)

    Frost, V. S.; Stiles, J. A.; Holtzman, J. C.; Dellwig, L. F.; Held, D. N.

    1980-01-01

    Greatly increased activity in the field of radar image applications in the coming years demands that techniques of radar image analysis, enhancement, and simulation be developed now. Since the statistical nature of radar imagery differs from that of photographic imagery, one finds that the required digital image processing algorithms (e.g., for improved viewing and feature extraction) differ from those currently existing. This paper addresses these problems and discusses work at the Remote Sensing Laboratory in image simulation and processing, especially for systems comparable to the formerly operational SEASAT synthetic aperture radar.

  16. External quality-assurance results for the national atmospheric deposition program/national trends network, 2000-2001

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Latysh, Natalie E.; Gordon, John D.

    2004-01-01

    Five external quality-assurance programs were operated by the U.S. Geological Survey for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) from 2000 through 2001 (study period): the intersite-comparison program, the blind-audit program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program is designed to measure specific components of the total error inherent in NADP/NTN wet-deposition measurements. The intersite-comparison program assesses the variability and bias of pH and specific-conductance determinations made by NADP/NTN site operators with respect to accuracy goals. The accuracy goals are statistically based using the median of all of the measurements obtained for each of four intersite-comparison studies. The percentage of site operators responding on time that met the pH accuracy goals ranged from 84.2 to 90.5 percent. In these same four intersite-comparison studies, 88.9 to 99.0 percent of the site operators met the accuracy goals for specific conductance. The blind-audit program evaluates the effects of routine sample handling, processing, and shipping on the chemistry of weekly precipitation samples. The blind-audit data for the study period indicate that sample handling introduced a small amount of sulfate contamination and slight changes to hydrogen-ion content of the precipitation samples. The magnitudes of the paired differences are not environmentally significant to NADP/NTN data users. The field-audit program (also known as the 'field-blank program') was designed to measure the effects of field exposure, handling, and processing on the chemistry of NADP/NTN precipitation samples. The results indicate potential low-level contamination of NADP/NTN samples with calcium, ammonium, chloride, and nitrate. Less sodium contamination was detected by the field-audit data than in previous years. Statistical analysis of the paired differences shows that contaminant ions are entrained into the solutions from the field-exposed buckets, but the positive bias that results from the minor amount of contamination appears to affect the analytical results by less than 6 percent. An interlaboratory-comparison program is used to estimate the analytical variability and bias of participating laboratories, especially the NADP Central Analytical Laboratory (CAL). Statistical comparison of the analytical results of participating laboratories implies that analytical data from the various monitoring networks can be compared. Bias was identified in the CAL data for ammonium, chloride, nitrate, sulfate, hydrogen-ion, and specific-conductance measurements, but the absolute value of the bias was less than analytical minimum reporting limits for all constituents except ammonium and sulfate. Control charts show brief time periods when the CAL's analytical precision for sodium, ammonium, and chloride was not within the control limits. Data for the analysis of ultrapure deionized-water samples indicated that the laboratories are maintaining good control of laboratory contamination. Estimated analytical precision among the laboratories indicates that the magnitudes of chemical-analysis errors are not environmentally significant to NADP data users. Overall precision of the precipitation-monitoring system used by the NADP/NTN was estimated by evaluation of samples from collocated monitoring sites at CA99, CO08, and NH02. Precision defined by the median of the absolute percent difference (MAE) was estimated to be approximately 10 percent or less for calcium, magnesium, sodium, chloride, nitrate, sulfate, specific conductance, and sample volume. The MAE values for ammonium and hydrogen-ion concentrations were estimated to be less than 10 percent for CA99 and NH02 but nearly 20 percent for ammonium concentration and about 17 percent for hydrogen-ion concentration for CO08. As in past years, the variability in the collocated-site data for sam

  17. Exploring laser-induced breakdown spectroscopy for nuclear materials analysis and in-situ applications

    NASA Astrophysics Data System (ADS)

    Martin, Madhavi Z.; Allman, Steve; Brice, Deanne J.; Martin, Rodger C.; Andre, Nicolas O.

    2012-08-01

    Laser-induced breakdown spectroscopy (LIBS) has been used to determine the limits of detection of strontium (Sr) and cesium (Cs), common nuclear fission products. Additionally, detection limits were determined for cerium (Ce), often used as a surrogate for radioactive plutonium in laboratory studies. Results were obtained using a laboratory instrument with a Nd:YAG laser at fundamental wavelength of 1064 nm, frequency doubled to 532 nm with energy of 50 mJ/pulse. The data was compared for different concentrations of Sr and Ce dispersed in a CaCO3 (white) and carbon (black) matrix. We have addressed the sampling errors, limits of detection, reproducibility, and accuracy of measurements as they relate to multivariate analysis in pellets that were doped with the different elements at various concentrations. These results demonstrate that LIBS technique is inherently well suited for in situ analysis of nuclear materials in hot cells. Three key advantages are evident: (1) small samples (mg) can be evaluated; (2) nuclear materials can be analyzed with minimal sample preparation; and (3) samples can be remotely analyzed very rapidly (ms-seconds). Our studies also show that the methods can be made quantitative. Very robust multivariate models have been used to provide quantitative measurement and statistical evaluation of complex materials derived from our previous research on wood and soil samples.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkley, Eric D.; Sego, Landon H.; Lin, Andy

    Adaptive processes in bacterial species can occur rapidly in laboratory culture, leading to genetic divergence between naturally occurring and laboratory-adapted strains. Differentiating wild and closely-related laboratory strains is clearly important for biodefense and bioforensics; however, DNA sequence data alone has thus far not provided a clear signature, perhaps due to lack of understanding of how diverse genome changes lead to adapted phenotypes. Protein abundance profiles from mass spectrometry-based proteomics analyses are a molecular measure of phenotype. Proteomics data contains sufficient information that powerful statistical methods can uncover signatures that distinguish wild strains of Yersinia pestis from laboratory-adapted strains.

  19. A multicenter randomized comparison of cycle control and laboratory findings with oral contraceptive agents containing 100 microg levonorgestrel with 20 microg ethinyl estradiol or triphasic norethindrone with ethinyl estradiol.

    PubMed

    Reisman, H; Martin, D; Gast, M J

    1999-11-01

    This study was undertaken to compare the effects of 2 oral contraceptive regimens on menstrual cycle control and laboratory findings. In a multicenter randomized study 100 microg levonorgestrel with 20 microg ethinyl estradiol (Alesse or Loette) was given to 155 healthy women. A triphasic preparation of 500, 750, and 1000 microg norethindrone with 35 microg ethinyl estradiol (Ortho-Novum 7/7/7 or TriNovum) was given to 167 women for 1 to 4 cycles of treatment. Overall, the percentages of normal menstrual cycles and the percentages of cycles with intermenstrual and withdrawal bleeding were similar between the 2 treatment groups. In the levonorgestrel with ethinyl estradiol group, there was a statistically significantly longer latent period and a statistically significantly shorter withdrawal bleeding episode. Adverse events were similar between treatment groups, and none were serious. Most mean changes from baseline laboratory values were comparable between groups, although the mean increase in cholesterol concentration was statistically significantly lower in the levonorgestrel with ethinyl estradiol group. Changes in triglyceride and glucose concentrations were not statistically significantly different between groups. Levonorgestrel (100 microg) with ethinyl estradiol (20 microg) provides menstrual cycle control equivalent to that obtained with triphasic norethindrone with ethinyl estradiol (75% higher estrogen dose) with similar safety and tolerability.

  20. Clinical use and misuse of automated semen analysis.

    PubMed

    Sherins, R J

    1991-01-01

    During the past six years, there has been an explosion of technology which allows automated machine-vision for sperm analysis. CASA clearly provides an opportunity for objective, systematic assessment of sperm motion. But there are many caveats in using this type of equipment. CASA requires a disciplined and standardized approach to semen collection, specimen preparation, machine settings, calibration and avoidance of sampling bias. Potential sources of error can be minimized. Unfortunately, the rapid commercialization of this technology preceded detailed statistical analysis of such data to allow equally rapid comparisons of data between different CASA machines and among different laboratories. Thus, it is now imperative that we standardize use of this technology and obtain more detailed biological insights into sperm motion parameters in semen and after capacitation before we empirically employ CASA for studies of fertility prediction. In the basic science arena, CASA technology will likely evolve to provide new algorithms for accurate sperm motion analysis and give us an opportunity to address the biophysics of sperm movement. In the clinical arena, CASA instruments provide the opportunity to share and compare sperm motion data among laboratories by virtue of its objectivity, assuming standardized conditions of utilization. Identification of men with specific sperm motion disorders is certain, but the biological relevance of motility dysfunction to actual fertilization remains uncertain and surely the subject for further study.

  1. Agreement between allergen-specific IgE assays and ensuing immunotherapy recommendations from four commercial laboratories in the USA.

    PubMed

    Plant, Jon D; Neradelik, Moni B; Polissar, Nayak L; Fadok, Valerie A; Scott, Brian A

    2014-02-01

    Canine allergen-specific IgE assays in the USA are not subjected to an independent laboratory reliability monitoring programme. The aim of this study was to evaluate the agreement of diagnostic results and treatment recommendations of four serum IgE assays commercially available in the USA. Replicate serum samples from 10 atopic dogs were submitted to each of four laboratories for allergen-specific IgE assays (ACTT(®) , VARL Liquid Gold, ALLERCEPT(®) and Greer(®) Aller-g-complete(®) ). The interlaboratory agreement of standard, regional panels and ensuing treatment recommendations were analysed with the kappa statistic (κ) to account for agreement that might occur merely by chance. Six comparisons of pairs of laboratories and overall agreement among laboratories were analysed for ungrouped allergens (as tested) and also with allergens grouped according to reported cross-reactivity and taxonomy. The overall chance-corrected agreement of the positive/negative test results for ungrouped and grouped allergens was slight (κ = 0.14 and 0.13, respectively). Subset analysis of the laboratory pair with the highest level of diagnostic agreement (κ = 0.36) found slight agreement (κ = 0.13) for ungrouped plants and fungi, but substantial agreement (κ = 0.71) for ungrouped mites. The overall agreement of the treatment recommendations was slight (κ = 0.11). Altogether, 85.1% of ungrouped allergen treatment recommendations were unique to one laboratory or another. Our study indicated that the choice of IgE assay may have a major influence on the positive/negative results and ensuing treatment recommendations. © 2014 The Authors. Veterinary Dermatology published by John Wiley & Sons Ltd on behalf of the ESVD and the ACVD.

  2. Carpal tunnel syndrome among laboratory technicians in relation to personal and ergonomic factors at work.

    PubMed

    El-Helaly, Mohamed; Balkhy, Hanan H; Vallenius, Laura

    2017-11-25

    Work-related carpal tunnel syndrome (CTS) has been reported in different occupations, including laboratory technicians, so this study was carried out to determine the prevalence and the associated personal and ergonomic factors for CTS among laboratory technicians. A cross-sectional study was conducted among 279 laboratory technicians at King Fahd Hospital, Saudi Arabia, who filled in a self-administered questionnaire, including questions regarding their demographic criteria, occupational history, job tasks, workplace tools, ergonomic factors at work, and symptoms suggestive of CTS. Physical examinations and electrodiagnostic studies were carried out for those who had symptoms suggestive of CTS to confirm the diagnosis. Univariate and multivariate analysis were performed for both personal and physical factors in association with confirmed CTS among laboratory technicians. The prevalence of CTS among the laboratory technicians was 9.7% (27/279). The following were the statistically significant risk factors for CTS among them: gender (all cases of CTS were female, P=0.00), arm/hand exertion (OR: 7.96; 95% CI: 1.84-34.33), pipetting (OR: 7.27; 95% CI: 3.15-16.78), repetitive tasks (OR: 4.60; 95% CI: 1.39-15.70), using unadjustable chairs or desks (OR: 3.35; 95% CI: 1.23-9.15), and working with a biosafety cabinet (OR: 2.49; 95% CI: 1.11-5.59). CTS cases had significant longer work duration (17.9 ± 5.6 years) than CTS non-case (11.5 ± 7.4 yeas) with low OR (1.108). This study demonstrates some personal and ergonomic factors associated with CTS among the laboratory technicians, including female gender, arm/hand exertion, pipetting, repetitive tasks, working with a biosafety cabinet, and an unadjusted workstation.

  3. Standardization of Quantitative PCR for Human T-Cell Leukemia Virus Type 1 in Japan: a Collaborative Study

    PubMed Central

    Okuma, Kazu; Yamochi, Tadanori; Sato, Tomoo; Sasaki, Daisuke; Hasegawa, Hiroo; Umeki, Kazumi; Kubota, Ryuji; Sobata, Rieko; Matsumoto, Chieko; Kaneko, Noriaki; Naruse, Isao; Yamagishi, Makoto; Nakashima, Makoto; Momose, Haruka; Araki, Kumiko; Mizukami, Takuo; Mizusawa, Saeko; Okada, Yoshiaki; Ochiai, Masaki; Utsunomiya, Atae; Koh, Ki-Ryang; Ogata, Masao; Nosaka, Kisato; Uchimaru, Kaoru; Iwanaga, Masako; Sagara, Yasuko; Yamano, Yoshihisa; Satake, Masahiro; Okayama, Akihiko; Mochizuki, Manabu; Izumo, Shuji; Saito, Shigeru; Itabashi, Kazuo; Kamihira, Shimeru; Yamaguchi, Kazunari; Watanabe, Toshiki

    2015-01-01

    Quantitative PCR (qPCR) analysis of human T-cell leukemia virus type 1 (HTLV-1) was used to assess the amount of HTLV-1 provirus DNA integrated into the genomic DNA of host blood cells. Accumulating evidence indicates that a high proviral load is one of the risk factors for the development of adult T-cell leukemia/lymphoma and HTLV-1-associated myelopathy/tropical spastic paraparesis. However, interlaboratory variability in qPCR results makes it difficult to assess the differences in reported proviral loads between laboratories. To remedy this situation, we attempted to minimize discrepancies between laboratories through standardization of HTLV-1 qPCR in a collaborative study. TL-Om1 cells that harbor the HTLV-1 provirus were serially diluted with peripheral blood mononuclear cells to prepare a candidate standard. By statistically evaluating the proviral loads of the standard and those determined using in-house qPCR methods at each laboratory, we determined the relative ratios of the measured values in the laboratories to the theoretical values of the TL-Om1 standard. The relative ratios of the laboratories ranged from 0.84 to 4.45. Next, we corrected the proviral loads of the clinical samples from HTLV-1 carriers using the relative ratio. As expected, the overall differences between the laboratories were reduced by half, from 7.4-fold to 3.8-fold on average, after applying the correction. HTLV-1 qPCR can be standardized using TL-Om1 cells as a standard and by determining the relative ratio of the measured to the theoretical standard values in each laboratory. PMID:26292315

  4. Standardization of Quantitative PCR for Human T-Cell Leukemia Virus Type 1 in Japan: a Collaborative Study.

    PubMed

    Kuramitsu, Madoka; Okuma, Kazu; Yamochi, Tadanori; Sato, Tomoo; Sasaki, Daisuke; Hasegawa, Hiroo; Umeki, Kazumi; Kubota, Ryuji; Sobata, Rieko; Matsumoto, Chieko; Kaneko, Noriaki; Naruse, Isao; Yamagishi, Makoto; Nakashima, Makoto; Momose, Haruka; Araki, Kumiko; Mizukami, Takuo; Mizusawa, Saeko; Okada, Yoshiaki; Ochiai, Masaki; Utsunomiya, Atae; Koh, Ki-Ryang; Ogata, Masao; Nosaka, Kisato; Uchimaru, Kaoru; Iwanaga, Masako; Sagara, Yasuko; Yamano, Yoshihisa; Satake, Masahiro; Okayama, Akihiko; Mochizuki, Manabu; Izumo, Shuji; Saito, Shigeru; Itabashi, Kazuo; Kamihira, Shimeru; Yamaguchi, Kazunari; Watanabe, Toshiki; Hamaguchi, Isao

    2015-11-01

    Quantitative PCR (qPCR) analysis of human T-cell leukemia virus type 1 (HTLV-1) was used to assess the amount of HTLV-1 provirus DNA integrated into the genomic DNA of host blood cells. Accumulating evidence indicates that a high proviral load is one of the risk factors for the development of adult T-cell leukemia/lymphoma and HTLV-1-associated myelopathy/tropical spastic paraparesis. However, interlaboratory variability in qPCR results makes it difficult to assess the differences in reported proviral loads between laboratories. To remedy this situation, we attempted to minimize discrepancies between laboratories through standardization of HTLV-1 qPCR in a collaborative study. TL-Om1 cells that harbor the HTLV-1 provirus were serially diluted with peripheral blood mononuclear cells to prepare a candidate standard. By statistically evaluating the proviral loads of the standard and those determined using in-house qPCR methods at each laboratory, we determined the relative ratios of the measured values in the laboratories to the theoretical values of the TL-Om1 standard. The relative ratios of the laboratories ranged from 0.84 to 4.45. Next, we corrected the proviral loads of the clinical samples from HTLV-1 carriers using the relative ratio. As expected, the overall differences between the laboratories were reduced by half, from 7.4-fold to 3.8-fold on average, after applying the correction. HTLV-1 qPCR can be standardized using TL-Om1 cells as a standard and by determining the relative ratio of the measured to the theoretical standard values in each laboratory. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  5. Carpal tunnel syndrome among laboratory technicians in relation to personal and ergonomic factors at work

    PubMed Central

    El-Helaly, Mohamed; Balkhy, Hanan H.; Vallenius, Laura

    2017-01-01

    Objectives: Work-related carpal tunnel syndrome (CTS) has been reported in different occupations, including laboratory technicians, so this study was carried out to determine the prevalence and the associated personal and ergonomic factors for CTS among laboratory technicians. Methods: A cross-sectional study was conducted among 279 laboratory technicians at King Fahd Hospital, Saudi Arabia, who filled in a self-administered questionnaire, including questions regarding their demographic criteria, occupational history, job tasks, workplace tools, ergonomic factors at work, and symptoms suggestive of CTS. Physical examinations and electrodiagnostic studies were carried out for those who had symptoms suggestive of CTS to confirm the diagnosis. Univariate and multivariate analysis were performed for both personal and physical factors in association with confirmed CTS among laboratory technicians. Results: The prevalence of CTS among the laboratory technicians was 9.7% (27/279). The following were the statistically significant risk factors for CTS among them: gender (all cases of CTS were female, P=0.00), arm/hand exertion (OR: 7.96; 95% CI: 1.84-34.33), pipetting (OR: 7.27; 95% CI: 3.15-16.78), repetitive tasks (OR: 4.60; 95% CI: 1.39-15.70), using unadjustable chairs or desks (OR: 3.35; 95% CI: 1.23-9.15), and working with a biosafety cabinet (OR: 2.49; 95% CI: 1.11-5.59). CTS cases had significant longer work duration (17.9 ± 5.6 years) than CTS non-case (11.5 ± 7.4 yeas) with low OR (1.108). Conclusion: This study demonstrates some personal and ergonomic factors associated with CTS among the laboratory technicians, including female gender, arm/hand exertion, pipetting, repetitive tasks, working with a biosafety cabinet, and an unadjusted workstation. PMID:28855446

  6. HEART: an automated beat-to-beat cardiovascular analysis package using Matlab.

    PubMed

    Schroeder, M J Mark J; Perreault, Bill; Ewert, D L Daniel L; Koenig, S C Steven C

    2004-07-01

    A computer program is described for beat-to-beat analysis of cardiovascular parameters from high-fidelity pressure and flow waveforms. The Hemodynamic Estimation and Analysis Research Tool (HEART) is a post-processing analysis software package developed in Matlab that enables scientists and clinicians to document, load, view, calibrate, and analyze experimental data that have been digitally saved in ascii or binary format. Analysis routines include traditional hemodynamic parameter estimates as well as more sophisticated analyses such as lumped arterial model parameter estimation and vascular impedance frequency spectra. Cardiovascular parameter values of all analyzed beats can be viewed and statistically analyzed. An attractive feature of the HEART program is the ability to analyze data with visual quality assurance throughout the process, thus establishing a framework toward which Good Laboratory Practice (GLP) compliance can be obtained. Additionally, the development of HEART on the Matlab platform provides users with the flexibility to adapt or create study specific analysis files according to their specific needs. Copyright 2003 Elsevier Ltd.

  7. Method and data evaluation at NASA endocrine laboratory. [Skylab 3 experiments

    NASA Technical Reports Server (NTRS)

    Johnston, D. A.

    1974-01-01

    The biomedical data of the astronauts on Skylab 3 were analyzed to evaluate the univariate statistical methods for comparing endocrine series experiments in relation to other medical experiments. It was found that an information storage and retrieval system was needed to facilitate statistical analyses.

  8. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less

  9. Effects of Laboratory Disinfecting Agents on Dimensional Stability of Three Commercially Available Heat-Cured Denture Acrylic Resins in India: An In-Vitro Study

    PubMed Central

    Jujare, Ravikanth Haridas; Varghese, Rana Kalappattil; Singh, Vishwa Deepak; Gaurav, Amit

    2016-01-01

    Introduction Dental professionals are exposed to a wide variety of microorganisms which calls for use of effective infection control procedures in the dental office and laboratories that can prevent cross-contamination that could extend to dentists, dental office staff, dental technicians as well as patients. This concern has led to a renewed interest in denture sterilization and disinfection. Heat polymerized dentures exhibit dimensional change during disinfection procedure. Aim The purpose of this study was to determine the influence of different types of widely used laboratory disinfecting agents on the dimensional stability of heat-cured denture acrylic resins and to compare the dimensional stability of three commercially available heat-cured denture acrylic resins in India. Materials and Methods Twelve specimens of uniform dimension each of three different brands namely Stellon, Trevalon and Acralyn-H were prepared using circular metal disc. Chemical disinfectants namely 2% alkaline glutaraldehyde, 1% povidone-iodine, 0.5% sodium hypochlorite and water as control group were used. Diameter of each specimen was measured before immersion and after immersion with time interval of 1 hour and 12 hours. The data was evaluated statistically using one way analysis of variance. Results All the specimens in three disinfectants and in water exhibited very small amount of linear expansion. Among three disinfectants, specimens in 2% alkaline glutaraldehyde exhibited least(0.005mm) and water showed highest (0.009mm) amount of dimensional change. Among resins, Trevalon showed least (0.067mm) and Acralyn-H exhibited highest (0.110mm) amount of dimensional change. Conclusion Although, all the specimens of three different brands of heat-cured denture acrylic resins exhibited increase in linear dimensional change in all the disinfectants and water, they were found to be statistically insignificant. PMID:27134996

  10. Validations of a portable home sleep study with twelve-lead polysomnography: comparisons and insights into a variable gold standard.

    PubMed

    Michaelson, Peter G; Allan, Patrick; Chaney, John; Mair, Eric A

    2006-11-01

    Accurate and timely diagnosis for patients with obstructive sleep apnea (OSA) is imperative. Unfortunately, growing interest in this diagnosis has resulted in increased requests and waiting times for polysomnography (PSG), as well as a potential delay in diagnosis and treatment. This study evaluated the accuracy and viability of utilizing SNAP (SNAP Laboratories, LLC, Wheeling, Illinois), a portable home sleep test, as an alternative to traditional PSG in diagnosing OSA. This prospective clinical trial included 59 patients evaluated at our institution's sleep laboratory. Concurrent PSG and SNAP testing was performed for 1 night on each patient. Independent, blinded readers at our institution and at an outside-accredited institution read the PSG data, and 2 independent, blinded readers interpreted the SNAP data at SNAP laboratories. The apnea-hypopnea index (AHI) was used to compare the 2 testing modalities. The correlation coefficient, receiver operating characteristic curve analysis, and the Bland-Altman curves, as well as sensitivity, specificity, inter-reader variability, positive predictive value, and negative predictive value, were used to compare SNAP and PSG. There is a definitive, statistically sound correlation between the AHIs determined from both PSG and SNAP. This relationship holds true for all measures of comparison, while displaying a concerning, weaker correlation between the different PSG interpretations. There is a convincing correlation between the study-determined AHIs of both PSG and SNAP. This finding supports SNAP as a suitable alternative to PSG in identifying OSA, while accentuating the inherent variation present in a PSG-derived AHI. This test expands the diagnostic and therapeutic prowess of the practicing otolaryngologist by offering an alternative OSA testing modality that is associated with not only less expense, decreased waiting time, and increased convenience, but also statistically proven accuracy.

  11. GC–MS-Based Metabonomic Profiling Displayed Differing Effects of Borna Disease Virus Natural Strain Hu-H1 and Laboratory Strain V Infection in Rat Cortical Neurons

    PubMed Central

    Liu, Siwen; Bode, Liv; Zhang, Lujun; He, Peng; Huang, Rongzhong; Sun, Lin; Chen, Shigang; Zhang, Hong; Guo, Yujie; Zhou, Jingjing; Fu, Yuying; Zhu, Dan; Xie, Peng

    2015-01-01

    Borna disease virus (BDV) persists in the central nervous systems of a wide variety of vertebrates and causes behavioral disorders. Previous studies have revealed that metabolic perturbations are associated with BDV infection. However, the pathophysiological effects of different viral strains remain largely unknown. Rat cortical neurons infected with human strain BDV Hu-H1, laboratory BDV Strain V, and non-infected control (CON) cells were cultured in vitro. At day 12 post-infection, a gas chromatography coupled with mass spectrometry (GC–MS) metabonomic approach was used to differentiate the metabonomic profiles of 35 independent intracellular samples from Hu-H1-infected cells (n = 12), Strain V-infected cells (n = 12), and CON cells (n = 11). Partial least squares discriminant analysis (PLS-DA) was performed to demonstrate discrimination between the three groups. Further statistical testing determined which individual metabolites displayed significant differences between groups. PLS-DA demonstrated that the whole metabolic pattern enabled statistical discrimination between groups. We identified 31 differential metabolites in the Hu-H1 and CON groups (21 decreased and 10 increased in Hu-H1 relative to CON), 35 differential metabolites in the Strain V and CON groups (30 decreased and 5 increased in Strain V relative to CON), and 21 differential metabolites in the Hu-H1 and Strain V groups (8 decreased and 13 increased in Hu-H1 relative to Strain V). Comparative metabonomic profiling revealed divergent perturbations in key energy and amino acid metabolites between natural strain Hu-H1 and laboratory Strain V of BDV. The two BDV strains differentially alter metabolic pathways of rat cortical neurons in vitro. Their systematic classification provides a valuable template for improved BDV strain definition in future studies. PMID:26287181

  12. Laboratory examination of seronegative and seropositive rheumatoid arthritis.

    PubMed

    Sahatçiu-Meka, Vjollca; Anton, Kukeli

    2010-01-01

    In response to the continuing debate whether seronegative and seropositive rheumatoid arthritis (RA) are parts of the same disease spectrum or are distinct disorders, we performed a comparative analysis of some laboratory characteristics. The test group consisted of 125 seronegative RA(93 female, 32 male), with titers lower than 1/64 as defined by Rose-Waaler test. The control group consisted of 125 seropositive RA patients (93 female, 32 male), with titers of 1/64 or higher. Patients all belonged to the 2nd and 3rd functional classes (ARA), and were between 25-60 years of age. The duration of the disease was 1-27 years. Elevated average values of erythrosedimentation (ERS), C-reactive protein (PCR) and erythrocytes (Er) were found in seropositive patients, but they did not present statistically significant difference with regard to sero-status. Reduced values of hemoglobin (Hb) were found more frequently in seropositive patients (t = 2.26, p < 0.05), especially female seropositive patients (t = 4.38, p < 0.01), without correlation to disability and duration of the disease. Statistically significant difference was found in average values of fibrinogen in the seropositive subset (t = 2.10, p < 0.05), especially in female seropositive patients (t = 2.65, p < 0.01), and in average values of leukocytes (t = 1.37, p < 0.05) among male seropositive patients. Elevated immunoglobulin (IgM) values were more prominent in the seropositive subset (chi2 = 47.6, p < 0.01), especially among seropositive females (chi2 = 35.68, p < 0.01). Values of IgA and IgG did not present statistically significant difference with regard to sero-status. Levels of C3 and C4 components of the complement were reduced in seropositive tested subjects, without significant difference between sero-subsets. Increased values of gamma-globulin were confirmed with statistical significance (chi2 = 3.39, p < 0.05) in seropositive subjects, while alpha-2 globulin values were nearly equally distributed in both subsets.

  13. Linked electronic medication systems in community pharmacies for preventing pseudoephedrine diversion: a review of international practice and analysis of results in Australia.

    PubMed

    Berbatis, Constantine G; Sunderland, Vivian Bruce; Dhaliwal, Satvinder S

    2009-11-01

    Pseudoephedrine is a precursor often diverted into the illegal manufacture of amphetamine type substances (ATS). The aim of this study was to evaluate the effectiveness of a linked electronic medication recording system (LEMS) established in Australian pharmacies in 2005 for preventing the diversion of pseudoephedrine. The number of illegal ATS laboratories detected in each jurisdiction of Australia from 1996-1997 to 2004-2005 were analysed by linear regression nationally and by each jurisdiction. The statistical significance of seizures in 2005-2006 was based on the comparison of the observed value to the 95% prediction confidence intervals calculated from the historical data for each jurisdiction and nationally. Pharmacies in Queensland commenced an LEMS in late 2005 to minimise retail pseudoephedrine diversion. The number of ATS laboratories seized in 2005-2006 in Queensland was significantly lower (P < 0.05) than predicted by historical data. For all other jurisdictions and nationally the totals of laboratories seized in 2005-2006 were not significantly different from predicted values. The significant decline in ATS illegal laboratories seized in Queensland in 2005-2006 suggests the effective use of LEMS in pharmacies to minimise pseudoephedrine diversion. In order to evaluate a national LEMS, more frequent data on numbers of linked pharmacies, ATS laboratories seized and indicators of pseudoephedrine sales and misuse are required. Testing the use of LEMS by pharmacies for preventing the diversion of other medicines seems appropriate.

  14. [GSTP1, APC and RASSF1 gene methylation in prostate cancer samples: comparative analysis of MS-HRM method and Infinium HumanMethylation450 BeadChip beadchiparray diagnostic value].

    PubMed

    Skorodumova, L O; Babalyan, K A; Sultanov, R; Vasiliev, A O; Govorov, A V; Pushkar, D Y; Prilepskaya, E A; Danilenko, S A; Generozov, E V; Larin, A K; Kostryukova, E S; Sharova, E I

    2016-11-01

    There is a clear need in molecular markers for prostate cancer (PC) risk stratification. Alteration of DNA methylation is one of processes that occur during ÐÑ progression. Methylation-sensitive PCR with high resolution melting curve analysis (MS-HRM) can be used for gene methylation analysis in routine laboratory practice. This method requires very small amounts of DNA for analysis. Numerous results have been accumulated on DNA methylation in PC samples analyzed by the Infinium HumanMethylation450 BeadChip (HM450). However, the consistency of MS-HRM results with chip hybridization results has not been examined yet. The aim of this study was to assess the consistency of results of GSTP1, APC and RASSF1 gene methylation analysis in ÐÑ biopsy samples obtained by MS-HRM and chip hybridization. The methylation levels of each gene determined by MS-HRM were statistically different in the group of PC tissue samples and the samples without signs of tumor growth. Chip hybridization data analysis confirmed the results obtained with the MS-HRM. Differences in methylation levels between tumor tissue and histologically intact tissue of each sample determined by MS-HRM and chip hybridization, were consistent with each other. Thus, we showed that the assessment of GSTP1, APC and RASSF1 gene methylation analysis using MS-HRM is suitable for the design of laboratory assays that will differentiate the PC tissue from the tissue without signs of tumor growth.

  15. Laboratory medicine handoff gaps experienced by primary care practices: A report from the shared networks of collaborative ambulatory practices and partners (SNOCAP).

    PubMed

    West, David R; James, Katherine A; Fernald, Douglas H; Zelie, Claire; Smith, Maxwell L; Raab, Stephen S

    2014-01-01

    The majority of errors in laboratory medicine testing are thought to occur in the pre- and postanalytic testing phases, and a large proportion of these errors are secondary to failed handoffs. Because most laboratory tests originate in ambulatory primary care, understanding the gaps in handoff processes within and between laboratories and practices is imperative for patient safety. Therefore, the purpose of this study was to understand, based on information from primary care practice personnel, the perceived gaps in laboratory processes as a precursor to initiating process improvement activities. A survey was used to assess perceptions of clinicians, staff, and management personnel of gaps in handoffs between primary care practices and laboratories working in 21 Colorado primary care practices. Data were analyzed to determine statistically significant associations between categorical variables. In addition, qualitative analysis of responses to open-ended survey questions was conducted. Primary care practices consistently reported challenges and a desire/need to improve their efforts to systematically track laboratory test status, confirm receipt of laboratory results, and report results to patients. Automated tracking systems existed in roughly 61% of practices, and all but one of those had electronic health record-based tracking systems in place. One fourth of these electronic health record-enabled practices expressed sufficient mistrust in these systems to warrant the concurrent operation of an article-based tracking system as backup. Practices also reported 12 different procedures used to notify patients of test results, varying by test result type. The results highlight the lack of standardization and definition of roles in handoffs in primary care laboratory practices for test ordering, monitoring, and receiving and reporting test results. Results also identify high-priority gaps in processes and the perceptions by practice personnel that practice improvement in these areas is needed. Commonalities in these areas warrant the development and support of tools for use in primary care settings. © Copyright 2014 by the American Board of Family Medicine.

  16. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  17. Cardiovascular Reactivity During Marital Conflict in Laboratory and Naturalistic Settings: Differential Associations with Relationship and Individual Functioning Across Contexts.

    PubMed

    Baucom, Brian R W; Baucom, Katherine J W; Hogan, Jasara N; Crenshaw, Alexander O; Bourne, Stacia V; Crowell, Sheila E; Georgiou, Panayiotis; Goodwin, Matthew S

    2018-03-25

    Cardiovascular reactivity during spousal conflict is considered to be one of the main pathways for relationship distress to impact physical, mental, and relationship health. However, the magnitude of association between cardiovascular reactivity during laboratory marital conflict and relationship functioning is small and inconsistent given the scope of its importance in theoretical models of intimate relationships. This study tests the possibility that cardiovascular data collected in laboratory settings downwardly bias the magnitude of these associations when compared to measures obtained in naturalistic settings. Ambulatory cardiovascular reactivity data were collected from 20 couples during two relationship conflicts in a research laboratory, two planned relationship conflicts at couples' homes, and two spontaneous relationship conflicts during couples' daily lives. Associations between self-report measures of relationship functioning, individual functioning, and cardiovascular reactivity across settings are tested using multilevel models. Cardiovascular reactivity was significantly larger during planned and spontaneous relationship conflicts in naturalistic settings than during planned relationship conflicts in the laboratory. Similarly, associations with relationship and individual functioning variables were statistically significantly larger for cardiovascular data collected in naturalistic settings than the same data collected in the laboratory. Our findings suggest that cardiovascular reactivity during spousal conflict in naturalistic settings is statistically significantly different from that elicited in laboratory settings both in magnitude and in the pattern of associations with a wide range of inter- and intrapersonal variables. These differences in findings across laboratory and naturalistic physiological responses highlight the value of testing physiological phenomena across interaction contexts in romantic relationships. © 2018 Family Process Institute.

  18. Correction of stream quality trends for the effects of laboratory measurement bias

    USGS Publications Warehouse

    Alexander, Richard B.; Smith, Richard A.; Schwarz, Gregory E.

    1993-01-01

    We present a statistical model relating measurements of water quality to associated errors in laboratory methods. Estimation of the model allows us to correct trends in water quality for long-term and short-term variations in laboratory measurement errors. An illustration of the bias correction method for a large national set of stream water quality and quality assurance data shows that reductions in the bias of estimates of water quality trend slopes are achieved at the expense of increases in the variance of these estimates. Slight improvements occur in the precision of estimates of trend in bias by using correlative information on bias and water quality to estimate random variations in measurement bias. The results of this investigation stress the need for reliable, long-term quality assurance data and efficient statistical methods to assess the effects of measurement errors on the detection of water quality trends.

  19. Experimental econophysics: Complexity, self-organization, and emergent properties

    NASA Astrophysics Data System (ADS)

    Huang, J. P.

    2015-03-01

    Experimental econophysics is concerned with statistical physics of humans in the laboratory, and it is based on controlled human experiments developed by physicists to study some problems related to economics or finance. It relies on controlled human experiments in the laboratory together with agent-based modeling (for computer simulations and/or analytical theory), with an attempt to reveal the general cause-effect relationship between specific conditions and emergent properties of real economic/financial markets (a kind of complex adaptive systems). Here I review the latest progress in the field, namely, stylized facts, herd behavior, contrarian behavior, spontaneous cooperation, partial information, and risk management. Also, I highlight the connections between such progress and other topics of traditional statistical physics. The main theme of the review is to show diverse emergent properties of the laboratory markets, originating from self-organization due to the nonlinear interactions among heterogeneous humans or agents (complexity).

  20. Chemical and Physical Signatures for Microbial Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cliff, John B.; Kreuzer, Helen W.; Ehrhardt, Christopher J.

    Chemical and physical signatures for microbial forensics John Cliff and Helen Kreuzer-Martin, eds. Humana Press Chapter 1. Introduction: Review of history and statement of need. Randy Murch, Virginia Tech Chapter 2. The Microbe: Structure, morphology, and physiology of the microbe as they relate to potential signatures of growth conditions. Joany Jackman, Johns Hopkins University Chapter 3. Science for Forensics: Special considerations for the forensic arena - quality control, sample integrity, etc. Mark Wilson (retired FBI): Western Carolina University Chapter 4. Physical signatures: Light and electron microscopy, atomic force microscopy, gravimetry etc. Joseph Michael, Sandia National Laboratory Chapter 5. Lipids: FAME,more » PLFA, steroids, LPS, etc. James Robertson, Federal Bureau of Investigation Chapter 6. Carbohydrates: Cell wall components, cytoplasm components, methods Alvin Fox, University of South Carolina School of Medicine David Wunschel, Pacific Northwest National Laboratory Chapter 7. Peptides: Peptides, proteins, lipoproteins David Wunschel, Pacific Northwest National Laboratory Chapter 8. Elemental content: CNOHPS (treated in passing), metals, prospective cell types John Cliff, International Atomic Energy Agency Chapter 9. Isotopic signatures: Stable isotopes C,N,H,O,S, 14C dating, potential for heavy elements. Helen Kreuzer-Martin, Pacific Northwest National Laboratory Michaele Kashgarian, Lawrence Livermore National Laboratory Chapter 10. Extracellular signatures: Cellular debris, heme, agar, headspace, spent media, etc Karen Wahl, Pacific Northwest National Laboratory Chapter 11. Data Reduction and Integrated Microbial Forensics: Statistical concepts, parametric and multivariate statistics, integrating signatures Kristin Jarman, Pacific Northwest National Laboratory« less

  1. Experimental design in chemistry: A tutorial.

    PubMed

    Leardi, Riccardo

    2009-10-12

    In this tutorial the main concepts and applications of experimental design in chemistry will be explained. Unfortunately, nowadays experimental design is not as known and applied as it should be, and many papers can be found in which the "optimization" of a procedure is performed one variable at a time. Goal of this paper is to show the real advantages in terms of reduced experimental effort and of increased quality of information that can be obtained if this approach is followed. To do that, three real examples will be shown. Rather than on the mathematical aspects, this paper will focus on the mental attitude required by experimental design. The readers being interested to deepen their knowledge of the mathematical and algorithmical part can find very good books and tutorials in the references [G.E.P. Box, W.G. Hunter, J.S. Hunter, Statistics for Experimenters: An Introduction to Design, Data Analysis, and Model Building, John Wiley & Sons, New York, 1978; R. Brereton, Chemometrics: Data Analysis for the Laboratory and Chemical Plant, John Wiley & Sons, New York, 1978; R. Carlson, J.E. Carlson, Design and Optimization in Organic Synthesis: Second Revised and Enlarged Edition, in: Data Handling in Science and Technology, vol. 24, Elsevier, Amsterdam, 2005; J.A. Cornell, Experiments with Mixtures: Designs, Models and the Analysis of Mixture Data, in: Series in Probability and Statistics, John Wiley & Sons, New York, 1991; R.E. Bruns, I.S. Scarminio, B. de Barros Neto, Statistical Design-Chemometrics, in: Data Handling in Science and Technology, vol. 25, Elsevier, Amsterdam, 2006; D.C. Montgomery, Design and Analysis of Experiments, 7th edition, John Wiley & Sons, Inc., 2009; T. Lundstedt, E. Seifert, L. Abramo, B. Thelin, A. Nyström, J. Pettersen, R. Bergman, Chemolab 42 (1998) 3; Y. Vander Heyden, LC-GC Europe 19 (9) (2006) 469].

  2. Comparison between Linear and Nonlinear Regression in a Laboratory Heat Transfer Experiment

    ERIC Educational Resources Information Center

    Gonçalves, Carine Messias; Schwaab, Marcio; Pinto, José Carlos

    2013-01-01

    In order to interpret laboratory experimental data, undergraduate students are used to perform linear regression through linearized versions of nonlinear models. However, the use of linearized models can lead to statistically biased parameter estimates. Even so, it is not an easy task to introduce nonlinear regression and show for the students…

  3. Quality Assurance in Clinical Chemistry: A Touch of Statistics and A Lot of Common Sense

    PubMed Central

    2016-01-01

    Summary Working in laboratories of clinical chemistry, we risk feeling that our personal contribution to quality is small and that statistical models and manufacturers play the major roles. It is seldom sufficiently acknowledged that personal knowledge, skills and common sense are crucial for quality assurance in the interest of patients. The employees, environment and procedures inherent to the laboratory including its interactions with the clients are crucial for the overall result of the total testing chain. As the measurement systems, reagents and procedures are gradually improved, work on the preanalytical, postanalytical and clinical phases is likely to pay the most substantial dividends in accomplishing further quality improvements. This means changing attitudes and behaviour, especially of the users of the laboratory. It requires understanding people and how to engage them in joint improvement processes. We need to use our knowledge and common sense expanded with new skills e.g. from the humanities, management, business and change sciences in order to bring this about together with the users of the laboratory. PMID:28356868

  4. A laboratory nanoseismological study on deep-focus earthquake micromechanics

    DOE PAGES

    Wang, Yanbin; Zhu, Lupei; Shi, Feng; ...

    2017-07-21

    Global earthquake occurring rate displays an exponential decay down to ~300 km and then peaks around 550 to 600 km before terminating abruptly near 700 km. How fractures initiate, nucleate, and propagate at these depths remains one of the greatest puzzles in earth science, as increasing pressure inhibits fracture propagation. We report nanoseismological analysis on high-resolution acoustic emission (AE) records obtained during ruptures triggered by partial transformation from olivine to spinel in Mg 2GeO 4, an analog to the dominant mineral (Mg,Fe) 2SiO 4 olivine in the upper mantle, using state-of-the-art seismological techniques, in the laboratory. AEs’ focal mechanisms, asmore » well as their distribution in both space and time during deformation, are carefully analyzed. Microstructure analysis shows that AEs are produced by the dynamic propagation of shear bands consisting of nanograined spinel. These nanoshear bands have a near constant thickness (~100 nm) but varying lengths and self-organize during deformation. This precursory seismic process leads to ultimate macroscopic failure of the samples. Several source parameters of AE events were extracted from the recorded waveforms, allowing close tracking of event initiation, clustering, and propagation throughout the deformation/transformation process. AEs follow the Gutenberg-Richter statistics with a well-defined b value of 1.5 over three orders of moment magnitudes, suggesting that laboratory failure processes are self-affine. The seismic relation between magnitude and rupture area correctly predicts AE magnitude at millimeter scales. A rupture propagation model based on strain localization theory is proposed. Future numerical analyses may help resolve scaling issues between laboratory AE events and deep-focus earthquakes.« less

  5. Quality-assurance data for routine water analysis in the National Water-Quality Laboratory of the US Geological Survey for water year 1988

    USGS Publications Warehouse

    Lucey, K.J.

    1989-01-01

    The US Geological Survey maintains a quality assurance program based on the analysis of reference samples for its National Water Quality Laboratory located in Denver, Colorado. Reference samples containing selected inorganic, nutrient, and precipitation (low-level concentration) constituents are prepared at the Survey 's Water Quality Services Unit in Ocala, Florida, disguised as routine samples, and sent daily or weekly, as appropriate, to the laboratory through other Survey offices. The results are stored permanently in the National Water Data Storage and Retrieval System (WATSTORE), the Survey 's database for all water data. These data are analyzed statistically for precision and bias. An overall evaluation of the inorganic major ion and trace metal constituent data for water year 1988 indicated a lack of precision in the National Water Quality Laboratory for the determination of 8 out of 58 constituents: calcium (inductively coupled plasma emission spectrometry), fluoride, iron (atomic absorption spectrometry), iron (total recoverable), magnesium (atomic absorption spectrometry), manganese (total recoverable), potassium, and sodium (inductively coupled plasma emission spectrometry). The results for 31 constituents had positive or negative bias during water year 1988. A lack of precision was indicated in the determination of three of the six nutrient constituents: nitrate plus nitrite nitrogen as nitrogen, nitrite nitrogen as nitrogen, and orthophosphate as phosphorus. A biased condition was indicated in the determination of ammonia nitrogen as nitrogen, ammonia plus organic nitrogen as nitrogen, and nitrate plus nitrite nitrogen as nitrogen. There was acceptable precision in the determination of all 10 constituents contained in precipitation samples. Results for ammonia nitrogen as nitrogen, sodium, and fluoride indicated a biased condition. (Author 's abstract)

  6. A laboratory nanoseismological study on deep-focus earthquake micromechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanbin; Zhu, Lupei; Shi, Feng

    Global earthquake occurring rate displays an exponential decay down to ~300 km and then peaks around 550 to 600 km before terminating abruptly near 700 km. How fractures initiate, nucleate, and propagate at these depths remains one of the greatest puzzles in earth science, as increasing pressure inhibits fracture propagation. We report nanoseismological analysis on high-resolution acoustic emission (AE) records obtained during ruptures triggered by partial transformation from olivine to spinel in Mg2GeO4, an analog to the dominant mineral (Mg,Fe)2SiO4 olivine in the upper mantle, using state-of-the-art seismological techniques, in the laboratory. AEs’ focal mechanisms, as well as their distributionmore » in both space and time during deformation, are carefully analyzed. Microstructure analysis shows that AEs are produced by the dynamic propagation of shear bands consisting of nanograined spinel. These nanoshear bands have a near constant thickness (~100 nm) but varying lengths and self-organize during deformation. This precursory seismic process leads to ultimate macroscopic failure of the samples. Several source parameters of AE events were extracted from the recorded waveforms, allowing close tracking of event initiation, clustering, and propagation throughout the deformation/transformation process. AEs follow the Gutenberg-Richter statistics with a well-defined b value of 1.5 over three orders of moment magnitudes, suggesting that laboratory failure processes are self-affine. The seismic relation between magnitude and rupture area correctly predicts AE magnitude at millimeter scales. A rupture propagation model based on strain localization theory is proposed. Future numerical analyses may help resolve scaling issues between laboratory AE events and deep-focus earthquakes.« less

  7. A laboratory nanoseismological study on deep-focus earthquake micromechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yanbin; Zhu, Lupei; Shi, Feng

    Global earthquake occurring rate displays an exponential decay down to ~300 km and then peaks around 550 to 600 km before terminating abruptly near 700 km. How fractures initiate, nucleate, and propagate at these depths remains one of the greatest puzzles in earth science, as increasing pressure inhibits fracture propagation. We report nanoseismological analysis on high-resolution acoustic emission (AE) records obtained during ruptures triggered by partial transformation from olivine to spinel in Mg 2GeO 4, an analog to the dominant mineral (Mg,Fe) 2SiO 4 olivine in the upper mantle, using state-of-the-art seismological techniques, in the laboratory. AEs’ focal mechanisms, asmore » well as their distribution in both space and time during deformation, are carefully analyzed. Microstructure analysis shows that AEs are produced by the dynamic propagation of shear bands consisting of nanograined spinel. These nanoshear bands have a near constant thickness (~100 nm) but varying lengths and self-organize during deformation. This precursory seismic process leads to ultimate macroscopic failure of the samples. Several source parameters of AE events were extracted from the recorded waveforms, allowing close tracking of event initiation, clustering, and propagation throughout the deformation/transformation process. AEs follow the Gutenberg-Richter statistics with a well-defined b value of 1.5 over three orders of moment magnitudes, suggesting that laboratory failure processes are self-affine. The seismic relation between magnitude and rupture area correctly predicts AE magnitude at millimeter scales. A rupture propagation model based on strain localization theory is proposed. Future numerical analyses may help resolve scaling issues between laboratory AE events and deep-focus earthquakes.« less

  8. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. U.S. Geological Survey Standard Reference Sample Project: Performance Evaluation of Analytical Laboratories

    USGS Publications Warehouse

    Long, H. Keith; Daddow, Richard L.; Farrar, Jerry W.

    1998-01-01

    Since 1962, the U.S. Geological Survey (USGS) has operated the Standard Reference Sample Project to evaluate the performance of USGS, cooperator, and contractor analytical laboratories that analyze chemical constituents of environmental samples. The laboratories are evaluated by using performance evaluation samples, called Standard Reference Samples (SRSs). SRSs are submitted to laboratories semi-annually for round-robin laboratory performance comparison purposes. Currently, approximately 100 laboratories are evaluated for their analytical performance on six SRSs for inorganic and nutrient constituents. As part of the SRS Project, a surplus of homogeneous, stable SRSs is maintained for purchase by USGS offices and participating laboratories for use in continuing quality-assurance and quality-control activities. Statistical evaluation of the laboratories results provides information to compare the analytical performance of the laboratories and to determine possible analytical deficiences and problems. SRS results also provide information on the bias and variability of different analytical methods used in the SRS analyses.

  10. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  11. Different CAD/CAM-processing routes for zirconia restorations: influence on fitting accuracy.

    PubMed

    Kohorst, Philipp; Junghanns, Janet; Dittmer, Marc P; Borchers, Lothar; Stiesch, Meike

    2011-08-01

    The aim of the present in vitro study was to evaluate the influence of different processing routes on the fitting accuracy of four-unit zirconia fixed dental prostheses (FDPs) fabricated by computer-aided design/computer-aided manufacturing (CAD/CAM). Three groups of zirconia frameworks with ten specimens each were fabricated. Frameworks of one group (CerconCAM) were produced by means of a laboratory CAM-only system. The other frameworks were made with different CAD/CAM systems; on the one hand by in-laboratory production (CerconCAD/CAM) and on the other hand by centralized production in a milling center (Compartis) after forwarding geometrical data. Frameworks were then veneered with the recommended ceramics, and marginal accuracy was determined using a replica technique. Horizontal marginal discrepancy, vertical marginal discrepancy, absolute marginal discrepancy, and marginal gap were evaluated. Statistical analyses were performed by one-way analysis of variance (ANOVA), with the level of significance chosen at 0.05. Mean horizontal discrepancies ranged between 22 μm (CerconCAM) and 58 μm (Compartis), vertical discrepancies ranged between 63 μm (CerconCAD/CAM) and 162 μm (CerconCAM), and absolute marginal discrepancies ranged between 94 μm (CerconCAD/CAM) and 181 μm (CerconCAM). The marginal gap varied between 72 μm (CerconCAD/CAM) and 112 μm (CerconCAM, Compartis). Statistical analysis revealed that, with all measurements, the marginal accuracy of the zirconia FDPs was significantly influenced by the processing route used (p < 0.05). Within the limitations of this study, all restorations showed a clinically acceptable marginal accuracy; however, the results suggest that the CAD/CAM systems are more precise than the CAM-only system for the manufacture of four-unit FDPs.

  12. A statistical methodology for estimating transport parameters: Theory and applications to one-dimensional advectivec-dispersive systems

    USGS Publications Warehouse

    Wagner, Brian J.; Gorelick, Steven M.

    1986-01-01

    A simulation nonlinear multiple-regression methodology for estimating parameters that characterize the transport of contaminants is developed and demonstrated. Finite difference contaminant transport simulation is combined with a nonlinear weighted least squares multiple-regression procedure. The technique provides optimal parameter estimates and gives statistics for assessing the reliability of these estimates under certain general assumptions about the distributions of the random measurement errors. Monte Carlo analysis is used to estimate parameter reliability for a hypothetical homogeneous soil column for which concentration data contain large random measurement errors. The value of data collected spatially versus data collected temporally was investigated for estimation of velocity, dispersion coefficient, effective porosity, first-order decay rate, and zero-order production. The use of spatial data gave estimates that were 2–3 times more reliable than estimates based on temporal data for all parameters except velocity. Comparison of estimated linear and nonlinear confidence intervals based upon Monte Carlo analysis showed that the linear approximation is poor for dispersion coefficient and zero-order production coefficient when data are collected over time. In addition, examples demonstrate transport parameter estimation for two real one-dimensional systems. First, the longitudinal dispersivity and effective porosity of an unsaturated soil are estimated using laboratory column data. We compare the reliability of estimates based upon data from individual laboratory experiments versus estimates based upon pooled data from several experiments. Second, the simulation nonlinear regression procedure is extended to include an additional governing equation that describes delayed storage during contaminant transport. The model is applied to analyze the trends, variability, and interrelationship of parameters in a mourtain stream in northern California.

  13. Excess Foundry Sand Characterization and Experimental Investigation in Controlled Low-Strength Material and Hot-Mixing Asphalt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tikalsky, Paul J.; Bahia, Hussain U.; Deng, An

    2004-10-15

    This report provides technical data regarding the reuse of excess foundry sand. The report addresses three topics: a statistically sound evaluation of the characterization of foundry sand, a laboratory investigation to qualify excess foundry sand as a major component in controlled low-strength material (CLSM), and the identification of the best methods for using foundry sand as a replacement for natural aggregates for construction purposes, specifically in asphalt paving materials. The survival analysis statistical technique was used to characterize foundry sand over a full spectrum of general chemical parameters, metallic elements, and organic compounds regarding bulk analysis and leachate characterization. Notmore » limited to characterization and environmental impact, foundry sand was evaluated by factor analyses, which contributes to proper selection of factor and maximization of the reuse marketplace for foundry sand. Regarding the integration of foundry sand into CLSM, excavatable CLSM and structural CLSM containing different types of excess foundry sands were investigated through laboratory experiments. Foundry sand was approved to constitute a major component in CLSM. Regarding the integration of foundry sand into asphalt paving materials, the optimum asphalt content was determined for each mixture, as well as the bulk density, maximum density, asphalt absorption, and air voids at Nini, Ndes, and Nmax. It was found that foundry sands can be used as an aggregate in hot-mix asphalt production, but each sand should be evaluated individually. Foundry sands tend to lower the strength of mixtures and also may make them more susceptible to moisture damage. Finally, traditional anti-stripping additives may decrease the moisture sensitivity of a mixture containing foundry sand, but not to the level allowed by most highway agencies.« less

  14. Excess Foundry Sand Characterization and Experimental Investigation in Controlled Low-Strength Material and Hot-Mixing Asphalt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pauul J. Tikalsky

    2004-10-31

    This report provides technical data regarding the reuse of excess foundry sand. The report addresses three topics: (1) a statistically sound evaluation of the characterization of foundry sand, (2) a laboratory investigation to qualify excess foundry sand as a major component in controlled low-strength material (CLSM), and (3) the identification of the best methods for using foundry sand as a replacement for natural aggregates for construction purposes, specifically in asphalt paving materials. The survival analysis statistical technique was used to characterize foundry sand over a full spectrum of general chemical parameters, metallic elements, and organic compounds regarding bulk analysis andmore » leachate characterization. Not limited to characterization and environmental impact, foundry sand was evaluated by factor analyses, which contributes to proper selection of factor and maximization of the reuse marketplace for foundry sand. Regarding the integration of foundry sand into CLSM, excavatable CLSM and structural CLSM containing different types of excess foundry sands were investigated through laboratory experiments. Foundry sand was approved to constitute a major component in CLSM. Regarding the integration of foundry sand into asphalt paving materials, the optimum asphalt content was determined for each mixture, as well as the bulk density, maximum density, asphalt absorption, and air voids at N{sub ini}, N{sub des}, and N{sub max}. It was found that foundry sands can be used as an aggregate in hot-mix asphalt production, but each sand should be evaluated individually. Foundry sands tend to lower the strength of mixtures and also may make them more susceptible to moisture damage. Finally, traditional anti-stripping additives may decrease the moisture sensitivity of a mixture containing foundry sand, but not to the level allowed by most highway agencies.« less

  15. Compositional analysis of biomass reference materials: Results from an interlaboratory study

    DOE PAGES

    Templeton, David W.; Wolfrum, Edward J.; Yen, James H.; ...

    2015-10-29

    Biomass compositional methods are used to compare different lignocellulosic feedstocks, to measure component balances around unit operations and to determine process yields and therefore the economic viability of biomass-to-biofuel processes. Four biomass reference materials (RMs NIST 8491–8494) were prepared and characterized, via an interlaboratory comparison exercise in the early 1990s to evaluate biomass summative compositional methods, analysts, and laboratories. Having common, uniform, and stable biomass reference materials gives the opportunity to assess compositional data compared to other analysts, to other labs, and to a known compositional value. The expiration date for the original characterization of these RMs was reached andmore » an effort to assess their stability and recharacterize the reference values for the remaining material using more current methods of analysis was initiated. We sent samples of the four biomass RMs to 11 academic, industrial, and government laboratories, familiar with sulfuric acid compositional methods, for recharacterization of the component reference values. In this work, we have used an expanded suite of analytical methods that are more appropriate for herbaceous feedstocks, to recharacterize the RMs’ compositions. We report the median values and the expanded uncertainty values for the four RMs on a dry-mass, whole-biomass basis. The original characterization data has been recalculated using median statistics to facilitate comparisons with this data. We found improved total component closures for three out of the four RMs compared to the original characterization, and the total component closures were near 100 %, which suggests that most components were accurately measured and little double counting occurred. Here, the major components were not statistically different in the recharacterization which suggests that the biomass materials are stable during storage and that additional components, not seen in the original characterization, were quantified here.« less

  16. Serum thyroid stimulating hormone, total and free T4 during the neonatal period: Establishing regional reference intervals

    PubMed Central

    Sheikhbahaei, Sara; Mahdaviani, Behnaz; Abdollahi, Alireza; Nayeri, Fatemeh

    2014-01-01

    Context: Congenital hypothyroidism (CH), the most common etiology of preventable mental retardation in children, is estimated to be more prevalent among Asian population. Aims: Since thyroid function tests (TFTs) varied among different ages and geographical regions, in this study, the neonatal thyroid reference intervals in a healthy neonatal population is determined for the first time in Iran. Settings and Design: A cross-sectional study performed on 246 healthy term newborns aged between 2 days and 1 month. Materials and Methods: Blood samples were obtained by venipuncture from all subjects. The median, 2.5th, 5th, 95th, and 97.5th percentile of serum thyroid-stimulating hormone (TSH), as well as the total and free T4 were assessed among different age groups. Statistical Analysis Used: Predictive Analytics Software (PASW Statistics 18) was used for the analysis. Results: Serum TSH, total and free T4 concentration peaked in 5th to 7th days of life, continued over 2 weeks, then decreased and started reaching to adult reference range. A significant negative correlation between age and serum concentration of TSH (P = 0.02), total T4 (P = 0.01) and free T4 (P = 0.01) was found. Conclusion: This study yielded fairly different values for TFTs compared compared values found in other countries and also different from values reported for laboratory kits we used. These differences were assumed to be due to variations in ethnicity, age, and laboratory methods used. Due to the lack of international standardization, conducting multicenter studies helps in making a more precise evaluation of thyroid status in neonates. PMID:24701428

  17. Stability of 35 biochemical and immunological routine tests after 10 hours storage and transport of human whole blood at 21°C.

    PubMed

    Henriksen, Linda O; Faber, Nina R; Moller, Mette F; Nexo, Ebba; Hansen, Annebirthe B

    2014-10-01

    Suitable procedures for transport of blood samples from general practitioners to hospital laboratories are requested. Here we explore routine testing on samples stored and transported as whole blood in lithium-heparin or serum tubes. Blood samples were collected from 106 hospitalized patients, and analyzed on Architect c8000 or Advia Centaur XP for 35 analytes at base line, and after storage and transport of whole blood in lithium-heparin or serum tubes at 21 ± 1°C for 10 h. Bias and imprecision (representing variation from analysis and storage) were calculated from values at baseline and after storage, and differences tested by paired t-tests. Results were compared to goals set by the laboratory. We observed no statistically significant bias and results within the goal for imprecision between baseline samples and 10-h samples for albumin, alkaline phosphatase, antitrypsin, bilirubin, creatinine, free triiodothyronine, γ-glutamyl transferase, haptoglobin, immunoglobulin G, lactate dehydrogenase, prostate specific antigen, total carbon dioxide, and urea. Alanine aminotransferase, amylase, C-reactive protein, calcium, cholesterol, creatine kinase, ferritin, free thyroxine, immunoglobulin A, immunoglobulin M, orosomucoid, sodium, transferrin, and triglycerides met goals for imprecision, though they showed a minor, but statistically significant bias in results after storage. Cobalamin, folate, HDL-cholesterol, iron, phosphate, potassium, thyroid stimulating hormone and urate warranted concern, but only folate and phosphate showed deviations of clinical importance. We conclude that whole blood in lithium-heparin or serum tubes stored for 10 h at 21 ± 1°C, may be used for routine analysis without restrictions for all investigated analytes but folate and phosphate.

  18. Interim estimates of the effectiveness of the influenza vaccine against A(H3N2) influenza in adults in South Korea, 2016-2017 season.

    PubMed

    Noh, Ji Yun; Lim, Sooyeon; Song, Joon Young; Choi, Won Suk; Jeong, Hye Won; Heo, Jung Yeon; Lee, Jacob; Seo, Yu Bin; Lee, Jin-Soo; Wie, Seong Heon; Kim, Young Keun; Park, Kyung Hwa; Jung, Sook-In; Kim, Shin Woo; Lee, Sun Hee; Lee, Han Sol; Yoon, Young Hoon; Cheong, Hee Jin; Kim, Woo Joo

    2017-01-01

    In the 2016-2017 season, the A(H3N2) influenza epidemic presented an unusual early peak pattern compared with past seasons in South Korea. The interim vaccine effectiveness (VE) of influenza vaccination in preventing laboratory-confirmed influenza was estimated using test-negative design through the tertiary hospital-based influenza surveillance system in South Korea. From 1 September, 2016 to 7 January, 2017, adjusted VE of influenza vaccination in preventing laboratory-confirmed A(H3N2) was -52.1% (95% confidence interval [CI], -147.2 to 6.4); -70.0% (95% CI, -212.0 to 7.4) in 19-64 years and 4.3% (95% CI, -137.8 to 61.5) in the elderly. Circulating A(H3N2) viruses belonged to the three phylogenetic subclades of 3C.2a, differently to A/Hong Kong/4801/2014, the current vaccine strain. Amino acid substitutions in hemagglutinin of circulating viruses seem to contribute to low VE. In conclusion, interim VE analysis presented that the protection of laboratory-confirmed influenza by seasonal influenza vaccination did not show the statistical significance in South Korea in the 2016-2017 influenza season.

  19. Designing experiments on thermal interactions by secondary-school students in a simulated laboratory environment

    NASA Astrophysics Data System (ADS)

    Lefkos, Ioannis; Psillos, Dimitris; Hatzikraniotis, Euripides

    2011-07-01

    Background and purpose: The aim of this study was to explore the effect of investigative activities with manipulations in a virtual laboratory on students' ability to design experiments. Sample Fourteen students in a lower secondary school in Greece attended a teaching sequence on thermal phenomena based on the use of information and communication technology, and specifically of the simulated virtual laboratory 'ThermoLab'. Design and methods A pre-post comparison was applied. Students' design of experiments was rated in eight dimensions; namely, hypothesis forming and verification, selection of variables, initial conditions, device settings, materials and devices used, process and phenomena description. A three-level ranking scheme was employed for the evaluation of students' answers in each dimension. Results A Wilcoxon signed-rank test revealed a statistically significant difference between the students' pre- and post-test scores. Additional analysis by comparing the pre- and post-test scores using the Hake gain showed high gains in all but one dimension, which suggests that this improvement was almost inclusive. Conclusions We consider that our findings support the statement that there was an improvement in students' ability to design experiments.

  20. Clinical, demographic, and laboratory characteristics of children with nephrolithiasis.

    PubMed

    Sas, David J; Becton, Lauren J; Tutman, Jeffrey; Lindsay, Laura A; Wahlquist, Amy H

    2016-06-01

    While the incidence of pediatric kidney stones appears to be increasing, little is known about the demographic, clinical, laboratory, imaging, and management variables in this patient population. We sought to describe various characteristics of our stone-forming pediatric population. To that end, we retrospectively reviewed the charts of pediatric patients with nephrolithiasis confirmed by imaging. Data were collected on multiple variables from each patient and analyzed for trends. For body mass index (BMI) controls, data from the general pediatrics population similar to our nephrolithiasis population were used. Data on 155 pediatric nephrolithiasis patients were analyzed. Of the 54 calculi available for analysis, 98 % were calcium based. Low urine volume, elevated supersaturation of calcium phosphate, elevated supersaturation of calcium oxalate, and hypercalciuria were the most commonly identified abnormalities on analysis of 24-h urine collections. Our stone-forming population did not have a higher BMI than our general pediatrics population, making it unlikely that obesity is a risk factor for nephrolithiasis in children. More girls presented with their first stone during adolescence, suggesting a role for reproductive hormones contributing to stone risk, while boys tended to present more commonly at a younger age, though this did not reach statistical significance. These intriguing findings warrant further investigation.

  1. Methodological approach to crime scene investigation: the dangers of technology

    NASA Astrophysics Data System (ADS)

    Barnett, Peter D.

    1997-02-01

    The visitor to any modern forensic science laboratory is confronted with equipment and processes that did not exist even 10 years ago: thermocyclers to allow genetic typing of nanogram amounts of DNA isolated from a few spermatozoa; scanning electron microscopes that can nearly automatically detect submicrometer sized particles of molten lead, barium and antimony produced by the discharge of a firearm and deposited on the hands of the shooter; and computers that can compare an image of a latent fingerprint with millions of fingerprints stored in the computer memory. Analysis of populations of physical evidence has permitted statistically minded forensic scientists to use Bayesian inference to draw conclusions based on a priori assumptions which are often poorly understood, irrelevant, or misleading. National commissions who are studying quality control in DNA analysis propose that people with barely relevant graduate degrees and little forensic science experience be placed in charge of forensic DNA laboratories. It is undeniable that high- tech has reversed some miscarriages of justice by establishing the innocence of a number of people who were imprisoned for years for crimes that they did not commit. However, this papers deals with the dangers of technology in criminal investigations.

  2. Improving the efficiency of the cardiac catheterization laboratories through understanding the stochastic behavior of the scheduled procedures.

    PubMed

    Stepaniak, Pieter S; Soliman Hamad, Mohamed A; Dekker, Lukas R C; Koolen, Jacques J

    2014-01-01

    In this study, we sought to analyze the stochastic behavior of Catherization Laboratories (Cath Labs) procedures in our institution. Statistical models may help to improve estimated case durations to support management in the cost-effective use of expensive surgical resources. We retrospectively analyzed all the procedures performed in the Cath Labs in 2012. The duration of procedures is strictly positive (larger than zero) and has mostly a large minimum duration. Because of the strictly positive character of the Cath Lab procedures, a fit of a lognormal model may be desirable. Having a minimum duration requires an estimate of the threshold (shift) parameter of the lognormal model. Therefore, the 3-parameter lognormal model is interesting. To avoid heterogeneous groups of observations, we tested every group-cardiologist-procedure combination for the normal, 2- and 3-parameter lognormal distribution. The total number of elective and emergency procedures performed was 6,393 (8,186 h). The final analysis included 6,135 procedures (7,779 h). Electrophysiology (intervention) procedures fit the 3-parameter lognormal model 86.1% (80.1%). Using Friedman test statistics, we conclude that the 3-parameter lognormal model is superior to the 2-parameter lognormal model. Furthermore, the 2-parameter lognormal is superior to the normal model. Cath Lab procedures are well-modelled by lognormal models. This information helps to improve and to refine Cath Lab schedules and hence their efficient use.

  3. Interlaboratory study of free cyanide methods compared to total cyanide measurements and the effect of preservation with sodium hydroxide for secondary- and tertiary-treated waste water samples.

    PubMed

    Stanley, Brett J; Antonio, Karen

    2012-11-01

    Several methods exist for the measurement of cyanide levels in treated wastewater,typically requiring preservation of the sample with sodium hydroxide to minimize loss of hydrogen cyanide gas (HCN). Recent reports have shown that cyanide levels may increase with chlorination or preservation. In this study, three flow injection analysis methods involving colorimetric and amperometric detection were compared within one laboratory, as well as across separate laboratories and equipment. Split wastewater samples from eight facilities and three different sampling periods were tested. An interlaboratory confidence interval of 3.5 ppb was calculated compared with the intralaboratory reporting limit of 2 ppb. The results show that free cyanide measurements are not statistically different than total cyanide levels. An artificial increase in cyanide level is observed with all methods for preserved samples relative to nonpreserved samples, with an average increase of 2.3 ppb. The possible loss of cyanide without preservation is shown to be statistically insignificant if properly stored up to 48 hours. The cyanide increase with preservation is further substantiated with the method of standard additions and is not a matrix interference. The increase appears to be correlated with the amount of cyanide observed without preservation, which appears to be greater in those facilities that disinfect their wastewater with chlorine, followed by dechlorination with sodium bisulfite.

  4. Meteorological Development Laboratory Student Career Experience Program

    NASA Astrophysics Data System (ADS)

    McCalla, C., Sr.

    2007-12-01

    The National Oceanic and Atmospheric Administration's (NOAA) National Weather Service (NWS) provides weather, hydrologic, and climate forecasts and warnings for the protection of life and property and the enhancement of the national economy. The NWS's Meteorological Development Laboratory (MDL) supports this mission by developing meteorological prediction methods. Given this mission, NOAA, NWS, and MDL all have a need to continually recruit talented scientists. One avenue for recruiting such talented scientist is the Student Career Experience Program (SCEP). Through SCEP, MDL offers undergraduate and graduate students majoring in meteorology, computer science, mathematics, oceanography, physics, and statistics the opportunity to alternate full-time paid employment with periods of full-time study. Using SCEP as a recruiting vehicle, MDL has employed students who possess some of the very latest technical skills and knowledge needed to make meaningful contributions to projects within the lab. MDL has recently expanded its use of SCEP and has increased the number of students (sometimes called co- ops) in its program. As a co-op, a student can expect to develop and implement computer based scientific techniques, participate in the development of statistical algorithms, assist in the analysis of meteorological data, and verify forecasts. This presentation will focus on describing recruitment, projects, and the application process related to MDL's SCEP. In addition, this presentation will also briefly explore the career paths of students who successfully completed the program.

  5. Effects of a Research-Infused Botanical Curriculum on Undergraduates’ Content Knowledge, STEM Competencies, and Attitudes toward Plant Sciences

    PubMed Central

    Clarke, H. David; Horton, Jonathan L.

    2014-01-01

    In response to the American Association for the Advancement of Science's Vision and Change in Undergraduate Biology Education initiative, we infused authentic, plant-based research into majors’ courses at a public liberal arts university. Faculty members designed a financially sustainable pedagogical approach, utilizing vertically integrated curricular modules based on undergraduate researchers’ field and laboratory projects. Our goals were to 1) teach botanical concepts, from cells to ecosystems; 2) strengthen competencies in statistical analysis and scientific writing; 3) pique plant science interest; and 4) allow all undergraduates to contribute to genuine research. Our series of inquiry-centered exercises mitigated potential faculty barriers to adopting research-rich curricula, facilitating teaching/research balance by gathering publishable scholarly data during laboratory class periods. Student competencies were assessed with pre- and postcourse quizzes and rubric-graded papers, and attitudes were evaluated with pre- and postcourse surveys. Our revised curriculum increased students’ knowledge and awareness of plant science topics, improved scientific writing, enhanced statistical knowledge, and boosted interest in conducting research. More than 300 classroom students have participated in our program, and data generated from these modules’ assessment allowed faculty and students to present 28 contributed talks or posters and publish three papers in 4 yr. Future steps include analyzing the effects of repeated module exposure on student learning and creating a regional consortium to increase our project's pedagogical impact. PMID:25185223

  6. Modeling and optimization of trihalomethanes formation potential of surface water (a drinking water source) using Box-Behnken design.

    PubMed

    Singh, Kunwar P; Rai, Premanjali; Pandey, Priyanka; Sinha, Sarita

    2012-01-01

    The present research aims to investigate the individual and interactive effects of chlorine dose/dissolved organic carbon ratio, pH, temperature, bromide concentration, and reaction time on trihalomethanes (THMs) formation in surface water (a drinking water source) during disinfection by chlorination in a prototype laboratory-scale simulation and to develop a model for the prediction and optimization of THMs levels in chlorinated water for their effective control. A five-factor Box-Behnken experimental design combined with response surface and optimization modeling was used for predicting the THMs levels in chlorinated water. The adequacy of the selected model and statistical significance of the regression coefficients, independent variables, and their interactions were tested by the analysis of variance and t test statistics. The THMs levels predicted by the model were very close to the experimental values (R(2) = 0.95). Optimization modeling predicted maximum (192 μg/l) TMHs formation (highest risk) level in water during chlorination was very close to the experimental value (186.8 ± 1.72 μg/l) determined in laboratory experiments. The pH of water followed by reaction time and temperature were the most significant factors that affect the THMs formation during chlorination. The developed model can be used to determine the optimum characteristics of raw water and chlorination conditions for maintaining the THMs levels within the safe limit.

  7. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  8. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandstrom, Mary M.; Brown, Geoffrey W.; Preston, Daniel N.

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statisticallymore » significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.« less

  9. Use of error grid analysis to evaluate acceptability of a point of care prothrombin time meter.

    PubMed

    Petersen, John R; Vonmarensdorf, Hans M; Weiss, Heidi L; Elghetany, M Tarek

    2010-02-01

    Statistical methods (linear regression, correlation analysis, etc.) are frequently employed in comparing methods in the central laboratory (CL). Assessing acceptability of point of care testing (POCT) equipment, however, is more difficult because statistically significant biases may not have an impact on clinical care. We showed how error grid (EG) analysis can be used to evaluate POCT PT INR with the CL. We compared results from 103 patients seen in an anti-coagulation clinic that were on Coumadin maintenance therapy using fingerstick samples for POCT (Roche CoaguChek XS and S) and citrated venous blood samples for CL (Stago STAR). To compare clinical acceptability of results we developed an EG with zones A, B, C and D. Using 2nd order polynomial equation analysis, POCT results highly correlate with the CL for CoaguChek XS (R(2)=0. 955) and CoaguChek S (R(2)=0. 93), respectively but does not indicate if POCT results are clinically interchangeable with the CL. Using EG it is readily apparent which levels can be considered clinically identical to the CL despite analytical bias. We have demonstrated the usefulness of EG in determining acceptability of POCT PT INR testing and how it can be used to determine cut-offs where differences in POCT results may impact clinical care. Copyright 2009 Elsevier B.V. All rights reserved.

  10. Analysis of lactate concentrations in canine synovial fluid.

    PubMed

    Proot, J L J; de Vicente, F; Sheahan, D E

    2015-01-01

    To report synovial fluid lactate concentrations in normal and pathological canine joints. Controlled, prospective study. Lactate was measured in synovial fluid using a hand-held meter and the rest of the fluid was sent to a commercial laboratory for analysis. Samples were divided into four groups; group 1: control, group 2: osteoarthritis, group 3: immune-mediated inflammatory arthritis, and group 4: septic arthritis. Statistical analysis was performed to compare lactate concentrations between the four groups and to examine the predictive value of lactate in the diagnosis of septic arthritis. A correlation was sought between synovial fluid lactate and synovial fluid total nucleated cell count and total protein. Seventy-four samples were investigated from 55 dogs. Statistical analysis found that lactate concentrations were significantly higher in the septic arthritis group than in each of the other three groups. No significant correlation could be found between synovial fluid lactate concentrations and synovial fluid total nucleated cell count or synovial fluid total protein. Lactate concentration was found to be a useful predictor of septic arthritis, with a low concentration pointing towards exclusion rather than a high concentration to the diagnosis of septic arthritis. Synovial fluid lactate concentration is not a good marker for osteoarthritis or immune-mediated inflammatory arthritis, but it is significantly increased in septic arthritis and could help the clinician in ruling out this condition in a quick and cost-effective way.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goltz, G.; Kaiser, L.M.; Weiner, H.

    A major mission of the U.S. Coast Guard is the task of providing and maintaining Maritime Aids to Navigation. These aids are located on and near the coastline and inland waters of the United States and its possessions. A computer program, Design Synthesis and Performance Analysis (DSPA), has been developed by the Jet Propulsion Laboratory to demonstrate the feasibility of low-cost solar array/battery power systems for use on flashing lamp buoys. To provide detailed, realistic temperature, wind, and solar insolation data for analysis of the flashing lamp buoy power systems, the two DSPA support computer program sets: MERGE and STATmore » were developed. A general description of these two packages is presented in this program summary report. The MERGE program set will enable the Coast Guard to combine temperature and wind velocity data (NOAA TDF-14 tapes) with solar insolation data (NOAA DECK-280 tapes) onto a single sequential MERGE file containing up to 12 years of hourly observations. This MERGE file can then be used as direct input to the DSPA program. The STAT program set will enable a statistical analysis to be performed of the MERGE data and produce high or low or mean profiles of the data and/or do a worst case analysis. The STAT output file consists of a one-year set of hourly statistical weather data which can be used as input to the DSPA program.« less

  12. Multivariate evaluation of the cutting performance of rotary instruments with electric and air-turbine handpieces.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Chochlidakis, Konstantinos; Ercoli, Carlo

    2016-10-01

    Laboratory studies of tooth preparation often involve single values for all variables other than the one being tested. In contrast, in clinical settings, not all variables can be adequately controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but, in clinical practice, the instrument must make different cuts with individual dentists applying different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies difficult. The purpose of this in vitro study was to examine the effects of 9 process variables on the dental cutting of rotary cutting instruments used with an electric handpiece and compare them with those of a previous study that used an air-turbine handpiece. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using an electric handpiece in a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures with Macor blocks as the cutting substrate. Analysis of variance (ANOVA) was used to assess the statistical significance (α=.05). Four variables (targeted applied load, cut length, diamond grit size, and cut type) consistently produced large, statistically significant effects, whereas 5 variables (rotation per minute, number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate) produced relatively small, statistically insignificant effects. These results are generally similar to those previously found for an air-turbine handpiece. Regardless of whether an electric or air-turbine handpiece was used, the control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances and hardware choices. These results highlight the greater importance of local clinical conditions (procedure, dentist) in understanding dental cutting as opposed to other hardware-related factors. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  13. Designed experiment evaluation of key variables affecting the cutting performance of rotary instruments.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo

    2015-04-01

    Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  14. Real-time forecasting and predictability of catastrophic failure events: from rock failure to volcanoes and earthquakes

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Bell, A. F.; Naylor, M.; Atkinson, M.; Filguera, R.; Meredith, P. G.; Brantut, N.

    2012-12-01

    Accurate prediction of catastrophic brittle failure in rocks and in the Earth presents a significant challenge on theoretical and practical grounds. The governing equations are not known precisely, but are known to produce highly non-linear behavior similar to those of near-critical dynamical systems, with a large and irreducible stochastic component due to material heterogeneity. In a laboratory setting mechanical, hydraulic and rock physical properties are known to change in systematic ways prior to catastrophic failure, often with significant non-Gaussian fluctuations about the mean signal at a given time, for example in the rate of remotely-sensed acoustic emissions. The effectiveness of such signals in real-time forecasting has never been tested before in a controlled laboratory setting, and previous work has often been qualitative in nature, and subject to retrospective selection bias, though it has often been invoked as a basis in forecasting natural hazard events such as volcanoes and earthquakes. Here we describe a collaborative experiment in real-time data assimilation to explore the limits of predictability of rock failure in a best-case scenario. Data are streamed from a remote rock deformation laboratory to a user-friendly portal, where several proposed physical/stochastic models can be analysed in parallel in real time, using a variety of statistical fitting techniques, including least squares regression, maximum likelihood fitting, Markov-chain Monte-Carlo and Bayesian analysis. The results are posted and regularly updated on the web site prior to catastrophic failure, to ensure a true and and verifiable prospective test of forecasting power. Preliminary tests on synthetic data with known non-Gaussian statistics shows how forecasting power is likely to evolve in the live experiments. In general the predicted failure time does converge on the real failure time, illustrating the bias associated with the 'benefit of hindsight' in retrospective analyses. Inference techniques that account explicitly for non-Gaussian statistics significantly reduce the bias, and increase the reliability and accuracy, of the forecast failure time in prospective mode.

  15. Detection of QT prolongation using a novel ECG analysis algorithm applying intelligent automation: Prospective blinded evaluation using the Cardiac Safety Research Consortium ECG database

    PubMed Central

    Green, Cynthia L.; Kligfield, Paul; George, Samuel; Gussak, Ihor; Vajdic, Branislav; Sager, Philip; Krucoff, Mitchell W.

    2013-01-01

    Background The Cardiac Safety Research Consortium (CSRC) provides both “learning” and blinded “testing” digital ECG datasets from thorough QT (TQT) studies annotated for submission to the US Food and Drug Administration (FDA) to developers of ECG analysis technologies. This manuscript reports the first results from a blinded “testing” dataset that examines Developer re-analysis of original Sponsor-reported core laboratory data. Methods 11,925 anonymized ECGs including both moxifloxacin and placebo arms of a parallel-group TQT in 191 subjects were blindly analyzed using a novel ECG analysis algorithm applying intelligent automation. Developer measured ECG intervals were submitted to CSRC for unblinding, temporal reconstruction of the TQT exposures, and statistical comparison to core laboratory findings previously submitted to FDA by the pharmaceutical sponsor. Primary comparisons included baseline-adjusted interval measurements, baseline- and placebo-adjusted moxifloxacin QTcF changes (ddQTcF), and associated variability measures. Results Developer and Sponsor-reported baseline-adjusted data were similar with average differences less than 1 millisecond (ms) for all intervals. Both Developer and Sponsor-reported data demonstrated assay sensitivity with similar ddQTcF changes. Average within-subject standard deviation for triplicate QTcF measurements was significantly lower for Developer than Sponsor-reported data (5.4 ms and 7.2 ms, respectively; p<0.001). Conclusion The virtually automated ECG algorithm used for this analysis produced similar yet less variable TQT results compared to the Sponsor-reported study, without the use of a manual core laboratory. These findings indicate CSRC ECG datasets can be useful for evaluating novel methods and algorithms for determining QT/QTc prolongation by drugs. While the results should not constitute endorsement of specific algorithms by either CSRC or FDA, the value of a public domain digital ECG warehouse to provide prospective, blinded comparisons of ECG technologies applied for QT/QTc measurement is illustrated. PMID:22424006

  16. Development of a rapid soil water content detection technique using active infrared thermal methods for in-field applications.

    PubMed

    Antonucci, Francesca; Pallottino, Federico; Costa, Corrado; Rimatori, Valentina; Giorgi, Stefano; Papetti, Patrizia; Menesatti, Paolo

    2011-01-01

    The aim of this study was to investigate the suitability of active infrared thermography and thermometry in combination with multivariate statistical partial least squares analysis as rapid soil water content detection techniques both in the laboratory and the field. Such techniques allow fast soil water content measurements helpful in both agricultural and environmental fields. These techniques, based on the theory of heat dissipation, were tested by directly measuring temperature dynamic variation of samples after heating. For the assessment of temperature dynamic variations data were collected during three intervals (3, 6 and 10 s). To account for the presence of specific heats differences between water and soil, the analyses were regulated using slopes to linearly describe their trends. For all analyses, the best model was achieved for a 10 s slope. Three different approaches were considered, two in the laboratory and one in the field. The first laboratory-based one was centred on active infrared thermography, considered measurement of temperature variation as independent variable and reported r = 0.74. The second laboratory-based one was focused on active infrared thermometry, added irradiation as independent variable and reported r = 0.76. The in-field experiment was performed by active infrared thermometry, heating bare soil by solar irradiance after exposure due to primary tillage. Some meteorological parameters were inserted as independent variables in the prediction model, which presented r = 0.61. In order to obtain more general and wide estimations in-field a Partial Least Squares Discriminant Analysis on three classes of percentage of soil water content was performed obtaining a high correct classification in the test (88.89%). The prediction error values were lower in the field with respect to laboratory analyses. Both techniques could be used in conjunction with a Geographic Information System for obtaining detailed information on soil heterogeneity.

  17. EzArray: A web-based highly automated Affymetrix expression array data management and analysis system

    PubMed Central

    Zhu, Yuerong; Zhu, Yuelin; Xu, Wei

    2008-01-01

    Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103

  18. A cohort mortality study of employees exposed to chlorinated chemicals.

    PubMed

    Wong, O

    1988-01-01

    The cohort of this historical prospective mortality study consisted of 697 male employees at a chlorination plant. A majority of the cohort was potentially exposed to benzotrichloride, benzyl chloride, benzoyl chloride, and other related chemicals. The mortality experience of the cohort was observed from 1943 through 1982. For the cohort as a whole, no statistically significant mortality excess was detected. The overall Standardized Mortality Ratio (SMR) was 100, and the SMR for all cancers combined was 122 (not significant). The respiratory cancer SMR for the cohort as a whole was 246 (7 observed vs. 2.8 expected). The excess was of borderline statistical significance, the lower 95% confidence limit being 99. Analysis by race showed that all 7 respiratory cancer deaths came from the white male employees, with an SMR of 265 (p less than 0.05). The respiratory cancer mortality excess was higher among employees in maintenance (SMR = 229) than among those in operations or production (SMR = 178). The lung cancer mortality excess among the laboratory employees was statistically significant (SMR = 1292). However, this observation should be viewed with caution, since it was based on only 2 deaths. Further analysis indicated that the respiratory cancer mortality excess was limited to the male employees with 15 or more years of employment (SMR = 379, p less than 0.05). Based on animal data as well as other epidemiologic studies, together with the internal consistency of analysis by length of employment, the data suggest an association between the chlorination process of toluene at the plant and an increased risk of respiratory cancer.(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Laboratory performance in the Sediment Laboratory Quality-Assurance Project, 1996-98

    USGS Publications Warehouse

    Gordon, John D.; Newland, Carla A.; Gagliardi, Shane T.

    2000-01-01

    Analytical results from all sediment quality-control samples are compiled and statistically summarized by the USGS, Branch of Quality Systems, both on an intra- and interlaboratory basis. When evaluating these data, the reader needs to keep in mind that every measurement has an error component associated with it. It is premature to use the data from the first five SLQA studies to judge any of the laboratories as performing in an unacceptable manner. There were, however, some notable differences in the results for the 12 laboratories that participated in the five SLQA studies. For example, the overall median percent difference for suspended-sediment concentration on an individual laboratory basis ranged from –18.04 to –0.33 percent. Five of the 12 laboratories had an overall median percent difference for suspended-sediment concentration of –2.02 to –0.33 percent. There was less variability in the median difference for the measured fine-size material mass. The overall median percent difference for fine-size material mass ranged from –10.11 to –4.27 percent. Except for one laboratory, the median difference for fine-size material mass was within a fairly narrow range of –6.76 to –4.27 percent. The median percent difference for sand-size material mass differed among laboratories more than any other physical sediment property measured in the study. The overall median percent difference for the sand-size material mass ranged from –1.49 percent to 26.39 percent. Five of the nine laboratories that do sand/fine separations had overall median percent differences that ranged from –1.49 to 2.98 percent for sand-size material mass. Careful review of the data reveals that certain laboratories consistently produced data within statistical control limits for some or all of the physical sediment properties measured in this study, whereas other laboratories occasionally produced data that exceeded the control limits.

  20. 7 CFR 160.17 - Laboratory analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...

  1. 7 CFR 160.17 - Laboratory analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory analysis. 160.17 Section 160.17 Agriculture... STANDARDS FOR NAVAL STORES Methods of Analysis, Inspection, Sampling and Grading § 160.17 Laboratory analysis. The analysis and laboratory testing of naval stores shall be conducted, so far as is practicable...

  2. Crop identification technology assessment for remote sensing (CITARS). Volume 6: Data processing at the laboratory for applications of remote sensing

    NASA Technical Reports Server (NTRS)

    Bauer, M. E.; Cary, T. K.; Davis, B. J.; Swain, P. H.

    1975-01-01

    The results of classifications and experiments for the crop identification technology assessment for remote sensing are summarized. Using two analysis procedures, 15 data sets were classified. One procedure used class weights while the other assumed equal probabilities of occurrence for all classes. Additionally, 20 data sets were classified using training statistics from another segment or date. The classification and proportion estimation results of the local and nonlocal classifications are reported. Data also describe several other experiments to provide additional understanding of the results of the crop identification technology assessment for remote sensing. These experiments investigated alternative analysis procedures, training set selection and size, effects of multitemporal registration, spectral discriminability of corn, soybeans, and other, and analyses of aircraft multispectral data.

  3. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    NASA Astrophysics Data System (ADS)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  4. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    PubMed

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.

  5. A web-based laboratory information system to improve quality of care of tuberculosis patients in Peru: functional requirements, implementation and usage statistics.

    PubMed

    Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin J A; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish S F

    2007-10-28

    Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS http://www.openmrs.org for other countries to use.

  6. A web-based laboratory information system to improve quality of care of tuberculosis patients in Peru: functional requirements, implementation and usage statistics

    PubMed Central

    Blaya, Joaquin A; Shin, Sonya S; Yagui, Martin JA; Yale, Gloria; Suarez, Carmen Z; Asencios, Luis L; Cegielski, J Peter; Fraser, Hamish SF

    2007-01-01

    Background Multi-drug resistant tuberculosis patients in resource-poor settings experience large delays in starting appropriate treatment and may not be monitored appropriately due to an overburdened laboratory system, delays in communication of results, and missing or error-prone laboratory data. The objective of this paper is to describe an electronic laboratory information system implemented to alleviate these problems and its expanding use by the Peruvian public sector, as well as examine the broader issues of implementing such systems in resource-poor settings. Methods A web-based laboratory information system "e-Chasqui" has been designed and implemented in Peru to improve the timeliness and quality of laboratory data. It was deployed in the national TB laboratory, two regional laboratories and twelve pilot health centres. Using needs assessment and workflow analysis tools, e-Chasqui was designed to provide for improved patient care, increased quality control, and more efficient laboratory monitoring and reporting. Results Since its full implementation in March 2006, 29,944 smear microscopy, 31,797 culture and 7,675 drug susceptibility test results have been entered. Over 99% of these results have been viewed online by the health centres. High user satisfaction and heavy use have led to the expansion of e-Chasqui to additional institutions. In total, e-Chasqui will serve a network of institutions providing medical care for over 3.1 million people. The cost to maintain this system is approximately US$0.53 per sample or 1% of the National Peruvian TB program's 2006 budget. Conclusion Electronic laboratory information systems have a large potential to improve patient care and public health monitoring in resource-poor settings. Some of the challenges faced in these settings, such as lack of trained personnel, limited transportation, and large coverage areas, are obstacles that a well-designed system can overcome. e-Chasqui has the potential to provide a national TB laboratory network in Peru. Furthermore, the core functionality of e-Chasqui as been implemented in the open source medical record system OpenMRS for other countries to use. PMID:17963522

  7. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    PubMed

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  8. Virtual lab demonstrations improve students' mastery of basic biology laboratory techniques.

    PubMed

    Maldarelli, Grace A; Hartmann, Erica M; Cummings, Patrick J; Horner, Robert D; Obom, Kristina M; Shingles, Richard; Pearlman, Rebecca S

    2009-01-01

    Biology laboratory classes are designed to teach concepts and techniques through experiential learning. Students who have never performed a technique must be guided through the process, which is often difficult to standardize across multiple lab sections. Visual demonstration of laboratory procedures is a key element in teaching pedagogy. The main goals of the study were to create videos explaining and demonstrating a variety of lab techniques that would serve as teaching tools for undergraduate and graduate lab courses and to assess the impact of these videos on student learning. Demonstrations of individual laboratory procedures were videotaped and then edited with iMovie. Narration for the videos was edited with Audacity. Undergraduate students were surveyed anonymously prior to and following screening to assess the impact of the videos on student lab performance by completion of two Participant Perception Indicator surveys. A total of 203 and 171 students completed the pre- and posttesting surveys, respectively. Statistical analyses were performed to compare student perceptions of knowledge of, confidence in, and experience with the lab techniques before and after viewing the videos. Eleven demonstrations were recorded. Chi-square analysis revealed a significant increase in the number of students reporting increased knowledge of, confidence in, and experience with the lab techniques after viewing the videos. Incorporation of instructional videos as prelaboratory exercises has the potential to standardize techniques and to promote successful experimental outcomes.

  9. Vigilance in the Laboratory Predicts Avoidance in the Real World: A Dimensional Analysis of Neural, Behavioral, and Ecological Momentary Data in Anxious Youth

    PubMed Central

    Silk, Jennifer S.; Ladouceur, Cecile D.; Ryan, Neal D.; Dahl, Ronald E.; Forbes, Erika E.; Siegle, Greg J.

    2016-01-01

    Vigilance and avoidance of threat are observed in anxious adults during laboratory tasks, and are posited to have real-world clinical relevance, but data are mixed in anxious youth. We propose that vigilance-avoidance patterns will become evident in anxious youth through a focus on individual differences and real-world strategic avoidance. Decreased functional connectivity between the amygdala and prefrontal cortex (PFC) could play a mechanistic role in this link. 78 clinically anxious youth completed a dot-probe task to assess vigilance to threat while undergoing fMRI. Real-world avoidance was assessed using Ecological Momentary Assessment (EMA) of self-reported suppression and distraction during negative life events. Vigilance towards threat was positively associated with EMA distraction and suppression. Functional connectivity between a right amygdala seed region and dorsomedial and right dorsolateral PFC regions was inversely related to EMA distraction. Dorsolateral PFC-amygdalar connectivity statistically mediated the relationship between attentional vigilance and real-world distraction. Findings suggest anxious youth showing attentional vigilance toward threat are more likely to use suppression and distraction to regulate negative emotions. Reduced PFC control over limbic reactivity is a possible neural substrate of this pattern. These findings lend ecological validity to laboratory vigilance assessments and suggest PFC-amygdalar connectivity is a neural mechanism bridging laboratory and naturalistic contexts. PMID:27010577

  10. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denman, Matthew R.; Brooks, Dusty Marie

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on keymore » figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .« less

  11. QTest: Quantitative Testing of Theories of Binary Choice.

    PubMed

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  12. Protein abundances can distinguish between naturally-occurring and laboratory strains of Yersinia pestis, the causative agent of plague

    DOE PAGES

    Merkley, Eric D.; Sego, Landon H.; Lin, Andy; ...

    2017-08-30

    Adaptive processes in bacterial species can occur rapidly in laboratory culture, leading to genetic divergence between naturally occurring and laboratory-adapted strains. Differentiating wild and closely-related laboratory strains is clearly important for biodefense and bioforensics; however, DNA sequence data alone has thus far not provided a clear signature, perhaps due to lack of understanding of how diverse genome changes lead to adapted phenotypes. Protein abundance profiles from mass spectrometry-based proteomics analyses are a molecular measure of phenotype. Proteomics data contains sufficient information that powerful statistical methods can uncover signatures that distinguish wild strains of Yersinia pestis from laboratory-adapted strains.

  13. Learning Axes and Bridging Tools in a Technology-Based Design for Statistics

    ERIC Educational Resources Information Center

    Abrahamson, Dor; Wilensky, Uri

    2007-01-01

    We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…

  14. 75 FR 79035 - Nationally Recognized Testing Laboratories; Supplier's Declaration of Conformity

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-17

    ... statistic, however, covers only a narrow subset of ICT equipment, and excludes laptop computers and computer... between 2003 and March of 2009 shows a total of 60 product recalls, including laptop computers, scanners..., statistical or similar data and studies, of a credible nature, supporting any claims made by commenters.'' (73...

  15. Challenges faced by cervical cancer prevention programs in developing countries: a situational analysis of program organization in Argentina.

    PubMed

    Arrossi, Silvina; Paolino, Melisa; Sankaranarayanan, Rengaswamy

    2010-10-01

    to carry out a situational analysis of cervical cancer prevention activities in Argentina, specifically regarding (a) the organizational framework of cervical cancer prevention activities; (b) Pap-smear coverage; (c) cytology laboratory organization; and (d) follow-up/treatment of women with abnormal lesions. a situational analysis of provincial cervical cancer programs using data from an ad-hoc questionnaire sent to the leaders of cervical cancer prevention programs in Argentina's 24 provinces. In addition, the provinces' program guidelines, statistical reports, laws, and program regulations were reviewed and certain key leaders were personally interviewed. data were obtained for 19 of Argentina's 24 provinces. Four of the 19 provinces had no formal program framework. Conventional cytology was the most commonly used screening test. Screening was mainly opportunistic. The recommended interval between normal tests was 3 years in most provinces. The eligible age for screening ranged from 10-70 years of age; however, annual or biannual screening was the usual practice after becoming sexually active. None of the provincial programs had data available regarding Pap-smear coverage. Most of the cytology laboratories did not have a quality control policy. The number of smears read varied greatly by laboratory (650-24 000 per year). A log of events related to screening and treatment did not exist in most provinces. screening in Argentina is mainly opportunistic, characterized by an estimated low coverage, coexisting with over-screening of women with access to health services, and an absence of quality control procedures. Policies for cervical cancer screening in the provinces vary and, most often, deviate from the national recommendation of one Pap smear every 3 years for women 35-64 years of age. Ensuring compliance with national program guidelines is an essential step toward significantly reducing the burden of cervical cancer.

  16. Determination of Calcium in Dietary Supplements: Statistical Comparison of Methods in the Analytical Laboratory

    ERIC Educational Resources Information Center

    Garvey, Sarah L.; Shahmohammadi, Golbon; McLain, Derek R.; Dietz, Mark L.

    2015-01-01

    A laboratory experiment is described in which students compare two methods for the determination of the calcium content of commercial dietary supplement tablets. In a two-week sequence, the sample tablets are first analyzed via complexometric titration with ethylenediaminetetraacetic acid and then, following ion exchange of the calcium ion present…

  17. Suggestions for Teaching Mathematics Using Laboratory Approaches. 6. Probability. Experimental Edition.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Elementary Curriculum Development.

    This guide is the sixth in a series of publications to assist teachers in using a laboratory approach to mathematics. Twenty activities on probability and statistics for the elementary grades are described in terms of purpose, materials needed, and procedures to be used. Objectives of these activities include basic probability concepts; gathering,…

  18. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    PubMed

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  19. An effect size filter improves the reproducibility in spectral counting-based comparative proteomics.

    PubMed

    Gregori, Josep; Villarreal, Laura; Sánchez, Alex; Baselga, José; Villanueva, Josep

    2013-12-16

    The microarray community has shown that the low reproducibility observed in gene expression-based biomarker discovery studies is partially due to relying solely on p-values to get the lists of differentially expressed genes. Their conclusions recommended complementing the p-value cutoff with the use of effect-size criteria. The aim of this work was to evaluate the influence of such an effect-size filter on spectral counting-based comparative proteomic analysis. The results proved that the filter increased the number of true positives and decreased the number of false positives and the false discovery rate of the dataset. These results were confirmed by simulation experiments where the effect size filter was used to evaluate systematically variable fractions of differentially expressed proteins. Our results suggest that relaxing the p-value cut-off followed by a post-test filter based on effect size and signal level thresholds can increase the reproducibility of statistical results obtained in comparative proteomic analysis. Based on our work, we recommend using a filter consisting of a minimum absolute log2 fold change of 0.8 and a minimum signal of 2-4 SpC on the most abundant condition for the general practice of comparative proteomics. The implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of the results obtained among independent laboratories and MS platforms. Quality control analysis of microarray-based gene expression studies pointed out that the low reproducibility observed in the lists of differentially expressed genes could be partially attributed to the fact that these lists are generated relying solely on p-values. Our study has established that the implementation of an effect size post-test filter improves the statistical results of spectral count-based quantitative proteomics. The results proved that the filter increased the number of true positives whereas decreased the false positives and the false discovery rate of the datasets. The results presented here prove that a post-test filter applying a reasonable effect size and signal level thresholds helps to increase the reproducibility of statistical results in comparative proteomic analysis. Furthermore, the implementation of feature filtering approaches could improve proteomic biomarker discovery initiatives by increasing the reproducibility of results obtained among independent laboratories and MS platforms. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Variability in P-Glycoprotein Inhibitory Potency (IC50) Using Various in Vitro Experimental Systems: Implications for Universal Digoxin Drug-Drug Interaction Risk Assessment Decision Criteria

    PubMed Central

    Bentz, Joe; O’Connor, Michael P.; Bednarczyk, Dallas; Coleman, JoAnn; Lee, Caroline; Palm, Johan; Pak, Y. Anne; Perloff, Elke S.; Reyner, Eric; Balimane, Praveen; Brännström, Marie; Chu, Xiaoyan; Funk, Christoph; Guo, Ailan; Hanna, Imad; Herédi-Szabó, Krisztina; Hillgren, Kate; Li, Libin; Hollnack-Pusch, Evelyn; Jamei, Masoud; Lin, Xuena; Mason, Andrew K.; Neuhoff, Sibylle; Patel, Aarti; Podila, Lalitha; Plise, Emile; Rajaraman, Ganesh; Salphati, Laurent; Sands, Eric; Taub, Mitchell E.; Taur, Jan-Shiang; Weitz, Dietmar; Wortelboer, Heleen M.; Xia, Cindy Q.; Xiao, Guangqing; Yabut, Jocelyn; Yamagata, Tetsuo; Zhang, Lei

    2013-01-01

    A P-glycoprotein (P-gp) IC50 working group was established with 23 participating pharmaceutical and contract research laboratories and one academic institution to assess interlaboratory variability in P-gp IC50 determinations. Each laboratory followed its in-house protocol to determine in vitro IC50 values for 16 inhibitors using four different test systems: human colon adenocarcinoma cells (Caco-2; eleven laboratories), Madin-Darby canine kidney cells transfected with MDR1 cDNA (MDCKII-MDR1; six laboratories), and Lilly Laboratories Cells—Porcine Kidney Nr. 1 cells transfected with MDR1 cDNA (LLC-PK1-MDR1; four laboratories), and membrane vesicles containing human P-glycoprotein (P-gp; five laboratories). For cell models, various equations to calculate remaining transport activity (e.g., efflux ratio, unidirectional flux, net-secretory-flux) were also evaluated. The difference in IC50 values for each of the inhibitors across all test systems and equations ranged from a minimum of 20- and 24-fold between lowest and highest IC50 values for sertraline and isradipine, to a maximum of 407- and 796-fold for telmisartan and verapamil, respectively. For telmisartan and verapamil, variability was greatly influenced by data from one laboratory in each case. Excluding these two data sets brings the range in IC50 values for telmisartan and verapamil down to 69- and 159-fold. The efflux ratio-based equation generally resulted in severalfold lower IC50 values compared with unidirectional or net-secretory-flux equations. Statistical analysis indicated that variability in IC50 values was mainly due to interlaboratory variability, rather than an implicit systematic difference between test systems. Potential reasons for variability are discussed and the simplest, most robust experimental design for P-gp IC50 determination proposed. The impact of these findings on drug-drug interaction risk assessment is discussed in the companion article (Ellens et al., 2013) and recommendations are provided. PMID:23620485

Top