Sample records for deming regression analysis

  1. DEMES rotary joint: theories and applications

    NASA Astrophysics Data System (ADS)

    Wang, Shu; Hao, Zhaogang; Li, Mingyu; Huang, Bo; Sun, Lining; Zhao, Jianwen

    2017-04-01

    As a kind of dielectric elastomer actuators, dielectric elastomer minimum energy structure (DEMES) can realize large angular deformations by small voltage-induced strains, which make them an attractive candidate for use as biomimetic robotics. Considering the rotary joint is a basic and common component of many biomimetic robots, we have been fabricated rotary joint by DEMES and developed its performances in the past two years. In this paper, we have discussed the static analysis, dynamics analysis and some characteristics of the DEMES rotary joint. Based on theoretical analysis, some different applications of the DEMES rotary joint were presented, such as a flapping wing, a biomimetic fish and a two-legged walker. All of the robots are fabricated by DEMES rotary joint and can realize some basic biomimetic motions. Comparing with traditional rigid robot, the robot based on DEMES is soft and light, so it has advantage on the collision-resistant.

  2. Semi-automatic assessment of skin capillary density: proof of principle and validation.

    PubMed

    Gronenschild, E H B M; Muris, D M J; Schram, M T; Karaca, U; Stehouwer, C D A; Houben, A J H M

    2013-11-01

    Skin capillary density and recruitment have been proven to be relevant measures of microvascular function. Unfortunately, the assessment of skin capillary density from movie files is very time-consuming, since this is done manually. This impedes the use of this technique in large-scale studies. We aimed to develop a (semi-) automated assessment of skin capillary density. CapiAna (Capillary Analysis) is a newly developed semi-automatic image analysis application. The technique involves four steps: 1) movement correction, 2) selection of the frame range and positioning of the region of interest (ROI), 3) automatic detection of capillaries, and 4) manual correction of detected capillaries. To gain insight into the performance of the technique, skin capillary density was measured in twenty participants (ten women; mean age 56.2 [42-72] years). To investigate the agreement between CapiAna and the classic manual counting procedure, we used weighted Deming regression and Bland-Altman analyses. In addition, intra- and inter-observer coefficients of variation (CVs), and differences in analysis time were assessed. We found a good agreement between CapiAna and the classic manual method, with a Pearson's correlation coefficient (r) of 0.95 (P<0.001) and a Deming regression coefficient of 1.01 (95%CI: 0.91; 1.10). In addition, we found no significant differences between the two methods, with an intercept of the Deming regression of 1.75 (-6.04; 9.54), while the Bland-Altman analysis showed a mean difference (bias) of 2.0 (-13.5; 18.4) capillaries/mm(2). The intra- and inter-observer CVs of CapiAna were 2.5% and 5.6% respectively, while for the classic manual counting procedure these were 3.2% and 7.2%, respectively. Finally, the analysis time for CapiAna ranged between 25 and 35min versus 80 and 95min for the manual counting procedure. We have developed a semi-automatic image analysis application (CapiAna) for the assessment of skin capillary density, which agrees well with the classic manual counting procedure, is time-saving, and has a better reproducibility as compared to the classic manual counting procedure. As a result, the use of skin capillaroscopy is feasible in large-scale studies, which importantly extends the possibilities to perform microcirculation research in humans. © 2013.

  3. Equivalent dynamic model of DEMES rotary joint

    NASA Astrophysics Data System (ADS)

    Zhao, Jianwen; Wang, Shu; Xing, Zhiguang; McCoul, David; Niu, Junyang; Huang, Bo; Liu, Liwu; Leng, Jinsong

    2016-07-01

    The dielectric elastomer minimum energy structure (DEMES) can realize large angular deformations by a small voltage-induced strain of the dielectric elastomer (DE), so it is a suitable candidate to make a rotary joint for a soft robot. Dynamic analysis is necessary for some applications, but the dynamic response of DEMESs is difficult to model because of the complicated morphology and viscoelasticity of the DE film. In this paper, a method composed of theoretical analysis and experimental measurement is presented to model the dynamic response of a DEMES rotary joint under an alternating voltage. Based on measurements of equivalent driving force and damping of the DEMES, the model can be derived. Some experiments were carried out to validate the equivalent dynamic model. The maximum angle error between model and experiment is greater than ten degrees, but it is acceptable to predict angular velocity of the DEMES, therefore, it can be applied in feedforward-feedback compound control.

  4. Genetic diversity in aspen and its relation to arthropod abundance

    PubMed Central

    Zhang, Chunxia; Vornam, Barbara; Volmer, Katharina; Prinz, Kathleen; Kleemann, Frauke; Köhler, Lars; Polle, Andrea; Finkeldey, Reiner

    2015-01-01

    The ecological consequences of biodiversity have become a prominent public issue. Little is known on the effect of genetic diversity on ecosystem services. Here, a diversity experiment was established with European and North American aspen (Populus tremula, P. tremuloides) planted in plots representing either a single deme only or combinations of two, four and eight demes. The goals of this study were to explore the complex inter- and intraspecific genetic diversity of aspen and to then relate three measures for diversity (deme diversity, genetic diversity determined as Shannon index or as expected heterozygosity) to arthropod abundance. Microsatellite and AFLP markers were used to analyze the genetic variation patterns within and between the aspen demes and deme mixtures. Large differences were observed regarding the genetic diversity within demes. An analysis of molecular variance revealed that most of the total genetic diversity was found within demes, but the genetic differentiation among demes was also high. The complex patterns of genetic diversity and differentiation resulted in large differences of the genetic variation within plots. The average diversity increased from plots with only one deme to plots with two, four, and eight demes, respectively and separated plots with and without American aspen. To test whether intra- and interspecific diversity impacts on ecosystem services, arthropod abundance was determined. Increasing genetic diversity of aspen was related to increasing abundance of arthropods. However, the relationship was mainly driven by the presence of American aspen suggesting that species identity overrode the effect of intraspecific variation of European aspen. PMID:25674097

  5. W. Edwards Deming, quality analysis, and total behavior management.

    PubMed

    Saunders, R R; Saunders, J L

    1994-01-01

    During the past 10 years, the inclusion of the word "quality" in descriptions of production methods, management approaches, educational systems, service system changes, and so forth, has grown exponentially. It appears that no new approach to any problem is likely to be given much consideration today without overt acknowledgment that some improvement in quality must be the outcome. The origins of the importance of quality are primarily rooted in the awakening recognition of the influence of W. Edwards Deming in the post-World War II restoration of Japanese industry. We provide a brief overview of Deming's approach to modernizing management methods and discuss recent criticisms from the field of organizational behavior management that his approach lacks emphasis on the role of reinforcement. We offer a different analysis of Deming's approach and relate its evolution to the contingencies of reinforcement for the behavior of consulting. We also provide an example of problem solving with Deming's approach in a social service setting familiar to many behavior analysts.

  6. Linear regression techniques for use in the EC tracer method of secondary organic aerosol estimation

    NASA Astrophysics Data System (ADS)

    Saylor, Rick D.; Edgerton, Eric S.; Hartsell, Benjamin E.

    A variety of linear regression techniques and simple slope estimators are evaluated for use in the elemental carbon (EC) tracer method of secondary organic carbon (OC) estimation. Linear regression techniques based on ordinary least squares are not suitable for situations where measurement uncertainties exist in both regressed variables. In the past, regression based on the method of Deming [1943. Statistical Adjustment of Data. Wiley, London] has been the preferred choice for EC tracer method parameter estimation. In agreement with Chu [2005. Stable estimate of primary OC/EC ratios in the EC tracer method. Atmospheric Environment 39, 1383-1392], we find that in the limited case where primary non-combustion OC (OC non-comb) is assumed to be zero, the ratio of averages (ROA) approach provides a stable and reliable estimate of the primary OC-EC ratio, (OC/EC) pri. In contrast with Chu [2005. Stable estimate of primary OC/EC ratios in the EC tracer method. Atmospheric Environment 39, 1383-1392], however, we find that the optimal use of Deming regression (and the more general York et al. [2004. Unified equations for the slope, intercept, and standard errors of the best straight line. American Journal of Physics 72, 367-375] regression) provides excellent results as well. For the more typical case where OC non-comb is allowed to obtain a non-zero value, we find that regression based on the method of York is the preferred choice for EC tracer method parameter estimation. In the York regression technique, detailed information on uncertainties in the measurement of OC and EC is used to improve the linear best fit to the given data. If only limited information is available on the relative uncertainties of OC and EC, then Deming regression should be used. On the other hand, use of ROA in the estimation of secondary OC, and thus the assumption of a zero OC non-comb value, generally leads to an overestimation of the contribution of secondary OC to total measured OC.

  7. An interactive website for analytical method comparison and bias estimation.

    PubMed

    Bahar, Burak; Tuncel, Ayse F; Holmes, Earle W; Holmes, Daniel T

    2017-12-01

    Regulatory standards mandate laboratories to perform studies to ensure accuracy and reliability of their test results. Method comparison and bias estimation are important components of these studies. We developed an interactive website for evaluating the relative performance of two analytical methods using R programming language tools. The website can be accessed at https://bahar.shinyapps.io/method_compare/. The site has an easy-to-use interface that allows both copy-pasting and manual entry of data. It also allows selection of a regression model and creation of regression and difference plots. Available regression models include Ordinary Least Squares, Weighted-Ordinary Least Squares, Deming, Weighted-Deming, Passing-Bablok and Passing-Bablok for large datasets. The server processes the data and generates downloadable reports in PDF or HTML format. Our website provides clinical laboratories a practical way to assess the relative performance of two analytical methods. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. An Analysis of Business Process Re-Engineering for Government Micro-Purchasing

    DTIC Science & Technology

    2014-09-01

    45 B. AREAS FOR FURTHER RESEARCH .......................................................46 APPENDIX A. DEMINGS ’ 14 POINTS FOR THE TRANSFORMATION OF...operational effectiveness. This research was based on the teachings of W. Edwards Deming . The first case study directed by Naval Air Systems Command...9 Crosby, W. Edwards Deming , and Joseph M. Juran. They each had their own definition of TQM, but among these three experts, they agreed, “that it is

  9. Comparison of low‐dose, half‐rotation, cone‐beam CT with electronic portal imaging device for registration of fiducial markers during prostate radiotherapy

    PubMed Central

    Wee, Leonard; Hackett, Sara Lyons; Jones, Andrew; Lim, Tee Sin; Harper, Christopher Stirling

    2013-01-01

    This study evaluated the agreement of fiducial marker localization between two modalities — an electronic portal imaging device (EPID) and cone‐beam computed tomography (CBCT) — using a low‐dose, half‐rotation scanning protocol. Twenty‐five prostate cancer patients with implanted fiducial markers were enrolled. Before each daily treatment, EPID and half‐rotation CBCT images were acquired. Translational shifts were computed for each modality and two marker‐matching algorithms, seed‐chamfer and grey‐value, were performed for each set of CBCT images. The localization offsets, and systematic and random errors from both modalities were computed. Localization performances for both modalities were compared using Bland‐Altman limits of agreement (LoA) analysis, Deming regression analysis, and Cohen's kappa inter‐rater analysis. The differences in the systematic and random errors between the modalities were within 0.2 mm in all directions. The LoA analysis revealed a 95% agreement limit of the modalities of 2 to 3.5 mm in any given translational direction. Deming regression analysis demonstrated that constant biases existed in the shifts computed by the modalities in the superior–inferior (SI) direction, but no significant proportional biases were identified in any direction. Cohen's kappa analysis showed good agreement between the modalities in prescribing translational corrections of the couch at 3 and 5 mm action levels. Images obtained from EPID and half‐rotation CBCT showed acceptable agreement for registration of fiducial markers. The seed‐chamfer algorithm for tracking of fiducial markers in CBCT datasets yielded better agreement than the grey‐value matching algorithm with EPID‐based registration. PACS numbers: 87.55.km, 87.55.Qr PMID:23835391

  10. Polymorphism in the two-locus Levene model with nonepistatic directional selection.

    PubMed

    Bürger, Reinhard

    2009-11-01

    For the Levene model with soft selection in two demes, the maintenance of polymorphism at two diallelic loci is studied. Selection is nonepistatic and dominance is intermediate. Thus, there is directional selection in every deme and at every locus. We assume that selection is in opposite directions in the two demes because otherwise no polymorphism is possible. If at one locus there is no dominance, then a complete analysis of the dynamical and equilibrium properties is performed. In particular, a simple necessary and sufficient condition for the existence of an internal equilibrium and sufficient conditions for global asymptotic stability are obtained. These results are extended to deme-independent degree of dominance at one locus. A perturbation analysis establishes structural stability within the full parameter space. In the absence of genotype-environment interaction, which requires deme-independent dominance at both loci, nongeneric equilibrium behavior occurs, and the introduction of arbitrarily small genotype-environment interaction changes the equilibrium structure and may destroy stable polymorphism. The volume of the parameter space for which a (stable) two-locus polymorphism is maintained is computed numerically. It is investigated how this volume depends on the strength of selection and on the dominance relations. If the favorable allele is (partially) dominant in its deme, more than 20% of all parameter combinations lead to a globally asymptotically stable, fully polymorphic equilibrium.

  11. W. Edwards Deming and total quality management: an interpretation for nursing practice.

    PubMed

    Williams, T; Howe, R

    1992-01-01

    The purpose of this article is to introduce nurses to W. Edwards Deming and the 14 points of his management philosophy, the basis of total quality management (TQM) (Deming, 1986). Each of Deming's points has been subject to in-depth analysis from business executives for the past 40 years. Quality improvement is at the very center of TQM. To adopt TQM will require a major thought transformation for many nursing leaders, but the benefits that nurses and the profession as a whole can reap from this revolutionary style of management make the effort to change worthwhile. If you are not satisfied with the status quo and are looking for a better way to conduct business, the information in this article will begin to define quality improvement and will help you strive for the highest possible level of service to your ultimate customer--the patient.

  12. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  13. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Quality Management in Career Services, a la Deming.

    ERIC Educational Resources Information Center

    Korschgen, Ann J.; Rounds, Dan

    1992-01-01

    Describes career services program at University of Wisconsin-La Crosse that adapted material from W.E. Deming's quality control manual to the needs of its office. Discusses Deming's work and its implications for career services professionals, then describes application of Deming's model to the career services program. (NB)

  16. Dr. Deming and the Improvement of Schooling: No Instant Pudding.

    ERIC Educational Resources Information Center

    Holt, Maurice

    1993-01-01

    To keep Deming's ideas alive, this article reexamines how Deming's philosophy connects with education and discusses implications for educators, parents, and legislators. Expecting to achieve "quality" results in little time encourages "quick fixes" and a crisis mentality that distort Deming's message and lead nowhere. There are…

  17. Comparison of the vanadate oxidase method with the diazo method for serum bilirubin determination in dog, monkey, and rat.

    PubMed

    Ameri, Mehrdad; Schnaars, Henry; Sibley, John; Honor, David

    2011-01-01

    The most widely used method for bilirubin concentration determination is the diazo method, which measures the color of azobilirubin. The vanadate oxidase method is based on oxidation of bilirubin to biliverdin by vanadate. The objective of this study was to compare total and direct bilirubin concentration ([Bt] and [Bd], respectively) determined by the diazo and vanadate oxidase methods in pooled serum samples from dogs, monkeys, and rats spiked with panels of different concentrations of bilirubin standards. Pooled serum samples from 40 dogs, 40 monkeys, and 60 rats were spiked with either ditaurine conjugates of bilirubin or a standard reference material. The results obtained from both assays were compared using Deming regression analysis. The intra- and interassay precision, expressed as a percentage of the coefficient of variation (%CV), was determined for [Bt] and [Bd], and the mean percentage of recovery was calculated. The vanadate oxidase method displayed an excellent correlation (r  =  0.99-1.00) with the diazo method. Using Deming regression, there were minimal negative or positive constant and proportional biases for [Bt] and [Bd]. The precision studies revealed that the vanadate oxidase method has comparable between-run and within-run CVs to those of the diazo method. The recovery study demonstrated that the diazo method more closely approximates the expected values of [Bt]. In conclusion, the vanadate oxidase method is a simple and rapid method that can be employed as an alternative to the diazo method when interfering substances are present in the serum samples of dog, monkey, and rat.

  18. Linear regression analysis for comparing two measurers or methods of measurement: but which regression?

    PubMed

    Ludbrook, John

    2010-07-01

    1. There are two reasons for wanting to compare measurers or methods of measurement. One is to calibrate one method or measurer against another; the other is to detect bias. Fixed bias is present when one method gives higher (or lower) values across the whole range of measurement. Proportional bias is present when one method gives values that diverge progressively from those of the other. 2. Linear regression analysis is a popular method for comparing methods of measurement, but the familiar ordinary least squares (OLS) method is rarely acceptable. The OLS method requires that the x values are fixed by the design of the study, whereas it is usual that both y and x values are free to vary and are subject to error. In this case, special regression techniques must be used. 3. Clinical chemists favour techniques such as major axis regression ('Deming's method'), the Passing-Bablok method or the bivariate least median squares method. Other disciplines, such as allometry, astronomy, biology, econometrics, fisheries research, genetics, geology, physics and sports science, have their own preferences. 4. Many Monte Carlo simulations have been performed to try to decide which technique is best, but the results are almost uninterpretable. 5. I suggest that pharmacologists and physiologists should use ordinary least products regression analysis (geometric mean regression, reduced major axis regression): it is versatile, can be used for calibration or to detect bias and can be executed by hand-held calculator or by using the loss function in popular, general-purpose, statistical software.

  19. 40 CFR 721.9517 - Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Siloxanes and silicones, de-Me, 3-[4... Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1-pyrrolidinyl... substance identified as siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethylamino) propyl]amino] carbonyl]-2...

  20. 40 CFR 721.9517 - Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 31 2014-07-01 2014-07-01 false Siloxanes and silicones, de-Me, 3-[4... Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1-pyrrolidinyl... substance identified as siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethylamino) propyl]amino] carbonyl]-2...

  1. 40 CFR 721.9517 - Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 32 2012-07-01 2012-07-01 false Siloxanes and silicones, de-Me, 3-[4... Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1-pyrrolidinyl... substance identified as siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethylamino) propyl]amino] carbonyl]-2...

  2. 40 CFR 721.9517 - Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 31 2011-07-01 2011-07-01 false Siloxanes and silicones, de-Me, 3-[4... Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1-pyrrolidinyl... substance identified as siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethylamino) propyl]amino] carbonyl]-2...

  3. 40 CFR 721.9517 - Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 32 2013-07-01 2013-07-01 false Siloxanes and silicones, de-Me, 3-[4... Siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethyl amino) propyl] amino]carbonyl]-2-oxo-1-pyrrolidinyl... substance identified as siloxanes and silicones, de-Me, 3-[4-[[[3-(dimethylamino) propyl]amino] carbonyl]-2...

  4. The IAEA neutron coincidence counting (INCC) and the DEMING least-squares fitting programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krick, M.S.; Harker, W.C.; Rinard, P.M.

    1998-12-01

    Two computer programs are described: (1) the INCC (IAEA or International Neutron Coincidence Counting) program and (2) the DEMING curve-fitting program. The INCC program is an IAEA version of the Los Alamos NCC (Neutron Coincidence Counting) code. The DEMING program is an upgrade of earlier Windows{reg_sign} and DOS codes with the same name. The versions described are INCC 3.00 and DEMING 1.11. The INCC and DEMING codes provide inspectors with the software support needed to perform calibration and verification measurements with all of the neutron coincidence counting systems used in IAEA inspections for the nondestructive assay of plutonium and uranium.

  5. Use of dried blood spots for the determination of serum concentrations of tamoxifen and endoxifen.

    PubMed

    Jager, N G L; Rosing, H; Schellens, J H M; Beijnen, J H; Linn, S C

    2014-07-01

    The anti-estrogenic effect of tamoxifen is suggested to be mainly attributable to its metabolite (Z)-endoxifen, and a minimum therapeutic threshold for (Z)-endoxifen in serum has been proposed. The objective of this research was to establish the relationship between dried blood spot (DBS) and serum concentrations of tamoxifen and (Z)-endoxifen to allow the use of DBS sampling, a simple and patient-friendly alternative to venous sampling, in clinical practice. Paired DBS and serum samples were obtained from 50 patients using tamoxifen and analyzed using HPLC-MS/MS. Serum concentrations were calculated from DBS concentrations using the formula calculated serum concentration = DBS concentration/([1-haematocrit (Hct)] + blood cell-to-serum ratio × Hct). The blood cell-to-serum ratio was determined ex vivo by incubating a batch of whole blood spiked with both analytes. The average Hct for female adults was imputed as a fixed value. Calculated and analyzed serum concentrations were compared using weighted Deming regression. Weighted Deming regression analysis comparing 44 matching pairs of DBS and serum samples showed a proportional bias for both analytes. Serum concentrations were calculated using [Tamoxifen] serum, calculated  = [Tamoxifen] DBS /0.779 and [(Z)-Endoxifen] serum, calculated = [(Z)-Endoxifen] DBS /0.663. Calculated serum concentrations were within 20 % of analyzed serum concentrations in 84 and 100 % of patient samples for tamoxifen and (Z)-endoxifen, respectively. In conclusion, DBS concentrations of tamoxifen and (Z)-endoxifen were equal to serum concentrations after correction for Hct and blood cell-to-serum ratio. DBS sampling can be used in clinical practice.

  6. Stable estimate of primary OC/EC ratios in the EC tracer method

    NASA Astrophysics Data System (ADS)

    Chu, Shao-Hang

    In fine particulate matter studies, the primary OC/EC ratio plays an important role in estimating the secondary organic aerosol contribution to PM2.5 concentrations using the EC tracer method. In this study, numerical experiments are carried out to test and compare various statistical techniques in the estimation of primary OC/EC ratios. The influence of random measurement errors in both primary OC and EC measurements on the estimation of the expected primary OC/EC ratios is examined. It is found that random measurement errors in EC generally create an underestimation of the slope and an overestimation of the intercept of the ordinary least-squares regression line. The Deming regression analysis performs much better than the ordinary regression, but it tends to overcorrect the problem by slightly overestimating the slope and underestimating the intercept. Averaging the ratios directly is usually undesirable because the average is strongly influenced by unrealistically high values of OC/EC ratios resulting from random measurement errors at low EC concentrations. The errors generally result in a skewed distribution of the OC/EC ratios even if the parent distributions of OC and EC are close to normal. When measured OC contains a significant amount of non-combustion OC Deming regression is a much better tool and should be used to estimate both the primary OC/EC ratio and the non-combustion OC. However, if the non-combustion OC is negligibly small the best and most robust estimator of the OC/EC ratio turns out to be the simple ratio of the OC and EC averages. It not only reduces random errors by averaging individual variables separately but also acts as a weighted average of ratios to minimize the influence of unrealistically high OC/EC ratios created by measurement errors at low EC concentrations. The median of OC/EC ratios ranks a close second, and the geometric mean of ratios ranks third. This is because their estimations are insensitive to questionable extreme values. A real world example is given using the ambient data collected from an Atlanta STN site during the winter of 2001-2002.

  7. Bioimpedance analysis vs. DEXA as a screening tool for osteosarcopenia in lean, overweight and obese Caucasian postmenopausal females.

    PubMed

    Peppa, Melpomeni; Stefanaki, Charikleia; Papaefstathiou, Athanasios; Boschiero, Dario; Dimitriadis, George; Chrousos, George P

    2017-04-01

    We aimed at evaluating the efficiency of a newly developed, advanced Bioimpedance Analysis (BIA-ACC®) device as a screening tool for determining the degree of obesity and osteosarcopenia in postmenopausal women with normal or decreased bone density determined by Dual-Energy X-Ray absorptiometry (DEXA) in a representative sample of Greek postmenopausal women. This is a single-gate cross-sectional study of body composition measured by BIA-ACC® and DEXA. Postmenopausal females with BMI ranging from 18.5 to 40 kg/m2 were subjected to two consecutive measurements of DEXA and BIA-ACC® within 5-10 minutes of each other. We used Pearson's co-efficient to examine linear correlations, the intraclass correlation co-efficient (ICC) to test reliability, Bland-Atman plots to assess bias and Deming regressions to establish the agreement in parameters measured by BIA-ACC® and DEXA. Last, we used ANOVA, with Bonferroni correction and Dunnett T3 post hoc tests, for assessing the differences between quantitative and Pearson's x2 between qualitative variables. Our sample consisted of 84 overweight/obese postmenopausal women, aged 39-83 years, of whom 22 had normal bone density, 38 had osteopenia and 24 had osteoporosis based on DEXA measurements, using quota sampling. ICCs and Deming regressions showed strong agreement between BIA-ACC® and DEXA and demonstrated minimal proportional differences of no apparent clinical significance. Bland-Altman plots indicated minimal biases. Fat, skeletal and bone mass measured by BIA-ACC® and DEXA were increased in the non-osteopenic/non-osteoporotic women compared with those of the osteopenic and osteoporotic groups. BIA-ACC® is a rapid, bloodless and useful screening tool for determining body composition adiposity and presence of osteo-sarcopenic features in postmenopausal women. Women with osteopenia and osteoporosis evaluated by DEXA had decreased fat, skeletal and bone mass compared with normal bone density women, suggesting concordance in the change of these three organ masses in postmenopausal women.

  8. Deming on Education: A View from the Seminar.

    ERIC Educational Resources Information Center

    Holt, Maurice

    1993-01-01

    W. Edwards Deming rejects school-improvement proposals based on formulating higher standards and enforcing them with performance assessments. To Deming, the Bush Administration's America 2000 education strategy is "a horrible example of numerical goals, tests, rewards, but no method." Emphasizing rationalist performance measures…

  9. Acute Ozone-Induced Pulmonary and Systemic Metabolic Effects Are Diminished in Adrenalectomized Rats

    PubMed Central

    Miller, Desinia B.; Snow, Samantha J.; Schladweiler, Mette C.; Richards, Judy E.; Ghio, Andrew J.; Ledbetter, Allen D.; Kodavanti, Urmila P.

    2016-01-01

    Acute ozone exposure increases circulating stress hormones and induces metabolic alterations in animals. We hypothesized that the increase of adrenal-derived stress hormones is necessary for both ozone-induced metabolic effects and lung injury. Male Wistar-Kyoto rats underwent bilateral adrenal demedullation (DEMED), total bilateral adrenalectomy (ADREX), or sham surgery (SHAM). After a 4 day recovery, rats were exposed to air or ozone (1 ppm), 4 h/day for 1 or 2 days and responses assessed immediately postexposure. Circulating adrenaline levels dropped to nearly zero in DEMED and ADREX rats relative to SHAM. Corticosterone tended to be low in DEMED rats and dropped to nearly zero in ADREX rats. Adrenalectomy in air-exposed rats caused modest changes in metabolites and lung toxicity parameters. Ozone-induced hyperglycemia and glucose intolerance were markedly attenuated in DEMED rats with nearly complete reversal in ADREX rats. Ozone increased circulating epinephrine and corticosterone in SHAM but not in DEMED or ADREX rats. Free fatty acids (P = .15) and branched-chain amino acids increased after ozone exposure in SHAM but not in DEMED or ADREX rats. Lung minute volume was not affected by surgery or ozone but ozone-induced labored breathing was less pronounced in ADREX rats. Ozone-induced increases in lung protein leakage and neutrophilic inflammation were markedly reduced in DEMED and ADREX rats (ADREX > DEMED). Ozone-mediated decreases in circulating white blood cells in SHAM were not observed in DEMED and ADREX rats. We demonstrate that ozone-induced peripheral metabolic effects and lung injury/inflammation are mediated through adrenal-derived stress hormones likely via the activation of stress response pathway. PMID:26732886

  10. The Deming Method: Systems Theory for Educational Technology Services.

    ERIC Educational Resources Information Center

    Richie, Mark L.

    1993-01-01

    Discusses quality management principles as taught by W. Edwards Deming and describes their applications to educational technology services. Traditional organizational charts are explained; and benefits of using flow charts in Deming's systems are described, including better communications between departments, building teamwork, and opportunities…

  11. Restructuring Schools by Applying Deming's Management Theories.

    ERIC Educational Resources Information Center

    Melvin, Charles A., III

    1991-01-01

    Four school districts adopted a school restructuring project using Deming's business management method. Deming offered alternative views of organizations based on psychology, systems, perceptual framework, and causes of variance. He listed 14 points for quality improvement. Evaluation indicated that key staff members willingly engaged in…

  12. Dr. Deming Comes to Class.

    ERIC Educational Resources Information Center

    Gartner, William B.

    1993-01-01

    One college faculty member's experiences in applying Deming management theory to his business courses to improve instruction are discussed. Key issues in the Deming philosophy are outlined, course changes based on them are described, and outcomes are examined. Suggestions are offered for overcoming institutional and ideological barriers. (MSE)

  13. Contemporary divergence in early life history in grayling (Thymallus thymallus).

    PubMed

    Thomassen, Gaute; Barson, Nicola J; Haugen, Thrond O; Vøllestad, L Asbjørn

    2011-12-13

    Following colonization of new habitats and subsequent selection, adaptation to environmental conditions might be expected to be rapid. In a mountain lake in Norway, Lesjaskogsvatnet, more than 20 distinct spawning demes of grayling have been established since the lake was colonized, some 20-25 generations ago. The demes spawn in tributaries consistently exhibiting either colder or warmer temperature conditions during spawning in spring and subsequent early development during early summer. In order to explore the degree of temperature-related divergence in early development, a multi-temperature common-garden experiment was performed on embryos from four different demes experiencing different spring temperatures. Early developmental characters were measured to test if individuals from the four demes respond differently to the treatment temperatures. There was clear evidence of among-deme differences (genotype - environment interactions) in larval growth and yolk-to-body-size conversion efficiency. Under the cold treatment regime, larval growth rates were highest for individuals belonging to cold streams. Individuals from warm streams had the highest yolk-consumption rate under cold conditions. As a consequence, yolk-to-body-mass conversion efficiency was highest for cold-deme individuals under cold conditions. As we observed response parallelism between individuals from demes belonging to similar thermal groups for these traits, some of the differentiation seems likely to result from local adaptation The observed differences in length at age during early larval development most likely have a genetic component, even though both directional and random processes are likely to have influenced evolutionary change in the demes under study.

  14. The Educational Consequences of W. Edwards Deming.

    ERIC Educational Resources Information Center

    Holt, Maurice

    1993-01-01

    Taylorism (the rational-managerial model) still dominates U.S. education. Deming's quality and improvement concepts cut much deeper than "total quality management" externalities and differ markedly from management by objectives or outcome-based education approaches. The Deming approach is no quick fix but requires a fundamental change in…

  15. Think Quality! The Deming Approach Does Work in Libraries.

    ERIC Educational Resources Information Center

    Mackey, Terry; Mackey, Kitty

    1992-01-01

    Presents W. Edwards Deming's Total Quality Management method and advocates its adoption in libraries. The 14 points that form the basis of Deming's philosophy are discussed in the context of the library setting. A flow chart of the reference process and user survey questions are included. (MES)

  16. Total Quality Management Case Study in a Navy Headquarters Organization

    DTIC Science & Technology

    1990-02-01

    A-0 APPENDIX B--DEMING’S 14 PRINCIPLES OF MANAGEMENT .............................. B-0 APPENDIX C--NAVAIR QM B CHARTER...Taylor, Logistics Intern A-I APPENDIX B DEMING’S 14 PRINCIPLES OF MANAGEMENT B-0 DEMING’S 14 PRINCIPLES OF MANAGEMENT 1. Create constancy of purpose

  17. Process Approach to Determining Quality Inspection Deployment

    DTIC Science & Technology

    2015-06-08

    27 B.1 The Deming Rule...k1/k2? [5] At this stage it is assumed that the manufacturing process is capable and that inspection is effective. The Deming rule is explained in...justify reducing inspectors. (See Appendix B for Deming rule discussion.) Three quantities must be determined: p, the probability of a nonconformity

  18. Deming's System of Profound Knowledge: An Overview for International Educators.

    ERIC Educational Resources Information Center

    Evans, Thomas J.

    W. Edwards Deming called for the transformation to a new style of organizational management based on greater cooperation between managers and employees. This transformation could be achieved by introducing "profound knowledge" into the system. This paper is a presentation outline that was used to introduce the basics of Deming's theory…

  19. Process Approach to Determining Quality Inspection Deployment

    DTIC Science & Technology

    2015-06-08

    27 B.1 The Deming Rule...inspection is effective. The Deming rule is explained in more detail in Appendix B. Basically the probability of error is compared to the cost of...cost or risk to repair defects escaped from inspection justify reducing inspectors. (See Appendix B for Deming rule discussion.) Three quantities must

  20. Why Deming and OBE Don't Mix.

    ERIC Educational Resources Information Center

    Holt, Maurice

    1995-01-01

    The central idea in W. Edwards Deming's approach to quality management is the need to improve process. Outcome-based education's central defect is its failure to address process. Deming would reject OBE along with management-by-objectives. Education is not a product defined by specific output measures, but a process to develop the mind. (MLH)

  1. Quality in Higher Education: The Contribution of Edward Demings Principles

    ERIC Educational Resources Information Center

    Redmond, Richard; Curtis, Elizabeth; Noone, Tom; Keenan, Paul

    2008-01-01

    Purpose--There can be little doubt about the importance and relevance of quality for any service industry. One of the most influential contributors to service quality developments was W. Edwards Deming (1900-1993). An important component of Demings philosophy is reflected in his 14-principles for transforming a service as they indicate what…

  2. Modeling the expected lifetime and evolution of a deme's principal genetic sequence.

    NASA Astrophysics Data System (ADS)

    Clark, Brian

    2014-03-01

    The principal genetic sequence (PGS) is the most common genetic sequence in a deme. The PGS changes over time because new genetic sequences are created by inversions, compete with the current PGS, and a small fraction become PGSs. A set of coupled difference equations provides a description of the evolution of the PGS distribution function in an ensemble of demes. Solving the set of equations produces the survival probability of a new genetic sequence and the expected lifetime of an existing PGS as a function of inversion size and rate, recombination rate, and deme size. Additionally, the PGS distribution function is used to explain the transition pathway from old to new PGSs. We compare these results to a cellular automaton based representation of a deme and the drosophila species, D. melanogaster and D. yakuba.

  3. Acute Ozone-Induced Pulmonary and Systemic Metabolic ...

    EPA Pesticide Factsheets

    Acute ozone exposure increases circulating stress hormones and induces metabolic alterations in animals and humans. We hypothesized that the increase of adrenal-derived stress hormones is necessary for both ozone-induced metabolic effects and lung injury. Male Wistar-Kyoto rats underwent adrenal demedullation (DEMED), total bilateral adrenalectomy (ADREX), or sham surgery (SHAM). After a 4 day recovery, rats were exposed to air or ozone (1ppm), 4h/day for 1 or 2 days. Circulating adrenaline levels dropped to nearly zero in DEMED and ADREX rats relative to air-exposed SHAM. Corticosterone levels tended to be low in DEMED rats and dropped to nearly zero in ADREX rats. Adrenalectomy in air-exposed rats caused modest changes in metabolites and lung toxicity parameters. Ozone-induced hyperglycemia and glucose intolerance were markedly attenuated in DEMED rats with nearly complete reversal in ADREX rats. Ozone increased circulating epinephrine and corticosterone in SHAM but not in DEMED or ADREX rats. Free fatty acids (p=0.15) and branched-chain amino acids increased after ozone exposure in SHAM but not in DEMED or ADREX rats. Lung minute volume was not affected by surgery or ozone but ozone-induced labored breathing was less pronounced in ADREX rats. Ozone-induced increases in lung protein leakage and neutrophilic inflammation were markedly reduced in DEMED and ADREX rats (ADREX>DMED). Ozone-mediated decreases in circulating white blood cells in SHAM were not obser

  4. Acute Ozone-Induced Pulmonary and Systemic Metabolic ...

    EPA Pesticide Factsheets

    Acute ozone exposure increases circulating stress hormones and induces peripheral metabolic alterations in animals and humans. We hypothesized that the increase of adrenal-derived stress hormones is necessary for ozone-induced systemic metabolic effects and lung injury. Male Wistar-Kyoto rats (12 week-old) underwent total bilateral adrenalectomy (ADREX), adrenal demedullation (DEMED) or sham surgery (SHEM). After 4 day recovery, rats were exposed to air or ozone (1ppm), 4h/day for 1 or 2 days. Circulating adrenaline levels dropped to nearly zero in DEMED and ADREX rats relative to air-exposed SHAM. Corticosterone levels tended to be low in DEMED rats and dropped to nearly zero in ADREX rats. Adrenalectomy in air-exposed rats caused modest changes in metabolites and lung toxicity parameters. Ozone-induced hyperglycemia and glucose intolerance were markedly attenuated in DEMED with nearly complete reversal in ADREX rats. Ozone increased circulating epinephrine and corticosterone in SHAM but not in DEMED or ADREX rats. Free fatty acids and branched-chain amino acids tended to increase after ozone exposure in SHAM but not in DEMED or ADREX rats. Lung minute volume was not affected by surgery or ozone but ozone-induced labored breathing was less pronounced in ADREX rats. Ozone-induced increases in lung protein leakage and neutrophilic inflammation were markedly reduced in DEMED and ADREX rats (ADREX>DMED). Ozone-mediated decrease in circulating WBC in SHAM was not

  5. The Reconciliation of W. Edwards Deming and John Dewey: An Exploration of Similarities in Motivation Theory.

    ERIC Educational Resources Information Center

    Towns, William C.

    1996-01-01

    Interrogates similarities and misconceptions common to W. Edwards Deming and John Dewey, examining a reconciliation of the two within the context of motivation theory and concluding that Deming and Dewey are very similar in general outlook and the shared belief in the integrity of the individual within the social system. (SM)

  6. The Case of Deming, New Mexico: International Public Education. Multicultural Videocase Series.

    ERIC Educational Resources Information Center

    Herbert, Joanne M., Ed.; McNergney, Robert F., Ed.

    This guide accompanies one of a pair of videocases depicting educational life in Deming, New Mexico. The videocase includes 28 minutes of unstaged but edited videotape footage of teaching and learning in and around junior high and mid-high schools in Deming. The first section of the guide, "Teaching Note" (Todd Kent) contains a…

  7. Comparison of multiple methods to measure maternal fat mass in late gestation12

    PubMed Central

    Marshall, Nicole E; Murphy, Elizabeth J; King, Janet C; Haas, E Kate; Lim, Jeong Y; Wiedrick, Jack; Thornburg, Kent L; Purnell, Jonathan Q

    2016-01-01

    Background: Measurements of maternal fat mass (FM) are important for studies of maternal and fetal health. Common methods of estimating FM have not been previously compared in pregnancy with measurements using more complete body composition models. Objectives: The goal of this pilot study was to compare multiple methods that estimate FM, including 2-, 3- and 4-compartment models in pregnant women at term, and to determine how these measures compare with FM by dual-energy X-ray absorptiometry (DXA) 2 wk postpartum. Design: Forty-one healthy pregnant women with prepregnancy body mass index (in kg/m2) 19 to 46 underwent skinfold thickness (SFT), bioelectrical impedance analysis (BIA), body density (Db) via air displacement plethysmography (ADP), and deuterium dilution of total body water (TBW) with and without adjustments for gestational age using van Raaij (VRJ) equations at 37–38 wk of gestation and 2 wk postpartum to derive 8 estimates of maternal FM. Deming regression analysis and Bland-Altman plots were used to compare methods of FM assessment. Results: Systematic differences in FM estimates were found. Methods for FM estimates from lowest to highest were 4-compartment, DXA, TBW(VRJ), 3-compartment, Db(VRJ), BIA, air displacement plethysmography body density, and SFT ranging from a mean ± SD of 29.5 ± 13.2 kg via 4-compartment to 39.1 ± 11.7 kg via SFT. Compared with postpartum DXA values, Deming regressions revealed no substantial departures from trend lines in maternal FM in late pregnancy for any of the methods. The 4-compartment method showed substantial negative (underestimating) constant bias, and the air displacement plethysmography body density and SFT methods showed positive (overestimating) constant bias. ADP via Db(VRJ) and 3-compartment methods had the highest precision; BIA had the lowest. Conclusions: ADP that uses gestational age-specific equations may provide a reasonable and practical measurement of maternal FM across a spectrum of body weights in late pregnancy. SFT would be acceptable for use in larger studies. This trial was registered at clinicaltrials.gov as NCT02586714. PMID:26888714

  8. Quality Management for Educational Technology Services: A Guide to Application of the Deming Management Method for District, University and Regional Media and Technology Centers.

    ERIC Educational Resources Information Center

    Richie, Mark L.

    This book shows how the quality management approach pioneered in Japan by Dr. W. Edwards Deming allows educational service centers to expand services and be more flexible by reducing waste and rework. Deming's method shows how to change from reactive management to a dynamic system of continuous improvement that restores worker pride, increases…

  9. Microhabitat Types Promote the Genetic Structure of a Micro-Endemic and Critically Endangered Mole Salamander (Ambystoma leorae) of Central Mexico

    PubMed Central

    Sunny, Armando; Monroy-Vilchis, Octavio; Reyna-Valencia, Carlos; Zarco-González, Martha M.

    2014-01-01

    The reduced immigration and emigration rates resulting from the lack of landscape connectivity of patches and the hospitality of the intervening matrix could favor the loss of alleles through genetic drift and an increased chance of inbreeding. In order for isolated populations to maintain sufficient levels of genetic diversity and adapt to environmental changes, one important conservation goal must be to preserve or reestablish connectivity among patches in a fragmented landscape. We studied the last known population of Ambystoma leorae, an endemic and critically threatened species. The aims of this study were: (1) to assess the demographic parameters of A. leorae and to distinguish and characterize the microhabitats in the river, (2) to determine the number of existing genetic groups or demes of A. leorae and to describe possible relationships between microhabitats types and demes, (3) to determine gene flow between demes, and (4) to search for geographic locations of genetic discontinuities that limit gene flow between demes. We found three types of microhabitats and three genetically differentiated subpopulations with a significant level of genetic structure. In addition, we found slight genetic barriers. Our results suggest that mole salamander’s species are very sensitive to microhabitat features and relatively narrow obstacles in their path. The estimates of bidirectional gene flow are consistent with the pattern of a stepping stone model between demes, where migration occurs between adjacent demes, but there is low gene flow between distant demes. We can also conclude that there is a positive correlation between microhabitats and genetic structure in this population. PMID:25076052

  10. Evaluation of Government Quality Assurance Oversight for DoD Acquisition Programs

    DTIC Science & Technology

    2014-11-03

    W. Edwards Deming , Joseph M. Juran,” July 1992, the Aerospace Standard (AS) 9100, and an industrially recognized quality management handbook... Deming , Joseph M. Juran,” July 1992, and Juran’s Quality Handbook (5th Edition). The Navy TQLO publication summarizes the best practices developed by...world renowned quality management experts who have set quality management best practices for more than 50 years ; Philip B. Crosby, W. Edwards Deming , and

  11. Total Quality Leadership as it Applies to the Surface Navy

    DTIC Science & Technology

    1990-12-01

    with statistical control methods. Dr. Deming opened the eyes of the Japanese. They embraced his ideas and accepted his 14 principles of management shown...move closer to fully embracing Deming’s fourteen principles of management . 3. Shipboard Leadership Compared To TQL Many activities on board Navy ships...The results of the comparison of Deming’s principles of management and the Navalized TQL principles show both similar- ities and differences do appear

  12. Determinants of genetic structure in a nonequilibrium metapopulation of the plant Silene latifolia.

    PubMed

    Fields, Peter D; Taylor, Douglas R

    2014-01-01

    Population genetic differentiation will be influenced by the demographic history of populations, opportunities for migration among neighboring demes and founder effects associated with repeated extinction and recolonization. In natural populations, these factors are expected to interact with each other and their magnitudes will vary depending on the spatial distribution and age structure of local demes. Although each of these effects has been individually identified as important in structuring genetic variance, their relative magnitude is seldom estimated in nature. We conducted a population genetic analysis in a metapopulation of the angiosperm, Silene latifolia, from which we had more than 20 years of data on the spatial distribution, demographic history, and extinction and colonization of demes. We used hierarchical Bayesian methods to disentangle which features of the populations contributed to among population variation in allele frequencies, including the magnitude and direction of their effects. We show that population age, long-term size and degree of connectivity all combine to affect the distribution of genetic variance; small, recently-founded, isolated populations contributed most to increase FST in the metapopulation. However, the effects of population size and population age are best understood as being modulated through the effects of connectivity to other extant populations, i.e. FST diminishes as populations age, but at a rate that depends how isolated the population is. These spatial and temporal correlates of population structure give insight into how migration, founder effect and within-deme genetic drift have combined to enhance and restrict genetic divergence in a natural metapopulation.

  13. Fixation Times in Deme Structured, Finite Populations with Rare Migration

    NASA Astrophysics Data System (ADS)

    Hauert, Christoph; Chen, Yu-Ting; Imhof, Lorens A.

    2014-08-01

    Population structure affects both the outcome and the speed of evolutionary dynamics. Here we consider a finite population that is divided into subpopulations called demes. The dynamics within the demes are stochastic and frequency-dependent. Individuals can adopt one of two strategic types, or . The fitness of each individual is determined by interactions with other individuals in the same deme. With small probability, proportional to fitness, individuals migrate to other demes. The outcome of these dynamics has been studied earlier by analyzing the fixation probability of a single mutant in an otherwise homogeneous population. These results give only a partial picture of the dynamics, because the time when fixation occurs can be exceedingly large. In this paper, we study the impact of deme structures on the speed of evolution. We derive analytical approximations of fixation times in the limit of rare migration and rare mutation. In this limit, the conditional fixation time of a single mutant in a population is the same as that of a single in an population. For the prisoner's dilemma game, simulation results fit very well with our analytical predictions and demonstrate that fixation takes place in a moderate amount of time as compared to the expected waiting time until a mutant successfully invades and fixates. The simulations also confirm that the conditional fixation time of a single cooperator is indeed the same as that of a single defector.

  14. Performance Evaluation: A Deadly Disease?

    ERIC Educational Resources Information Center

    Aluri, Rao; Reichel, Mary

    1994-01-01

    W. Edwards Deming condemned performance evaluations as a deadly disease afflicting American management. He argued that performance evaluations nourish fear, encourage short-term thinking, stifle teamwork, and are no better than lotteries. This article examines library literature from Deming's perspective. Although that literature accepts…

  15. Temporal change in the electromechanical properties of dielectric elastomer minimum energy structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buchberger, G., E-mail: erda.buchberger@jku.at; Hauser, B.; Jakoby, B.

    Dielectric elastomer minimum energy structures (DEMES) are soft electronic transducers and energy harvesters with potential for consumer goods. The temporal change in their electromechanical properties is of major importance for engineering tasks. Therefore, we study acrylic DEMES by impedance spectroscopy and by optical methods for a total time period of approx. 4.5 months. We apply either compliant electrodes from carbon black particles only or fluid electrodes from a mixture of carbon black particles and silicone oil. From the measurement data, the equivalent series capacitances and resistances as well as the bending angles of the transducers are obtained. We find thatmore » the equivalent series capacitances change in average between −12 %/1000 h and −4.0 %/1000 h, while the bending angles decrease linearly with slopes ranging from −15 %/1000 h to −7 %/1000 h. Transducers with high initial bending angles and electrodes from carbon black particles show the smallest changes of the electromechanical characteristics. The capacitances decrease faster for DEMES with fluid electrodes. Some DEMES of this type reveal huge and unpredictable fluctuations of the resistances over time due to the ageing of the contacts. Design guidelines for DEMES follow directly from the observed transient changes of their electromechanical performance.« less

  16. Simultaneous Determination of Underivatized Vitamin B1 and B6 in Whole Blood by Reversed Phase Ultra High Performance Liquid Chromatography Tandem Mass Spectrometry

    PubMed Central

    Puts, Johan; de Groot, Monique; Haex, Martin; Jakobs, Bernadette

    2015-01-01

    Background Vitamin B1 (thiamine-diphosphate) and B6 (pyridoxal-5’phosphate) are micronutrients. Analysis of these micronutrients is important to diagnose potential deficiency which often occurs in elderly people due to malnutrition, in severe alcoholism and in gastrointestinal compromise due to bypass surgery or disease. Existing High Performance Liquid Chromatography (HPLC) based methods include the need for derivatization and long analysis time. We developed an Ultra High Performance Liquid Chromatography Tandem Mass spectrometry (UHPLC-MS/MS) assay with internal standards for simultaneous measurement of underivatized thiamine-diphosphate and pyridoxal-5’phosphate without use of ion pairing reagent. Methods Whole blood, deproteinized with perchloric acid, containing deuterium labelled internal standards thiamine-diphosphate(thiazole-methyl-D3) and pyridoxal-5’phosphate(methyl-D3), was analyzed by UHPLC-MS/MS. The method was validated for imprecision, linearity, recovery and limit of quantification. Alternate (quantitative) method comparisons of the new versus currently used routine HPLC methods were established with Deming regression. Results Thiamine-diphosphate and pyridoxal-5’phosphate were measured within 2.5 minutes instrumental run time. Limits of detection were 2.8 nmol/L and 7.8 nmol/L for thiamine-diphosphate and pyridoxal-5’phosphate respectively. Limit of quantification was 9.4 nmol/L for thiamine-diphosphate and 25.9 nmol/L for pyridoxal-5’phosphate. The total imprecision ranged from 3.5–7.7% for thiamine-diphosphate (44–157 nmol/L) and 6.0–10.4% for pyridoxal-5’phosphate (30–130 nmol/L). Extraction recoveries were 101–102% ± 2.5% (thiamine-diphosphate) and 98–100% ± 5% (pyridoxal-5’phosphate). Deming regression yielded slopes of 0.926 and 0.990 in patient samples (n = 282) and national proficiency testing samples (n = 12) respectively, intercepts of +3.5 and +3 for thiamine-diphosphate (n = 282 and n = 12) and slopes of 1.04 and 0.84, intercepts of -2.9 and +20 for pyridoxal-5’phosphate (n = 376 and n = 12). Conclusion The described UHPLC-MS/MS method allows simultaneous determination of underivatized thiamine-diphosphate and pyridoxal-5’phosphate in whole blood without intensive sample preparation. PMID:26134844

  17. The Total Quality Classroom.

    ERIC Educational Resources Information Center

    Bonstingl, John Jay

    1992-01-01

    Deming's quality-focused principles contradict Taylor's top-down, authoritarian industrial model. Deming advises educators to fix the system instead of blaming individual teachers for student apathy and low achievement. In the new paradigm, education will be a process that encourages continual progress through improving one's abilities, expanding…

  18. Our Deming Users' Group.

    ERIC Educational Resources Information Center

    Dinklocker, Christina

    1992-01-01

    After training in the Total Quality Management concept, a suburban Ohio school district created a Deming Users' Group to link agencies, individuals, and ideas. The group has facilitated ongoing school/business collaboration, networking among individuals from diverse school systems, mentoring and cooperative learning activities, and resource…

  19. Adrenal-derived stress hormones modulate ozone-induced lung injury and inflammation.

    PubMed

    Henriquez, Andres; House, John; Miller, Desinia B; Snow, Samantha J; Fisher, Anna; Ren, Hongzu; Schladweiler, Mette C; Ledbetter, Allen D; Wright, Fred; Kodavanti, Urmila P

    2017-08-15

    Ozone-induced systemic effects are modulated through activation of the neuro-hormonal stress response pathway. Adrenal demedullation (DEMED) or bilateral total adrenalectomy (ADREX) inhibits systemic and pulmonary effects of acute ozone exposure. To understand the influence of adrenal-derived stress hormones in mediating ozone-induced lung injury/inflammation, we assessed global gene expression (mRNA sequencing) and selected proteins in lung tissues from male Wistar-Kyoto rats that underwent DEMED, ADREX, or sham surgery (SHAM) prior to their exposure to air or ozone (1ppm), 4h/day for 1 or 2days. Ozone exposure significantly changed the expression of over 2300 genes in lungs of SHAM rats, and these changes were markedly reduced in DEMED and ADREX rats. SHAM surgery but not DEMED or ADREX resulted in activation of multiple ozone-responsive pathways, including glucocorticoid, acute phase response, NRF2, and PI3K-AKT. Predicted targets from sequencing data showed a similarity between transcriptional changes induced by ozone and adrenergic and steroidal modulation of effects in SHAM but not ADREX rats. Ozone-induced increases in lung Il6 in SHAM rats coincided with neutrophilic inflammation, but were diminished in DEMED and ADREX rats. Although ozone exposure in SHAM rats did not significantly alter mRNA expression of Ifnγ and Il-4, the IL-4 protein and ratio of IL-4 to IFNγ (IL-4/IFNγ) proteins increased suggesting a tendency for a Th2 response. This did not occur in ADREX and DEMED rats. We demonstrate that ozone-induced lung injury and neutrophilic inflammation require the presence of circulating epinephrine and corticosterone, which transcriptionally regulates signaling mechanisms involved in this response. Published by Elsevier Inc.

  20. Using structural equation modeling to construct calibration equations relating PM2.5 mass concentration samplers to the federal reference method sampler

    NASA Astrophysics Data System (ADS)

    Bilonick, Richard A.; Connell, Daniel P.; Talbott, Evelyn O.; Rager, Judith R.; Xue, Tao

    2015-02-01

    The objective of this study was to remove systematic bias among fine particulate matter (PM2.5) mass concentration measurements made by different types of samplers used in the Pittsburgh Aerosol Research and Inhalation Epidemiology Study (PARIES). PARIES is a retrospective epidemiology study that aims to provide a comprehensive analysis of the associations between air quality and human health effects in the Pittsburgh, Pennsylvania, region from 1999 to 2008. Calibration was needed in order to minimize the amount of systematic error in PM2.5 exposure estimation as a result of including data from 97 different PM2.5 samplers at 47 monitoring sites. Ordinary regression often has been used for calibrating air quality measurements from pairs of measurement devices; however, this is only appropriate when one of the two devices (the "independent" variable) is free from random error, which is rarely the case. A group of methods known as "errors-in-variables" (e.g., Deming regression, reduced major axis regression) has been developed to handle calibration between two devices when both are subject to random error, but these methods require information on the relative sizes of the random errors for each device, which typically cannot be obtained from the observed data. When data from more than two devices (or repeats of the same device) are available, the additional information is not used to inform the calibration. A more general approach that often has been overlooked is the use of a measurement error structural equation model (SEM) that allows the simultaneous comparison of three or more devices (or repeats). The theoretical underpinnings of all of these approaches to calibration are described, and the pros and cons of each are discussed. In particular, it is shown that both ordinary regression (when used for calibration) and Deming regression are particular examples of SEMs but with substantial deficiencies. To illustrate the use of SEMs, the 7865 daily average PM2.5 mass concentration measurements made by seven collocated samplers at an urban monitoring site in Pittsburgh, Pennsylvania, were used. These samplers, which included three federal reference method (FRM) samplers, three speciation samplers, and a tapered element oscillating microbalance (TEOM), operated at various times during the 10-year PARIES study period. Because TEOM measurements are known to depend on temperature, the constructed SEM provided calibration equations relating the TEOM to the FRM and speciation samplers as a function of ambient temperature. It was shown that TEOM imprecision and TEOM bias (relative to the FRM) both decreased as temperature increased. It also was shown that the temperature dependency for bias was non-linear and followed a sigmoidal (logistic) pattern. The speciation samplers exhibited only small bias relative to the FRM samplers, although the FRM samplers were shown to be substantially more precise than both the TEOM and the speciation samplers. Comparison of the SEM results to pairwise simple linear regression results showed that the regression results can differ substantially from the correctly-derived calibration equations, especially if the less-precise device is used as the independent variable in the regression.

  1. Officer Professional Military Education: a New Distance Learning Evolution

    DTIC Science & Technology

    2015-01-01

    diversity, and maximize its most valued asset: human capital. Bibliography Broughton, R. (n.d.). Dr. Deming Point 13 Institute a Vigorous Program of...Education and Retraining. Retrieved 4 23, 2015, from Quality Assurance Solutions: www.quality-assurance- solutions.com/ Deming -Point-13.html

  2. Deming's Quality: Our Last but Best Hope.

    ERIC Educational Resources Information Center

    Schenkat, Randy

    1993-01-01

    If educators endorse Alfie Kohn's surface message about Total Quality Management, they may miss opportunity to professionalize education. Deming's system of profound knowledge (interaction of theories of systems, knowledge, psychology, and variation) is a model for educated people grappling with life's complexities. Moreover, gaining community…

  3. Intercalibration of Two Polar Satellite Instruments Without Simultaneous Nadir Observations

    NASA Astrophysics Data System (ADS)

    Manninen, Terhikki; Riihela, Aku; Schaaf, Crystal; Key, Jeffrey; Lattanzio, Alessio

    2016-08-01

    A new intercalibration method for two polar satellite instruments is presented. It is based on statistical fitting of two data sets covering the same area during the same period, but not simultaneously. Deming regression with iterative weights is used. The accuracy of the method was better than about 0.5 % for the MODIS vs. MODIS and AVHRR vs. AVHRR test data sets. The intercalibration of AVHRR vs. MODIS red and NIR channels is carried out and showed a difference of reflectance values of 2% (red) and 6 % (NIR). The red channel intercalibration has slightly higher accuracy for all cases studied.

  4. Transforming Schools through Total Quality Education.

    ERIC Educational Resources Information Center

    Schmoker, Mike; Wilson, Richard B.

    1993-01-01

    Deming's work emphasizes advantages of teamwork, investment in ongoing training for all employees to increase their value to the company, and insistence that research and employee-gathered data guide and inform every decision and improvement effort. The parallel between psychologist Mihaly Csikszenmihalyi's work and Deming's shows that Total…

  5. Translating Deming's 14 Points for Education.

    ERIC Educational Resources Information Center

    Melvin, Charles A., III

    1991-01-01

    A consortium of four Wisconsin school districts has decided to stop tinkering with the educational system and apply W. Edward Deming's 14 management points to total system improvement. The rewritten precepts involve creating and adopting a fitting purpose, infusing quality into the educational "product," and working toward zero defects…

  6. Lessons from Enlightened Corporations.

    ERIC Educational Resources Information Center

    Blankstein, Alan M.

    1992-01-01

    The formula for improving U.S. schools can be found in the philosophy that helped transform Japanese industry and in Deming's 14 principles, emulated by many corporations. Deming's arguments against appraising individual performance through quotas or numerical goals call into question schools' current grading and merit pay practices. (12…

  7. Training Quality: Before and after Winning the Deming Prize.

    ERIC Educational Resources Information Center

    Magennis, Jo P.

    1995-01-01

    Describes the Quality Improvement Program developed by Florida Power and Light's Nuclear Training organization that was awarded the Deming Application Prize for quality control. Training quality, team activities, training's role in business planning, customer involvement and evaluation, and continuous improvement of training are discussed. (LRW)

  8. Adrenal-derived stress hormones modulate ozone-induced ...

    EPA Pesticide Factsheets

    Ozone-induced systemic effects are modulated through activation of the neuro-hormonal stress response pathway. Adrenal demedullation (DEMED)or bilateral total adrenalectomy (ADREX) inhibits systemic and pulmonary effect of acute ozone exposure. To understand the influence of adrenal-derived stress hormones in mediating ozone-induced lung injury/inflammation, we assessed global gene expression (mRNA sequencing) and selected proteins in lung tissues from male Wistar-Kyoto rats that underwent DEMED, ADREX, or sham surgery (SHAM)prior to their exposure to air or ozone (1 ppm),4 h/day for 1 or 2days. Ozone exposure significantly changed the expression of over 2300 genes in lungs of SHAM rats, and these changes were markedly reduced in DEMED and ADREX rats. SHAM surgery but not DEMED or ADREX resulted in activation of multiple ozone-responsive pathways, including glucocorticoid, acute phase response, NRF2, and Pl3K-AKT.Predicted targets from sequencing data showed a similarity between transcriptional changes induced by ozone and adrenergic and steroidal modulation of effects in SHAM but not ADREX rats. Ozone-induced Increases in lung 116 in SHAM rats coincided with neutrophilic Inflammation, but were diminished in DEMED and ADREX rats. Although ozone exposure in SHAM rats did not significantly alter mRNA expression of lfny and 11-4, the IL-4 protein and ratio of IL-4 to IFNy (IL-4/IFNy) proteins increased suggesting a tendency for a Th2 response. This did not occur

  9. The structured ancestral selection graph and the many-demes limit.

    PubMed

    Slade, Paul F; Wakeley, John

    2005-02-01

    We show that the unstructured ancestral selection graph applies to part of the history of a sample from a population structured by restricted migration among subpopulations, or demes. The result holds in the limit as the number of demes tends to infinity with proportionately weak selection, and we have also made the assumptions of island-type migration and that demes are equivalent in size. After an instantaneous sample-size adjustment, this structured ancestral selection graph converges to an unstructured ancestral selection graph with a mutation parameter that depends inversely on the migration rate. In contrast, the selection parameter for the population is independent of the migration rate and is identical to the selection parameter in an unstructured population. We show analytically that estimators of the migration rate, based on pairwise sequence differences, derived under the assumption of neutrality should perform equally well in the presence of weak selection. We also modify an algorithm for simulating genealogies conditional on the frequencies of two selected alleles in a sample. This permits efficient simulation of stronger selection than was previously possible. Using this new algorithm, we simulate gene genealogies under the many-demes ancestral selection graph and identify some situations in which migration has a strong effect on the time to the most recent common ancestor of the sample. We find that a similar effect also increases the sensitivity of the genealogy to selection.

  10. On Deming and School Quality: A Conversation with Enid Brown.

    ERIC Educational Resources Information Center

    Brandt, Ron

    1992-01-01

    A Deming expert explains that his 14 principles are no recipe but must be combined with the theory of profound knowledge, which poses essential questions and recognizes the importance of human variation, intrinsic motivation, and external rewards. She also debunks grading, formal teacher evaluation, tracking, and decentralized management. (MLH)

  11. Quality in Higher Education: Lessons Learned from the Baldrige Award, Deming Prize, and ISO 9000 Registration.

    ERIC Educational Resources Information Center

    Izadi, Mahyar; And Others

    1996-01-01

    Compares the Baldrige Award, Deming Prize, and ISO 9000 registration in terms of purpose, focus, eligibility, time frame, information sharing, number of recipients, and assessment. Suggests that vocational-technical programs in higher education could be improved using the criteria for these awards. (SK)

  12. Notes on TQM (Total Quality Management) and Education

    ERIC Educational Resources Information Center

    Daniel, Carter A.

    2005-01-01

    Application of Deming's TQM principles to education is long overdue. Principles that have proven their worth in businesses for decades could revolutionize our thinking about education. But they require a total commitment, from the highest to the lowest level. Deming's 14 points, and Gray Rinehart's suggestions, are presented, discussed, and…

  13. Effects of Serum Creatinine Calibration on Estimated Renal Function in African Americans: the Jackson Heart Study

    PubMed Central

    Wang, Wei; Young, Bessie A.; Fülöp, Tibor; de Boer, Ian H.; Boulware, L. Ebony; Katz, Ronit; Correa, Adolfo; Griswold, Michael E.

    2015-01-01

    Background The calibration to Isotope Dilution Mass Spectroscopy (IDMS) traceable creatinine is essential for valid use of the new Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation to estimate the glomerular filtration rate (GFR). Methods For 5,210 participants in the Jackson Heart Study (JHS), serum creatinine was measured with a multipoint enzymatic spectrophotometric assay at the baseline visit (2000–2004) and re-measured using the Roche enzymatic method, traceable to IDMS in a subset of 206 subjects. The 200 eligible samples (6 were excluded, 1 for failure of the re-measurement and 5 for outliers) were divided into three disjoint sets - training, validation, and test - to select a calibration model, estimate true errors, and assess performance of the final calibration equation. The calibration equation was applied to serum creatinine measurements of 5,210 participants to estimate GFR and the prevalence of CKD. Results The selected Deming regression model provided a slope of 0.968 (95% Confidence Interval (CI), 0.904 to 1.053) and intercept of −0.0248 (95% CI, −0.0862 to 0.0366) with R squared 0.9527. Calibrated serum creatinine showed high agreement with actual measurements when applying to the unused test set (concordance correlation coefficient 0.934, 95% CI, 0.894 to 0.960). The baseline prevalence of CKD in the JHS (2000–2004) was 6.30% using calibrated values, compared with 8.29% using non-calibrated serum creatinine with the CKD-EPI equation (P < 0.001). Conclusions A Deming regression model was chosen to optimally calibrate baseline serum creatinine measurements in the JHS and the calibrated values provide a lower CKD prevalence estimate. PMID:25806862

  14. Effects of serum creatinine calibration on estimated renal function in african americans: the Jackson heart study.

    PubMed

    Wang, Wei; Young, Bessie A; Fülöp, Tibor; de Boer, Ian H; Boulware, L Ebony; Katz, Ronit; Correa, Adolfo; Griswold, Michael E

    2015-05-01

    The calibration to isotope dilution mass spectrometry-traceable creatinine is essential for valid use of the new Chronic Kidney Disease Epidemiology Collaboration equation to estimate the glomerular filtration rate. For 5,210 participants in the Jackson Heart Study (JHS), serum creatinine was measured with a multipoint enzymatic spectrophotometric assay at the baseline visit (2000-2004) and remeasured using the Roche enzymatic method, traceable to isotope dilution mass spectrometry in a subset of 206 subjects. The 200 eligible samples (6 were excluded, 1 for failure of the remeasurement and 5 for outliers) were divided into 3 disjoint sets-training, validation and test-to select a calibration model, estimate true errors and assess performance of the final calibration equation. The calibration equation was applied to serum creatinine measurements of 5,210 participants to estimate glomerular filtration rate and the prevalence of chronic kidney disease (CKD). The selected Deming regression model provided a slope of 0.968 (95% confidence interval [CI], 0.904-1.053) and intercept of -0.0248 (95% CI, -0.0862 to 0.0366) with R value of 0.9527. Calibrated serum creatinine showed high agreement with actual measurements when applying to the unused test set (concordance correlation coefficient 0.934, 95% CI, 0.894-0.960). The baseline prevalence of CKD in the JHS (2000-2004) was 6.30% using calibrated values compared with 8.29% using noncalibrated serum creatinine with the Chronic Kidney Disease Epidemiology Collaboration equation (P < 0.001). A Deming regression model was chosen to optimally calibrate baseline serum creatinine measurements in the JHS, and the calibrated values provide a lower CKD prevalence estimate.

  15. Bilateral Symmetry of Visual Function Loss in Cone-Rod Dystrophies.

    PubMed

    Galli-Resta, Lucia; Falsini, Benedetto; Rossi, Giuseppe; Piccardi, Marco; Ziccardi, Lucia; Fadda, Antonello; Minnella, Angelo; Marangoni, Dario; Placidi, Giorgio; Campagna, Francesca; Abed, Edoardo; Bertelli, Matteo; Zuntini, Monia; Resta, Giovanni

    2016-07-01

    To investigate bilateral symmetry of visual impairment in cone-rod dystrophy (CRD) patients and understand the feasibility of clinical trial designs treating one eye and using the untreated eye as an internal control. This was a retrospective study of visual function loss measures in 436 CRD patients followed at the Ophthalmology Department of the Catholic University in Rome. Clinical measures considered were best-corrected visual acuity, focal macular cone electroretinogram (fERG), and Ganzfeld cone-mediated and rod-mediated electroretinograms. Interocular agreement in each of these clinical indexes was assessed by t- and Wilcoxon tests for paired samples, structural (Deming) regression analysis, and intraclass correlation. Baseline and follow-up measures were analyzed. A separate analysis was performed on the subset of 61 CRD patients carrying likely disease-causing mutations in the ABCA4 gene. Statistical tests show a very high degree of bilateral symmetry in the extent and progression of visual impairment in the fellow eyes of CRD patients. These data contribute to a better understanding of CRDs and support the feasibility of clinical trial designs involving unilateral eye treatment with the use of fellow eye as internal control.

  16. Application of the Deming management method to equipment-inspection processes.

    PubMed

    Campbell, C A

    1996-01-01

    The Biomedical Engineering staff at the Washington Hospital Center has designed an inspection process that optimizes timely completion of scheduled equipment inspections. The method used to revise the process was primarily Deming, but certainly the method incorporates the re-engineering concept of questioning the basic assumptions around which the original process was designed. This effort involved a review of the existing process in its entirety by task groups made up of representatives from all involved departments. Complete success in all areas has remained elusive. However, the lower variability of inspection completion ratios follows Deming's description of a successfully revised process. Further CQI efforts targeted at specific areas with low completion ratios will decrease this variability even further.

  17. Restructure Staff Development for Systemic Change

    ERIC Educational Resources Information Center

    Kelly, Thomas F.

    2012-01-01

    This paper presents a systems approach based on the work of W. Edwards Deming to system wide, high impact staff development. Deming has pointed out the significance of structure in systems. By restructuring the process of staff development we can bring about cost effective improvement of the whole system. We can improve student achievement while…

  18. The Quality Improvement Management Approach as Implemented in a Middle School.

    ERIC Educational Resources Information Center

    Bayless, David L.; And Others

    1992-01-01

    The Total Quality Management Theory of W. E. Deming can be adapted within an educational organization to build structures that support educators' beliefs. A case study of the implementation of Deming principles at the LeRoy Martin Middle School in Raleigh (North Carolina) illustrates the effectiveness of this management approach. (SLD)

  19. Implications of the Fourteen Points of Total Quality Management (TQM) for Science Education.

    ERIC Educational Resources Information Center

    Aliff, John Vincent

    The management theories of W. Edwards Deming are known as Total Quality Management (TQM) and advocate building quality into organizational processes rather than analyzing outcomes. Although TQM was originally developed for the workplace, educational reformers have been applying its principles to higher education. The original 14 points of Deming's…

  20. TQM in Education: The Theory and How To Put It To Work.

    ERIC Educational Resources Information Center

    Tribus, Myron

    This paper describes how Deming's theory of management can be applied to the educational process. Following an overview of Deming's theory, nine specific questions to ask any theory of education are posed. The differences between education and industry, as well as the differences between quality management and traditional educational approaches,…

  1. Using Total Quality Management Principles To Implement School-Based Management.

    ERIC Educational Resources Information Center

    Terry, Paul M.

    Those engaged in school restructuring can find direction in the philosophy of W. Edwards Deming, which has guided the operations of many American corporations. This paper provides an overview of Deming's Fourteen Points of Total Quality Management (TQM) and discusses their applications to education. To develop a successful TQM system, the school…

  2. The Total Quality Movement in Education.

    ERIC Educational Resources Information Center

    Leuenberger, John A.; Whitaker, Sheldon V., Jr.

    The total quality movement began as a result of the desire of W. Edwards Deming, an American statistician, to permit the economic system to maintain its edge in a growing global market. The 14 points Deming listed as essential to "total quality management" have recently been adapted to the field of education. The success of the total…

  3. Exploring the link between intrinsic motivation and quality

    NASA Astrophysics Data System (ADS)

    Christy, Steven M.

    1992-12-01

    This thesis proposes that it is workers' intrinsic motivation that leads them to produce quality work. It reviews two different types of evidence- expert opinion and empirical studies--to attempt to evaluate a link between intrinsic motivation and work quality. The thesis reviews the works of Total Quality writers and behavioral scientists for any connection they might have made between intrinsic motivation and quality. The thesis then looks at the works of Deming and his followers in an attempt to establish a match between Deming's motivational assumptions and the four task rewards in the Thomas/Tymon model of intrinsic motivation: choice, competence, meaningfulness, and progress. Based upon this analysis, it is proposed that the four Thomas/Tymon task rewards are a promising theoretical foundation for explaining the motivational basis of quality for workers in Total Quality organizations.

  4. Analysis of the Astronomy Diagnostic Test

    ERIC Educational Resources Information Center

    Brogt, Erik; Sabers, Darrell; Prather, Edward E.; Deming, Grace L.; Hufnagel, Beth; Slater, Timothy F.

    2007-01-01

    Seventy undergraduate class sections were examined from the database of Astronomy Diagnostic Test (ADT) results of Deming and Hufnagel to determine if course format correlated with ADT normalized gain scores. Normalized gains were calculated for four different classroom scenarios: lecture, lecture with discussion, lecture with lab, and lecture…

  5. Two-step voltage dual electromembrane extraction: A new approach to simultaneous extraction of acidic and basic drugs.

    PubMed

    Asadi, Sakine; Nojavan, Saeed

    2016-06-07

    In the present work, acidic and basic drugs were simultaneously extracted by a novel method of high efficiency herein referred to as two-step voltage dual electromembrane extraction (TSV-DEME). Optimizing effective parameters such as composition of organic liquid membrane, pH values of donor and acceptor solutions, voltage and duration of each step, the method had its figures of merit investigated in pure water, human plasma, wastewater, and breast milk samples. Simultaneous extraction of acidic and basic drugs was done by applying potentials of 150 V and 400 V for 6 min and 19 min as the first and second steps, respectively. The model compounds were extracted from 4 mL of sample solution (pH = 6) into 20 μL of each acceptor solution (32 mM NaOH for acidic drugs and 32 mM HCL for basic drugs). 1-Octanol was immobilized within the pores of a porous hollow fiber of polypropylene, as the supported liquid membrane (SLM) for acidic drugs, and 2-ethyle hexanol, as the SLM for basic drugs. The proposed TSV-DEME technique provided good linearity with the resulting correlation coefficients ranging from 0.993 to 0.998 over a concentration range of 1-1000 ng mL(-1). The limit of detections of the drugs were found to range within 0.3-1.5 ng mL(-1), while the corresponding repeatability ranged from 7.7 to 15.5% (n = 4). The proposed method was further compared to simple dual electromembrane extraction (DEME), indicating significantly higher recoveries for TSV-DEME procedure (38.1-68%), as compared to those of simple DEME procedure (17.7-46%). Finally, the optimized TSV-DEME was applied to extract and quantify model compounds in breast milk, wastewater, and plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. The Art of Chairing: What Deming Taught the Japanese and the Japanese Taught Me.

    ERIC Educational Resources Information Center

    Rodd, Laurel Rasplica

    2001-01-01

    Reveals how a business model--based on the work of W. Edwards Deming--helped a foreign language department chair become a better leader. Outlines seven principles for department chairs: create constancy of purpose; change and improvement are ongoing; drive out fear; work with suppliers to continually improve the quality of incoming people,…

  7. Total Quality Management and Media Services: The Deming Method.

    ERIC Educational Resources Information Center

    Richie, Mark L.

    1992-01-01

    W. Edwards Deming built a 40-year record of quality management in Japan known as Total Quality Management (TQM). His 14 points require a change in the belief system of managers and media directors, but their implementation in government agencies and schools will produce increased time for better services, better communications, and new programs.…

  8. Toward a System of Total Quality Management: Applying the Deming Approach to the Education Setting.

    ERIC Educational Resources Information Center

    McLeod, Willis B.; And Others

    1992-01-01

    Recently, the Petersburg (Virginia) Public Schools have moved away from a highly centralized organizational structure to a Total Quality Management system featuring shared decision making and school-based management practices. The district was guided by Deming's philosophy that all stakeholders should be involved in defining the level of products…

  9. Using Deming To Improve Quality in Colleges and Universities.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.; And Others

    Of all the people known for stressing quality in industry, W. Edwards Deming is the pioneer. He stresses statistical process control (SPC) and a 14-point process for managers to improve quality and productivity. His approach is humanistic and treats people as intelligent human beings who want to do a good job. Twelve administrators in a university…

  10. Total Quality Education: Profiles of Schools That Demonstrate the Power of Deming's Management Principles.

    ERIC Educational Resources Information Center

    Schmoker, Michael J.; Wilson, Richard B.

    This book presents profiles of schools that have demonstrated the power of Deming's Total Quality Management (TQM) principles. It describes schools that have successfully applied those strategies for change. The book explores what public education needs most--a compelling but flexible action plan for improvement. Chapter 1 offers a rationale for…

  11. The Teaching of Dr. W. E. Deming and the Performance Domains of the National Commission for the Principalship.

    ERIC Educational Resources Information Center

    House, Jess E.

    This paper considers the total-quality-management teachings of W. E. Deming as a basis for the redesign of educational-administration preparation programs. The performance domains developed by the National Commission for the Principalship (jointly sponsored by the National Association of Secondary School Principals and the National Association of…

  12. Creating Learning Organizations: The Deming Management Method Applied to Instruction (Quality Teaching & Quality Learning). A Paradigm Application.

    ERIC Educational Resources Information Center

    Loehr, Peter

    This paper presents W. Edwards Deming's 14 management points, 7 deadly diseases, and 4 obstacles that thwart productivity, and discusses how these principles relate to teaching and learning. Application of these principles is expected to increase the quality of learning in classrooms from kindergarten through graduate level. Examples of the…

  13. Divergence and evolution of assortative mating in a polygenic trait model of speciation with gene flow.

    PubMed

    Sachdeva, Himani; Barton, Nicholas H

    2017-06-01

    Assortative mating is an important driver of speciation in populations with gene flow and is predicted to evolve under certain conditions in few-locus models. However, the evolution of assortment is less understood for mating based on quantitative traits, which are often characterized by high genetic variability and extensive linkage disequilibrium between trait loci. We explore this scenario for a two-deme model with migration, by considering a single polygenic trait subject to divergent viability selection across demes, as well as assortative mating and sexual selection within demes, and investigate how trait divergence is shaped by various evolutionary forces. Our analysis reveals the existence of sharp thresholds of assortment strength, at which divergence increases dramatically. We also study the evolution of assortment via invasion of modifiers of mate discrimination and show that the ES assortment strength has an intermediate value under a range of migration-selection parameters, even in diverged populations, due to subtle effects which depend sensitively on the extent of phenotypic variation within these populations. The evolutionary dynamics of the polygenic trait is studied using the hypergeometric and infinitesimal models. We further investigate the sensitivity of our results to the assumptions of the hypergeometric model, using individual-based simulations. © 2017 The Author(s). Evolution © 2017 The Society for the Study of Evolution.

  14. Resistance to genetic insect control: Modelling the effects of space.

    PubMed

    Watkinson-Powell, Benjamin; Alphey, Nina

    2017-01-21

    Genetic insect control, such as self-limiting RIDL 2 (Release of Insects Carrying a Dominant Lethal) technology, is a development of the sterile insect technique which is proposed to suppress wild populations of a number of major agricultural and public health insect pests. This is achieved by mass rearing and releasing male insects that are homozygous for a repressible dominant lethal genetic construct, which causes death in progeny when inherited. The released genetically engineered ('GE') insects compete for mates with wild individuals, resulting in population suppression. A previous study modelled the evolution of a hypothetical resistance to the lethal construct using a frequency-dependent population genetic and population dynamic approach. This found that proliferation of resistance is possible but can be diluted by the introgression of susceptible alleles from the released homozygous-susceptible GE males. We develop this approach within a spatial context by modelling the spread of a lethal construct and resistance trait, and the effect on population control, in a two deme metapopulation, with GE release in one deme. Results show that spatial effects can drive an increased or decreased evolution of resistance in both the target and non-target demes, depending on the effectiveness and associated costs of the resistant trait, and on the rate of dispersal. A recurrent theme is the potential for the non-target deme to act as a source of resistant or susceptible alleles for the target deme through dispersal. This can in turn have a major impact on the effectiveness of insect population control. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Alive and Well: Optimizing the Fitness of an Organization

    ERIC Educational Resources Information Center

    Wayne, David

    2008-01-01

    Grounded in the work of W. Edwards Deming, this article describes the basics of systems thinking, viewing a business as a system, and contrasts improving a system with solving a problem. The article uses the human body as a metaphor to describe the various aspects of viewing a business as a system at the concept level and maps the Deming cycle,…

  16. Through Deming's Eyes: A Cross-National Analysis of Quality Assurance Policies in Higher Education.

    ERIC Educational Resources Information Center

    Dill, David D.

    1995-01-01

    Efforts to improve the quality of academic programs in the United Kingdom, United States, and Netherlands have followed three general approaches: the logic of competitive markets; application of incentives; and professional self-regulation. Strengths and weaknesses of these approaches for improving academic quality are examined through the lens of…

  17. The Development of a Survey Instrument on South Dakota's School District Leadership Climate as Related to Deming's Fourteen Points.

    ERIC Educational Resources Information Center

    Holmes, Lawrence W. O.; And Others

    Development of an instrument to measure baseline levels of applied Total Quality Management (TQM) practices in South Dakota before the introduction and dissemination of TQM theory to the state's educational leaders is described. Using the interpretation of Deming's 14 points that was developed by J. J. Bonstigl, a 115-item initial item pool was…

  18. Managing Productivity and Quality in the 1990s. Some Observations on TIMS (The Institute for Management Sciences) XXIX, 23-26 July 1989, Osaka, Japan

    DTIC Science & Technology

    1989-10-01

    often attempted. This profound knowledge must be possessed by management; hence Dr. Deming’s slogan: "Quality is Made in the Boardroom ." Dr. Deming...Out of the Crisis, MIT Center for Advanced Engineering Study, Cambridge, MA, 1982. 5. Lawrence M. Miller, Barbarians to Bureaucrats, Corporate Life

  19. Quantifying the Role of Population Subdivision in Evolution on Rugged Fitness Landscapes

    PubMed Central

    Bitbol, Anne-Florence; Schwab, David J.

    2014-01-01

    Natural selection drives populations towards higher fitness, but crossing fitness valleys or plateaus may facilitate progress up a rugged fitness landscape involving epistasis. We investigate quantitatively the effect of subdividing an asexual population on the time it takes to cross a fitness valley or plateau. We focus on a generic and minimal model that includes only population subdivision into equivalent demes connected by global migration, and does not require significant size changes of the demes, environmental heterogeneity or specific geographic structure. We determine the optimal speedup of valley or plateau crossing that can be gained by subdivision, if the process is driven by the deme that crosses fastest. We show that isolated demes have to be in the sequential fixation regime for subdivision to significantly accelerate crossing. Using Markov chain theory, we obtain analytical expressions for the conditions under which optimal speedup is achieved: valley or plateau crossing by the subdivided population is then as fast as that of its fastest deme. We verify our analytical predictions through stochastic simulations. We demonstrate that subdivision can substantially accelerate the crossing of fitness valleys and plateaus in a wide range of parameters extending beyond the optimal window. We study the effect of varying the degree of subdivision of a population, and investigate the trade-off between the magnitude of the optimal speedup and the width of the parameter range over which it occurs. Our results, obtained for fitness valleys and plateaus, also hold for weakly beneficial intermediate mutations. Finally, we extend our work to the case of a population connected by migration to one or several smaller islands. Our results demonstrate that subdivision with migration alone can significantly accelerate the crossing of fitness valleys and plateaus, and shed light onto the quantitative conditions necessary for this to occur. PMID:25122220

  20. Shaping America's Future III: Proceedings of the National Forum on Transforming Our System of Educating Youth with W. Edwards Deming (June 8, 1992).

    ERIC Educational Resources Information Center

    National Educational Service, Bloomington, IN.

    On June 8, 1992, the presidents of the nation's two largest teachers unions joined the directors and presidents of virtually every educational organization, as well as political leaders and executives from Ford, General Motors, and Chrysler in an effort to redesign U.S. schools using the quality principles of W. Edwards Deming. Panelists spent the…

  1. Thermal boundary conductance of hydrophilic and hydrophobic ionic liquids

    NASA Astrophysics Data System (ADS)

    Oyake, Takafumi; Sakata, Masanori; Yada, Susumu; Shiomi, Junichiro

    2015-03-01

    A solid/liquid interface plays a critical role for understanding mechanisms of biological and physical science. Moreover, carrier density of the surface is dramatically enhanced by electric double layer with ionic liquid, salt in the liquid state. Here, we have measured the thermal boundary conductance (TBC) across an interface of gold thin film and ionic liquid by using time-domain thermoreflectance technique. Following the prior researches, we have identified the TBC of two interfaces. One is gold and hydrophilic ionic liquid, N,N-Diethyl-N-methyl-N-(2-methoxyethyl) ammonium tetrafluoroborate (DEME-BF4), which is a hydrophilic ionic liquid, and the other is N,N-Diethyl-N-methyl-N-(2-methoxyethyl) ammonium bis (trifluoromethanesulfonyl) imide (DEME-TFSI), which is a hydrophobic ionic liquid. We found that the TBC between gold and DEME-TFIS (19 MWm-2K-1) is surprisingly lower than the interface between gold and DEME-BF4 (45 MWm-2K-1). With these data, the importance of the wetting angle and ion concentration for the thermal transport at the solid/ionic liquid interface is discussed. Part of this work is financially supported by Japan Society for the Promotion of Science (JSPS) and Japan Science and Technology Agency. The author is financially supported by JSPS Fellowship.

  2. Validation of a Second-Generation Near-Infrared Spectroscopy Monitor in Children With Congenital Heart Disease.

    PubMed

    Nasr, Viviane G; Bergersen, Lisa T; Lin, Hung-Mo; Benni, Paul B; Bernier, Rachel S; Anderson, Michelle E; Kussman, Barry D

    2018-01-09

    Cerebral oximetry using near-infrared spectroscopy is a noninvasive optical technology to detect cerebral hypoxia-ischemia and develop interventions to prevent and ameliorate hypoxic brain injury. Cerebral oximeters are calibrated and validated by comparison of the near-infrared spectroscopy-measured cerebral O2 saturation (SctO2) to a "field" or reference O2 saturation (REF CX) calculated as a weighted average from arterial and jugular bulb oxygen saturations. In this study, we calibrated and validated the second-generation, 5 wavelength, FORE-SIGHT Elite with the medium sensor (source-detector separation 12 and 40 mm) for measurement of SctO2 in children with congenital heart disease. After institutional review board approval and written informed consent, 63 children older than 1 month and ≥2.5 kg scheduled for cardiac catheterization were enrolled. Self-adhesive FORE-SIGHT Elite medium sensors were placed on the right and left sides of the forehead. Blood samples for calculation of REF CX were drawn simultaneously from the aorta or femoral artery and the jugular bulb before (T1) and shortly after (T2) baseline hemodynamic measurements. FORE-SIGHT Elite SctO2 measurements were compared to the REF CX (REF CX = [0.3 SaO2] + [0.7 SjbO2]) using Deming regression, least squares linear regression, and Bland-Altman analysis. Sixty-one subjects (4.5 [standard deviation 4.4] years of age; 17 [standard deviation 13] kg, male 56%) completed the study protocol. Arterial oxygen saturation ranged from 64.7% to 99.1% (median 96.0%), jugular bulb venous oxygen saturation from 34.1% to 88.1% (median 68.2%), the REF CX from 43.8% to 91.4% (median 76.9%), and the SctO2 from 47.8% to 90.8% (median 76.3%). There was a high degree of correlation in SctO2 between the right and left sensors at a given time point (within subject between sensor correlation r = 0.91 and 95% confidence interval [CI], 0.85-0.94) or between T1 and T2 for the right and left sensors (replicates, within subject between time point correlation r = 0.95 and 95% CI, 0.92-0.96). By Deming regression, the estimated slope was 0.966 (95% CI, 0.786-1.147; P = .706 for testing against null hypothesis of slope = 1) with a y intercept of 2.776 (95% CI, -11.102 to 16.654; P = .689). The concordance correlation coefficient was 0.873 (95% CI, 0.798-0.922). Bland-Altman analysis for agreement between SctO2 and REF CX that accounted for repeated measures (both in times and sensors) found a bias of -0.30% (95% limits of agreement: -10.56% to 9.95%). This study calibrated and validated the FORE-SIGHT Elite tissue oximeter to accurately measure SctO2 in pediatric patients with the medium sensor.

  3. Comparative evaluation of the Aptima HIV-1 Quant Dx assay and COBAS TaqMan HIV-1 v2.0 assay using the Roche High Pure System for the quantification of HIV-1 RNA in plasma.

    PubMed

    Schalasta, Gunnar; Börner, Anna; Speicher, Andrea; Enders, Martin

    2016-03-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has become the standard of care in the management of HIV-infected patients. There are several commercially available assays that have been implemented for the detection of HIV-1 RNA in plasma. Here, the new Hologic Aptima® HIV-1 Quant Dx assay (Aptima HIV) was compared to the Roche COBAS® TaqMan® HIV-1 Test v2.0 for use with the High Pure System (HPS/CTM). The performance characteristics of the assays were assessed using commercially available HIV reference panels, dilution of the WHO 3rd International HIV-1 RNA International Standard (WHO-IS) and plasma from clinical specimens. Assay performance was determined by linear regression, Deming correlation analysis and Bland-Altman analysis. Testing of HIV-1 reference panels revealed excellent agreement. The 61 clinical specimens quantified in both assays were linearly associated and strongly correlated. The Aptima HIV assay offers performance comparable to that of the HPS/CTM assay and, as it is run on a fully automated platform, a significantly improved workflow.

  4. Interlaboratory assessment of mitotic index by flow cytometry confirms superior reproducibility relative to microscopic scoring.

    PubMed

    Roberts, D J; Spellman, R A; Sanok, K; Chen, H; Chan, M; Yurt, P; Thakur, A K; DeVito, G L; Murli, H; Stankowski, L F

    2012-05-01

    A flow cytometric procedure for determining mitotic index (MI) as part of the metaphase chromosome aberrations assay, developed and utilized routinely at Pfizer as part of their standard assay design, has been adopted successfully by Covance laboratories. This method, using antibodies against phosphorylated histone tails (H3PS10) and nucleic acid stain, has been evaluated by the two independent test sites and compared to manual scoring. Primary human lymphocytes were treated with cyclophosphamide, mitomycin C, benzo(a)pyrene, and etoposide at concentrations inducing dose-dependent cytotoxicity. Deming regression analysis indicates that the results generated via flow cytometry (FCM) were more consistent between sites than those generated via microscopy. Further analysis using the Bland-Altman modification of the Tukey mean difference method supports this finding, as the standard deviations (SDs) of differences in MI generated by FCM were less than half of those generated manually. Decreases in scoring variability owing to the objective nature of FCM, and the greater number of cells analyzed, make FCM a superior method for MI determination. In addition, the FCM method has proven to be transferable and easily integrated into standard genetic toxicology laboratory operations. Copyright © 2012 Wiley Periodicals, Inc.

  5. The Effect of the Adoption of the Quality Philosophy by Teachers on Student Achievement

    ERIC Educational Resources Information Center

    Sandifer, Cody Clark

    2009-01-01

    The purpose of this study was to determine if the adoption of the Deming philosophy by teachers and use of the LtoJ[R] process resulted in greater academic achievement. Results of internal consistency analysis indicated that the instrument, the "Commitment to Quality Inventory for Educators," was a reliable measure of the Deming…

  6. Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn

    This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…

  7. [Quality management: its use in nursing].

    PubMed

    Antunes, A V; Trevizan, M A

    2000-01-01

    The Quality Management has been used and it is a reality in the hospitals. Thus the authors comment about its importance for Nursing and analyse its utilization in a Nursing Service of a private hospital, with purpose to evaluate the implementation form, nurses' involvement and the Deming' Principles application. Data show that the implementation has brought good results, nurses are engaged in the process and the Deming's Principles have been utilized, adequate or inadequately.

  8. Total Quality Management: Implications for Nursing Information Systems

    DTIC Science & Technology

    1992-01-01

    the Defense Department’s TQM model (see Appendix M) and TQM master plan (see Appendix N) (Hunt, 1992). Ishikawa . Kaoru Ishikawa was the best known of...the subject -- Deming, Juran, Crosby, Feigenbaum, Ishikawa , etc. (see Chapter 2). One of the most concise definitions on TQM was given in a speech by...with the contributions of key quality authors, including Deming, Juran, Crosby, Costello, and Ishikawa . The use of TQM in business and in healthcare

  9. An Introduction to Quality Management: Selected Readings.

    DTIC Science & Technology

    total quality management (TQM). Through the kind permission of a number of publishers, we have been able to reproduce here some key articles about...TQM. It is not the intent of this technical note to provide a comprehensive study of quality management , but rather to aid in planning for an...implementation of the Deming approach to TQM. Although the Navy aviation community chose the Deming approach to quality management , as reflected in the selected

  10. Exploring the Link between Intrinsic Motivation and Quality

    DTIC Science & Technology

    1992-12-01

    to and motivation for quality. " Effective human relations is basic to quality control," Fiegenbaum (1991) says. A major effect of this activity is... overjustification (explained later in this chapter). Deming "fervently believes in the intrinsic motivation of mankind," wrote Gabor (1990, p. 13). "All...see the congruence between Deming’s philosophy and overjustification theory (Deci, 1975). This is the idea that extrinsic motivators can be emphasized

  11. [Genetic Structure of Urban Population of the Common Hamster (Cricetus cricetus)].

    PubMed

    Feoktistova, N Yu; Meschersky, I G; Surov, A V; Bogomolov, P L; Tovpinetz, N N; Poplavskaya, N S

    2016-02-01

    Over the past half-century, the common hamster (Cricetus cricetus), along with range-wide decline of natural populations, has actively populated the cities. The study of the genetic structure of urban populations of common hamster may shed light on features of the habitation of this species in urban landscapes. This article is focused on the genetic structure of common hamster populations in Simferopol (Crimea), one of the largest known urban populations of this species. On the basis of the analysis of nucleotide sequences of the cytochrome b gene and mtDNA control region, and the allelic composition of ten microsatellite loci of nDNA, we revealed that, despite the fact that some individuals can move throughout the city at considerable distances, the entire population of the city is represented by separate demes confined to different areas. These demes are characterized by a high degree of the genetic isolation and reduced genetic diversity compared to that found for the city as a whole.

  12. Deming meets Braverman: toward a progressive analysis of the continuous quality improvement paradigm.

    PubMed

    Schiff, G D; Goldfield, N I

    1994-01-01

    The continuous quality improvement (CQI) model has rapidly become the dominant management paradigm in U.S. industrial and health care leadership circles. Despite its widespread corporate acceptance and its relevance to public sector policy issues, there has been a paucity of progressive analysis of CQI. The authors begin by noting remarkable similarities between CQI critiques of Taylorism (so-called scientific management of work) with those made by Braverman, a leading Marxist analyst of the work process. Each of the 14 principles of CQI pioneer W. E. Deming are explained and analyzed for their progressive content. These pluses are then contrasted with 18 problematic issues in an attempt to challenge and go beyond the constraints of CQI as it is currently being applied in health care and other sectors. These issues include (1) mismatch between rhetoric and reality, (2) public sector issues, and (3) broader contradictions. The authors emphasize the genuine need for improving health care quality and the relevance of CQI for addressing this need. They challenge progressives to grapple with the profound contradictions by the CQI paradigm inviting a broader dialogue on CQI's meaning for improving the public's health.

  13. Low-Polarization Lithium-Oxygen Battery Using [DEME][TFSI] Ionic Liquid Electrolyte.

    PubMed

    Ulissi, Ulderico; Elia, Giuseppe Antonio; Jeong, Sangsik; Mueller, Franziska; Reiter, Jakub; Tsiouvaras, Nikolaos; Sun, Yang-Kook; Scrosati, Bruno; Passerini, Stefano; Hassoun, Jusef

    2018-01-10

    The room-temperature molten salt mixture of N,N-diethyl-N-(2-methoxyethyl)-N-methylammonium bis(trifluoromethanesulfonyl) imide ([DEME][TFSI]) and lithium bis(trifluoromethanesulfonyl)imide (LiTFSI) salt is herein reported as electrolyte for application in Li-O 2 batteries. The [DEME][TFSI]-LiTFSI solution is studied in terms of ionic conductivity, viscosity, electrochemical stability, and compatibility with lithium metal at 30 °C, 40 °C, and 60 °C. The electrolyte shows suitable properties for application in Li-O 2 battery, allowing a reversible, low-polarization discharge-charge performance with a capacity of about 13 Ah g-1carbon in the positive electrode and coulombic efficiency approaching 100 %. The reversibility of the oxygen reduction reaction (ORR)/oxygen evolution reaction (OER) is demonstrated by ex situ XRD and SEM studies. Furthermore, the study of the cycling behavior of the Li-O 2 cell using the [DEME][TFSI]-LiTFSI electrolyte at increasing temperatures (from 30 to 60 °C) evidences enhanced energy efficiency together with morphology changes of the deposited species at the working electrode. In addition, the use of carbon-coated Zn 0.9 Fe 0.1 O (TMO-C) lithium-conversion anode in an ionic-liquid-based Li-ion/oxygen configuration is preliminarily demonstrated. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Trade-offs between xylem hydraulic properties, wood anatomy and yield in Populus.

    PubMed

    Hajek, Peter; Leuschner, Christoph; Hertel, Dietrich; Delzon, Sylvain; Schuldt, Bernhard

    2014-07-01

    Trees face the dilemma that achieving high plant productivity is accompanied by a risk of drought-induced hydraulic failure due to a trade-off in the trees' vascular system between hydraulic efficiency and safety. By investigating the xylem anatomy of branches and coarse roots, and measuring branch axial hydraulic conductivity and vulnerability to cavitation in 4-year-old field-grown aspen plants of five demes (Populus tremula L. and Populus tremuloides Michx.) differing in growth rate, we tested the hypotheses that (i) demes differ in wood anatomical and hydraulic properties, (ii) hydraulic efficiency and safety are related to xylem anatomical traits, and (iii) aboveground productivity and hydraulic efficiency are negatively correlated to cavitation resistance. Significant deme differences existed in seven of the nine investigated branch-related anatomical and hydraulic traits but only in one of the four coarse-root-related anatomical traits; this likely is a consequence of high intra-plant variation in root morphology and the occurrence of a few 'high-conductivity roots'. Growth rate was positively related to branch hydraulic efficiency (xylem-specific conductivity) but not to cavitation resistance; this indicates that no marked trade-off exists between cavitation resistance and growth. Both branch hydraulic safety and hydraulic efficiency significantly depended on vessel size and were related to the genetic distance between the demes, while the xylem pressure causing 88% loss of hydraulic conductivity (P88 value) was more closely related to hydraulic efficiency than the commonly used P50 value. Deme-specific variation in the pit membrane structure may explain why vessel size was not directly linked to growth rate. We conclude that branch hydraulic efficiency is an important growth-influencing trait in aspen, while the assumed trade-off between productivity and hydraulic safety is weak. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Evolution and polymorphism in the multilocus Levene model with no or weak epistasis.

    PubMed

    Bürger, Reinhard

    2010-09-01

    Evolution and the maintenance of polymorphism under the multilocus Levene model with soft selection are studied. The number of loci and alleles, the number of demes, the linkage map, and the degree of dominance are arbitrary, but epistasis is absent or weak. We prove that, without epistasis and under mild, generic conditions, every trajectory converges to a stationary point in linkage equilibrium. Consequently, the equilibrium and stability structure can be determined by investigating the much simpler gene-frequency dynamics on the linkage-equilibrium manifold. For a haploid species an analogous result is shown. For weak epistasis, global convergence to quasi-linkage equilibrium is established. As an application, the maintenance of multilocus polymorphism is explored if the degree of dominance is intermediate at every locus and epistasis is absent or weak. If there are at least two demes, then arbitrarily many multiallelic loci can be maintained polymorphic at a globally asymptotically stable equilibrium. Because this holds for an open set of parameters, such equilibria are structurally stable. If the degree of dominance is not only intermediate but also deme independent, and loci are diallelic, an open set of parameters yielding an internal equilibrium exists only if the number of loci is strictly less than the number of demes. Otherwise, a fully polymorphic equilibrium exists only nongenerically, and if it exists, it consists of a manifold of equilibria. Its dimension is determined. In the absence of genotype-by-environment interaction, however, a manifold of equilibria occurs for an open set of parameters. In this case, the equilibrium structure is not robust to small deviations from no genotype-by-environment interaction. In a quantitative-genetic setting, the assumptions of no epistasis and intermediate dominance are equivalent to assuming that in every deme directional selection acts on a trait that is determined additively, i.e., by nonepistatic loci with dominance. Some of our results are exemplified in this quantitative-genetic context. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Assessment of the Applicability of Total Quality Leadership into the Argentine Army.

    DTIC Science & Technology

    1995-03-01

    Leadership " or TQL (see Chapter III, Section F for further explanation). After analyzing various approaches to quality management , the leaders of the Navy...organizations, learning and change. Theory of knowledge. 18 The Deming Approach to Quality Management E D, EDE (’f’, BI.,, IM0) Figure 3-4. The Deming Approach... managers lack profound knowledge. "Profound knowledge is a lens which provides the needed theory to optimize organizations" [Ref 3:p 94]. According to

  17. Out of the healthcare crisis.

    PubMed

    Siriwardena, A Niroshan

    2011-01-01

    W Edwards Deming's Out of the Crisis, was first published almost three decades ago.(1) It was a bestseller and remains a classic text written by one of the foremost quality improvement experts of the 20th century. It is a book which certainly warrants re-examination in light of today's challenges for health care. This discussion paper reviews what Deming can teach us about causes of failure in management, including health care, what can be done to remedy them and how to avert problems in future.

  18. 40 CFR Appendix C to Part 136 - Determination of Metals and Trace Elements in Water and Wastes by Inductively Coupled Plasma...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...

  19. 40 CFR Appendix C to Part 136 - Determination of Metals and Trace Elements in Water and Wastes by Inductively Coupled Plasma...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...

  20. 40 CFR Appendix C to Part 136 - Determination of Metals and Trace Elements in Water and Wastes by Inductively Coupled Plasma...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...

  1. A perspectivist review of supermetallicity studies. II

    NASA Astrophysics Data System (ADS)

    Taylor, B. J.

    A summary of indirect deductions is provided, taking into account a high-dispersion analysis of Delta Pav conducted by Rodgers (1969), a study of three K giant - F dwarf binaries performed by Deming and Butler (1979), and investigations involving the Hyades giants. Attention is given to an analysis of explanations, the analyses reported by Peterson (1976), the most recent results, future work on the VSL Giants, a summary of deficiencies in the methodology of supermetallicity, and the present state of the M67 problem.

  2. DASHBOARDS & CONTROL CHARTS EXPERIENCES IN IMPROVING SAFETY AT HANFORD WASHINGTON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PREVETTE, S.S.

    2006-02-27

    The aim of this paper is to demonstrate the integration of safety methodology, quality tools, leadership, and teamwork at Hanford and their significant positive impact on safe performance of work. Dashboards, Leading Indicators, Control charts, Pareto Charts, Dr. W. Edward Deming's Red Bead Experiment, and Dr. Deming's System of Profound Knowledge have been the principal tools and theory of an integrated management system. Coupled with involved leadership and teamwork, they have led to significant improvements in worker safety and protection, and environmental restoration at one of the nation's largest nuclear cleanup sites.

  3. Human Milk Fatty Acid Composition: Comparison of Novel Dried Milk Spot Versus Standard Liquid Extraction Methods.

    PubMed

    Rudolph, Michael C; Young, Bridget E; Jackson, Kristina Harris; Krebs, Nancy F; Harris, William S; MacLean, Paul S

    2016-12-01

    Accurate assessment of the long chain polyunsaturated fatty acid (LC-PUFA) content of human milk (HM) provides a powerful means to evaluate the FA nutrient status of breastfed infants. The conventional standard for FA composition analysis of HM is liquid extraction, trans-methylation, and analyte detection resolved by gas chromatography. This standard approach requires fresh or frozen samples, storage in deep freeze, organic solvents, and specialized equipment in processing and analysis. Further, HM collection is often impractical for many studies in the free living environment, particularly for studies in developing countries. In the present study, we compare a novel and more practical approach to sample collection and processing that involves the spotting and drying ~50 μL of HM on a specialized paper stored and transported at ambient temperatures until analysis. Deming regression indicated the two methods aligned very well for all LC-PUFA and the abundant HM FA. Additionally, strong correlations (r > 0.85) were observed for DHA, ARA, EPA, linoleic (LA), and alpha-linolenic acids (ALA), which are of particular interest to the health of the developing infant. Taken together, our data suggest this more practical and inexpensive method of collection, storage, and transport of HM milk samples could dramatically facilitate studies of HM, as well as understanding its lipid composition influences on human health and development.

  4. Ground Penetrating Radar, Magnetic and Compositional Analysis of Sediment Cores and Surface Samples: The Relationships Between Lacustrine Sediments and Holocene Lake- Level and Climate Change at Deming Lake, Minnesota, USA

    NASA Astrophysics Data System (ADS)

    Murray, R.; Lascu, I.; Plank, C.

    2007-12-01

    Deming Lake is a small (<1 square km), deep (about 17m), meromictic kettle lake situated near the prairie- forest boundary, in Itasca State Park, MN. Because of the lake's location and morphology, the accumulated sediments comprise a high-resolution record of limnological and ecological changes in response to Holocene climate variations. We used a shore perpendicular transect of three cores (located in littoral, mid-slope, and profundal settings) and ground penetrating radar (GPR) profiles to investigate Holocene lake-level variability at Deming. Cores were sampled continuously at a 1-2 cm resolution and sediment composition (in terms of percent organic matter, carbonate material, and minerogenic residue) was determined via loss on ignition (LOI). Isothermal remanent magnetization (IRM) and anhysteretic remanent magnetization (ARM) were used as proxies of magnetic mineral concentration and grain size. Four lithostratigraphic units were identified and correlated between cores based on these analyses. Changes in GPR facies corroborate the correlation between the two shallow cores. In order to inform our interpretation of down-core variations in magnetic properties and LOI values in terms of variations in lake depth, a suite of over 70 modern sediment samples were collected from the basin and analyzed. LOI compositional variability across the basin was high, with no clear trends related to depth or distance from shore. A sharp decrease in minerogenic content was observed at depths consistent with a predicted wave-base of 0.5 m, but aside from this trend it appears the steep slopes of much of the basin promote gravity driven slumping and mixing of sediments at depth. In the profundal sediments IRM values are routinely 5% higher than in the slope and littoral environments, while ARM/IRM ratios indicate an increase in magnetic grain size with water depth. We infer that an increase in coarse organic material in the shallow-water cores of Deming records a period of aridity (associated with a decrease lake-level less than 2m based on GPR profiles) and/or increased water clarity during the regionally expansive mid-Holocene dry period. We do not see clear evidence of late-Holocene lake level change of a significant magnitude (i.e. >1m). While remanence measurements (especially IRM) often correlate with the LOI residue, interference in the IRM resulting from the dissolution of magnetic minerals casts uncertainty into the reliability of our magnetic measurements as a signal of climate driven limnological change. Additional measurements must be performed before definite interpretations about the lake-level changes at Deming can be made. We suggest that future studies look more closely at the near-shore record (water depths <1m), as our results indicate shoreline migration in response to moisture balance fluctuations during the last 1000 years (as recorded at numerous sites in the great plains and upper Midwest) may have been subtle.

  5. Seasonal Variations in Water Chemistry and Sediment Composition in Three Minnesota Lakes

    NASA Astrophysics Data System (ADS)

    Lascu, I.; Ito, E.; Banerjee, S.

    2006-12-01

    Variations in water chemistry, isotopic composition of dissolved inorganic carbon, sediment geochemistry and mineral magnetism were monitored for several months in three Minnesota lakes. Lake McCarrons, Deming Lake and Steel Lake are all small (<1 km2), deep (>16 m), stratified lakes that contain varved sediments for some time intervals or throughout. Deming Lake and Steel Lake are situated in north-central Minnesota, about 40 km apart, while Lake McCarrons is located in the heart of the Twin Cities and is heavily used for recreational purposes. The lakes have different mixing regimes (Steel is dimictic, Deming is meromictic and McCarrons is oligomictic) but all have well defined epilimnia and hypolimnia during the ice-free season. Water samples were collected bi-weekly from the epilimnia, upper and lower hypolimnia, while sediments were collected monthly from sediment traps placed in shallow and deep parts of the lakes. All lakes are moderately alkaline (80-280 ppm HCO3-) carbonate-producing systems, although calcite is being dissolved in the slightly acidic hypolimnetic waters of Deming Lake. The magnetic parameters reveal different distributions of the magnetic components in the three lakes, but all exhibit a general increase in the concentration of bacterial magnetosomes towards the end of summer. Differences in elemental concentrations, cation and anion profiles, and chemical behavior as the season progressed are also obvious among the three lakes. For the two lakes situated in the same climatic regime, this implies additional controls (besides climate) on water and sediment composition, such as local hydrology, substrate composition and biogeochemical in-lake processes.

  6. Microsatellite Analyses of Blacktip Reef Sharks (Carcharhinus melanopterus) in a Fragmented Environment Show Structured Clusters

    PubMed Central

    Vignaud, Thomas; Clua, Eric; Mourier, Johann; Maynard, Jeffrey; Planes, Serge

    2013-01-01

    The population dynamics of shark species are generally poorly described because highly mobile marine life is challenging to investigate. Here we investigate the genetic population structure of the blacktip reef shark (Carcharhinus melanopterus) in French Polynesia. Five demes were sampled from five islands with different inter-island distances (50–1500 km). Whether dispersal occurs between islands frequently enough to prevent moderate genetic structure is unknown. We used 11 microsatellites loci from 165 individuals and a strong genetic structure was found among demes with both F-statistics and Bayesian approaches. This differentiation is correlated with the geographic distance between islands. It is likely that the genetic structure seen is the result of all or some combination of the following: low gene flow, time since divergence, small effective population sizes, and the standard issues with the extent to which mutation models actually fit reality. We suggest low levels of gene flow as at least a partial explanation of the level of genetic structure seen among the sampled blacktip demes. This explanation is consistent with the ecological traits of blacktip reef sharks, and that the suitable habitat for blacktips in French Polynesia is highly fragmented. Evidence for spatial genetic structure of the blacktip demes we studied highlights that similar species may have populations with as yet undetected or underestimated structure. Shark biology and the market for their fins make them highly vulnerable and many species are in rapid decline. Our results add weight to the case that total bans on shark fishing are a better conservation approach for sharks than marine protected area networks. PMID:23585872

  7. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Zhen Yu, Jian

    2018-03-01

    Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS), Deming regression (DR), orthogonal distance regression (ODR), weighted ODR (WODR), and York regression (YR). We first introduce a new data generation scheme that employs the Mersenne twister (MT) pseudorandom number generator. The numerical simulations are also improved by (a) refining the parameterization of nonlinear measurement uncertainties, (b) inclusion of a linear measurement uncertainty, and (c) inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot) was developed to facilitate the implementation of error-in-variables regressions.

  8. Validating urinary measurement of beta-2-microglobulin with a Roche reagent kit designed for serum measurements.

    PubMed

    Chan, Pak Cheung R; Kulasingam, Vathany; Lem-Ragosnig, Bonny

    2012-11-01

    To validate the use of a Roche serum beta-2-microglobulin (B2MG) kit for urinary B2MG measurements, and to establish reference limits for urinary B2MG/creatinine ratio from healthy individuals. The Roche B2MG Tina-Quant serum kit was used to measure urinary B2MG immunoturbidimetrically. Using human urine as a diluent, the B2MG method was linear from 73-2156 μg/L. The imprecision on a commercially available urine QC was 4.4% at a concentration of 380 μg/L. Limit of quantification at <20% CV was 40 μg/L. Method comparison with Immulite 2000 (Siemens) yielded slope=1.180 (95% CI 1.14-1.22), intercept=11.5 (95% CI -3.6-26.6), SEE=27.6 and r=0.99 (n=26) by the Deming regression analysis. The upper reference limit of B2MG/creatinine ratio determined from 195 healthy adults was 29 μg/mmol (97.5th centile). The serum B2MG Tina Quant reagent kit is acceptable to measure urinary B2MG. Copyright © 2012. Published by Elsevier Inc.

  9. An Analysis of the State of Total Quality In Academia

    DTIC Science & Technology

    1991-12-01

    sense, quality is anything that can be improved." [Imai, 1986] 3. Joseph M. Juran "Quality is fitness for use." [Juran, 19891 4. Kaoru Ishikawa ...121 v L m 1111 X. INTRODUCTION "Quality Control begins with education and ends with education" Karou Ishikawa [ Ishikawa , 1985...revolution, the magnitude of which has been compared to that of the industrial revolution. [ Ishikawa , 1985, Deming, 1982] Dramatic changes are

  10. Hospital quality: a product of good management as much as good treatment.

    PubMed

    Hyde, Andy; Frafjord, Anders

    2013-01-01

    In Norway, as in most countries, the demands placed on hospitals to reduce costs and improve the quality of services are intense. Although many say that improving quality reduces costs, few can prove it. Futhermore, how many people can show that improving quality improves patient satisfaction. Diakonhjemmet hospital in Norway has designed and implemented a hospital management system based on lean principles and the PDCA (Plan-Do-Check-Act) quality circle introduced by WE Deming (Deming 2000). The results are quite impressive with improvements in quality and patient satisfaction. The hospital also runs at a profit.

  11. An Exploratory Survey of Methods Used to Develop Measures of Performance

    DTIC Science & Technology

    1993-09-01

    E3 Genichi Taguchi O3 Robert C. Camp 0 Kaoru Ishikawa 0 Dorsey J. Talley o Philip B. Crosby 0 J.M. Juran 0 Arthur R. Tenner 0 W. Edwards Deming 0...authored books or papers on the subject of quality? (Mark all that apply) o Nancy Brady 0 H. James Harrington 03 Genichi Taguchi o Robert C. Camp 0 Kaoru ... Ishikawa 03 Dorsey J. Talley 0 Philip B. Crosby 0 J.M. Juran 0 Arthur R. Tenner o W. Edwards Deming 0 Dennis Kinlaw 03 Hans J. Thamhain 0 Irving J

  12. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  13. Comparison of approaches to Total Quality Management. Including an examination of the Department of Energy`s position on quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, C.T.

    1994-03-01

    This paper presents a comparison of several qualitatively different approaches to Total Quality Management (TQM). The continuum ranges from management approaches that are primarily standards -- with specific guidelines, but few theoretical concepts -- to approaches that are primarily philosophical, with few specific guidelines. The approaches to TQM discussed in this paper include the International Organization for Standardization (ISO) 9000 Standard, the Malcolm Baldrige National Quality Award, Senge`s the Learning Organization, Watkins and Marsick`s approach to organizational learning, Covey`s Seven Habits of Highly Successful People, and Deming`s Fourteen Points for Management. Some of these approaches (Deming and ISO 9000) aremore » then compared to the DOE`s official position on quality management and conduct of operations (DOE Orders 5700.6C and 5480.19). Using a tabular format, it is shown that while 5700.6C (Quality Assurance) maps well to many of the current approaches to TQM, DOE`s principle guide to management Order 5419.80 (Conduct of Operations) has many significant conflicts with some of the modern approaches to continuous quality improvement.« less

  14. Thermal conductivity anisotropy of rocks

    NASA Astrophysics Data System (ADS)

    Lee, Youngmin; Keehm, Youngseuk; Shin, Sang Ho

    2013-04-01

    The interior heat of the lithosphere of the Earth is mainly transferred by conduction that depends on thermal conductivity of rocks. Many sedimentary and metamorphic rocks have thermal conductivity anisotropy, i.e. heat is preferentially transferred in the direction parallel to the bedding and foliation of these rocks. Deming (JGR, 1994) proposed an empirical relationship between K(perp) and anisotropy (K(par)/K(perp)) using 89 measurements on rock samples from literatures. In Deming's model, thermal conductivity is almost isotropic for K(perp) > 4 W/mK, but anisotropy is exponentially increasing with decreasing K(perp), with final anisotropy of ~2.5 at K(perp) < 1.0 W/mK. However, Davis et al. (JGR, 2007) argued that there is little evidence for Deming's suggestion that thermal conductivity anisotropy of all rocks increases systematically to about 2.5 for rocks with low thermal conductivity. Davis et al. insisted that Deming's increase in anisotropy for 1 < K(perp) < 4 W/mK with decreasing K(perp) could be due to the fractures filled with air or water, which causes thermal conductivity anisotropy. To test Deming's suggestion and Davis et al.'s argument on thermal conductivity anisotropy, we measured thermal conductivity parallel (K(par)) and perpendicular (K(perp)) to bedding or foliation and performed analytical & numerical modeling. Our measurements on 53 rock samples show the anisotropy range from 0.79 to 1.36 for 1.84 < K(prep) < 4.06 W/mK. Analytical models show that anisotropy can increase or stay the same at the range of 1 < K(perp) < 4 W/mK. Numerical modeling for gneiss shows that anisotropy ranges 1.21 to 1.36 for 2.5 < K(perp) < 4.8 W/mK. Another numerical modeling with interbedded coal layers in high thermal conductivity rocks (3.5 W/mK) shows anisotropy of 1.87 when K(perp) is 1.7 W/mK. Finally, numerical modeling with fractures indicates that the fractures does not seem to affect thermal conductivity anisotropy significantly. In conclusion, our preliminary results imply that thermal conductivity anisotropy can increase or stay at low value in the range of 1.0 < K(perp) < 4.0 W/mK. Both cases are shown to be possible through lab measurements and analytical & numerical modeling.

  15. A quality improvement management model for renal care.

    PubMed

    Vlchek, D L; Day, L M

    1991-04-01

    The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.

  16. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  17. Genetic identification of Theobroma cacao L. trees with high Criollo ancestry in Soconusco, Chiapas, Mexico.

    PubMed

    Vázquez-Ovando, J A; Molina-Freaner, F; Nuñez-Farfán, J; Ovando-Medina, I; Salvador-Figueroa, M

    2014-12-12

    Criollo-type cacao trees are an important pool of genes with potential to be used in cacao breeding and selection programs. For that reason, we assessed the diversity and population structure of Criollo-type trees (108 cultivars with Criollo phenotypic characteristics and 10 Criollo references) using 12 simple sequence repeat (SSR) markers. Cultivars were selected from 7 demes in the Soconusco region of southern Mexico. SSRs amplified 74 alleles with an average of 3.6 alleles per population. The overall populations showed an average observed heterozygosity of 0.28, indicating heterozygote deficiency (average fixation index F = 0.50). However, moderate allelic diversity was found within populations (Shannon index for all populations I = 0.97). Bayesian method analysis determined 2 genetic clusters (K = 2) within individuals. In concordance, an assignment test grouped 37 multilocus genotypes (including 10 references) into a first cluster (Criollo), 54 into a second (presumably Amelonado), and 27 admixed individuals unassigned at the 90% threshold likely corresponding to the Trinitario genotype. This classification was supported by the principal coordinate analysis and analysis of molecular variance, which showed 12% of variation among populations (FST = 0.123, P < 0.0001). Sampled demes sites (1- 7) in the Soconusco region did not show any evidence of clustering by geographic location, and this was supported by the Mantel test (Rxy = 0.54, P = 0.120). Individuals with high Criollo lineage planted in Soconusco farms could be an important reservoir of genes for future breeding programs searching for fine, taste, flavor, and aroma cocoa.

  18. Clinical Evaluation of the BD FACSPresto™ Near-Patient CD4 Counter in Kenya

    PubMed Central

    Angira, Francis; Akoth, Benta; Omolo, Paul; Opollo, Valarie; Bornheimer, Scott; Judge, Kevin; Tilahun, Henok; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2016-01-01

    Background The BD FACSPresto™ Near-Patient CD4 Counter was developed to expand HIV/AIDS management in resource-limited settings. It measures absolute CD4 counts (AbsCD4), percent CD4 (%CD4), and hemoglobin (Hb) from a single drop of capillary or venous blood in approximately 23 minutes, with throughput of 10 samples per hour. We assessed the performance of the BD FACSPresto system, evaluating accuracy, stability, linearity, precision, and reference intervals using capillary and venous blood at KEMRI/CDC HIV-research laboratory, Kisumu, Kenya, and precision and linearity at BD Biosciences, California, USA. Methods For accuracy, venous samples were tested using the BD FACSCalibur™ instrument with BD Tritest™ CD3/CD4/CD45 reagent, BD Trucount™ tubes, and BD Multiset™ software for AbsCD4 and %CD4, and the Sysmex™ KX-21N for Hb. Stability studies evaluated duration of staining (18–120-minute incubation), and effects of venous blood storage <6–24 hours post-draw. A normal cohort was tested for reference intervals. Precision covered multiple days, operators, and instruments. Linearity required mixing two pools of samples, to obtain evenly spaced concentrations for AbsCD4, total lymphocytes, and Hb. Results AbsCD4 and %CD4 venous/capillary (N = 189/ N = 162) accuracy results gave Deming regression slopes within 0.97–1.03 and R2 ≥0.96. For Hb, Deming regression results were R2 ≥0.94 and slope ≥0.94 for both venous and capillary samples. Stability varied within 10% 2 hours after staining and for venous blood stored less than 24 hours. Reference intervals results showed that gender—but not age—differences were statistically significant (p<0.05). Precision results had <3.5% coefficient of variation for AbsCD4, %CD4, and Hb, except for low AbsCD4 samples (<6.8%). Linearity was 42–4,897 cells/μL for AbsCD4, 182–11,704 cells/μL for total lymphocytes, and 2–24 g/dL for Hb. Conclusions The BD FACSPresto system provides accurate, precise clinical results for capillary or venous blood samples and is suitable for near-patient CD4 testing. Trial Registration ClinicalTrials.gov NCT02396355 PMID:27483008

  19. Transcriptional over-expression of chloride intracellular channels 3 and 4 in malignant pleural mesothelioma.

    PubMed

    Tasiopoulou, Vasiliki; Magouliotis, Dimitrios; Solenov, Evgeniy I; Vavougios, Georgios; Molyvdas, Paschalis-Adam; Gourgoulianis, Konstantinos I; Hatzoglou, Chrissi; Zarogiannis, Sotirios G

    2015-12-01

    Chloride Intracellular Channels (CLICs) are contributing to the regulation of multiple cellular functions. CLICs have been found over-expressed in several malignancies, and therefore they are currently considered as potential drug targets. The goal of our study was to assess the gene expression levels of the CLIC's 1-6 in malignant pleural mesothelioma (MPM) as compared to controls. We used gene expression data from a publicly available microarray dataset comparing MPM versus healthy tissue in order to investigate the differential expression profile of CLIC 1-6. False discovery rates were calculated and the interactome of the significantly differentially expressed CLICs was constructed and Functional Enrichment Analysis for Gene Ontologies (FEAGO) was performed. In MPM, the gene expressions of CLIC3 and CLIC4 were significantly increased compared to controls (p=0.001 and p<0.001 respectively). A significant positive correlation between the gene expressions of CLIC3 and CLIC4 (p=0.0008 and Pearson's r=0.51) was found. Deming regression analysis provided an association equation between the CLIC3 and CLIC4 gene expressions: CLIC3=4.42CLIC4-10.07. Our results indicate that CLIC3 and CLIC4 are over-expressed in human MPM. Moreover, their expressions correlate suggesting that they either share common gene expression inducers or that their products act synergistically. FAEGO showed that CLIC interactome might contribute to TGF beta signaling and water transport. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Removing Batch Effects from Longitudinal Gene Expression - Quantile Normalization Plus ComBat as Best Approach for Microarray Transcriptome Data

    PubMed Central

    Müller, Christian; Schillert, Arne; Röthemeier, Caroline; Trégouët, David-Alexandre; Proust, Carole; Binder, Harald; Pfeiffer, Norbert; Beutel, Manfred; Lackner, Karl J.; Schnabel, Renate B.; Tiret, Laurence; Wild, Philipp S.; Blankenberg, Stefan

    2016-01-01

    Technical variation plays an important role in microarray-based gene expression studies, and batch effects explain a large proportion of this noise. It is therefore mandatory to eliminate technical variation while maintaining biological variability. Several strategies have been proposed for the removal of batch effects, although they have not been evaluated in large-scale longitudinal gene expression data. In this study, we aimed at identifying a suitable method for batch effect removal in a large study of microarray-based longitudinal gene expression. Monocytic gene expression was measured in 1092 participants of the Gutenberg Health Study at baseline and 5-year follow up. Replicates of selected samples were measured at both time points to identify technical variability. Deming regression, Passing-Bablok regression, linear mixed models, non-linear models as well as ReplicateRUV and ComBat were applied to eliminate batch effects between replicates. In a second step, quantile normalization prior to batch effect correction was performed for each method. Technical variation between batches was evaluated by principal component analysis. Associations between body mass index and transcriptomes were calculated before and after batch removal. Results from association analyses were compared to evaluate maintenance of biological variability. Quantile normalization, separately performed in each batch, combined with ComBat successfully reduced batch effects and maintained biological variability. ReplicateRUV performed perfectly in the replicate data subset of the study, but failed when applied to all samples. All other methods did not substantially reduce batch effects in the replicate data subset. Quantile normalization plus ComBat appears to be a valuable approach for batch correction in longitudinal gene expression data. PMID:27272489

  1. Management Tools in Engineering Education.

    ERIC Educational Resources Information Center

    Fehr, M.

    1999-01-01

    Describes a teaching model that applies management tools such as delegation, total quality management, time management, teamwork, and Deming rules. Promotes the advantages of efficiency, reporting, independent scheduling, and quality. (SK)

  2. British standard (BS) 5750--quality assurance?

    PubMed

    Pratt, D J

    1995-04-01

    BS5750 is the British Standard on "Quality Systems". Its equivalent in European Standards is EN29000 and in the International Standards Organisation ISO9000. This paper points out that these standards lay down formalised procedures and require documentation but do not ipso facto lead to quality assurance. The author points to the Japanese post-war industrial success as being an example of Total Quality Management within the framework provided by the philosophy of Dr. W. Edwards Deming (1988 and 1993). This philosophy on the management of "systems" to provide high quality products and services is briefly outlined. The author argues that improvement in prosthetic and orthotic services will not be reached through implementation of BS5750 but rather through radical rethinking and the adoption and application of the Deming philosophy.

  3. Inferred vs Realized Patterns of Gene Flow: An Analysis of Population Structure in the Andros Island Rock Iguana

    PubMed Central

    Colosimo, Giuliano; Knapp, Charles R.; Wallace, Lisa E.; Welch, Mark E.

    2014-01-01

    Ecological data, the primary source of information on patterns and rates of migration, can be integrated with genetic data to more accurately describe the realized connectivity between geographically isolated demes. In this paper we implement this approach and discuss its implications for managing populations of the endangered Andros Island Rock Iguana, Cyclura cychlura cychlura. This iguana is endemic to Andros, a highly fragmented landmass of large islands and smaller cays. Field observations suggest that geographically isolated demes were panmictic due to high, inferred rates of gene flow. We expand on these observations using 16 polymorphic microsatellites to investigate the genetic structure and rates of gene flow from 188 Andros Iguanas collected across 23 island sites. Bayesian clustering of specimens assigned individuals to three distinct genotypic clusters. An analysis of molecular variance (AMOVA) indicates that allele frequency differences are responsible for a significant portion of the genetic variance across the three defined clusters (Fst =  0.117, p0.01). These clusters are associated with larger islands and satellite cays isolated by broad water channels with strong currents. These findings imply that broad water channels present greater obstacles to gene flow than was inferred from field observation alone. Additionally, rates of gene flow were indirectly estimated using BAYESASS 3.0. The proportion of individuals originating from within each identified cluster varied from 94.5 to 98.7%, providing further support for local isolation. Our assessment reveals a major disparity between inferred and realized gene flow. We discuss our results in a conservation perspective for species inhabiting highly fragmented landscapes. PMID:25229344

  4. Neoglacial fluctuations of Deming Glacier, Mt. Baker, Washington USA.

    NASA Astrophysics Data System (ADS)

    Osborn, G.; Menounos, B.; Scott, K.; Clague, J. J.; Tucker, D.; Riedel, J.; Davis, P.

    2007-12-01

    Deming Glacier flows from the upper west slopes of Mt. Baker, a stratovolcano in the Cascade Range of Washington, USA. The north and south lateral moraines of Deming Glacier are composed of at least four tills separated by layers of detrital wood and sheared stumps in growth position. The stratigraphy records fluctuations of the glacier during the Holocene. The outer ten rings of an in situ stump from the middle wood layer, which is about 40 m below the north lateral moraine crest and 1.2 km downvalley from the present glacier terminus, yielded an age of 1750 ± 50~~ 14C yr BP [1810-1550 cal yr BP]. The stump revealed at least 300 rings and thus records a period of landscape stability and relatively restricted glaciation for several hundred years prior to ca. 1750 14C yr BP . Samples from the lowest wood layer also have been submitted for radiocarbon dating. Outer rings of detrital wood samples collected from two wood mats exposed in the south lateral moraine, 2.3 km downvalley of the glacier terminus, returned radiocarbon ages of 1600 ± 30~~ 14C yr BP [1550- 1410 cal yr BP] and 430 ± 30~~ 14C yr BP [AD 1420-1620]. These data indicate that Deming Glacier advanced over a vegetated moraine sometime after 1810 cal yr BP to a position less extensive that it achieved at the peak of the Little Ice Age. The glacier then receded before it began its final and most extensive Holocene advance after AD 1420. The older advance is correlative with the 'First Millennium AD' advance, recently recognized throughout western North America. The younger advance coincides with an advance of Mt. Baker's Easton Glacier [AD 1430-1630], and advances of many alpine glaciers elsewhere in western North America. Our data suggest that glaciers on Mt. Baker fluctuated in a similar manner to alpine glaciers in the Coast Mountains of British Columbia and in other mountain ranges of northwest North America during Neoglaciation.

  5. Evaluation of the accuracy of a microdialysis-based glucose sensor during insulin-induced hypoglycemia, its recovery, and post-hypoglycemic hyperglycemia in humans.

    PubMed

    Rossetti, P; Porcellati, F; Fanelli, C G; Bolli, G B

    2006-06-01

    These studies were designed to evaluate the accuracy of a microdialysis-based subcutaneous glucose sensor (GlucoDay, A. Menarini Diagnostics, Firenze, Italy) compared with a standard reference method of plasma glucose measurement during insulin-induced hypoglycemia. Nine subjects without diabetes were studied in eu-, hypo-, and hyperglycemia (clamp technique). The GlucoDay was calibrated against one arterialized plasma glucose measurement (Glucose Analyzer, Beckman, Brea, CA), and plasma glucose estimates every 3 min were compared with paired plasma glucose values. Accuracy of glucose estimates was not homogeneously distributed among subjects and depended on stability of the sensor's current signal during spontaneous euglycemia (R +/- -0.68). Linear regression analysis showed a good correlation between the two methods of measurement (R = 0.9), Deming regression showed the inclusion of the unit in the confidence interval of the slope (slope 0.95, 95% confidence interval 0.87-1.02), and the accuracy of the GlucoDay reached 40 +/- 15% (American Diabetes Association criteria). The mean relative difference was 6 +/- 8% in euglycemia, 13 +/- 14% during plasma glucose fall, 5 +/- 22% in the hypoglycemic plateau, and -14 +/- 16% during recovery from hypoglycemia. The Bland-Altman analysis indicated a bias of -1.9 +/- 16.6 mg/dL, whereas the Error Grid Analysis showed 94% of the Gluco- Day measurements in the acceptable zones of the grid. The time to reach the glycemic nadir was longer when measured with the GlucoDay (90 +/- 5 vs. 72.5 +/- 9 min, P < 0.05). However, absolute values of glycemic nadir, time spent in hypoglycemia, and the rate of fall of glycemia and the rate of recovery from the hypoglycemia were not statistically different. GlucoDay closely monitors changes in plasma glucose before, during, and after hypoglycemia. However, these results can be achieved only if calibration of the GlucoDay is performed under conditions of sensor signal stability. Similar studies have to be performed in subjects with diabetes to validate the GlucoDay system.

  6. Feasibility Study of Economics and Performance of Solar Photovoltaics at the Peru Mill Industrial Park in the City of Deming, New Mexico. A Study Prepared in Partnership with the Environmental Protection Agency for the RE-Powering America's Land Initiative: Siting Renewable Energy on Potentially Contaminated Land and Mine Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiatreungwattana, K.; Geiger, J.; Healey, V.

    2013-04-01

    The U.S. Environmental Protection Agency (EPA), in accordance with the RE-Powering America's Land initiative, selected the Peru Mill Industrial Park site in the City of Deming, New Mexico, for a feasibility study of renewable energy production. The National Renewable Energy Laboratory (NREL) provided technical assistance for this project. The purpose of this report is to assess the site for a possible photovoltaic (PV) system installation and estimate the cost, performance, and site impacts of different PV options. In addition, the report recommends financing options that could assist in the implementation of a PV system at the site.

  7. Local extinction and recolonization, species effective population size, and modern human origins.

    PubMed

    Eller, Elise; Hawks, John; Relethford, John H

    2004-10-01

    A primary objection from a population genetics perspective to a multiregional model of modern human origins is that the model posits a large census size, whereas genetic data suggest a small effective population size. The relationship between census size and effective size is complex, but arguments based on an island model of migration show that if the effective population size reflects the number of breeding individuals and the effects of population subdivision, then an effective population size of 10,000 is inconsistent with the census size of 500,000 to 1,000,000 that has been suggested by archeological evidence. However, these models have ignored the effects of population extinction and recolonization, which increase the expected variance among demes and reduce the inbreeding effective population size. Using models developed for population extinction and recolonization, we show that a large census size consistent with the multiregional model can be reconciled with an effective population size of 10,000, but genetic variation among demes must be high, reflecting low interdeme migration rates and a colonization process that involves a small number of colonists or kin-structured colonization. Ethnographic and archeological evidence is insufficient to determine whether such demographic conditions existed among Pleistocene human populations, and further work needs to be done. More realistic models that incorporate isolation by distance and heterogeneity in extinction rates and effective deme sizes also need to be developed. However, if true, a process of population extinction and recolonization has interesting implications for human demographic history.

  8. Evaluation of the Aptima HBV Quant assay vs. the COBAS TaqMan HBV test using the high pure system for the quantitation of HBV DNA in plasma and serum samples.

    PubMed

    Schalasta, Gunnar; Börner, Anna; Speicher, Andrea; Enders, Martin

    2018-03-28

    Proper management of patients with chronic hepatitis B virus (HBV) infection requires monitoring of plasma or serum HBV DNA levels using a highly sensitive nucleic acid amplification test. Because commercially available assays differ in performance, we compared herein the performance of the Hologic Aptima HBV Quant assay (Aptima) to that of the Roche Cobas TaqMan HBV test for use with the high pure system (HPS/CTM). Assay performance was assessed using HBV reference panels as well as plasma and serum samples from chronically HBV-infected patients. Method correlation, analytical sensitivity, precision/reproducibility, linearity, bias and influence of genotype were evaluated. Data analysis was performed using linear regression, Deming correlation analysis and Bland-Altman analysis. Agreement between the assays for the two reference panels was good, with a difference in assay values vs. target <0.5 log. Qualitative assay results for 159 clinical samples showed good concordance (88.1%; κ=0.75; 95% confidence interval: 0.651-0.845). For the 106 samples quantitated by both assays, viral load results were highly correlated (R=0.92) and differed on average by 0.09 log, with 95.3% of the samples being within the 95% limit of agreement of the assays. Linearity for viral loads 1-7 log was excellent for both assays (R2>0.98). The two assays had similar bias and precision across the different genotypes tested at low viral loads (25-1000 IU/mL). Aptima has a performance comparable with that of HPS/CTM, making it suitable for use for HBV infection monitoring. Aptima runs on a fully automated platform (the Panther system) and therefore offers a significantly improved workflow compared with HPS/CTM.

  9. Comparison of the quantification of acetaminophen in plasma, cerebrospinal fluid and dried blood spots using high-performance liquid chromatography-tandem mass spectrometry

    PubMed Central

    Taylor, Rachel R.; Hoffman, Keith L.; Schniedewind, Björn; Clavijo, Claudia; Galinkin, Jeffrey L.; Christians, Uwe

    2013-01-01

    Acetaminophen (paracetamol, N-(4-hydroxyphenyl) acetamide) is one of the most commonly prescribed drugs for the management of pain in children. Quantification of acetaminophen in pre-term and term neonates and small children requires the availability of highly sensitive assays in small volume blood samples. We developed and validated an LC-MS/MS assay for the quantification of acetaminophen in human plasma, cerebro-spinal fluid (CSF) and dried blood spots (DBS). Reconstitution in water (DBS only) and addition of a protein precipitation solution containing the deuterated internal standard were the only manual steps. Extracted samples were analyzed on a Kinetex 2.6 μm PFP column using an acetonitrile/formic acid gradient. The analytes were detected in the positive multiple reaction mode. Alternatively, DBS were automatically processed using direct desorption in a sample card and preparation (SCAP) robotic autosampler in combination with online extraction. The range of reliable response in plasma and CSF was 3.05-20,000 ng/ml (r2 > 0.99) and 27.4-20,000 ng/ml (r2 > 0.99) for DBS (manual extraction and automated direct desorption). Inter-day accuracy was always within 85-115% and inter-day precision for plasma, CSF and manually extracted DBS were less than 15%. Deming regression analysis comparing 167 matching pairs of plasma and DBS samples showed a correlation coefficient of 0.98. Bland Altman analysis indicated a 26.6% positive bias in DBS, most likely reflecting the blood: plasma distribution ratio of acetaminophen. DBS are a valid matrix for acetaminophen pharmacokinetic studies. PMID:23670126

  10. Management Manifesto.

    ERIC Educational Resources Information Center

    Siu-Runyan, Yvonne; Heart, Sally Joy

    1992-01-01

    Deming's 14 principles, which have revitalized Japanese industry, can help restructure the education workplace. Administrators should agree on future goals and priorities; adopt a cooperative, "lead management" philosophy; cease dependency on mass inspection; constantly improve production and service; institute training and retraining;…

  11. Adsorption Characteristics of LaNi 5Particles

    NASA Astrophysics Data System (ADS)

    Song, M. Y.; Park, H. R.

    1997-11-01

    Nitrogen adsorption on an intermetallic compound, LaNi 5, was studied before and after activation and after hydriding-dehydriding cycling. The specific surface area of activated LaNi 5was 0.271±0.004 m 2g -1. Adsorption and desorption isotherms of activated LaNi 5were obtained. The adsorption isotherm was similar to type II among the five types of isotherms classified by S. Brunauer, L. S. Deming, W S. Deming, and E. Teller ( J. Am. Chem. Soc.62, 1723, 1940). Its hysteresis curve had the type B form among de Boer's five types of hysteresis. Desorption pore-size analyses showed that the activated LaNi 5had only a few mesopores, the diameters of which were around 20-110 Å. The average adsorption rate of the activated LaNi 5showed a first-order dependence on nitrogen pressure at 77 K.

  12. Language-learning disabilities: Paradigms for the nineties.

    PubMed

    Wiig, E H

    1991-01-01

    We are beginning a decade, during which many traditional paradigms in education, special education, and speech-language pathology will undergo change. Among paradigms considered promising for speech-language pathology in the schools are collaborative language intervention and strategy training for language and communication. This presentation introduces management models for developing a collaborative language intervention process, among them the Deming Management Method for Total Quality (TQ) (Deming 1986). Implementation models for language assessment and IEP planning and multicultural issues are also introduced (Damico and Nye 1990; Secord and Wiig in press). While attention to processes involved in developing and implementing collaborative language intervention is paramount, content should not be neglected. To this end, strategy training for language and communication is introduced as a viable paradigm. Macro- and micro-level process models for strategy training are featured and general issues are discussed (Ellis, Deshler, and Schumaker 1989; Swanson 1989; Wiig 1989).

  13. Rewarding Teachers without Pay Increases.

    ERIC Educational Resources Information Center

    Hayden, Gary

    1993-01-01

    Today's educational institutions should establish a system of intrinsic rewards for teachers and other staff. This article reviews research on intrinsic motivators, including Deming's total quality concepts, and recommends providing teachers with more individualized instruction, reorganizing faculty supervision practices, giving teachers greater…

  14. Population dynamics of aberrant chromosome 1 in mice.

    PubMed

    Sabantsev, I; Spitsin, O; Agulnik, S; Ruvinsky, A

    1993-05-01

    Natural populations of two semispecies of house mouse, Mus musculus domesticus and M.m. musculus, were found to be polymorphic for an aberrant chromosome 1 bearing a large inserted block of homogeneously staining heterochromatin. Strong meiotic drive for the aberrant chromosome from M.m. musculus was previously observed in heterozygous female mice. There are at least three meiotic drive levels determined by different allelic variants of distorter. Homozygotes had low viability and females showed low fertility. Both homo- and heterozygous males had normal fertility and their segregation patterns did not deviate from normal. Computer simulations were performed of the dynamics of aberrant chromosome 1 in demes and populations. The data demonstrate that a spontaneous mutation (inversion) of an aberrant chromosome 1, once arisen, has a high probability of spreading in a population at high coefficients of meiotic drive and migration. In the long-term, the population attains a stationary state which is determined by the drive level and migration intensity. The state of stable genotypic equilibrium is independent of deme and population size, as well as of the initial concentration of the aberrant chromosome. As populations initially polymorphic for the distorters approach the stationary state, the stronger distorter is eliminated. The frequencies of the aberrant chromosome determined by computer analysis agree well with those obtained for the studied Asian M.m. musculus populations. The evolutionary pathways for the origin and fixation of the aberrant chromosome in natural populations are considered.

  15. “Edifice Rex” Sulfide Recovery Project: Analysis of submarine hydrothermal, microbial habitat

    NASA Astrophysics Data System (ADS)

    Delaney, John R.; Kelley, Deborah S.; Mathez, Edmond A.; Yoerger, Dana R.; Baross, John; Schrenk, Matt O.; Tivey, Margaret K.; Kaye, Jonathan; Robigou, Veronique

    Recent scientific developments place inquiries about submarine volcanic systems in a broad planetary context. Among these is the discovery that submarine eruptions are intimately linked with massive effusions of microbes and their products from below the sea floor [Holden et al., 1998]. This material includes microbes that only grow at temperatures tens of degrees higher than the temperatures of the vent fluids from which they were sampled. Such results lend support for the existence of a potentially extensive, but currently unexplored sub-sea floor microbial biosphere associated with active submarine volcanoes [Deming and Baross, 1993; Delaney et al., 1998; Summit and Baross, 1998].

  16. Toward the 4-Micron Infrared Spectrum of the Transiting Extrasolar Planet HD 209458 b

    NASA Astrophysics Data System (ADS)

    Richardson, L. J.; Deming, D.

    2003-12-01

    We have continued our analysis of infrared spectra of the "transiting planet" system, HD 209458, recorded at the NASA IRTF in September 2001. The spectra cover two predicted secondary eclipse events, wherein the planet passed behind the star and re-emerged. We are attempting to detect the planet's infrared continuum peaks, by exploiting the spectral modulation which accompanies the secondary eclipse. Our initial analysis placed the strongest limits to date on the spectrum of the planet near 2.2 microns (Richardson, Deming & Seager 2003, recently appeared in ApJ). Further analysis of our long wavelength data (3.0--4.2 microns) decorrelates and removes most of the systematic errors due to seeing and guiding fluctuations. This decorrelation has improved the precision of our analysis to the level where a predicted 4-micron planetary flux peak may now be detectable.

  17. Performance of plasma free metanephrines measured by liquid chromatography-tandem mass spectrometry in the diagnosis of pheochromocytoma.

    PubMed

    Peaston, Robert T; Graham, Kendon S; Chambers, Erin; van der Molen, Jan C; Ball, Stephen

    2010-04-02

    Plasma free metanephrines have proved a highly sensitive biochemical test for the diagnosis of pheochromocytoma. We have developed and validated a simple, LC-MS/MS method to determine plasma metanephrines and compared the diagnostic efficacy of the method with an enzyme immunoassay procedure in 151 patients, 38 with histologically confirmed pheochromocytoma. Off-line solid phase extraction in a 96-well plate format was used to isolate metanephrines from 100-microL of plasma, followed by rapid separation with hydrophilic interaction chromatography. Mass spectrometry detection was performed in multiple-reaction monitoring mode using a tandem quadrupole mass spectrometer with positive electrospray ionization. Detection limits were <0.1nmol/l with method linearity up to 23.0nmol/L for normetanephrine (NMN), metanephrine (MN) and 3-methoxytyramine (3-MT). Method comparison with an automated LC-MS/MS yielded Deming regression slopes of r=0.94 for NMN, r=0.98 for MN and r=0.94 for 3-MT. Method comparison with enzyme immunoassay revealed regression slope of r=1.28 (NMN) and 1.25 (MN) with values approximately 25% lower than LC-MS/MS. Plasma metanephrines by LC-MS/MS identified all 38 patients with phaeochromocytoma compared with 36 cases by immunoassay. Plasma metanephrines measured by LC-MS/MS are a reliable and sensitive test for the biochemical detection of pheochromocytoma. 2010 Elsevier B.V. All rights reserved.

  18. Some Remarks on the Improvement of Engineering Education

    NASA Astrophysics Data System (ADS)

    Tribus, Myron

    2005-03-01

    This paper combines the quality management principles of Deming with the educational theories of Feuerstein and Bloom in the design of a system of engineering education in which every student, on graduation, will display mastery of the subjects in the curriculum.

  19. Some Remarks on the Improvement of Engineering Education

    ERIC Educational Resources Information Center

    Tribus, Myron

    2005-01-01

    This paper combines the quality management principles of Deming with the educational theories of Feuerstein and Bloom in the design of a system of engineering education in which every student, on graduation, will display mastery of the subjects in the curriculum.

  20. A Quality Approach to Writing Assessment.

    ERIC Educational Resources Information Center

    Andrade, Joanne; Ryley, Helen

    1992-01-01

    A Colorado elementary school began its Total Quality Management work about a year ago after several staff members participated in an IBM Leadership Training Program addressing applications of Deming's theories. The school's new writing assessment has increased collegiality and cross-grade collaboration. (MLH)

  1. Is TQM Right for Schools?

    ERIC Educational Resources Information Center

    Blankstein, Alan M.; Swain, Heather

    1994-01-01

    Examines eight reasons why Total Quality Management cannot succeed in education and shows how one Florida elementary school surmounted these obstacles and implemented Deming's quality principles. Principal Nancy Duden overcame resistance to change, leadership misconceptions, reliance on external motivators (promotions and grades), increased…

  2. Adrenal-derived stress hormones modulate ozone-induced lung injury and inflammation

    EPA Science Inventory

    Ozone-induced systemic effects are modulated through activation of the neuro-hormonal stress response pathway. Adrenal demedullation (DEMED)or bilateral total adrenalectomy (ADREX) inhibits systemic and pulmonary effect of acute ozone exposure. To understand the influence of adre...

  3. The Quality Revolution in Education.

    ERIC Educational Resources Information Center

    Bonstingl, John Jay

    1992-01-01

    Whether viewed through Deming's 14 points, Juran's Trilogy, or Kaoru Ishikawa's Thought Revolution, Total Quality Management embodies 4 fundamental tenets: primary focus on customers and suppliers, universal commitment to continuous improvement, a systems approach, and top management responsibility. Educational organizations are recreating their…

  4. Optimization and Development of a Human Scent Collection Method

    DTIC Science & Technology

    2007-06-04

    19. Schoon, G. A. A., Scent Identification Lineups by Dogs (Canis familiaris): Experimental Design and Forensic Application. Applied Animal...Parker, Lloyd R., Morgan, Stephen L., Deming, Stanley N., Sequential Simplex Optimization. Chemometrics Series, ed. S.D. Brown. 1991, Boca Raton

  5. TQM in Tupelo.

    ERIC Educational Resources Information Center

    Rist, Marilee C.

    1993-01-01

    Tupelo (Mississippi) Superintendent Mike Walters eschewed his former "happy bureaucrat" role for a facilitative role allowing teachers to reinvent curriculum and instruction. Inspired by Deming's continuous-improvement precept and aided by a $3.5 million grant from the area's Fortune-500 business community, this superintendent finds…

  6. Validation of a serum immunoassay to measure progesterone and diagnose pregnancy in the West Indian manatee (Trichechus manatus)

    USGS Publications Warehouse

    Tripp, K.M.; Verstegen, J.P.; Deutsch, C.J.; Bonde, R.K.; Rodriguez, M.; Morales, B.; Schmitt, D.L.; Harr, K.E.

    2008-01-01

    The objective was to validate a high-sensitivity chemiluminescent assay of serum progesterone concentrations for pregnancy diagnosis in manatees. Assay analytical sensitivity was 0.1 ng/mL, with mean intra- and inter-assay coefficients of variation of 9.7 and 9.2%, respectively, and accuracy had a mean adjusted R2 of 0.98. Methods comparison (relative to Siemen's Coat-A-Count RIA) demonstrated r = 0.98, Deming regression slope of 0.95, and an intercept of 0.01. Based on ROC analysis, a progesterone concentration ???0.4 ng/mL was indicative of pregnancy. Assay results were not significantly altered by two freeze-thaw cycles of samples. Characteristic progesterone concentrations during pregnancy were Months 1-4 (1.7-4.7 ng/mL), 5-8 (???1.0 ng/mL), and 10 and 11 (0.3-0.5 ng/mL), whereas two late-pregnant females with impending abortion had progesterone concentrations of 0.1 ng/mL. Among pregnant females, maximum progesterone concentrations occurred in autumn (3.9 ?? 1.8 ng/mL), and were greater during all seasons than concentrations in non-pregnant females (0.1-0.2 ng/mL). Progesterone concentrations were also significantly higher in pregnant females than in non-pregnant females and males. This highly sensitive, specific, and diagnostic assay will be valuable for monitoring pregnancy and abortion in manatees. ?? 2008 Elsevier Inc.

  7. Use of refractometry for determination of psittacine plasma protein concentration.

    PubMed

    Cray, Carolyn; Rodriguez, Marilyn; Arheart, Kristopher L

    2008-12-01

    Previous studies have demonstrated both poor and good correlation of total protein concentrations in various avian species using refractometry and biuret methodologies. The purpose of the current study was to compare these 2 techniques of total protein determination using plasma samples from several psittacine species and to determine the effect of cholesterol and other solutes on refractometry results. Total protein concentration in heparinized plasma samples without visible lipemia was analyzed by refractometry and an automated biuret method on a dry reagent analyzer (Ortho 250). Cholesterol, glucose, and uric acid concentrations were measured using the same analyzer. Results were compared using Deming regression analysis, Bland-Altman bias plots, and Spearman's rank correlation. Correlation coefficients (r) for total protein results by refractometry and biuret methods were 0.49 in African grey parrots (n=28), 0.77 in Amazon parrots (20), 0.57 in cockatiels (20), 0.73 in cockatoos (36), 0.86 in conures (20), and 0.93 in macaws (38) (P< or =.01). Cholesterol concentration, but not glucose or uric acid concentrations, was significantly correlated with total protein concentration obtained by refractometry in Amazon parrots, conures, and macaws (n=25 each, P<.05), and trended towards significance in African grey parrots and cockatoos (P=.06). Refractometry can be used to accurately measure total protein concentration in nonlipemic plasma samples from some psittacine species. Method and species-specific reference intervals should be used in the interpretation of total protein values.

  8. Validation of a serum immunoassay to measure progesterone and diagnose pregnancy in the West Indian manatee (Trichechus manatus).

    PubMed

    Tripp, K M; Verstegen, J P; Deutsch, C J; Bonde, R K; Rodriguez, M; Morales, B; Schmitt, D L; Harr, K E

    2008-10-15

    The objective was to validate a high-sensitivity chemiluminescent assay of serum progesterone concentrations for pregnancy diagnosis in manatees. Assay analytical sensitivity was 0.1 ng/mL, with mean intra- and inter-assay coefficients of variation of 9.7 and 9.2%, respectively, and accuracy had a mean adjusted R(2) of 0.98. Methods comparison (relative to Siemen's Coat-A-Count RIA) demonstrated r=0.98, Deming regression slope of 0.95, and an intercept of 0.01. Based on ROC analysis, a progesterone concentration >or=0.4 ng/mL was indicative of pregnancy. Assay results were not significantly altered by two freeze-thaw cycles of samples. Characteristic progesterone concentrations during pregnancy were Months 1-4 (1.7-4.7 ng/mL), 5-8 ( approximately 1.0 ng/mL), and 10 and 11 (0.3-0.5 ng/mL), whereas two late-pregnant females with impending abortion had progesterone concentrations of 0.1 ng/mL. Among pregnant females, maximum progesterone concentrations occurred in autumn (3.9+/-1.8 ng/mL), and were greater during all seasons than concentrations in non-pregnant females (0.1-0.2 ng/mL). Progesterone concentrations were also significantly higher in pregnant females than in non-pregnant females and males. This highly sensitive, specific, and diagnostic assay will be valuable for monitoring pregnancy and abortion in manatees.

  9. Total Quality Management in Higher Education: Applying Deming's Fourteen Points.

    ERIC Educational Resources Information Center

    Masters, Robert J.; Leiker, Linda

    1992-01-01

    This article presents guidelines to aid administrators of institutions of higher education in applying the 14 principles of Total Quality Management. The principles stress understanding process improvements, handling variation, fostering prediction, and using psychology to capitalize on human resources. (DB)

  10. SSET: Spatially-scanned Spectra of Exoplanet Transits

    NASA Astrophysics Data System (ADS)

    McCullough, Peter R.; Berta, Z. K.; Howard, A. W.; MacKenty, J. W.; WFC3 Team

    2012-01-01

    Spatial scanning is expected to have some advantages over staring-mode observations with the HST WFC3 instrument, especially for very bright stars, i.e. those that intrinsically can provide the highest sensitivity observations. We analyze 1.1-1.7 micron spectra of a transit of the super-Earth GJ1214b obtained 2011-4-18 during re-commissioning of a technique for spatially scanning the Hubble Space Telescope. These are the first data of this type obtained with the HST instrument WFC3. Results are directly compared to staring-mode observations with the same instrument of the same target by Berta et al. (2011). We also describe a case study of the sub-Neptune-sized planet HD 97658b in terms of proposed observations and what they may reveal of that planet. We also summarize publicly-available descriptions of additional HST programs that use the spatial-scanning technique (Table 1). Table 1 HST program, Title, Investigators, Scanned Targets 12181 The Atmospheric Structure of Giant Hot Exoplanets, Deming, L. D. et al., HD 209458 and HD 189733 12325 Photometry with Spatial Scans, MacKenty, J. W., & McCullough, P. R., GJ1214 12336 Scan Enabled Photometry, MacKenty, J. W., McCullough, P. R., & Deustua, S., Vega and other calibration stars 12449 Atmospheric Composition of the ExoNeptune HAT-P-11, Deming, L. D., et al., HAT-P-11 12473 An Optical Transmission Spectral Survey of hot-Jupiter Exoplanetary Atmospheres, Sing, D. K. et al., WASP-31, HAT-P-1 12495 Near-IR Spectroscopy of the Hottest Known Exoplanet, WASP-33b, Deming, L. D. et al., WASP-33 12679 Luminosity-Distance Standards from Gaia and HST, Riess, A., et al., Milky Way Cepheids 12713 Spatial Scanned L-flat Validation Pathfinder, McCullough and MacKenty, nearly identical double stars

  11. Science Outreach at NASA's Marshall Space Flight Center

    NASA Astrophysics Data System (ADS)

    Lebo, George

    2002-07-01

    At the end of World War II Duane Deming, an internationally known economist enunciated what later came to be called "Total Quality Management" (TQM). The basic thrust of this economic theory called for companies and governments to identify their customers and to do whatever was necessary to meet their demands and to keep them satisfied. It also called for companies to compete internally. That is, they were to build products that competed with their own so that they were always improving. Unfortunately most U.S. corporations failed to heed this advice. Consequently, the Japanese who actively sought Deming's advice and instituted it in their corporate planning, built an economy that outstripped that of the U.S. for the next three to four decades. Only after U.S. corporations reorganized and fashioned joint ventures which incorporated the tenets of TQM with their Japanese competitors did they start to catch up. Other institutions such as the U.S. government and its agencies and schools face the same problem. While the power of the U.S. government is in no danger of being usurped, its agencies and schools face real problems which can be traced back to not heeding Deming's advice. For example, the public schools are facing real pressure from private schools and home school families because they are not meeting the needs of the general public, Likewise, NASA and other government agencies find themselves shortchanged in funding because they have failed to convince the general public that their missions are important. In an attempt to convince the general public that its science mission is both interesting and important, in 1998 the Science Directorate at NASA's Marshall Space Flight Center (MSFC) instituted a new outreach effort using the interact to reach the general public as well as the students. They have called it 'Science@NASA'.

  12. Genetic divergence and diversity in the Mona and Virgin Islands Boas, Chilabothrus monensis (Epicrates monensis) (Serpentes: Boidae), West Indian snakes of special conservation concern.

    PubMed

    Rodríguez-Robles, Javier A; Jezkova, Tereza; Fujita, Matthew K; Tolson, Peter J; García, Miguel A

    2015-07-01

    Habitat fragmentation reduces the extent and connectivity of suitable habitats, and can lead to changes in population genetic structure. Limited gene flow among isolated demes can result in increased genetic divergence among populations, and decreased genetic diversity within demes. We assessed patterns of genetic variation in the Caribbean boa Chilabothrus monensis (Epicrates monensis) using two mitochondrial and seven nuclear markers, and relying on the largest number of specimens of these snakes examined to date. Two disjunct subspecies of C. monensis are recognized: the threatened C. m. monensis, endemic to Mona Island, and the rare and endangered C. m. granti, which occurs on various islands of the Puerto Rican Bank. Mitochondrial and nuclear markers revealed unambiguous genetic differences between the taxa, and coalescent species delimitation methods indicated that these snakes likely are different evolutionary lineages, which we recognize at the species level, C. monensis and C. granti. All examined loci in C. monensis (sensu stricto) are monomorphic, which may indicate a recent bottleneck event. Each population of C. granti exclusively contains private mtDNA haplotypes, but five of the seven nuclear genes assayed are monomorphic, and nucleotide diversity is low in the two remaining markers. The faster pace of evolution of mtDNA possibly reflects the present-day isolation of populations of C. granti, whereas the slower substitution rate of nuDNA may instead mirror the relatively recent episodes of connectivity among the populations facilitated by the lower sea level during the Pleistocene. The small degree of overall genetic variation in C. granti suggests that demes of this snake could be managed as a single unit, a practice that would significantly increase their effective population size. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Science Outreach at NASA's Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Lebo, George

    2002-01-01

    At the end of World War II Duane Deming, an internationally known economist enunciated what later came to be called "Total Quality Management" (TQM). The basic thrust of this economic theory called for companies and governments to identify their customers and to do whatever was necessary to meet their demands and to keep them satisfied. It also called for companies to compete internally. That is, they were to build products that competed with their own so that they were always improving. Unfortunately most U.S. corporations failed to heed this advice. Consequently, the Japanese who actively sought Deming's advice and instituted it in their corporate planning, built an economy that outstripped that of the U.S. for the next three to four decades. Only after U.S. corporations reorganized and fashioned joint ventures which incorporated the tenets of TQM with their Japanese competitors did they start to catch up. Other institutions such as the U.S. government and its agencies and schools face the same problem. While the power of the U.S. government is in no danger of being usurped, its agencies and schools face real problems which can be traced back to not heeding Deming's advice. For example, the public schools are facing real pressure from private schools and home school families because they are not meeting the needs of the general public, Likewise, NASA and other government agencies find themselves shortchanged in funding because they have failed to convince the general public that their missions are important. In an attempt to convince the general public that its science mission is both interesting and important, in 1998 the Science Directorate at NASA's Marshall Space Flight Center (MSFC) instituted a new outreach effort using the interact to reach the general public as well as the students. They have called it 'Science@NASA'.

  14. Seven propositions of the science of improvement: exploring foundations.

    PubMed

    Perla, Rocco J; Provost, Lloyd P; Parry, Gareth J

    2013-01-01

    The phrase "Science of Improvement" or "Improvement Science" is commonly used today by a range of people and professions to mean different things, creating confusion to those trying to learn about improvement. In this article, we briefly define the concepts of improvement and science, and review the history of the consideration of "improvement" as a science. We trace key concepts and ideas in improvement to their philosophical and theoretical foundation with a focus on Deming's System of Profound Knowledge. We suggest that Deming's system has a firm association with many contemporary and historic philosophic and scientific debates and concepts. With reference to these debates and concepts, we identify 7 propositions that provide the scientific and philosophical foundation for the science of improvement. A standard view of the science of improvement does not presently exist that is grounded in the philosophical and theoretical basis of the field. The 7 propositions outlined here demonstrate the value of examining the underpinnings of improvement. This is needed to both advance the field and minimize confusion about what the phrase "science of improvement" represents. We argue that advanced scientists of improvement are those who like Deming and Shewhart can integrate ideas, concepts, and models between scientific disciplines for the purpose of developing more robust improvement models, tools, and techniques with a focus on application and problem solving in real world contexts. The epistemological foundations and theoretical basis of the science of improvement and its reasoning methods need to be critically examined to ensure its continued development and relevance. If improvement efforts and projects in health care are to be characterized under the canon of science, then health care professionals engaged in quality improvement work would benefit from a standard set of core principles, a standard lexicon, and an understanding of the evolution of the science of improvement.

  15. Analytical performance of the Hologic Aptima HBV Quant Assay and the COBAS Ampliprep/COBAS TaqMan HBV test v2.0 for the quantification of HBV DNA in plasma samples.

    PubMed

    Schønning, Kristian; Johansen, Kim; Nielsen, Lone Gilmor; Weis, Nina; Westh, Henrik

    2018-07-01

    Quantification of HBV DNA is used for initiating and monitoring antiviral treatment. Analytical test performance consequently impacts treatment decisions. To compare the analytical performance of the Aptima HBV Quant Assay (Aptima) and the COBAS Ampliprep/COBAS TaqMan HBV Test v2.0 (CAPCTMv2) for the quantification of HBV DNA in plasma samples. The performance of the two tests was compared on 129 prospective plasma samples, and on 63 archived plasma samples of which 53 were genotyped. Linearity of the two assays was assessed on dilutions series of three clinical samples (Genotype B, C, and D). Bland-Altman analysis of 120 clinical samples, which quantified in both tests, showed an average quantification bias (Aptima - CAPCTMv2) of -0.19 Log IU/mL (SD: 0.33 Log IU/mL). A single sample quantified more than three standard deviations higher in Aptima than in CAPCTMv2. Only minor differences were observed between genotype A (N = 4; average difference -0.01 Log IU/mL), B (N = 8; -0.13 Log IU/mL), C (N = 8; -0.31 Log IU/mL), D (N = 25; -0.22 Log IU/mL), and E (N = 7; -0.03 Log IU/mL). Deming regression showed that the two tests were excellently correlated (slope of the regression line 1.03; 95% CI: 0.998-1.068). Linearity of the tests was evaluated on dilution series and showed an excellent correlation of the two tests. Both tests were precise with %CV less than 3% for HBV DNA ≥3 Log IU/mL. The Aptima and CAPCTMv2 tests are highly correlated, and both tests are useful for monitoring patients chronically infected with HBV. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Comparison of the Hologic Aptima HIV-1 Quant Dx Assay to the Roche COBAS Ampliprep/COBAS TaqMan HIV-1 Test v2.0 for the quantification of HIV-1 RNA in plasma samples.

    PubMed

    Schønning, Kristian; Johansen, Kim; Landt, Bodil; Benfield, Thomas; Westh, Henrik

    2017-07-01

    HIV-RNA is the most important parameter for monitoring antiviral treatment in individuals infected with HIV-1. Knowledge of the performance of different tests for the quantification of HIV-1 RNA is therefore important for clinical care. To compare the analytical performance of the Aptima HIV-1 Quant Dx Assay (Aptima) and the COBAS Ampliprep/COBAS TaqMan HIV-1 Test v2.0 (CAPCTMv2) for the quantification of HIV-1 RNA in plasma samples. The performance of the two tests was compared on 216 clinical plasma samples, on dilutions series in seven replicates of five clinical samples of known subtype and on ten replicates of the Acrometrix High and Low Positive Control. Bland-Altman analysis of 130 samples that quantified in both tests did not show indications of gross mis-quantification of either test. A tendency of the Aptima assay to quantify higher at high viral load compared to the CAPCTMv2 was observed in Bland-Altman analysis, by Deming regression (Slope 1.13) and in dilution series of clinical samples. Precision evaluated using the Acrometrix Positive Controls was similar for the High Control (CV: 1.2% vs. 1.3%; Aptima assay vs. CAPCTMv2 test, respectively), but differed for the Low control (CV: 17.9% vs. 7.1%; Aptima assay vs. CAPCTMv2 test, respectively). However, this did not impact clinical categorization of clinical samples at neither the 50 cp/mL nor 200 cp/mL level. The Aptima assay and the CAPCTMv2 test are highly correlated and are useful for monitoring HIV-infected individuals. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Comparison of the quantification of acetaminophen in plasma, cerebrospinal fluid and dried blood spots using high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Taylor, Rachel R; Hoffman, Keith L; Schniedewind, Björn; Clavijo, Claudia; Galinkin, Jeffrey L; Christians, Uwe

    2013-09-01

    Acetaminophen (paracetamol, N-(4-hydroxyphenyl) acetamide) is one of the most commonly prescribed drugs for the management of pain in children. Quantification of acetaminophen in pre-term and term neonates and small children requires the availability of highly sensitive assays in small volume blood samples. We developed and validated an LC-MS/MS assay for the quantification of acetaminophen in human plasma, cerebro-spinal fluid (CSF) and dried blood spots (DBS). Reconstitution in water (DBS only) and addition of a protein precipitation solution containing the deuterated internal standard were the only manual steps. Extracted samples were analyzed on a Kinetex 2.6 μm PFP column using an acetonitrile/formic acid gradient. The analytes were detected in the positive multiple reaction mode. Alternatively, DBS were automatically processed using direct desorption in a sample card and preparation (SCAP) robotic autosampler in combination with online extraction. The range of reliable response in plasma and CSF was 3.05-20,000 ng/ml (r(2)>0.99) and 27.4-20,000 ng/ml (r(2)>0.99) for DBS (manual extraction and automated direct desorption). Inter-day accuracy was always within 85-115% and inter-day precision for plasma, CSF and manually extracted DBS were less than 15%. Deming regression analysis comparing 167 matching pairs of plasma and DBS samples showed a correlation coefficient of 0.98. Bland Altman analysis indicated a 26.6% positive bias in DBS, most likely reflecting the blood: plasma distribution ratio of acetaminophen. DBS are a valid matrix for acetaminophen pharmacokinetic studies. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Multisite evaluation of the BD™ Stem Cell Enumeration (SCE) Kit for CD34+ cell enumeration on the BD FACSCanto II™ and BD FACSCalibur™ flow cytometers

    PubMed Central

    Preti, Robert A; Chan, Wai Shun; Kurtzberg, Joanne; Dornsife, Ronna E.; Wallace, Paul K.; Furlange, Rosemary; Lin, Anna; Omana-Zapata, Imelda; Bonig, Halvard; Tonn, Thorsten

    2018-01-01

    Background Evaluation of the BD™ Stem Cell Enumeration (SCE) Kit was conducted at four clinical sites with flow cytometry CD34+ enumeration, to assess agreement between two investigational methods, the BD FACSCanto™ II and BD FACSCalibur™ systems, and the predicate method (Beckman Coulter Stem-Kit™ reagents). Methods Leftover and delinked specimens (n = 1,032) from clinical flow cytometry testing were analyzed on the BD FACSCanto II (n = 918) and BD FACSCalibur (n = 905) in normal and mobilized blood, frozen and thawed bone marrow, and leucopheresis and cord blood anticoagulated with CPD, ACD-A, heparin, and EDTA alone or in combination. Fresh leucopheresis analysis addressed site equivalency for sample preparation, testing, and analysis. Results The mean relative bias showed agreement within predefined parameters for the BD FACSCanto II (−2.81 to 4.31 ±7.1) and BD FACSCalibur (−2.69 to 5.2 ±7.9). Results are reported as absolute and relative differences compared to the predicate for viable CD34+, percentage of CD34+ in CD45+, and viable CD45+ populations (or gates). Bias analyses of the distribution of the predicate low, mid, and high bin values were done using BD FACSCanto II optimal gating and BD FACSCalibur manual gating for viable CD34+, percentage of CD34+ in CD45+, and viable CD45+. Bias results from both investigational methods show agreement. Deming regression analyses showed a linear relationship with R2 >0.92 for both investigational methods. Discussion In conclusion, the results from both investigational methods demonstrated agreement and equivalence with the predicate method for enumeration of absolute viable CD34+, percentage of viable CD34+ in CD45+, and absolute viable CD45+ populations. PMID:24927716

  19. Comparison of the COBAS TAQMAN HIV-1 HPS with VERSANT HIV-1 RNA 3.0 assay (bDNA) for plasma RNA quantitation in different HIV-1 subtypes.

    PubMed

    Gomes, Perpétua; Palma, Ana Carolina; Cabanas, Joaquim; Abecasis, Ana; Carvalho, Ana Patrícia; Ziermann, Rainer; Diogo, Isabel; Gonçalves, Fátima; Lobo, Céu Sousa; Camacho, Ricardo

    2006-08-01

    Quantitation of HIV-1 RNA levels in plasma has an undisputed prognostic value and is extremely important for evaluating response to antiretroviral therapy. The purpose of this study was to evaluate the performance of the real-time PCR COBAS TaqMan 48 analyser, comparing it to the existing VERSANT 3.0 (bDNA) for HIV-1 RNA quantitation in plasma of individuals infected with different HIV-1 subtypes (104 blood samples). A positive linear correlation between the two tests (r2 = 0.88) was found. Quantitation by the COBAS TaqMan assay was approximately 0.32log10 higher than by bDNA. The relationship between the two assays was similar within all subtypes with a Deming regression of <1 and <0 for the Bland-Altman plots. Overall, no significant differences were found in plasma viral load quantitation in different HIV-1 subtypes between both assays; therefore these assays are suitable for viral load quantitation of highly genetically diverse HIV-1 plasma samples.

  20. The Quality Fit.

    ERIC Educational Resources Information Center

    Vertiz, Virginia C.; Downey, Carolyn J.

    This paper proposes a two-pronged approach for examining an educational program's "quality of fit." The American Association of School Administrators' (AASA's) Curriculum Management Audit for quality indicators is reviewed, using the Downey Quality Fit Framework and Deming's 4 areas of profound knowledge and 14 points. The purpose is to…

  1. Getting Started with TQM.

    ERIC Educational Resources Information Center

    Freeston, Kenneth R.

    1992-01-01

    Tired of disjointed programs and projects, the staff of Newtown (Connecticut) Public Schools developed their own Success-Oriented School Model, blending elements of Deming's 14 points with William Glasser's approach to quality. To obtain quality outcomes means stressing continuous improvement and staying close to the customer. (six references)…

  2. A Move from Effective to Quality.

    ERIC Educational Resources Information Center

    Duden, Nancy

    1993-01-01

    For past three years, Tallahassee, Florida, elementary school has changed from good effective school to quality learning organization. Inspired by Deming's principles, the school's broad-based task force developed a culture suitable for continuous quality improvement. Newly established core values included the importance of individuals, teachers…

  3. Quality Management in Education.

    ERIC Educational Resources Information Center

    Tribus, Myron

    When transferring the methods of quality management from industry to academia, there are important differences that must be considered. This paper describes the differences between traditional management and quality management, and shows how Deming's principles of Total Quality Management (TQM) can be applied to education. Some of these principles…

  4. The Quality Education Challenge.

    ERIC Educational Resources Information Center

    Downey, Carolyn J.; And Others

    Attempts to implement W. Edwards Deming's Total Quality Management (TQM) principles in education and transform school systems into world-class, quality learning environments have proved somewhat disappointing. This book asserts that educators need a way to translate the ideas about corporate quality for adaptation and use in schools. The…

  5. Selected Works: 1990-1994

    DTIC Science & Technology

    1995-08-01

    soldiers: 43 tanks: 37 Isaacs, Betty: 336 Isherwood, Mike: 336 Ishikawa , Kaoru : 263 Isolationism: 137 Israel: 29, 229, 335 383 Israeli Air Force: 335...Deming, Juran, Ishikawa , and others—have long been practiced by the Air Force. We’ve used these principles from our beginnings as an institution—long

  6. Quality Management: Survey of Federal Organizations

    DTIC Science & Technology

    1992-10-01

    Deming, Armand Feigenbaum, Kaoru Ishikawa , and J. M. Juran. For purposes of our survey, quality improvement efforts which have the same basic goals and...Feigenbaum, Kaoru Ishikawa , and J.M. Juran. Even though this approach may have different names, it most often includes the following five concepts

  7. Student evaluations of a year-long mentorship program: a quality improvement initiative.

    PubMed

    van Eps, Mary Ann; Cooke, Marie; Creedy, Debra K; Walker, Rachel

    2006-08-01

    Mentoring is an important teaching-learning process in undergraduate nursing curricula. There are relatively few studies specifically evaluating nursing students' perceptions of mentorship. In the period 1999-2002, 39 students were mentored during a year-long program. This descriptive, exploratory study used a quality improvement framework informed by the Deming cycle of Plan, Do, Check and Act [Deming, W.E., 1982. Quality, Productivity and Competitive Position. Massachusetts Institute of Technology, Cambridge] to evaluate the mentorship program from the students' perspective. Information was gathered through surveys, focus group discussions and interviews and analyzed to identify themes of responses. Identified themes were 'The doing of nursing', 'The thinking of nursing' and 'Being a nurse'. The study confirmed the value of mentorship in undergraduate nursing and highlighted the importance of skill competence as a basis for professional role identity by graduating students. The benefits of mentorship were derived from a long term, supportive relationship with the same registered nurse who was committed to the student's professional development.

  8. Mt. Edgecumbe's Venture in Quality.

    ERIC Educational Resources Information Center

    Rocheleau, Larrae

    1991-01-01

    Having rewritten W. Edwards Deming's 14 points from an educational perspective, the superintendent of a state-run boarding school serving native Alaskans describes the transformation that the school and his own administrator role have undergone thanks to systems thinking and a paradigm shift demanding change at the top. (MLH)

  9. AACSB Accreditation and Possible Unintended Consequences: A Deming View

    ERIC Educational Resources Information Center

    Stepanovich, Paul; Mueller, James; Benson, Dan

    2014-01-01

    The AACSB accreditation process reflects basic quality principles, providing standards and a process for feedback for continuous improvement. However, implementation can lead to unintended negative consequences. The literature shows that while institutionalism and critical theory have been used as a theoretical base for evaluating accreditation,…

  10. Total Quality Management: Implications for Educational Assessment.

    ERIC Educational Resources Information Center

    Rankin, Stuart C.

    1992-01-01

    Deming's "System of Profound Knowledge" is even more fundamental than his 14-principle system transformation guide and is based on 4 elements: systems theory, statistical variation, a theory of knowledge, and psychology. Management should revamp total system processes so that quality of product is continually improved. Implications for…

  11. Quality Management for Schools.

    ERIC Educational Resources Information Center

    Mulligan, Dorothy

    1992-01-01

    W. Edwards Deming introduced management principles that helped Japan become a world economic power. Virginia is attempting to adapt these techniques to education with a grant that provides training and support for school personnel in several school districts. Describes a quality management program at Christa McAuliffe Elementary School that has…

  12. On the Road to Quality: Turning Stumbling Blocks into Stepping Stones.

    ERIC Educational Resources Information Center

    Bonstingl, John Jay

    1996-01-01

    W. Edwards Deming's quality philosophy can help organizations develop collaborative, community-building leadership practices. This article outlines five personal practices of quality based on personal leadership, partnerships, a systems focus, a process orientation, and constant dedication to continuous improvement. Stumbling blocks can be…

  13. Why Quality Is within Our Grasp ... If We Reach.

    ERIC Educational Resources Information Center

    Rhodes, Lewis A.

    1990-01-01

    Today's calls for school restructuring demand a greater response than piecemeal tinkering. The impetus for total organizational change started over 30 years ago in Japan as industrial leaders adapted W. Edwards Deming's beliefs and strategies concerning psychology, systems, perceptual theoretical frameworks, and causes of variation. Quality…

  14. Quality Implementation in Transition: A Framework for Specialists and Administrators.

    ERIC Educational Resources Information Center

    Wald, Judy L.; Repetto, Jeanne B.

    1995-01-01

    Quality Implementation in Transition is a framework designed to guide transition specialists and administrators in the implementation of total quality management. The framework uses the tenets set forth by W. Edwards Deming and is intended to help professionals facilitate change within transition programs. (Author/JOW)

  15. Beyond Your Beliefs: Quantum Leaps toward Quality Schools.

    ERIC Educational Resources Information Center

    Rhodes, Lewis A.

    1990-01-01

    W. Edwards Deming's concepts offer an integrated approach to quality schooling. Three barriers must be overcome: fear of industrial models, poor knowledge of workers and work processes, and unquestioned beliefs. Instead, educators must develop community understanding and commitment, establish business-education partnerships, and manage schools as…

  16. Curriculum Transformation through Total Quality Management.

    ERIC Educational Resources Information Center

    Edwards, Barbara; Algozzine, Bob

    1995-01-01

    Describes a massive cultural transformation project at two Charlotte, North Carolina, elementary schools that used Deming's total quality management principles to restructure curricula according to Boyer's eight commonalities of learning. Shows how the FADE (focus, analyze, develop, and execute) model was used to develop a well-coordinated,…

  17. Quality assurance programs for pressure ulcers.

    PubMed

    Xakellis, G C

    1997-08-01

    Traditional medical quality assurance programs are beginning to incorporate the principles of continuous quality improvement pioneered by Juran and Deming. Strategies for incorporating these principles into a long-term care facility are described, and two examples of successful implementation of continuous quality improvement programs for pressure ulcers are presented.

  18. Using Defined Processes as a Context for Resilience Measures

    DTIC Science & Technology

    2011-12-01

    processes. 1 "W. Edwards Deming." BrainyQuote.com. Xplore Inc, 2010. Accessed September 22, 2011. http://www.brainyquote.com/quotes/quotes/w...process owner of this process element or another related organizational home page, e.g., Software Engineering Institute, IEEE or a government regulatory

  19. Self-Ratings of Eight Factors of Quality Management at Naval Avionics Center

    DTIC Science & Technology

    1991-12-01

    revised edition, North Rivers Press, Inc., 1986. Ishikawa , Kaoru , What is Total Quality Control? the Japanese Way, Prentice-Hall, Inc., 1985. Jaeger...including such authors as Deming, Juran, Ishikawa , and Crosby. The questionnaire was validated using a sample from private sector organizations in

  20. Total Quality Management. ERIC Digest, Number 73.

    ERIC Educational Resources Information Center

    Weaver, Tyler

    The Japanese success story has made W. Edwards Deming's Total Quality Management (TQM) theory increasingly popular among American managers, from car manufacturers to educational leaders. TQM is based on two tenets: the primacy of customer satisfaction and the necessity of tapping nontraditional sources (especially employee ideas) to institute…

  1. Total Quality Management in Higher Education.

    ERIC Educational Resources Information Center

    Sherr, Lawrence A.; Lozier, G. Gredgory

    1991-01-01

    Total Quality Management, based on theories of W. Edward Deming and others, is a style of management using continuous process improvement characterized by mission and customer focus, a systematic approach to operations, vigorous development of human resources, long-term thinking, and a commitment to ensuring quality. The values espoused by this…

  2. Coalescing a School Community around Total Quality: A Superintendent's Perspective.

    ERIC Educational Resources Information Center

    Manley, Robert J.

    1996-01-01

    Inspired by Deming's work, the superintendent of West Babylon (New York) Schools convened his administrative team to build a consensus about the schools' mission. The vision statement outlines four primary functions: protective care, civic training, personality development, and teaching of knowledge. The district's goal-driven process surmounted…

  3. Staff Development and Total Quality Management.

    ERIC Educational Resources Information Center

    Norris, Gerald L.; Norris, Joye H.

    Professional development is an emerging view of faculty development that places teachers in charge of their own professional growth. The emergence of Total Quality Management (TQM) provides a vehicle for designing professional development to meet the needs of individuals and the organizations that employ them. The eight tenets of Deming's theory…

  4. Creating the Total Quality Effective School.

    ERIC Educational Resources Information Center

    Lezotte, Lawrence W.

    This book shows how Deming's Total Quality Management (TQM) theory for organizational management can be integrated with the effective-schools literature. Part 1 compares the 14 principles of TQM with the tenets of effective-schools research. The second part develops a blueprint for creating the total quality effective school. The conceptual…

  5. New Paradigms for Creating Quality Schools.

    ERIC Educational Resources Information Center

    Greene, Brad

    The Quality Schools movement combines the principles of control theory with Edward Deming's principles of total quality management. The outcome is a school environment in which the focus is on quality work, discipline is maintained without coercion, and students continuously evaluate their own work. This book describes the application of Quality…

  6. TQ What?: Applying Total Quality Management to Child Care.

    ERIC Educational Resources Information Center

    Hewes, Dorothy

    1994-01-01

    Discusses the concept of Total Quality Management (TQM), developed by W. Edward Deming and Joseph Juran in 1940s, and its applications for child care centers. Discusses how TQM focuses on customer satisfaction, measuring performance, benchmarking, employee empowerment, and continuous training. Includes a list of suggested readings on TQM. (MDM)

  7. A School-Based Quality Improvement Program.

    ERIC Educational Resources Information Center

    Rappaport, Lewis A.

    1993-01-01

    As one Brooklyn high school discovered, quality improvement begins with administrator commitment and participants' immersion in the literature. Other key elements include ongoing training of personnel involved in the quality-improvement process, tools such as the Deming Cycle (plan-do-check-act), voluntary and goal-oriented teamwork, and a worthy…

  8. Total Quality Management.

    ERIC Educational Resources Information Center

    Focus in Change, 1992

    1992-01-01

    The philosophy known as Total Quality Management (TQM) is frequently presented as a way to change and improve public education. This issue of "Focus in Change" examines Deming's original 14 TQM points and their application to education. Myron Tribus lays out the core philosophy of the movement and discusses its possible application to…

  9. The Quality Movements in Higher Education in the United States.

    ERIC Educational Resources Information Center

    Miller, Richard I.

    1996-01-01

    Discussion of various quality control strategies in American higher education looks at and compares Total Quality Management (TQM), outcomes assessment, Deming's 14 points, the Malcolm Baldrige National Quality Award, the ISO 9000 series, restructuring, reengineering, and performance indicators. It is suggested that colleges and universities will…

  10. TQM in Rural Education: Managing Schools from a Business Perspective.

    ERIC Educational Resources Information Center

    Nelson, William

    1994-01-01

    Outlines the 14 points of Deming's business philosophy of Total Quality Management in terms of rural education, including adoption of a common mission, movement from mass inspection (standardized testing) to individualized assessment, constant system improvement, training for those involved in the process, improved communication, employee rewards…

  11. Are Learning Organizations Pragmatic?

    ERIC Educational Resources Information Center

    Cavaleri, Steven A.

    2008-01-01

    Purpose: The purpose of this paper is to evaluate the future prospects of the popular concept known as the learning organization; to trace the influence of philosophical pragmatism on the learning organization and to consider its potential impact on the future; and to emphasize how pragmatic theories have shaped the development of Deming's total…

  12. Total Quality Management: Empirical, Conceptual, and Practical Issues.

    ERIC Educational Resources Information Center

    Hackman, J. Richard; Wageman, Ruth

    1995-01-01

    Total quality management (TQM) has become a U.S. social movement. This commentary analyzes the writings of W. Edwards Deming, Joseph Juran, and Kaoru Ishikawa to assess TQM's coherence, distinctiveness, and likely perseverance. Rhetoric is winning over substance, unrelated interventions are being herded under the TQM banner, and research is not…

  13. How Configuration Management Helps Projects Innovate and Communicate

    NASA Technical Reports Server (NTRS)

    Cioletti, Louis A.; Guidry, Carla F.

    2009-01-01

    This slide presentation reviews the concept of Configuration Management (CM) and compares it to the standard view of Project management (PM). It presents two PM models: (1) Kepner-Tregoe,, and the Deming models, describes why projects fail, and presents methods of how CM helps projects innovate and communicate.

  14. A Guide for Implementing Total Quality Management in the U.S. Coast Guard Reserve

    DTIC Science & Technology

    1991-12-01

    quality field were reviewed. The main ones studied were: Total Quality (Deming 1988); Single-Minute Exchange of Die (SMED) (Shingo 1985); Poka - yoke (mistake...Productivity Press, 1985. Shingo, Shigeo. Zero Quality Control: Source Inspection and the Poka - Yoke System. Cambridge, MA: Productivity Press, 1986. Snead

  15. 77 FR 18651 - Qualifying Urban Areas for the 2010 Census

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-27

    ...,158 Delhi, LA 2,770 Delhi, NY 2,953 Dell Rapids, SD 3,584 Delphi, IN 3,008 Delphos, OH 7,440 Delta, CO 7,849 Delta, OH 3,158 Delta, UT 3,258 Deming, NM 14,903 Demopolis, AL 6,452 Denison, IA 8,240...

  16. A Quality Classroom: Quality Teaching Tools That Facilitate Student Success.

    ERIC Educational Resources Information Center

    Cooke, Brian

    This presentation described practical applications and quality tools for educators that are based on original classroom research and the theories of motivation, learning, profound knowledge, systems thinking, and service quality advanced by Karl Albrecht, William Glasser, and W. Edwards Deming. The presentation was conducted in a way that…

  17. Quality Management Plus: The Continuous Improvement of Education.

    ERIC Educational Resources Information Center

    Kaufman, Roger; Zahn, Douglas

    This book applies quality management, an organizational theory that has been successful in business and industry, to education. Chapter 1 describes the basic elements of quality management (QM)--continuous improvement, client satisfaction, positive return on investment, and doing it right the first and every time. Ways to implement Deming's 14…

  18. Total Quality Management: Application in Vocational Education. ERIC Digest No. 125.

    ERIC Educational Resources Information Center

    Lankard, Bettina A.

    Total Quality Management (TQM) establishes business and industry standards and techniques that ensure the quality of products leaving and reaching firms through continuous actions rather than one final inspection. Deming, Juran, and Crosby, who initiated the process, share a common theme of participatory management. Management participation and…

  19. A Matter of Metaphors: Education as a Handmade Process.

    ERIC Educational Resources Information Center

    Sztajn, Paola

    1992-01-01

    Applying Deming's principles to education represents change in metaphor, not paradigm shift. Exchanging factory metaphor for enlightened corporation metaphor updates business/economics image but perpetuates view of students as raw materials to be processed efficiently. No business metaphor truly aims at improving society as a whole. If production…

  20. Total Quality Management (TQM) in a Community College.

    ERIC Educational Resources Information Center

    Knowles, Tony

    In September 1991, Red River Community College (RRCC) in Winnipeg, Manitoba, decided to embrace the concepts of Total Quality Management (TQM) to provide an operational philosophy, enhance program curricula, and establish business opportunities. RRCC adapted W. Edward Deming's manufacturing philosophy to create its own approach, which focused on:…

  1. Method comparison of ultrasound and kilovoltage x-ray fiducial marker imaging for prostate radiotherapy targeting

    NASA Astrophysics Data System (ADS)

    Fuller, Clifton David; Thomas, Charles R., Jr.; Schwartz, Scott; Golden, Nanalei; Ting, Joe; Wong, Adrian; Erdogmus, Deniz; Scarbrough, Todd J.

    2006-10-01

    Several measurement techniques have been developed to address the capability for target volume reduction via target localization in image-guided radiotherapy; among these have been ultrasound (US) and fiducial marker (FM) software-assisted localization. In order to assess interchangeability between methods, US and FM localization were compared using established techniques for determination of agreement between measurement methods when a 'gold-standard' comparator does not exist, after performing both techniques daily on a sequential series of patients. At least 3 days prior to CT simulation, four gold seeds were placed within the prostate. FM software-assisted localization utilized the ExacTrac X-Ray 6D (BrainLab AG, Germany) kVp x-ray image acquisition system to determine prostate position; US prostate targeting was performed on each patient using the SonArray (Varian, Palo Alto, CA). Patients were aligned daily using laser alignment of skin marks. Directional shifts were then calculated by each respective system in the X, Y and Z dimensions before each daily treatment fraction, previous to any treatment or couch adjustment, as well as a composite vector of displacement. Directional shift agreement in each axis was compared using Altman-Bland limits of agreement, Lin's concordance coefficient with Partik's grading schema, and Deming orthogonal bias-weighted correlation methodology. 1019 software-assisted shifts were suggested by US and FM in 39 patients. The 95% limits of agreement in X, Y and Z axes were ±9.4 mm, ±11.3 mm and ±13.4, respectively. Three-dimensionally, measurements agreed within 13.4 mm in 95% of all paired measures. In all axes, concordance was graded as 'poor' or 'unacceptable'. Deming regression detected proportional bias in both directional axes and three-dimensional vectors. Our data suggest substantial differences between US and FM image-guided measures and subsequent suggested directional shifts. Analysis reveals that the vast majority of all individual US and FM directional measures may be expected to agree with each other within a range of 1-1.5 cm. Since neither system represents a gold standard, clinical judgment must dictate whether such a difference is of import. As IMRT protocols seek dose escalation and PTV reduction predicated on US- and FM-guided imaging, future studies are needed to address these potential clinically relevant issues regarding the interchangeability and accuracy of novel positional verification techniques. Comparison series with multiple image-guidance systems are needed to refine comparisons between targeting methods. However, we do not advocate interchangeability of US and FM localization methods. Portions of this data were presented at the American Society of Clinical Oncology/American Society for Therapeutic Radiology and Oncology/Society of Surgical Oncology 2006 Prostate Cancer Symposium, San Francisco, CA, USA.

  2. Clinical comparison of branched DNA and reverse transcriptase-PCR and nucleic acid sequence-based amplification assay for the quantitation of circulating recombinant form_BC HIV-1 RNA in plasma.

    PubMed

    Pan, Pinliang; Tao, Xiaoxia; Zhang, Qi; Xing, Wenge; Sun, Xianguang; Pei, Lijian; Jiang, Yan

    2007-12-01

    To investigate the correlation between three viral load assays for circulating recombinant form (CRF)_BC. Recent studies in HIV-1 molecular epidemiology, reveals that CRF_BC is the dominant subtype of HIV-1 virus in mainland China, representing over 45% of the HIV-1 infected population. The performances of nucleic acid sequence-based amplification (NASBA), branched DNA (bDNA) and reverse transcriptase polymerase chain reaction (RT-PCR) were compared for the HIV-1 viral load detection and quantitation of CRF_BC in China. Sixteen HIV-1 positive and three HIV-1 negative samples were collected. Sequencing of the positive samples in the gp41 region was conducted. The HIV-1 viral load values were determined using bDNA, RT-PCR and NASBA assays. Deming regression analysis with SPSS 12.0 (SPS Inc., Chicago, Illinois, USA) was performed for data analysis. Sequencing and phylogenetic analysis of env gene (gp41) region of the 16 HIV-1 positive clinical specimens from Guizhou Province in southwest China revealed the dominance of the subtype CRF_BC in that region. A good correlation of their viral load values was observed among three assays. Pearson's correlation between RT-PCR and bDNA is 0.969, Lg(VL)RT-PCR = 0.969 * Lg(VL)bDNA + 0.55; Pearson's correlation between RT-PCR and NASBA is 0.968, Lg(VL)RT-PCR = 0.968 * Lg(VL)NASBA + 0.937; Pearson's correlation between NASBA and bDNA is 0.980, Lg(VL)NASBA = 0.980 * Lg(VL)bDNA - 0.318. When testing with 3 different assays, RT-PCR, bDNA and NASBA, the group of 16 HIV-1 positive samples showed the viral load value was highest for RT-PCR, followed by bDNA then NASBA, which is consistent with the former results in subtype B. The three viral load assays are highly correlative for CRF_BC in China.

  3. Concurrent agreement between an anthropometric model to predict thigh volume and dual-energy X-Ray absorptiometry assessment in female volleyball players aged 14-18 years.

    PubMed

    Tavares, Óscar M; Valente-Dos-Santos, João; Duarte, João P; Póvoas, Susana C; Gobbo, Luís A; Fernandes, Rômulo A; Marinho, Daniel A; Casanova, José M; Sherar, Lauren B; Courteix, Daniel; Coelho-E-Silva, Manuel J

    2016-11-24

    A variety of performance outputs are strongly determined by lower limbs volume and composition in children and adolescents. The current study aimed to examine the validity of thigh volume (TV) estimated by anthropometry in late adolescent female volleyball players. Dual-energy X-ray absorptiometry (DXA) measures were used as the reference method. Total and regional body composition was assessed with a Lunar DPX NT/Pro/MD+/Duo/Bravo scanner in a cross-sectional sample of 42 Portuguese female volleyball players aged 14-18 years (165.2 ± 0.9 cm; 61.1 ± 1.4 kg). TV was estimated with the reference method (TV-DXA) and with the anthropometric method (TV-ANTH). Agreement between procedures was assessed with Deming regression. The analysis also considered a calibration of the anthropometric approach. The equation that best predicted TV-DXA was: -0.899 + 0.876 × log 10 (body mass) + 0.113 × log 10 (TV-ANTH). This new model (NM) was validated using the predicted residual sum of squares (PRESS) method (R 2 PRESS  = 0.838). Correlation between the reference method and the NM was 0.934 (95%CI: 0.880-0.964, S y∙x  = 0.325 L). A new and accurate anthropometric method to estimate TV in adolescent female volleyball players was obtained from the equation of Jones and Pearson alongside with adjustments for body mass.

  4. Requirement for specific gravity and creatinine adjustments for urinary steroids and luteinizing hormone concentrations in adolescents.

    PubMed

    Singh, Gurmeet K S; Balzer, Ben W R; Desai, Reena; Jimenez, Mark; Steinbeck, Katharine S; Handelsman, David J

    2015-11-01

    Urinary hormone concentrations are often adjusted to correct for hydration status. We aimed to determine whether first morning void urine hormones in growing adolescents require adjustments and, if so, whether urinary creatinine or specific gravity are better adjustments. The study population was adolescents aged 10.1 to 14.3 years initially who provided fasting morning blood samples at 0 and 12 months (n = 343) and first morning urine every three months (n = 644). Unadjusted, creatinine and specific gravity-adjusted hormonal concentrations were compared by Deming regression and Bland-Altman analysis and grouped according to self-rated Tanner stage or chronological age. F-ratios for self-rated Tanner stages and age groups were used to compare unadjusted and adjusted hormonal changes in growing young adolescents. Correlations of paired serum and urinary hormonal concentration of unadjusted and creatinine and specific gravity-adjusted were also compared. Fasting first morning void hormone concentrations correlated well and were unbiased between unadjusted or adjusted by either creatinine or specific gravity. Urine creatinine concentration increases with Tanner stages, age and male gender whereas urine specific gravity was not influenced by Tanner stage, age or gender. Adjustment by creatinine or specific gravity of urinary luteinizing hormone, estradiol, testosterone, dihydrotestosterone and dehydroepiandrosterone concentrations did not improve correlation with paired serum concentrations. Urine steroid and luteinizing hormone concentrations in first morning void samples of adolescents are not significantly influenced by hydration status and may not require adjustments; however, if desired, both creatinine and specific gravity adjustments are equally suitable. © The Author(s) 2015.

  5. Evaluation of the COBAS Hepatitis C Virus (HCV) TaqMan analyte-specific reagent assay and comparison to the COBAS Amplicor HCV Monitor V2.0 and Versant HCV bDNA 3.0 assays.

    PubMed

    Konnick, Eric Q; Williams, Sheri M; Ashwood, Edward R; Hillyard, David R

    2005-05-01

    Performance characteristics of the COBAS hepatitis C virus (HCV) TaqMan analyte-specific reagent (TM-ASR) assay using the QIAGEN BioRobot 9604 for RNA extraction were evaluated and compared to the COBAS Amplicor HCV Monitor V2.0 (Amplicor) and Versant HCV bDNA 3.0 (Versant) assays using clinical samples. Calibration of TM-ASR using Armored RNA allowed determination of the distribution of HCV RNA in clinical samples, using 22,399 clinical samples. Limit of detection, linearity, and inter- and intraassay assay precision were determined for the TM-ASR assay using multiple clinical specimen panels across multiple determinations. Genotype specificity for the TM-ASR assay was determined using samples with different HCV RNA genotypes evaluated and compared against predetermined results. Contamination control of the TM-ASR assay was evaluated using pools of HCV RNA-positive and -negative samples tested in a checkerboard pattern over 12 runs of 96 samples. Correlation of the TM-ASR, Amplicor, and Versant assays was determined using 100 paired clinical samples and Deming regression analysis. The TM-ASR performed well with respect to linearity, precision, and contamination control. The correlation between TM-ASR and the Amplicor and Versant assays was poor, with large differences between assay results for individual samples. Calibration of the TM-ASR assay with Armored RNA allowed for a wide dynamic range and description of the distribution of HCV RNA in clinical samples.

  6. Evaluation of the COBAS Hepatitis C Virus (HCV) TaqMan Analyte-Specific Reagent Assay and Comparison to the COBAS Amplicor HCV Monitor V2.0 and Versant HCV bDNA 3.0 Assays

    PubMed Central

    Konnick, Eric Q.; Williams, Sheri M.; Ashwood, Edward R.; Hillyard, David R.

    2005-01-01

    Performance characteristics of the COBAS hepatitis C virus (HCV) TaqMan analyte-specific reagent (TM-ASR) assay using the QIAGEN BioRobot 9604 for RNA extraction were evaluated and compared to the COBAS Amplicor HCV Monitor V2.0 (Amplicor) and Versant HCV bDNA 3.0 (Versant) assays using clinical samples. Calibration of TM-ASR using Armored RNA allowed determination of the distribution of HCV RNA in clinical samples, using 22,399 clinical samples. Limit of detection, linearity, and inter- and intraassay assay precision were determined for the TM-ASR assay using multiple clinical specimen panels across multiple determinations. Genotype specificity for the TM-ASR assay was determined using samples with different HCV RNA genotypes evaluated and compared against predetermined results. Contamination control of the TM-ASR assay was evaluated using pools of HCV RNA-positive and -negative samples tested in a checkerboard pattern over 12 runs of 96 samples. Correlation of the TM-ASR, Amplicor, and Versant assays was determined using 100 paired clinical samples and Deming regression analysis. The TM-ASR performed well with respect to linearity, precision, and contamination control. The correlation between TM-ASR and the Amplicor and Versant assays was poor, with large differences between assay results for individual samples. Calibration of the TM-ASR assay with Armored RNA allowed for a wide dynamic range and description of the distribution of HCV RNA in clinical samples. PMID:15872232

  7. Quantifying MMA by SLE LC-MS/MS: Unexpected challenges in assay development.

    PubMed

    Lo, Sheng-Ying; Gordon, Cindy; Sadilkova, Katerina; Jack, Rhona M; Dickerson, Jane A

    2016-09-01

    Analysis of serum/plasma methylmalonic acid (MMA) is important for the diagnosis and management of methylmalonic acidemia in pediatric populations. This work focuses on developing and validating a liquid chromatography tandem mass spectrometry (LC-MS/MS) method to monitor methylmalonic acidemia using a simple method preparation. MMA and stable isotope labeled d3-MMA was extracted using supported liquid extraction (SLE). Assay imprecision, bias, linearity, recovery and carryover were determined. The relationship between MMA and propionyl acylcarnitine (C3-acylcarnitine) was also evaluated using historical paired results from 51 unique individuals. Baseline separation between MMA and succinic acid was completed in 7min. The assay was linear from 0.1 to 500μM. The intra-day and inter-day imprecision CV ranged from 4.1 to 13.2% (0.3 to 526μM) and 5.0 to 15.7% (0.3 to 233μM), respectively. Recovery ranged from 93 to 125%. The correlation with a national reference laboratory LC-MS/MS assay showed a Deming regression of 1.026 and intercept of -1.335. Carryover was determined to be <0.04%. Patient-specific correlation was observed between MMA and C3-acylcarnitine. This report describes the first LC-MS/MS method using SLE for MMA extraction. In addition, we illustrate the challenges encountered during this method development that should be assessed and resolved by any laboratory implementing a SLE LC-MS/MS assay designed to quantify analytes across several orders of magnitude. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. Quantitative determination of risperidone, paliperidone and olanzapine in human serum by liquid chromatography-tandem mass spectrometry coupled with on-line solid-phase extraction.

    PubMed

    Ruan, Can-Jun; Guo, Wei; Zhou, Miao; Guo, Gui-Xin; Wang, Chuan-Yue; Li, Wen-Biao; de Leon, Jose

    2018-07-01

    A recent guideline recommends therapeutic drug monitoring for risperidone, paliperidone and olanzapine, which are frequently used second-generation antipsychotics. We developed a simple high-performance liquid chromatography-tandem mass spectrometry coupled with an online solid-phase extraction method that can be used to measure risperidone, paliperidone and olanzapine using small (40 μL) samples. The analytes were extracted from serum samples automatically pre-concentrated and purified by C 8 (5 μm, 2.1 × 30 mm) solid-phase extraction cartridges, then chromatographed on an Xbidge™ C 18 column (3.5 μm, 100 × 2.1 mm) thermostatted at 30°C with a mobile phase consisting of 70% acetonitrile and 30% ammonium hydroxide 1% solution at an isocratic flow rate of 0.3 mL/min, and detected with tandem mass spectrometry. The assay was validated in the concentration range from 2.5 to 160 ng/mL. Intra- and inter-day precision for all analytes was between 1.1 and 8.2%; method accuracy was between 6.6 and 7.6%. The risperidone and paliperidone assay was compared with a high-performance liquid chromatography-ultraviolet assay currently used in our hospital for risperidone and paliperidone therapeutic drug monitoring, and the results of weighted Deming regression analysis showed good agreement. For the olanzapine assay, we compared 20 samples in separate re-assays on different days; all the relative errors were within the 20% recommended limit. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Evaluating and Improving Tech Prep: Development, Validation, and Results of the Minnesota Self-Assessment Model.

    ERIC Educational Resources Information Center

    Pucel, David J.; And Others

    1996-01-01

    The Minnesota Tech Prep Self-Evaluation System is a framework based on ideas from Deming, Juran, and the Baldridge award. Testing with 17 Minnesota tech prep consortia found it effective in identifying areas needing improvement and promoting dialog among those involved in tech prep implementation. (SK)

  10. Red Beads and Profound Knowledge: Deming and Quality of Education

    ERIC Educational Resources Information Center

    Lohr, Sharon L.

    2015-01-01

    Value-added models are being implemented in many states in an attempt to measure the contributions of individual teachers and schools toward students' learning. Scores from these models are increasingly used for high-stakes purposes such as setting compensation, hiring or dismissing teachers, awarding tenure, and closing schools. The statistician…

  11. Quality Management in U.S. High Schools: Evidence from the Field.

    ERIC Educational Resources Information Center

    Detert, James R.; Bauerly Kopel, Michelle E.; Mauriel, John J.; Jenni, Roger W.

    2000-01-01

    Reports on a longitudinal study examining implementation of a Quality Management reform based on Deming's seven principles. Interview and survey data from a national sample of purposefully chosen high schools show limited results as to teachers' effective use and institutionalization of TQM principles. The principal's role is critical. (Contains…

  12. School Attendance Problems: Using the TQM Tools To Identify Root Causes.

    ERIC Educational Resources Information Center

    Weller, L. David

    2000-01-01

    Deming's principles and TQM problem-solving tools and techniques can be used to solve noninstructional problems such as vandalism, dropouts, and student absenteeism. This case study presents a model for principals to apply to identify root causes, resolve problems, and provide quality outcomes (at reduced cost) in noninstructional areas. (Contains…

  13. Defining Instructional Quality by Employing the Total Quality Management (TQM) Method: A Research Project.

    ERIC Educational Resources Information Center

    Croker, Robert E.; And Others

    The feasibility of using W. E. Deming's total quality management (TQM) method to define instructional quality was examined by surveying three groups of students attending Idaho State University's College of Education and School of Applied Technology: 31 students seeking cosmetology certification; 75 undergraduates pursuing degrees in corporate…

  14. Strategic Management of Quality: An American and British Perspective.

    ERIC Educational Resources Information Center

    Weller, L. David; McElwee, Gerard

    1997-01-01

    Total Quality Management is being implemented in American and British schools to improve educational outcomes. The 14 points of Deming's quality model and Porter's models of competition and drivers of cost provide a systematic, structured template to promote educational excellence and meet the demands of social, political, and economic forces.…

  15. Quality in Education: An Implementation Handbook.

    ERIC Educational Resources Information Center

    Arcaro, Jerome S.

    This book describes how the principles of quality can be applied to education. Based on the work of W. Edwards Deming and Joseph M. Juran, the book outlines a systematic and practical approach to implementing quality in educational settings. It also describes how to encourage staff participation in quality initiatives. Total-quality schools are…

  16. Implementing Continuous Improvement Management (CIM) in the Public Schools.

    ERIC Educational Resources Information Center

    Borgers, William E.; Thompson, Tommy A.

    This book traces the restructuring of a Texas school district that moved from management by coercion to continuous improvement for quality. In 1990, the Dickinson Independent School District (Texas) began implementation of Continuous Improvement Management (CIM), based on the teachings of W. Edwards Deming, William Glasser, and J. M. Juran.…

  17. A Paradigm Shift for Educational Administrators: The Total Quality Movement.

    ERIC Educational Resources Information Center

    Hough, M. J.

    This paper reviews the major ideas of the seminal total quality management theorists, such as Deming, Crosby, Juran, Ishikawa, and Imai, to illustrate how total quality management is applicable to education. It is argued that there is a need for a paradigm shift in educational administration. The first part reviews current Australian societal…

  18. Quality Improvement Awards and Vocational Education Assessment. ERIC Digest No. 182.

    ERIC Educational Resources Information Center

    Brown, Bettina Lankard

    Quality system awards offer blueprints for assessing quality in vocational education as well as in business and industry. The three most prestigious awards recognizing quality improvement in business and industry are the Malcolm Baldrige Quality Award, Deming Application Prize, and ISO 9000 Registration. When comparing standards for the quality…

  19. Schools of Quality. Third Edition.

    ERIC Educational Resources Information Center

    Bonstingl, John Jay

    This book presents the concept that quality as a keystone philosophy in today's business world can be applied to school systems as a means to improving education and all aspects of school culture, producing a school of quality. The author uses examples such as Japan's adopting William E. Deming's quality-control principles to help it skyrocket…

  20. Effects of Business School Student's Study Time on the Learning Process

    ERIC Educational Resources Information Center

    Tetteh, Godson Ayertei

    2016-01-01

    Purpose: This paper aims to clarify the relationship between the student's study time and the learning process in the higher education system by adapting the total quality management (TQM) principles-process approach. Contrary to Deming's (1982) constancy of purpose to improve the learning process, some students in higher education postpone their…

  1. Adapting Total Quality Doesn't Mean "Turning Learning into a Business."

    ERIC Educational Resources Information Center

    Schmoker, Mike; Wilson, Richard B.

    1993-01-01

    Although Alfie Kohn is a first-rate thinker, his article in the same "Educational Leadership" issue confuses adopting Total Quality Management methods with intelligently adapting them. Kohn wrestles too hard with the "worker/student" metaphor and wrongly disparages Deming's emphasis on data and performance. Schools can definitely benefit from…

  2. Program Manager: Journal of the Defense Systems Management College, Volume 22, Number 5, September-October 1993

    DTIC Science & Technology

    1993-10-01

    Edwards Deming, Joseph M. Juran, Process Kaoru Ishikawa and Philip B. Crosby Activities stress the importance of work-force employee involvement to...contracts. The Ishikawa - An atmosphere of mutual work-force employee involvement ac- DOD executives have lectured at con- trust and respect is neces- tions

  3. The Concepts of Quality for Rural and Small School Decision Makers.

    ERIC Educational Resources Information Center

    Wilson, Alfred P.; Hedlund, Paul H.

    This report briefly introduces the ideas of six influential individuals in the field of quality control, and relates these concepts to current educational innovations. Quality is defined by Philip B. Crosby as the result of a culture of relationships within an organization. W. Edwards Deming espouses intrinsic motivation for all employees,…

  4. Total Quality Management and Organizational Behavior Management: An Integration for Continual Improvement.

    ERIC Educational Resources Information Center

    Mawhinney, Thomas C.

    1992-01-01

    The history and main features of organizational behavior management (OBM) are compared and integrated with those of total quality management (TQM), with emphasis on W.E. Deming's 14 points and OBM's operant-based approach to performance management. Interventions combining OBM, TQM, and statistical process control are recommended. (DB)

  5. Organize Your School for Improvement

    ERIC Educational Resources Information Center

    Truby, William F.

    2017-01-01

    W. Edwards Deming has suggested 96% of organization performance is a function of the organization's structure. He contends only about 4% of an organization's performance is attributable to the people. This is a fundamental difference as most school leaders work with the basic assumption that 80% of a school's performance is related to staff and…

  6. The Quality Professor: Implementing TQM in the Classroom.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.

    This volume describes Total Quality Management (TQM) in the higher education classroom and guides college faculty in implementing TQM to improve their teaching. Chapter 1 introduces TQM and gives pointers on how to begin implementing it. Chapter 2 describes TQM approaches and principles including the Deming and Crosby approaches, describes the TQM…

  7. Teacher-Student Perspectives of Invisible Pedagogy: New Directions in Online Problem-Based Learning Environments

    ERIC Educational Resources Information Center

    Barber, Wendy; King, Sherry

    2016-01-01

    Universities and institutions of higher education are facing economic pressures to sustain large classes, while simultaneously maintaining the quality of the online learning environment (Deming et al., 2015). Digital learning environments require significant pedagogical shifts on the part of the teacher. This paper is a qualitative examination of…

  8. Baldrige Educational Quality Criteria as Another Model for Accreditation in American Community Colleges.

    ERIC Educational Resources Information Center

    Faulkner, Jane B.

    Institutional accreditation is a voluntary, non-governmental activity administered by the eight postsecondary accrediting institutions that are part of the six regional associations that serve colleges and universities in the United States. The author cites W. Edwards Deming's work on corporate quality improvement, and its applicability to…

  9. The role of colonization in the dynamics of patchy populations of a cyclic vole species.

    PubMed

    Glorvigen, Petter; Gundersen, Gry; Andreassen, Harry P; Ims, Rolf A

    2013-09-01

    The crash phase of vole populations with cyclic dynamics regularly leads to vast areas of uninhabited habitats. Yet although the capacity for cyclic voles to re-colonize such empty space is likely to be large and predicted to have become evolved as a distinct life history trait, the processes of colonization and its effect on the spatio-temporal dynamics have been little studied. Here we report from an experiment with root voles (Microtus oeconomus) specifically targeted at quantifying the process of colonization of empty patches from distant source patches and its resultant effect on local vole deme size variation in a patchy landscape. Three experimental factors: habitat quality, predation risk and inter-patch distance were employed among 24 habitat patches in a 100 × 300-m experimental area. The first-born cohort in the spring efficiently colonized almost all empty patches irrespective of the degree of patch isolation and predation risk, but this was dependent on habitat quality. Just after the initial colonization wave the deme sizes in patches of the same quality were underdispersed relative to Poisson variance, indicating regulated (density-dependent) settlement. Towards the end of the breeding season local demographic processes acted to smooth out the initial post-colonization differences among source and colonization patches, and among patches of initially different quality. However, at this time demographic stochasticity had also given rise to a large (overdispersed) variation in deme sizes that may have contributed to an overshadowing of the effect of other factors. The results of this experiment confirmed our expectation that the space-filling capacity of voles is large. The costs associated with transience appeared to be so low, at least at the spatial scale considered in this experiment, that such costs are not likely to substantially constrain habitat selection and colonization in the increase phase of cyclic patchy populations.

  10. TaqMan RT-PCR and VERSANT HIV-1 RNA 3.0 (bDNA) assay Quantification of HIV-1 RNA viral load in breast milk.

    PubMed

    Israel-Ballard, Kiersten; Ziermann, Rainer; Leutenegger, Christian; Di Canzio, James; Leung, Kimmy; Strom, Lynn; Abrams, Barbara; Chantry, Caroline

    2005-12-01

    Transmission of HIV via breast milk is a primary cause of pediatric HIV infection in developing countries. Reliable methods to detect breast milk viral load are important. To correlate the ability of the VERSANT HIV 3.0 (bDNA) assay to real-time (RT) TaqMan PCR in quantifying breast milk HIV-1 RNA. Forty-six breast milk samples that had been spiked with cell-free HIV-1 and eight samples spiked with cell-associated HIV-1 were assayed for HIV-1 RNA by both VERSANT HIV 3.0 and TaqMan RNA assays. Only assays on the cell-free samples were statistically compared. Both a Deming regression slope and a Bland-Altman slope indicated a linear relationship between the two assays. TaqMan quantitations were on average 2.6 times higher than those of HIV 3.0. A linear relationship was observed between serial dilutions of spiked cell-free HIV-1 and both the VERSANT HIV 3.0 and the TaqMan RNA assays. The two methods correlated well although the VERSANT HIV 3.0 research protocol quantified HIV-1 RNA slightly lower than TaqMan.

  11. Analysis of the Astronomy Diagnostic Test

    NASA Astrophysics Data System (ADS)

    Brogt, Erik; Sabers, Darrell; Prather, Edward E.; Deming, Grace L.; Hufnagel, Beth; Slater, Timothy F.

    Seventy undergraduate class sections were examined from the database of Astronomy Diagnostic Test (ADT) results of Deming and Hufnagel to determine if course format correlated with ADT normalized gain scores. Normalized gains were calculated for four different classroom scenarios: lecture, lecture with discussion, lecture with lab, and lecture with both lab and discussion. Statistical analysis shows that there are no significant differences in normalized gain among the self- reported classroom formats. Prerequisites related to mathematics courses did show differences in normalized gain. Of all reported course activities, only the lecture and the readings for the course correlate significantly with the normalized gain. This analysis suggests that the ADT may not have enough sensitivity to measure differences in the effectiveness of different course formats because of the wide range of topics that the ADT addresses with few questions. Different measures of gain and their biases are discussed. We argue that the use of the normalized gain is not always warranted because of its strong bias toward high pretest scores.

  12. Applying the PDCA Cycle to the Complex Task of Teaching and Assessing Public Relations Writing

    ERIC Educational Resources Information Center

    Knight, John E.; Allen, Sandra

    2012-01-01

    Teaching skills, knowledge and abilities appropriate for career-ready graduates and assessing learning are complex issues. Developing a valid and reliable approach is often by trial and error. Instead, the authors employed Deming's PDCA Cycle of continuous improvement as a systematic procedure to incrementally move closer to their goal. This paper…

  13. Apply Deming's Methods to K-12 Curriculum and Improve Student Achievement

    ERIC Educational Resources Information Center

    Kelly, Thomas F.

    2008-01-01

    The United States has been engaged in school reform for three decades. The federal government as well as all fifty states have passed numerous versions of reform legislation to mandate and regulate the process. Educators have adjusted their practices to the policy created by this legislation. They have also allocated hundreds of billions of…

  14. Should Business Reform Public Education? A "Rainy Night" for Georgia Teachers and Implications for Science Education.

    ERIC Educational Resources Information Center

    Aliff, John Vincent

    Into the "quality of public schools" issue step politicians with quick fixes--"proven" business practices variously rejected by experts Peter Drucker (Management by Objectives) and W. E. Deming (Quality Management). These include the following. Determine product quality by inspection--hence, compare school quality by testing teachers and students.…

  15. Echoes of the Vision: When the Rest of the Organization Talks Total Quality.

    ERIC Educational Resources Information Center

    Fairhurst, Gail T.

    1993-01-01

    Describes a case study of an organization that recently began implementing W. E. Deming's Total Quality (TQ). Finds and discusses five framing devices used in routine work conversations between leaders and members to implement the TQ vision: communicated predicaments, possible futures, jargon and vision themes, positive spin, and agenda setting.…

  16. Using Quality Management as a Bridge in Educating for Sustainability in a Business School

    ERIC Educational Resources Information Center

    Rusinko, Cathy A.

    2005-01-01

    Purpose: To demonstrate how quality management (QM), a widely accepted management paradigm, can be used to advance education for sustainability in the business curriculum. Design/methodology/approach: The assumptions of QM and environmental sustainability are explored. A class exercise is developed that uses QM tools--and in particular, Deming's…

  17. Evaluating For-Profit Higher Education: Evidence from the Education Longitudinal Study. A CAPSEE Working Paper

    ERIC Educational Resources Information Center

    Liu, Yuen Ting; Belfield, Clive

    2014-01-01

    This study evaluates the postsecondary and labor market outcomes of students who attended for-profit colleges. The evaluation complements a similar study by Deming, Goldin, and Katz (2012) that found significant differences in outcomes between students in for-profit colleges and those in other sectors. In this study we use the Education…

  18. Can Deming's Concept of Total Quality Management Be Applied to Education?

    ERIC Educational Resources Information Center

    Sevick, Charles

    This paper explores the meaning of Total Quality Management (TQM), examines the development of the concept, and assesses the application of TQM to education. In summary, TQM has the following points of relevance for education: (1) The interest and welfare of every student must be a primary concern; (2) the authoritarian management model does not…

  19. DCASR (Defense Contract Administration Services Region) Dallas Total Quality Management Implementation Plan

    DTIC Science & Technology

    1989-07-01

    Competitive Success e. What is Total Quality Kaoru Ishikawa Control? The Japanese Way f. Managerial Break Through J. M. Juran g. The Deming Route to...Berger and Thomas H. Hart p. Juran’s Quality Control J. M. Juran Handbook, Fourth Edition q. Guide to Quality Control Kaoru Ishikawa r. Quality Assurance

  20. Strategies for Meeting High Standards: Quality Management and the Baldrige Criteria in Education. Lessons from the States.

    ERIC Educational Resources Information Center

    Barth, John; Burk, Zona Sharp; Serfass, Richard; Harms, Barbara Ann; Houlihan, G. Thomas; Anderson, Gerald; Farley, Raymond P.; Rigsby, Ken; O'Rourke, John

    This document, one of a series of reports, focuses on the adoption of principles of quality management, originally developed by W. Edwards Deming, and the Baldrige Criteria for use in education. These processes and tools for systemic organizational management, when comprehensively applied, produce performance excellence and continuous improvement.…

  1. Efficiency vs. Effectiveness: Can W. Edwards Deming's Principles of Quality Management Be Applied Successfully to American Education.

    ERIC Educational Resources Information Center

    Petry, John R.

    The field of education has been slow to recognize the Total Quality Management (TQM) concept. This resistance may result from entrenched management styles characterized by hierarchical decision-making structures. TQM emphasizes management based on leadership instead of management by objective, command, and coercion. The TQM concept consists of…

  2. The Relation among School District Health, Total Quality Principles for School Organization and Student Achievement

    ERIC Educational Resources Information Center

    Marshall, Jon; Pritchard, Ruie; Gunderson, Betsey

    2004-01-01

    The purpose of this study was to determine the congruence among W. E. Deming's 14 points for Total Quality Management (TQM), the organizational health of school districts, and student achievement. Based on Kanter's (1983) concept of a Culture of Pride with a Climate of Success, healthy districts were defined as having an organizational culture…

  3. Deming, quality and the small medical group administrator.

    PubMed

    Noll, D C

    1992-01-01

    As administrators, writes Douglas Noll, we can coordinate and implement quality measures affecting our practices and which impact the patient's total medical experience. Unfortunately, many smaller groups cannot hire an outside consultant or single employee whose sole purpose would be to monitor quality. Noll offers several simple practices that administrators can use to improve the quality of service in their groups.

  4. Outcomes-Balanced Framework for Emergency Management: A Predictive Model for Preparedness

    DTIC Science & Technology

    2013-09-01

    Management Total Quality Management (TQM) was developed by W. Edwards Deming in the post-World War II reconstruction period in Japan. It ushered in a...FIGURES Figure 1. From Total Quality Management Principles ....................................................30 Figure 2. Outcomes Logic Model (After...THIRA Threat and Hazard Identification and Risk Assessment TQM Total Quality Management UTL Universal Task List xiv ACKNOWLEDGMENTS German

  5. Some Recommendations for Education (and All of Us): Valuing Differences as Collaboration beyond Outcomes Assessment and Total Quality Management/Demingism.

    ERIC Educational Resources Information Center

    Sadler, Lynn Veach

    Recommendations for a national educational agenda that is based on tolerance for cultural diversity and real collaboration are presented in this paper with emphasis on the W. E. Deming model of Total Quality Management, or "Demingism." Two problems in American education are academic performance and the failure of disadvantaged schools. Ten…

  6. On the Calculation of Steady Flow in the Boundary Layer Near the Surface of a Cylinder in a Stream

    DTIC Science & Technology

    1934-07-17

    Green’s.: souinn~b oprd meitlwt the thes inthi imortat rgio. Owng o th laourinivlve, i Wa not deme neesr ocryotGee’ aciai sn...n ..o a cicl of raiu 4.87 .... afud f ne ai U ............ .. u x . . . -.... . ... .. . . .. . . (.1 be~ ax 10 . The s orrepeonds-o p ar

  7. DefenseLink Special: Blum Visits 'Operation Jump Start' Troops

    Science.gov Websites

    DEMING, N.M., Dec. 1, 2006 - The National Guard Bureau chief got a look this week at how the forward , 2006 - The National Guard Bureau chief assured members of the Army and Air National Guard during a Guard troops serving here along the southwestern U.S. border. Story Guard Chief, Border Patrol Praise

  8. Self-Organization of Ions at the Interface between Graphene and Ionic Liquid DEME-TFSI.

    PubMed

    Hu, Guangliang; Pandey, Gaind P; Liu, Qingfeng; Anaredy, Radhika S; Ma, Chunrui; Liu, Ming; Li, Jun; Shaw, Scott K; Wu, Judy

    2017-10-11

    Electrochemical effects manifest as nonlinear responses to an applied electric field in electrochemical devices, and are linked intimately to the molecular orientation of ions in the electric double layer (EDL). Herein, we probe the origin of the electrochemical effect using a double-gate graphene field effect transistor (GFET) of ionic liquid N,N-diethyl-N-(2-methoxyethyl)-N-methylammonium bis(trifluoromethylsulfonyl)imide (DEME-TFSI) top-gate, paired with a ferroelectric Pb 0.92 La 0.08 Zr 0.52 Ti 0.48 O 3 (PLZT) back-gate of compatible gating efficiency. The orientation of the interfacial molecular ions can be extracted by measuring the GFET Dirac point shift, and their dynamic response to ultraviolet-visible light and a gate electric field was quantified. We have observed that the strong electrochemical effect is due to the TFSI anions self-organizing on a treated GFET surface. Moreover, a reversible order-disorder transition of TFSI anions self-organized on the GFET surface can be triggered by illuminating the interface with ultraviolet-visible light, revealing that it is a useful method to control the surface ion configuration and the overall performance of the device.

  9. Spontaneous Ionic Polarization in Ammonia-Based Ionic Liquid [Spontaneous Ionic Polarization in Ionic Liquid

    DOE PAGES

    Kim, Ki-jeong; Yuan, Hongtao; Jang, Hoyoung; ...

    2018-05-24

    Ionic liquids and gels have attracted attention for a variety of energy storage applications, as well as for high performance electrolytes for batteries and super-capacitors. Although the electronic structure of ionic electrolytes in these applications is of practical importance for device design and improved performance, the understanding of the electronic structure of ionic liquids and gels is still at an early stage. Here we report soft x-ray spectroscopic measurements of the surface electronic structure of a representative ammonia-based ionic gel (DEME-TFSI with PSPMMA- PS copolymer). We observe that near the outermost surface, the area of the anion peak (1s Nmore » - core level in TFSI) is relatively larger than that of the cation peak (N + in DEME). This spontaneous ionic polarization of the electrolyte surface, which is absent for the pure ionic liquid without copolymer, can be directly tuned by the copolymer content in the ionic gel, and further results in a modulation in work function. Finally, these results shed new light on the control of surface electronic properties of ionic electrolytes, as well as a difference between their implementation in ionic liquids and gels.« less

  10. Isolation By Distance (IBD) signals in the deep-water rose shrimp Parapenaeus longirostris (Lucas, 1846) (Decapoda, Panaeidae) in the Mediterranean Sea.

    PubMed

    Lo Brutto, S; Maggio, T; Arculeo, M

    2013-09-01

    The identification of boundaries of genetic demes is one of the major goals for fishery management, and few Mediterranean commercial species have not been studied from a genetic point of view yet. The deep-water rose shrimp Parapenaeus longirostris (Lucas, 1846) is one of the most important components of commercial landings in Mediterranean, its fishery aspects have received much attention, regrettably without any concern for the genetic architecture of its populations. The population structure in the central and eastern Mediterranean Sea (captures from six Italian and two Greek landings) has been analysed on the basis of surveys carried out with mitochondrial and AFLP markers. Data revealed the presence of a gradual discrepancy along a west-east axis. This species, occurring mainly at a depth of between 100 and 400 m, is not strongly confined in isolated demes, but it demonstrates an 'Isolation By Distance' model, within the Mediterranean Sea, which includes geographical areas with a some degree of isolation. The role of hydrodynamic forces, such as currents, water fronts, is discussed; and a further evidence of the 'Levantine isolation' within Mediterranean basin is shown. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Spontaneous Ionic Polarization in Ammonia-Based Ionic Liquid [Spontaneous Ionic Polarization in Ionic Liquid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Ki-jeong; Yuan, Hongtao; Jang, Hoyoung

    Ionic liquids and gels have attracted attention for a variety of energy storage applications, as well as for high performance electrolytes for batteries and super-capacitors. Although the electronic structure of ionic electrolytes in these applications is of practical importance for device design and improved performance, the understanding of the electronic structure of ionic liquids and gels is still at an early stage. Here we report soft x-ray spectroscopic measurements of the surface electronic structure of a representative ammonia-based ionic gel (DEME-TFSI with PSPMMA- PS copolymer). We observe that near the outermost surface, the area of the anion peak (1s Nmore » - core level in TFSI) is relatively larger than that of the cation peak (N + in DEME). This spontaneous ionic polarization of the electrolyte surface, which is absent for the pure ionic liquid without copolymer, can be directly tuned by the copolymer content in the ionic gel, and further results in a modulation in work function. Finally, these results shed new light on the control of surface electronic properties of ionic electrolytes, as well as a difference between their implementation in ionic liquids and gels.« less

  12. Serum cystatin C is independently associated with renal impairment and high sensitivity C-reactive protein in systemic lupus erythematosus.

    PubMed

    Chew, Christine; Pemberton, Philip W; Husain, Awal Al-M; Haque, Sahena; Bruce, Ian N

    2013-01-01

    In systemic lupus erythematosus (SLE) patients, glomerular filtration rate (GFR) is usually estimated using the modified Cockcroft-Gault (mCG) and Modification of Diet in Renal Disease (MDRD) equations. We aimed to study cystatin C (sCysC) in SLE to assess its agreement with standard renal indices and investigate factors affecting sCysC in SLE. SLE patients (≥4 ACR criteria) and healthy women from Greater Manchester were recruited and clinical assessments were undertaken. SCysC was measured using R & D Systems' ELISA. Agreement between renal measures was assessed using Deming plots and factors associated with sCysC in SLE were examined by multiple linear regression analyses. 178 patients and 68 controls had median (IQR) ages of 53 (46-61) and 50 (39-60) years, respectively. In an age-adjusted analysis, SLE patients had higher sCysC (1.16 [0.98-1.36] vs. 0.950 [0.73-1.13] mg/l; p<0.0001) and within SLE those with a history of lupus nephritis had higher sCysC (1.31 [1.10-1.66] vs. 1.11 [0.95-1.29] mg/l; p<0.005). SCysC correlated positively with serum creatinine, and inversely to renal measures (r=-0.530; p<0.0001 [mCG], and r=-0.620; p<0.0001 [MDRD]). There was closer agreement between the two eGFR measures than between either eGFR measures and sCysC. In addition to age and serum creatinine, a multivariate analysis (β, p) found that high-sensitivity C-reactive protein (hs-CRP) (0.03, 0.026) was also independently associated with sCysC in SLE. In SLE, sCysC may be influenced by low grade inflammation as well as by renal dysfunction. Therefore, SCysC should not supplant current assessment of renal dysfunction in SLE.

  13. Point-of-care testing of electrolytes and calcium using blood gas analysers: it is time we trusted the results.

    PubMed

    Mirzazadeh, Mehdi; Morovat, Alireza; James, Tim; Smith, Ian; Kirby, Justin; Shine, Brian

    2016-03-01

    Point-of-care testing allows rapid analysis of samples to facilitate prompt clinical decisions. Electrolyte and calcium abnormalities are common in acutely ill patients and can be associated with life-threatening consequences. There is uncertainty whether clinical decisions can be based on the results obtained from blood gas analysers or if laboratory results should be awaited. To assess the agreement between sodium, potassium and calcium results from blood gas and laboratory mainstream analysers in a tertiary centre, with a network consisting of one referral and two peripheral hospitals, consisting of three networked clinical biochemistry laboratories. Using the laboratory information management system database and over 11 000 paired samples in three hospital sites, the results of sodium, potassium and ionised calcium on blood gas analysers were studied over a 5-year period and compared with the corresponding laboratory results from the same patients booked in the laboratory within 1 h. The Pearson's linear correlation coefficient between laboratory and blood gas results for sodium, potassium and calcium were 0.92, 0.84 and 0.78, respectively. Deming regression analysis showed a slope of 1.04 and an intercept of -5.7 for sodium, slope of 0.93 and an intercept of 0.22 for potassium and a slope of 1.23 with an intercept of -0.55 for calcium. With some strict statistical assumptions, percentages of results lying outside the least significant difference were 9%, 26.7% and 20.8% for sodium, potassium and calcium, respectively. Most clinicians wait for the laboratory confirmation of results generated by blood gas analysers. In a large retrospective study we have shown that there is sufficient agreement between the results obtained from the blood gas and laboratory analysers to enable prompt clinical decisions to be made. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Multiple reaction monitoring with multistage fragmentation (MRM3) detection enhances selectivity for LC-MS/MS analysis of plasma free metanephrines.

    PubMed

    Wright, Michael J; Thomas, Rebecca L; Stanford, Phoebe E; Horvath, Andrea R

    2015-03-01

    LC-MS/MS with multiple reaction monitoring (MRM) is a powerful tool for quantifying target analytes in complex matrices. However, the technique lacks selectivity when plasma free metanephrines are measured. We propose the use of multistage fragmentation (MRM(3)) to improve the analytical selectivity of plasma free metanephrine measurement. Metanephrines were extracted from plasma with weak cation exchange solid-phase extraction before separation by hydrophilic interaction liquid chromatography. We quantified normetanephrine and metanephrine by either MRM or MRM(3) transitions m/z 166→134→79 and m/z 180→149→121, respectively. Over a 6-month period, approximately 1% (n = 21) of patient samples showed uncharacterized coeluting substances that interfered with the routine assay, resulting in an inability to report results. Quantification with MRM(3) removed these interferences and enabled measurement of the target compounds. For patient samples unaffected by interferences, Deming regression analysis demonstrated a correlation between MRM(3) and MRM methods of y = 1.00x - 0.00 nmol/L for normetanephrine and y = 0.99x + 0.03 nmol/L for metanephrine. Between the MRM(3) method and the median of all LC-MS/MS laboratories enrolled in a quality assurance program, the correlations were y = 0.97x + 0.03 nmol/L for normetanephrine and y = 1.03x - 0.04 nmol/L for metanephrine. Imprecision for the MRM(3) method was 6.2%-7.0% for normetanephrine and 6.1%-9.9% for metanephrine (n = 10). The lower limits of quantification for the MRM(3) method were 0.20 nmol/L for normetanephrine and 0.16 nmol/L for metanephrine. The use of MRM(3) technology improves the analytical selectivity of plasma free metanephrine quantification by LC-MS/MS while demonstrating sufficient analytical sensitivity and imprecision. © 2014 American Association for Clinical Chemistry.

  15. An evaluation of the DRI-ETG EIA method for the determination of ethyl glucuronide concentrations in clinical and post-mortem urine.

    PubMed

    Turfus, Sophie C; Vo, Tu; Niehaus, Nadia; Gerostamoulos, Dimitri; Beyer, Jochen

    2013-06-01

    A commercial enzyme immunoassay for the qualitative and semi-quantitative measurement of ethyl glucuronide (EtG) in urine was evaluated. Post-mortem (n=800), and clinical urine (n=200) samples were assayed using a Hitachi 902 analyzer. The determined concentrations were compared with those obtained using a previously published liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the quantification of EtG and ethyl sulfate. Using a cut-off of 0.5 µg/ml and LC-MS/MS limit of reporting of 0.1 µg/ml, there was a sensitivity of 60.8% and a specificity of 100% for clinical samples. For post-mortem samples, sensitivity and specificity were 82.4% and 97.1%, respectively. When reducing the cut-off to 0.1 µg/ml, the sensitivity and specificity were 83.3% and 100% for clinical samples whereas for post-mortem samples the sensitivity and specificity were 90.3 % and 88.3 %, respectively. The best trade-offs between sensitivity and specificity for LC-MS/MS limits of reporting of 0.5 and 0.1 µg/ml were achieved when using immunoassay cut-offs of 0.3 and 0.092 µg/ml, respectively. There was good correlation between quantitative results obtained by both methods but analysis of samples by LC-MS/MS gave higher concentrations than by enzyme immunoassay (EIA), with a statistically significant proportional bias (P<0.0001, Deming regression) for both sample types. The immunoassay is reliable for the qualitative and semi-quantitative presumptive detection of ethyl glucuronide in urine. Copyright © 2012 John Wiley & Sons, Ltd.

  16. Performance Assessment of a Trypanosoma cruzi Chimeric Antigen in Multiplex Liquid Microarray Assays.

    PubMed

    Santos, Fred Luciano Neves; Celedon, Paola Alejandra Fiorani; Zanchin, Nilson Ivo Tonin; Leitolis, Amanda; Crestani, Sandra; Foti, Leonardo; de Souza, Wayner Vieira; Gomes, Yara de Miranda; Krieger, Marco Aurélio

    2017-10-01

    Diagnosing chronic Chagas disease (CD) requires antibody-antigen detection methods, which are traditionally based on enzymatic assay techniques whose performance depend on the type and quality of antigen used. Previously, 4 recombinant chimeric proteins from the Instituto de Biologia Molecular do Paraná (IBMP-8.1 to 8.4) comprising immuno-dominant regions of diverse Trypanosoma cruzi antigens showed excellent diagnostic performance in enzyme-linked immunosorbent assays. Considering that next-generation platforms offer improved CD diagnostic accuracy with different T. cruzi -specific recombinant antigens, we assessed the performance of these chimeras in liquid microarrays (LMAs). The chimeric proteins were expressed in Escherichia coli and purified by chromatography. Sera from 653 chagasic and 680 healthy individuals were used to assess the performance of these chimeras in detecting specific anti- T. cruzi antibodies. Accuracies ranged from 98.1 to 99.3%, and diagnostic odds ratio values were 3,548 for IBMP-8.3, 4,826 for IBMP-8.1, 7,882 for IBMP-8.2, and 25,000 for IBMP-8.4. A separate sera bank (851 samples) was employed to assess cross-reactivity with other tropical diseases. Leishmania , a pathogen with high similarity to T. cruzi , showed cross-reactivity rates ranging from 0 to 2.17%. Inconclusive results were negligible (0 to 0.71%). Bland-Altman and Deming regression analysis based on 200 randomly selected CD-positive and negative samples demonstrated interchangeability with respect to CD diagnostic performance in both singleplex and multiplex assays. Our results suggested that these chimeras can potentially replace antigens currently used in commercially available assay kits. Moreover, the use of multiplex platforms, such as LMA assays employing 2 or more IBMP antigens, would abrogate the need for 2 different testing techniques when diagnosing CD. Copyright © 2017 American Society for Microbiology.

  17. Use of the Vettest 8008 and refractometry for determination of total protein, albumin, and globulin concentrations in feline effusions.

    PubMed

    Papasouliotis, Kostas; Murphy, Kate; Dodkin, Steve; Torrance, Andy G

    2002-01-01

    Pleural and peritoneal effusion is a common clinical finding in feline practice. Determination of fluid albumin (ALB) and globulin (GLOB) concentrations in addition to total protein (TP) concentration can be helpful in diagnosing or ruling out certain diseases in cats, especially feline infectious peritonitis (FIP). The objective of this study was to compare effusion TP, ALB, and GLOB results obtained by a refractometer and a bench-top dry chemistry analyzer with those results obtained by a reference method. Twenty-six pleural and 14 peritoneal effusion samples were analyzed from 40 cats with various diseases. TP and ALB concentrations were determined by a reference automated wet chemistry analyzer (Kone Specific, Kone Instruments, Espoo, Finland), a bench-top dry chemistry analyzer (Vettest 8008, IDEXX Laboratories Ltd, Chalfont St Peter, UK), and a refractometer (Atago SPR-T2, Atago Co, Tokyo, Japan). GLOB, albumin to globulin (A/G) ratio, and globulins as a percentage of total proteins (GLOB%) were calculated. Results were analyzed by paired t tests, difference plots, and Deming s regression analysis. Correlation coefficients (r) for TP with Vettest versus Kone and refractometer versus Kone methods were.97 and.94, respectively. GLOB and GLOB% values were significantly higher and A/G ratios were significantly lower with Vettest versus Kone methods. Correlation coefficients for ALB, GLOB, GLOB% and A/G ratio with Vettest versus Kone methods were.86,.93,.82, and.73, respectively. Although correlation with other methods was good, the refractometer underestimated TP concentrations in 3 samples. The refractometer is an acceptable method for determination of TP concentration in feline effusions. The Vettest 8008 also is an acceptable method for the determination of TP and ALB concentrations, however, calculated A/G ratios obtained with the Vettest are unacceptable.

  18. Estimating glomerular filtration rate in black South Africans by use of the modification of diet in renal disease and Cockcroft-Gault equations.

    PubMed

    van Deventer, Hendrick E; George, Jaya A; Paiker, Janice E; Becker, Piet J; Katz, Ivor J

    2008-07-01

    The 4-variable Modification of Diet in Renal Disease (4-v MDRD) and Cockcroft-Gault (CG) equations are commonly used for estimating glomerular filtration rate (GFR); however, neither of these equations has been validated in an indigenous African population. The aim of this study was to evaluate the performance of the 4-v MDRD and CG equations for estimating GFR in black South Africans against measured GFR and to assess the appropriateness for the local population of the ethnicity factor established for African Americans in the 4-v MDRD equation. We enrolled 100 patients in the study. The plasma clearance of chromium-51-EDTA ((51)Cr-EDTA) was used to measure GFR, and serum creatinine was measured using an isotope dilution mass spectrometry (IDMS) traceable assay. We estimated GFR using both the reexpressed 4-v MDRD and CG equations and compared it to measured GFR using 4 modalities: correlation coefficient, weighted Deming regression analysis, percentage bias, and proportion of estimated GFR within 30% of measured GFR (P(30)). The Spearman correlation coefficient between measured and estimated GFR for both equations was similar (4-v MDRD R(2) = 0.80 and CG R(2) = 0.79). Using the 4-v MDRD equation with the ethnicity factor of 1.212 as established for African Americans resulted in a median positive bias of 13.1 (95% CI 5.5 to 18.3) mL/min/1.73 m(2). Without the ethnicity factor, median bias was 1.9 (95% CI -0.8 to 4.5) mL/min/1.73 m(2). The 4-v MDRD equation, without the ethnicity factor of 1.212, can be used for estimating GFR in black South Africans.

  19. A Case Study of Two Regional State Universities Qualifying as Learning Organizations Based on Administration and Staff Viewpoints

    ERIC Educational Resources Information Center

    Rich, Tammy Morrison

    2011-01-01

    This case study of 2 state universities qualifying as learning organizations, based on administration and staff viewpoints, was completed using a qualitative methodology. The idea of what a learning organization is can be different depending on who or what is being analyzed. For this study, the work of theorists including W. Edwards Deming,…

  20. Performance of the Defense Acquisition System, 2013 Annual Report

    DTIC Science & Technology

    2013-06-28

    well beyond their start dates, which could indicate factors not attributable to the initial management of the contracts. Figure 2-13. DoD Total ...Attributed to W. Edwards Deming While the United States achieves its national security missions by equipping its military forces with the best weapons...annual reports on the performance of the defense acquisition system—its programs, institutions, workforce, managers , executives, and industrial partners

  1. A TQM involvement plan.

    PubMed

    Laws, H F

    1993-03-01

    We present the plan used to change our 35-bed medical treatment facility to a Deming Total Quality Improvement environment. This was successfully intermeshed with the military hierarchical chain of command and has resulted in a paradigm shift in the least amount of time within the facility. We present this plan, its steps for implementation, and our Quality Council organization so that similar military medical units may learn from our experience.

  2. Total Quality Management (TQM) Awareness Seminar. Revision 8

    DTIC Science & Technology

    1990-04-18

    in the United States and abroad, including Dr. W. Edwards Deming, Dr. Joseph Juran, Philip Crosby, Genichi Taguchi, Kaoru Ishikawa , and Armand...Hall, Inc., Englewood Cliffs, NJ 07022. 1985 Ishikawa , Kaoru , Guide to Quality Control, Tokyo; Asian Productivity Organization, 1976 (Available from...Random House Business Division, 201 East 50th Street, New York, NY 10022. � Ishikawa , Karou, What is Total Quality Control?: The Japanese Way, Prentice

  3. Retention of First and Second Class Petty Officers in the U.S. Coast Guard

    DTIC Science & Technology

    1979-09-01

    enlistment Intentions). intrinsic / extrinsic Job satisfaction factors (as in the Herzberg model of what constitutes em- ployee motivation ), other specific...HERZBERG’S THEORY OF MOTIVATION --------------------------- 11 8. CO"ITMENT----------------------------------------------- 12 C. EXPECTATIONS AND...deme has shown a major interest in studying and devising both descriptive and predictive models and explanations of job turnover and job motivations . A

  4. Demographic inference under the coalescent in a spatial continuum.

    PubMed

    Guindon, Stéphane; Guo, Hongbin; Welch, David

    2016-10-01

    Understanding population dynamics from the analysis of molecular and spatial data requires sound statistical modeling. Current approaches assume that populations are naturally partitioned into discrete demes, thereby failing to be relevant in cases where individuals are scattered on a spatial continuum. Other models predict the formation of increasingly tight clusters of individuals in space, which, again, conflicts with biological evidence. Building on recent theoretical work, we introduce a new genealogy-based inference framework that alleviates these issues. This approach effectively implements a stochastic model in which the distribution of individuals is homogeneous and stationary, thereby providing a relevant null model for the fluctuation of genetic diversity in time and space. Importantly, the spatial density of individuals in a population and their range of dispersal during the course of evolution are two parameters that can be inferred separately with this method. The validity of the new inference framework is confirmed with extensive simulations and the analysis of influenza sequences collected over five seasons in the USA. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Rivaroxaban Levels in Patients' Plasmas are Comparable by Using Two Different Anti Xa Assay/Coagulometer Systems Calibrated with Two Different Calibrators.

    PubMed

    Martinuzzo, Marta E; Duboscq, Cristina; Lopez, Marina S; Barrera, Luis H; Vinuales, Estela S; Ceresetto, Jose; Forastiero, Ricardo R; Oyhamburu, Jose

    2018-06-01

    Rivaroxaban oral anticoagulant does not need laboratory monitoring, but in some situations plasma level measurement is useful. The objective of this paper was to verify analytical performance and compare two rivaroxaban calibrated anti Xa assays/coagulometer systems with specific or other branch calibrators. In 59 samples drawn at trough or peak from patients taking rivaroxaban, plasma levels were measured by HemosIL Liquid anti Xa in ACLTOP 300/500, and STA liquid Anti Xa in TCoag Destiny Plus. HemosIL and STA rivaroxaban calibrators and controls were used. CLSI guideline procedures EP15A3 for precision and trueness, EP6 for linearity, and EP9 for methods comparison were used. Coefficient of variation within run and total precision (CVR and CVWL respectively) of plasmatic rivaroxaban were < 4.2 and < 4.85% and BIAS < 7.4 and < 6.5%, for HemosIL-ACL TOP and STA-Destiny systems, respectively. Linearity verification 8 - 525 ng/mL a Deming regression for methods comparison presented R 0.963, 0.968 and 0.982, with a mean CV 13.3% when using different systems and calibrations. The analytical performance of plasma rivaroxaban was acceptable in both systems, and results from reagent/coagulometer systems are comparable even when calibrating with different branch material.

  6. Analytical and clinical performance of the Hologic Aptima HCV Quant Dx Assay for the quantification of HCV RNA in plasma samples.

    PubMed

    Schønning, Kristian; Pedersen, Martin Schou; Johansen, Kim; Landt, Bodil; Nielsen, Lone Gilmor; Weis, Nina; Westh, Henrik

    2017-10-01

    Chronic hepatitis C virus (HCV) infection can be effectively treated with directly acting antiviral (DAA) therapy. Measurement of HCV RNA is used to evaluate patient compliance and virological response during and after treatment. To compare the analytical performance of the Aptima HCV Quant Dx Assay (Aptima) and the COBAS Ampliprep/COBAS TaqMan HCV Test v2.0 (CAPCTMv2) for the quantification of HCV RNA in plasma samples, and compare the clinical utility of the two tests in patients undergoing treatment with DAA therapy. Analytical performance was evaluated on two sets of plasma samples: 125 genotyped samples and 172 samples referred for quantification of HCV RNA. Furthermore, performance was evaluated using dilutions series of four samples containing HCV genotype 1a, 2b, 3a, and 4a, respectively. Clinical utility was evaluated on 118 plasma samples obtained from 13 patients undergoing treatment with DAAs. Deming regression of results from 187 plasma samples with HCV RNA >2 Log IU/mL indicated that the Aptima assay quantified higher than the CAPCTMv2 test for HCV RNA >4.9 Log IU/mL. The linearity of the Aptima assay was excellent across dilution series of four HCV genotypes (slope of the regression line: 1.00-1.02). The Aptima assay detected significantly more replicates below targeted 2 Log IU/mL than the CAPCTMv2 test, and yielded clearly interpretable results when used to analyze samples from patients treated with DAAs. The analytical performance of the Aptima assay makes it well suited for monitoring patients with chronic HCV infection undergoing antiviral treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Wave-clouds coupling in the Jovian troposphere.

    NASA Astrophysics Data System (ADS)

    Gaulme, P.; Mosser, B.

    2003-05-01

    First studies about Jovian oscillations are due to Vorontsov et al. (1976). Attempts to observe them started in the late 1980's (Deming et al. 1989, Mosser et al. 1991). The micro-satellite Jovis and ground-based observations campaign such as SŸMPA (e.g Baglin et al. 1999) account for an accurate analysis of the cloud response to an acoustic wave. Therefore, the propagation of sound or gravity waves in the Jovian troposphere is revisited, in order to estimate their effect on the highest clouds layer. From basic thermodynamics, the troposphere should be stratified in three major ice clouds layers: water-ammonia, ammonium-hydrosulfide and ammonia ice for the highest. The presence of ammonia ice clouds has been inferred from Kuiper in 1952, and was predicted to dominate the Jovian skies. However, they had been observed spectroscopically over less than one percent of the surface. This absence of spectral proof could come from a coating of ammonia particles from other substances (Baines et al. 2002). In this work, we study the behaviour of a cloud submitted to a periodic pressure perturbation. We suppose a vertical wave propagating in a plane parallel atmosphere including an ammonia ice cloud layer. We determine the relation between the Lagrangian pressure perturbation and the variation of the fraction of solid ammonia. The linearized equations governing the evolution of the Eulerian pressure and density perturbed terms allows us to study how the propagation is altered by the clouds and how the clouds move with the wave. Finally, because a pressure perturbation modifies the fraction of solid ammonia, we estimate how much an ammonia crystal should grow or decrease and how the clouds albedo could change with the wave. Baglin et al. 1999. BAAS 31, 813. Baines et al. 2002. Icarus 159, 74. Deming et al. 1989. Icarus 21, 943. Kuiper 1952.The atmospheres of the Earth and Planets pp. 306-405. Univ. of Chicago Press, Chicago. Mosser et al. 1991. A&A 251, 356. Vorontsov et al. 1976. Icarus 27, 109.

  8. Total Quality Management: A Guide to Implementation

    DTIC Science & Technology

    1989-08-01

    Kaizen. New York: Random House. 1986. Ishikawa , Kaoru . Guide to Quality Control. Asian Productivity Organization. 1984. Ishikawa , Kaoru . What is Total...with the problems and the new management principles are based on the complexities of the new Systems Age. The theories of Deming, Juran, Ishikawa , and...Progress. Jun 1985. Miller, Jeffery G. and Thomas E. Vollmann. "The Hidden Factory." Harvard Business Review. Sep-Oct 1985. Shimoyamada, Kaoru . "The

  9. Total Quality Management Guide. Volume 2. A Guide to Implementation

    DTIC Science & Technology

    1990-02-15

    Kaoru . Guide to Quality Control. Asian Productivity Organization. 1984. Ishikawa , Kaoru . What is Total Quality Control? The Japanese Way. Englewood...of the new Systems Age. The theories of Deming, Juran, Ishikawa , and other management methods that still predominate are pioneers of Systems Age...Feigenbaum, Armand V. Total Quality Control. New York: McGraw-Hill Book Company. 1983. bnai, Masaaki. Kaizen. New York: Random House. 1986. Ishikawa

  10. TQM - Total Quality Management (Bibliography)

    DTIC Science & Technology

    1990-05-01

    ISBN: 0-941893-00-6 Price: $27.00 NAVSWC Library Call No: TS156 S358 GUIDE TO OUALITY CONTROL Author: Ishikawa , Kaoru Publisher: Asian Productivity...teachings of Juran, Deming, Feigenbaum, Crosby, Taguchi, Shewhart, Ishikawa , and others. 19 OUT OF BEDLAM: MANAGEMENT BY QUALITY LEADERSHIP Author...Price: $60.00 WHAT IS TOTAL QUALITY CONTROL? THE JAPANESE WAY Author. Ishikawa , K oru Publisher: Prentice Hall Date: 1985 Pagination: 215pp ISBN: 0-13

  11. Measuring quality progress

    NASA Astrophysics Data System (ADS)

    Lambert, Larry D.

    The study by the American Productivity & Quality Center (APQC) was commissioned by Loral Space Information Systems, Inc. and the National Aeronautics and Space Administration (NASA) to evaluate internal assessment systems. APQC benchmarked approaches to the internal assessment of quality management systems in three phases. The first phase included work conducted for the International Benchmarking Clearinghouse (IBC) and consisted of an in-depth analysis of the 1991 Malcolm Baldrige National Quality Award criteria. The second phase was also performed for the IBC and compared the 1991 award criteria among the following quality awards: Deming Prize, Malcolm Baldrige National Quality Award, The President's Award for Quality and Productivity Improvement, The NASA Excellence Award (The George M. Lowe Trophy) for Quality and Productivity Improvement and the Shigeo Shingo Award for Excellence in Manufacturing. The third phase compared the internal implementation approaches of 23 companies selected from American industry for their recognized, formal assessment systems.

  12. Measuring quality progress

    NASA Technical Reports Server (NTRS)

    Lambert, Larry D.

    1992-01-01

    The study by the American Productivity & Quality Center (APQC) was commissioned by Loral Space Information Systems, Inc. and the National Aeronautics and Space Administration (NASA) to evaluate internal assessment systems. APQC benchmarked approaches to the internal assessment of quality management systems in three phases. The first phase included work conducted for the International Benchmarking Clearinghouse (IBC) and consisted of an in-depth analysis of the 1991 Malcolm Baldrige National Quality Award criteria. The second phase was also performed for the IBC and compared the 1991 award criteria among the following quality awards: Deming Prize, Malcolm Baldrige National Quality Award, The President's Award for Quality and Productivity Improvement, The NASA Excellence Award (The George M. Lowe Trophy) for Quality and Productivity Improvement and the Shigeo Shingo Award for Excellence in Manufacturing. The third phase compared the internal implementation approaches of 23 companies selected from American industry for their recognized, formal assessment systems.

  13. Isolation–By–Distance–and–Time in a stepping–stone model

    PubMed Central

    Duforet-Frebourg, Nicolas; Slatkin, Montgomery

    2015-01-01

    With the great advances in ancient DNA extraction, genetic data are now obtained from geographically separated individuals from both present and past. However, population genetics theory about the joint effect of space and time has not been thoroughly studied. Based on the classical stepping–stone model, we develop the theory of Isolation by Distance and Time. We derive the correlation of allele frequencies between demes in the case where ancient samples are present, and investigate the impact of edge effects with forward–in–time simulations. We also derive results about coalescent times in circular and toroidal models. As one of the most common ways to investigate population structure is principal components analysis (PCA), we evaluate the impact of our theory on PCA plots. Our results demonstrate that time between samples is an important factor. Ancient samples tend to be drawn to the center of a PCA plot. PMID:26592162

  14. The squid-Vibrio symbioses: from demes to genes.

    PubMed

    Kimbell, Jennifer R; McFall-Ngai, Margaret J

    2003-04-01

    The monospecific light organ association between the Hawaiian sepiolid squid Euprymna scolopes and the marine luminous bacterium Vibrio fischeri has been used as a model for the study of the most common type of coevolved animal-bacterial interaction; i.e., the association of Gram-negative bacteria with the extracellular apical surfaces of polarized epithelia. Analysis of the squid-vibrio symbiosis has ranged from characterizations of the harvesting mechanisms by which the host ensures colonization by the appropriate symbiont to identification of bacteria-induced changes in host gene expression that accompany the establishment and maintenance of the relationship. Studies of this model have been enhanced by extensive collaboration with microbiologists, who are able to manipulate the genetics of the bacterial symbiont. The results of our studies have indicated that initiation and persistence of the association requires a complex, reciprocal molecular dialogue between these two phylogenetically distant partners.

  15. Reading Reform and the Role of Policy, Practice and Instructional Leadership on Reading Achievement: A Case Study of Grissom Elementary School

    ERIC Educational Resources Information Center

    Morrison, Faith Andrea

    2012-01-01

    The purpose of this study was to explore whether William Deming's 8 Step Model would increase reading achievement in 3rd grade students. The study investigated how well the process based plan-do-check-act model when used as a treatment with fidelity, coupled with the principal as instructional leader would result in success in the age of federal…

  16. Calculated and Observed Speeds of Cavitation About Two- and Three- Dimensional Bodies in Water

    DTIC Science & Technology

    1942-11-01

    yE1l0.044po ellipse of - 36 (1 - Y) 6 0 .l.Olxx 0(y) L . 6.54 Porabolo Elipe em - Cicl Y ,11p f. 2 1 - fal: Figure 1 ScinofCylinders and Bodies of...Propellers,* by E.Z. Stowell and A.F. Deming , BACA Report 526, 1935. (2) "Flow about a Pair of Adjacent, Parallel Cylinders Normal to a Stream, Theoretical

  17. Statistics of Mass Production

    DTIC Science & Technology

    1993-05-01

    smaller ones, and the second after Kaoru Ishikawa who developed it in 1943. 22 UCL LCL Sample number Sample numbe CYCLES ON A CONTROL CHART A SHIFT IN...include Phillip Crosby, W. Edwards Deming, Armand Feigenbaum, Kaoru Ishikawa , and Joseph Juran. As an example of the ideas, the well-known 14 points of...tool for analyzing process dispersion. It is also referred to as the Ishikawa diagram, because Kaoru Ishikawa developed it, and the fishbone diagram

  18. Determining Successful Approaches for a Total Quality Management Training Program for Tripler Army Medical Center, Hawaii

    DTIC Science & Technology

    1993-08-01

    Krowinski, W. J. (1990). Measuring and managing Patient satisfaction . Chicago, IL: American Hospital Publishing. Stiffler, R. (1992). Making it work: The...Currently, there is no consensus among the experts on what method to use in executing a TQM training program. There are a wide variety of viewpoints... burnout , reenergize employees, and send a clear message to employees that management considers employees to be a valuable resource (Walton, 1986; Deming

  19. QMHC interview: Peter R. Scholtes [by Marie E. Sinioris].

    PubMed

    Scholtes, P R

    1993-01-01

    Peter R. Scholtes has a unique perspective on what it takes to build a world-class quality organization: A transformation of the relationships, environment, and dynamics within and between individuals and groups throughout an organization. He brings an organizational development perspective to quality management and, in particular, to the approach and practices advocated by W. Edwards Deming. This interview explores Mr. Scholtes' in-depth understanding and sometimes controversial views on quality improvement teams, team training, and performance appraisal.

  20. The Subject Matter of Process Improvement: A Topic and Reference Source for Software Engineering Educators and Trainers.

    DTIC Science & Technology

    1995-05-01

    Random House Business Division, 1986. Ishikawa , Kaoru . What is Total Quality Control?: The Japanese Way. En- glewood Cliffs, NJ: Prentice-Hall...by drawing attention to the vital few truly important problems. Cause-and-effect diagrams. Also called fishbone and Ishikawa diagrams due to their...February 1994. 5.2 The Seeds [Aguayo 91] [Crosby 79] [Crosby 92] [Deming 86] [Fellers 92] [Gitlow 87] [Gluckman 93] [Imai 86] [ Ishikawa 85] [Juran

  1. The Coast Artillery Journal. Volume 86, Number 2, March-April 1943

    DTIC Science & Technology

    1943-04-01

    enemy suc- ceeded in pushing back the Soviet units. In the ensuing street fighting one of our batteries, shifting from place to place, kept up a...Eldrisco Apartments, Pacific Avenue and Broderick Street , San Francisco, California. "Charles G. Sage, Colonel, 200th Coast Artillery, United States...duced numbers for the armament manned." Colonel Sage was born at Sparks, Kansas. His wife, Mrs. Dorothy H. Sage, lives at 333 South Tin Street , Deming

  2. Army Support to the United States Border Patrol in the 21st Century

    DTIC Science & Technology

    2011-05-19

    and Lieutenant Colonel (Promotable) Clifford J. Weinstein (United States Marine Corps). Thank you for letting me travel this important journey and...Operating Bases in Deming and Playas , New Mexico. The 4-14 CAV was preparing for its deployment to the Joint Readiness Training Center (JRTC) at Fort Polk...write strategic policy. Once a suitable bench of key planners comes back to USBP, they can travel throughout the UCs and train other agents across the

  3. A two-locus model of spatially varying stabilizing or directional selection on a quantitative trait

    PubMed Central

    Geroldinger, Ludwig; Bürger, Reinhard

    2014-01-01

    The consequences of spatially varying, stabilizing or directional selection on a quantitative trait in a subdivided population are studied. A deterministic two-locus two-deme model is employed to explore the effects of migration, the degree of divergent selection, and the genetic architecture, i.e., the recombination rate and ratio of locus effects, on the maintenance of genetic variation. The possible equilibrium configurations are determined as functions of the migration rate. They depend crucially on the strength of divergent selection and the genetic architecture. The maximum migration rates are investigated below which a stable fully polymorphic equilibrium or a stable single-locus polymorphism can exist. Under stabilizing selection, but with different optima in the demes, strong recombination may facilitate the maintenance of polymorphism. However usually, and in particular with directional selection in opposite direction, the critical migration rates are maximized by a concentrated genetic architecture, i.e., by a major locus and a tightly linked minor one. Thus, complementing previous work on the evolution of genetic architectures in subdivided populations subject to diversifying selection, it is shown that concentrated architectures may aid the maintenance of polymorphism. Conditions are obtained when this is the case. Finally, the dependence of the phenotypic variance, linkage disequilibrium, and various measures of local adaptation and differentiation on the parameters is elaborated. PMID:24726489

  4. A two-locus model of spatially varying stabilizing or directional selection on a quantitative trait.

    PubMed

    Geroldinger, Ludwig; Bürger, Reinhard

    2014-06-01

    The consequences of spatially varying, stabilizing or directional selection on a quantitative trait in a subdivided population are studied. A deterministic two-locus two-deme model is employed to explore the effects of migration, the degree of divergent selection, and the genetic architecture, i.e., the recombination rate and ratio of locus effects, on the maintenance of genetic variation. The possible equilibrium configurations are determined as functions of the migration rate. They depend crucially on the strength of divergent selection and the genetic architecture. The maximum migration rates are investigated below which a stable fully polymorphic equilibrium or a stable single-locus polymorphism can exist. Under stabilizing selection, but with different optima in the demes, strong recombination may facilitate the maintenance of polymorphism. However usually, and in particular with directional selection in opposite direction, the critical migration rates are maximized by a concentrated genetic architecture, i.e., by a major locus and a tightly linked minor one. Thus, complementing previous work on the evolution of genetic architectures in subdivided populations subject to diversifying selection, it is shown that concentrated architectures may aid the maintenance of polymorphism. Conditions are obtained when this is the case. Finally, the dependence of the phenotypic variance, linkage disequilibrium, and various measures of local adaptation and differentiation on the parameters is elaborated. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  5. WFC3: Precision Infrared Spectrophotometry with Spatial Scans of HD 189733b and Vega

    NASA Astrophysics Data System (ADS)

    McCullough, Peter R.; Crouzet, N.; Deming, D.; Madhusudhan, N.; Deustua, S. E.; WFC3

    2014-01-01

    The Wide Field Camera 3 (WFC3) on the Hubble Space Telescope (HST) now routinely provides near-infrared spectroscopy of transiting extrasolar planet atmospheres with better than ~50 ppm precision per 0.05-micron resolution bin per transit, for sufficiently bright host stars. Two improvements of WFC3 (the detector) and HST (the spatial scanning technique) have made transiting planet spectra more sensitive and more repeatable than was feasible with NICMOS. In addition, the data analysis is much simpler with WFC3 than with NICMOS. We present time-series spectra of HD 189733b from 1.1 to 1.7 microns in transit and eclipse with fidelity similar to that of the WFC3 transit spectrum of HD 209458b (Deming et al. 2013). In a separate program, we obtained scanned infrared spectra of the bright star, Vega, thereby extending the dynamic range of WFC3 to ~26 magnitudes! Analysis of these data will affect the absolute spectrophotometric calibration of the WFC3, placing it on an SI traceable scale.

  6. A stochastic simulator of birth-death master equations with application to phylodynamics.

    PubMed

    Vaughan, Timothy G; Drummond, Alexei J

    2013-06-01

    In this article, we present a versatile new software tool for the simulation and analysis of stochastic models of population phylodynamics and chemical kinetics. Models are specified via an expressive and human-readable XML format and can be used as the basis for generating either single population histories or large ensembles of such histories. Importantly, phylogenetic trees or networks can be generated alongside the histories they correspond to, enabling investigations into the interplay between genealogies and population dynamics. Summary statistics such as means and variances can be recorded in place of the full ensemble, allowing for a reduction in the amount of memory used--an important consideration for models including large numbers of individual subpopulations or demes. In the case of population size histories, the resulting simulation output is written to disk in the flexible JSON format, which is easily read into numerical analysis environments such as R for visualization or further processing. Simulated phylogenetic trees can be recorded using the standard Newick or NEXUS formats, with extensions to these formats used for non-tree-like inheritance relationships.

  7. A Stochastic Simulator of Birth–Death Master Equations with Application to Phylodynamics

    PubMed Central

    Vaughan, Timothy G.; Drummond, Alexei J.

    2013-01-01

    In this article, we present a versatile new software tool for the simulation and analysis of stochastic models of population phylodynamics and chemical kinetics. Models are specified via an expressive and human-readable XML format and can be used as the basis for generating either single population histories or large ensembles of such histories. Importantly, phylogenetic trees or networks can be generated alongside the histories they correspond to, enabling investigations into the interplay between genealogies and population dynamics. Summary statistics such as means and variances can be recorded in place of the full ensemble, allowing for a reduction in the amount of memory used—an important consideration for models including large numbers of individual subpopulations or demes. In the case of population size histories, the resulting simulation output is written to disk in the flexible JSON format, which is easily read into numerical analysis environments such as R for visualization or further processing. Simulated phylogenetic trees can be recorded using the standard Newick or NEXUS formats, with extensions to these formats used for non-tree-like inheritance relationships. PMID:23505043

  8. Three Experts on Quality Management: Philip B. Crosby, W. Edwards Deming, Joseph M. Juran

    DTIC Science & Technology

    1992-07-01

    Department of the Navy Office of the Under Secretary of the Navy Total Quality Leadership Omce THREE EXPERTS ON QUALITY MANAGEMENT : PHILIP B. CROSBY W...research, as the "price of nonconformance." To aid managers in statistical theory , statistical thinking, and the application tracking the cost of doing...Quality Management emphasizes that the process must become a way of life in Theory of Systems. "A system is a series of the organization. Continuance is

  9. Total Quality Management in Logistics: A Case Study from the Trucking Industry

    DTIC Science & Technology

    1992-06-01

    Quality Management (TQM) movement on the logistics industry as a whole, and, more specifically, its impact within the trucking industry. Its focus then narrows to study the practical aspects of implementing a W. Edwards Deming-based quality program within a particular trucking company, Mason Transporters, Inc. The effectiveness of the company’s implementation effort is assessed using data collected from a survey questionnaire, formal interviews, and personal observations during an on- site visit. Successes and shortcomings of the implementation process are highlighted

  10. An Experimental Evaluation of Image Quality for Various Scenarios in a Chromotomographic System With a Spinning Prism

    DTIC Science & Technology

    2014-03-27

    pixels. The proprietary Phantom software allowed the user to crop down the usable pixels for FOV and speed purposes. Each recording was made with only the...sampled at discrete wavelengths and not a continuous spectrum, Mooney and Deming represent Equation (2.10) as a summation rather than an integral: Gm ...project were arbitrarily chosen between 1 and 10, with 5 being the most common. The full importance of µ was not investigated in this project. Lastly

  11. A Study as to the Feasibility of the Department of Defense Mandating its Supplier Base Adopt Total Quality Management

    DTIC Science & Technology

    1989-12-01

    leadership for change. 3. Cease dependence on inspection to achieve quality. 4. End the practice of awarding business on the basis of price tag. 5. Improve...training on the job. 7. Institute leadership . The aim of supervision should be to help people do a better job. 8. Eliminate fear from the workplace...and the deadly diseases and the obstacles. Only then may they place themselves in roles of leadership . (Deming, p. 119) The same screening and

  12. The U.S. Automobile Industry: Will It Survive Increasing International Competition

    DTIC Science & Technology

    1991-04-22

    the best selling automobile in the country by almost a 2:1 margin was the Japanese designed Honda Accord. 1 It has been estimated that since 1980 over...the industry made to U.S. war efforts. I will compare the American automobile industry to its foreign competi- tors, take a look at the strategies used...during WW II in bomber pro- duction with excellent results. After the war, Deming’s methods were ignored by domestic automobile producers. Nissan was

  13. Gender disparity in BMD conversion: a comparison between Lunar and Hologic densitometers.

    PubMed

    Ganda, Kirtan; Nguyen, Tuan V; Pocock, Nicholas

    2014-01-01

    Female-derived inter-conversion and standardised BMD equations at the lumbar spine and hip have not been validated in men. This study of 110 male subjects scanned on Hologic and Lunar densitometers demonstrates that published equations may not applicable to men at the lumbar spine. Male inter-conversion equations have also been derived. Currently, available equations for inter-manufacturer conversion of bone mineral density (BMD) and calculation of standardised BMD (sBMD) are used in both males and females, despite being derived and validated only in women. Our aim was to test the validity of the published equations in men. One hundred ten men underwent lumbar spine (L2-4), femoral neck (FN) and total hip (TH) dual X-ray absorptiometry (DXA) using Hologic and Lunar scanners. Hologic BMD was converted to Lunar using published equations derived from women for L2-4 and FN. Actual Lunar BMD (A-Lunar) was compared to converted (Lunar equivalent) Hologic BMD values (H-Lunar). sBMD was calculated separately using Hologic (sBMD-H) and Lunar BMD (sBMD-L) at L2-4, FN and TH. Conversion equations in men for Hologic to Lunar BMD were derived using Deming regression analysis. There was a strong linear correlation between Lunar and Hologic BMD at all skeletal sites. A-Lunar BMD was however significantly higher than derived H-Lunar BMD (p < 0.001) at L2-L4 (mean difference, 0.07 g/cm(2)). There was no significant difference at the FN (mean difference, 0.01 g/cm(2)). sBMD-L at the spine was significantly higher than sBMD-H (mean difference, 0.06 g/cm(2), p < 0.001), whilst there was little difference at the FN and TH (mean difference, 0.01 g/cm(2)). Published conversion equations for Lunar BMD to Hologic BMD, and formulae for lumbar spine sBMD, derived in women may not be applicable to men.

  14. Perceptions of a HIV testing message targeted for at-risk adults with low functional health literacy

    NASA Astrophysics Data System (ADS)

    Hunter, Susan L.

    This study analyses warehoused data collected by Georgia State University and Centers for Disease Control and Prevention (GSU/CDC) researchers after developing an HIV testing message for urban adults with low functional health literacy. It expands previous work by examining data collected when 202 primarily African-American homeless clients of an urban community based organization (CBO) reviewed both the low literacy brochure (Wallace et al., 2006) and a standard HIV brochure (Georgia Department of Human Resources, 1997). Participants' health literacy was assessed using 2 measures; the Rapid Estimate of Adult Literacy in Medicine or REALM (Davis, Crouch, Long & Green) and the Test of Functional Health Literacy Assessment or TOFHLA (Nurss, Parker & Baker, 2001). HIV risk was determined using an interview questionnaire developed by the research group (Belcher, Deming, Hunter & Wallace, 2005) which allowed participants to self-report recent alcohol and drug use, sexual behavior, sexually transmitted disease (STD) history and exposure to abuse and sexual coercion. Open-ended response questions regarding readability, understanding, main message, and importance for each brochure provided the qualitative data. This analysis confirms previous work showing accessibility, readability, cultural sensitivity and user-friendly formatting are important when attempting to engage at-risk adults with varying levels of functional health literacy in an HIV testing message. The visual aspects of the brochure can be essential in capturing the reader's attention and should be relevant to the target audience (Wallace, Deming, Hunter, Belcher & Choi, 2006). Mono-colored graphics may be perceived as dated and irrelevant or worse yet, threatening to some readers. Whenever possible culturally appropriate color photos of people depicting relevant content should replace excess text and difficult medical terms should be eliminated. Wording on the cover and within the brochure should be used to focus the reader on a single main message. This data also shows that many participants considered the quantity of information just as important. For reasons not elucidated here, many respondents equated quantity of information with message quality. Based on these results it is important to further clarify how much information is enough to maintain legitimacy and the reader's attention while simultaneously avoiding confusing mixed messages.

  15. Analysis of Repeatability and Reliability of Warm IRAC Observations of Transiting Exoplanets

    NASA Astrophysics Data System (ADS)

    Carey, Sean J.; Krick, Jessica; Ingalls, James

    2015-12-01

    Extracting information about thermal profiles and composition of the atmospheres of transiting exoplanets is extremely challenging due to the small differential signal of the atmosphere in observations of transits, secondary eclipses, and full phase curves for exoplanets. The relevant signals are often at the level of 100 ppm or smaller and require the removal of significant instrumental systematics in the two infrared instruments currently capable of providing information at this precision, WFC3 on HST and IRAC aboard the Spitzer Space Telescope. For IRAC, the systematics are due to the interplay of residual telescope pointing variation with intra-pixel gain variations in the moderately undersampled camera. There is currently a debate in the community on the reliability of repeated IRAC observations of exoplanets particularly those in eclipse from which inferences about atmospheric temperature and pressure profiles can made. To assess the repeatability and reliability of post-cryogenic observations with IRAC, the Spitzer Science Center in conjunction with volunteers from the astronomical community has performed a systematic analysis of the removal of systematics and repeatability of warm IRAC observations. Recently, a data challenge consisting of the measurement of ten secondary eclipses of XO-3b (see Wong et al. 2014) and a complementary analysis of a synthetic version of the XO-3b data was undertaken. We report on the results of this data challenge. Five different techniques were applied to the data (BLISS mapping [Stevenson et al. (2012)], kernel regression using the science data [Wong et al. (2015)] and calibration data [Krick et al. (2015)], pixel-level decorrelation [Deming et al. (2015)], ICA [Morello et al. (2015)] and Gaussian Processes [Evans et al. (2015)]) and found consistent results in terms of eclipse depth and reliability in both the actual and synthetic data. In addition, each technique obtained the input eclipse depth in the simulated data within the stated measurement uncertainty. The reported uncertainties for each measurement approach the photon noise limit. These findings generally refute the results of Hansen et al. (2014) and suggest that inferences about atmospheric properties can be reasonably made using warm IRAC data. Application of our test methods to future observations using JWST (in particular the MIRI instrument) will be discussed.

  16. Outbreeding effects in an inbreeding insect, Cimex lectularius.

    PubMed

    Fountain, Toby; Butlin, Roger K; Reinhardt, Klaus; Otti, Oliver

    2015-01-01

    In some species, populations with few founding individuals can be resilient to extreme inbreeding. Inbreeding seems to be the norm in the common bed bug, Cimex lectularius, a flightless insect that, nevertheless, can reach large deme sizes and persist successfully. However, bed bugs can also be dispersed passively by humans, exposing inbred populations to gene flow from genetically distant populations. The introduction of genetic variation through this outbreeding could lead to increased fitness (heterosis) or be costly by causing a loss of local adaptation or exposing genetic incompatibility between populations (outbreeding depression). Here, we addressed how inbreeding within demes and outbreeding between distant populations impact fitness over two generations in this re-emerging public health pest. We compared fitness traits of families that were inbred (mimicking reproduction following a founder event) or outbred (mimicking reproduction following a gene flow event). We found that outbreeding led to increased starvation resistance compared to inbred families, but this benefit was lost after two generations of outbreeding. No other fitness benefits of outbreeding were observed in either generation, including no differences in fecundity between the two treatments. Resilience to inbreeding is likely to result from the history of small founder events in the bed bug. Outbreeding benefits may only be detectable under stress and when heterozygosity is maximized without disruption of coadaptation. We discuss the consequences of these results both in terms of inbreeding and outbreeding in populations with genetic and spatial structuring, as well as for the recent resurgence of bed bug populations.

  17. A multinational study to develop universal standardization of whole-body bone density and composition using GE Healthcare Lunar and Hologic DXA systems.

    PubMed

    Shepherd, John A; Fan, Bo; Lu, Ying; Wu, Xiao P; Wacker, Wynn K; Ergun, David L; Levine, Michael A

    2012-10-01

    Dual-energy x-ray absorptiometry (DXA) is used to assess bone mineral density (BMD) and body composition, but measurements vary among instruments from different manufacturers. We sought to develop cross-calibration equations for whole-body bone density and composition derived using GE Healthcare Lunar and Hologic DXA systems. This multinational study recruited 199 adult and pediatric participants from a site in the US (n = 40, ages 6 through 16 years) and one in China (n = 159, ages 5 through 81 years). The mean age of the participants was 44.2 years. Each participant was scanned on both GE Healthcare Lunar and Hologic Discovery or Delphi DXA systems on the same day (US) or within 1 week (China) and all scans were centrally analyzed by a single technologist using GE Healthcare Lunar Encore version 14.0 and Hologic Apex version 3.0. Paired t-tests were used to test the results differences between the systems. Multiple regression and Deming regressions were used to derive the cross-conversion equations between the GE Healthcare Lunar and Hologic whole-body scans. Bone and soft tissue measures were highly correlated between the GE Healthcare Lunar and Hologic and systems, with r ranging from 0.96 percent fat [PFAT] to 0.98 (BMC). Significant differences were found between the two systems, with average absolute differences for PFAT, BMC, and BMD of 1.4%, 176.8 g and 0.013 g/cm(2) , respectively. After cross-calibration, no significant differences remained between GE Healthcare Lunar measured results and the results converted from Hologic. The equations we derived reduce differences between BMD and body composition as determined by GE Healthcare Lunar and Hologic systems and will facilitate combining study results in clinical or epidemiological studies. Copyright © 2012 American Society for Bone and Mineral Research.

  18. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  19. Application of the Deming Management Method to Implement Total Quality Management in the DoD

    DTIC Science & Technology

    1991-09-01

    of the authors and do not reflect the official policy or position of the Department of Defense or the U.S. Government. b T; Aocesslon For NTIS GRA&I...Plans should be made bhiat reflect the long-term well-being of a firm , not just the next quarterly report. Investments must.be made in the...world leaders. While this was happening U.S. firms continued ’ith their post WWII mindset. Industries in the U.S. must realize they can not survive

  20. Reply [to “An open letter to Ellen Druffel” and to “A second look at gender distribution among AGU Fellows”

    NASA Astrophysics Data System (ADS)

    Druffel, Ellen R. M.

    The purpose of my brief article [Druffel, 1994] was to inform AGU members of gender inequity within the AGU Fellows. I raised the question 2 years ago, then assembled the statistics and reported the results to the Fellows Committee. At their recommendation, I wrote the article for Eos to report the facts. The reasons for this inequity were intended to be discussed separately. Deming presents several hypotheses that he feels may explain the inequity. I address them below.

  1. The primacy of the patient and family in a quality-improvement environment.

    PubMed

    Walker, J K

    1995-09-01

    The primary customers of health care services are the patient and family. It is important to adopt a mission and philosophy that put the patient and family at the center of all quality improvement programs. The principles put forth by Deming in his 14 points can be applied to patient-focused quality improvement measures. Creating a foundation for the professional practice of nursing and using and expanding tools that are already in use can help care providers meet the needs of their customers and help people to live healthier, better lives.

  2. The role of CQI in the strategic planning process.

    PubMed

    Sahney, V K; Warden, G L

    1993-01-01

    This article describes the strategic planning process used to define the health care needs of a region and to prepare Henry Ford Health System (HFHS) to meet the needs of the 21st century. It presents key applications of continuous quality improvement in the development and implementation of the strategic plans for HFHS; explains how HFHS adapted the Deming/Shewhart cycle of continuous improvement for the purpose of improving its planning process; and delineates how the strategic planning, financial planning, and quality planning processes have been integrated.

  3. Hardware Acceleration Of Multi-Deme Genetic Algorithm for DNA Codeword Searching

    DTIC Science & Technology

    2008-01-01

    C and G are complementary to each other. A Watson - Crick complement of a DNA sequence is another DNA sequence which replaces all the A with T or vise...versa and replaces all the T with A or vise versa, and also switches the 5’ and 3’ ends. A DNA sequence binds most stably with its Watson - Crick ...bind with 5 Watson - Crick pairs. The length of the longest complementary sequence between two flexible DNA strands, A and B, is the same as the

  4. A complementary organic inverter of porphyrazine thin films: low-voltage operation using ionic liquid gate dielectrics.

    PubMed

    Fujimoto, Takuya; Miyoshi, Yasuhito; Matsushita, Michio M; Awaga, Kunio

    2011-05-28

    We studied a complementary organic inverter consisting of a p-type semiconductor, metal-free phthalocyanine (H(2)Pc), and an n-type semiconductor, tetrakis(thiadiazole)porphyrazine (H(2)TTDPz), operated through the ionic-liquid gate dielectrics of N,N-diethyl-N-methyl(2-methoxyethyl)ammonium bis(trifluoromethylsulfonyl)imide (DEME-TFSI). This organic inverter exhibits high performance with a very low operation voltage below 1.0 V and a dynamic response up to 20 Hz. © The Royal Society of Chemistry 2011

  5. Near patient cholesterol testing in patients with peripheral arterial disease.

    PubMed

    Hobbs, S D; Jones, A; Wilmink, A B; Bradbury, A W

    2003-09-01

    To assess the bias, precision and utility of the Bioscanner 2000 for near patient testing of total cholesterol (NPTC) in patients with peripheral arterial disease (PAD). One hundred consecutive patients attending a hospital-based clinic with symptomatic PAD underwent non-fasting NPTC using finger prick blood sample and a laboratory total cholesterol (TC) using blood drawn from an antecubital fossa vein. The Bioscanner 2000 showed good precision with a coefficient of variation of 1.8-3.8%. NPTC was significantly lower than laboratory TC (mean (S.D.) 4.67 (1.1) vs. 5.12 (1.2) mmol/l), p < or = 0.01, paired Student's t-test. Comparing the two methods using Deming regression revealed a 15% negative bias for the Bioscanner 2000 compared to laboratory testing, which was demonstrated to be a systematic bias using a Bland-Altman plot. Almost half (46%) of the readings differed by > 0.5 mmol/l, 16% by > 1.0 mmol/l and 3% by > 2 mmol/l. This means that if the cut-off for statin treatment were taken as a TC of 5.0 or 3.5 mmol/l then, based on NPTC, alone 18 and 6% of patients, respectively, would not have received a statin. In the present study, NPTC significantly under-estimated TC when compared to laboratory testing. However, in the majority of cases, this would not have affected the decision to prescribe a statin and NPTC testing allows the immediate institution or titration of statin treatment.

  6. Development and Validation of a Simple and Rapid UPLC-MS Assay for Valproic Acid and Its Comparison With Immunoassay and HPLC Methods.

    PubMed

    Zhao, Mingming; Li, Guofei; Qiu, Feng; Sun, Yaxin; Xu, Yinghong; Zhao, Limei

    2016-04-01

    Valproic acid (VPA), a widely used antiepileptic drug, has a narrow therapeutic range of 50-100 mcg/mL and shows large individual variability. It is very important to monitor the trough concentration of VPA using a reliable method. Therefore, the aim of this study was to develop and validate a rapid ultraperformance liquid chromatographic-mass spectrometry (UPLC-MS) method for quantification of VPA in human serum and to compare with fluorescence polarization immunoassay (FPIA), chemiluminescence microparticle immunoassay (CMIA), and high-performance liquid chromatography (HPLC) methods. The method included extraction of VPA in serum by deproteinization with acetonitrile. The analysis was performed using an EC-C18 column (2.7 μm, 4.6 × 50 mm) under isocratic conditions with a mobile phase of acetonitrile/water (containing 0.1% formic acid) (45/55, vol/vol) at a flow rate of 0.6 mL/min. The detection was performed on a triple-quadrupole tandem mass spectrometer using an electrospary probe in the negative ionization mode. The method was validated by studies of selectivity, linearity, lower limit of quantification, accuracy, precision, recovery, matrix effect, and stability. Furthermore, all the 4 methods including FPIA, CMIA, and HPLC were subsequently used to assay the VPA concentration in 498 clinical serum samples collected from patients who received VPA. These methods were compared by Deming regression and Bland-Altman analysis. The retention time of VPA was 2.09 minutes. The calibration curve was linear over the concentration range of 1-200 mcg/mL, with a lower limit of quantification of 1 mcg/mL. The interday and intraday precision (RSD %) was less than 4.6% and 4.5%, respectively, and the accuracy (RE %) was below 7.9%. The recoveries and matrix effect of VPA at concentrations of 2, 50, and 160 mcg/mL met the requirement for the analysis of biological samples. No obvious degradation of VPA was observed under various storage conditions including room temperature for 12 hour, 3 freeze-thaw cycles, and -20°C for 3 months. Regression analysis showed that the correlation coefficients for the UPLC-MS versus FPIA, CMIA, and HPLC were 0.989, 0.988, and 0.987, respectively. The results of agreement tests between UPLC-MS and other methods showed that the mean difference of UPLC-MS and FPIA was -1.4 mcg/mL and 95% confidence interval of -7.7 to 4.9 mcg/mL, and the values for UPLC-MS and CMIA were -0.8 mcg/mL and -7.5 to 5.8 mcg/mL, for UPLC-MS and HPLC were 1.1 mcg/mL and -5.7 to 7.9 mcg/mL. The rapid UPLC-MS method we developed showed a good analytical performance required for therapeutic drug monitoring, leading to potential improvements in patient care and laboratory management. Compared with the FPIA, CMIA, and HPLC methods, the UPLC-MS method correlated well and displayed comparable VPA concentrations.

  7. The Astronomy Workshop: Computer Assisted Learning Tools with Instructor Support Materials and Student Activities

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2006-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes, as supplementary materials appropriate for grades 9-12, and by the general public. The philosophy of the website is to foster student and public interest in astronomy by capitalizing on their fascination with computers and the internet. Many of the tools were developed by graduate and undergraduate students at UMD. This website contains over 20 tools on topics including scientific notation, giant impacts, extrasolar planets, astronomical distances, planets, moons, comets, and asteroids. Educators around the country at universities, colleges, and secondary schools have used the Astronomy Workshop’s tools and activities as homework assignments, in-class demos, or extra credit. Since 2005, Grace Deming has assessed several of the Astronomy Workshop’s tools for clarity and effectiveness by interviewing students as they used tools on the website. Based on these interviews, Deming wrote student activities and instructor support materials and posted them to the website. Over the next three years, we will continue to interview students, develop web materials, and field-test activities. We are targeting classes in introductory undergraduate astronomy courses and grades 11-12 for our Spring 2007 field tests. We are interested in hearing your ideas on how we can make the Astronomy Workshop more appealing to educators, museum directors, specialty programs, and professors. This research is funded by NASA EPO grants NNG04GM18G and NNG06GGF99G.

  8. Isolation-by-distance in landscapes: considerations for landscape genetics

    PubMed Central

    van Strien, M J; Holderegger, R; Van Heck, H J

    2015-01-01

    In landscape genetics, isolation-by-distance (IBD) is regarded as a baseline pattern that is obtained without additional effects of landscape elements on gene flow. However, the configuration of suitable habitat patches determines deme topology, which in turn should affect rates of gene flow. IBD patterns can be characterized either by monotonically increasing pairwise genetic differentiation (for example, FST) with increasing interdeme geographic distance (case-I pattern) or by monotonically increasing pairwise genetic differentiation up to a certain geographical distance beyond which no correlation is detectable anymore (case-IV pattern). We investigated if landscape configuration influenced the rate at which a case-IV pattern changed to a case-I pattern. We also determined at what interdeme distance the highest correlation was measured between genetic differentiation and geographic distance and whether this distance corresponded to the maximum migration distance. We set up a population genetic simulation study and assessed the development of IBD patterns for several habitat configurations and maximum migration distances. We show that the rate and likelihood of the transition of case-IV to case-I FST–distance relationships was strongly influenced by habitat configuration and maximum migration distance. We also found that the maximum correlation between genetic differentiation and geographic distance was not related to the maximum migration distance and was measured across all deme pairs in a case-I pattern and, for a case-IV pattern, at the distance where the FST–distance curve flattens out. We argue that in landscape genetics, separate analyses should be performed to either assess IBD or the landscape effects on gene flow. PMID:25052412

  9. Clines in quantitative traits: The role of migration patterns and selection scenarios

    PubMed Central

    Geroldinger, Ludwig; Bürger, Reinhard

    2015-01-01

    The existence, uniqueness, and shape of clines in a quantitative trait under selection toward a spatially varying optimum is studied. The focus is on deterministic diploid two-locus n-deme models subject to various migration patterns and selection scenarios. Migration patterns may exhibit isolation by distance, as in the stepping-stone model, or random dispersal, as in the island model. The phenotypic optimum may change abruptly in a single environmental step, more gradually, or not at all. Symmetry assumptions are imposed on phenotypic optima and migration rates. We study clines in the mean, variance, and linkage disequilibrium (LD). Clines result from polymorphic equilibria. The possible equilibrium configurations are determined as functions of the migration rate. Whereas for weak migration, many polymorphic equilibria may be simultaneously stable, their number decreases with increasing migration rate. Also for intermediate migration rates polymorphic equilibria are in general not unique, however, for loci of equal effects the corresponding clines in the mean, variance, and LD are unique. For sufficiently strong migration, no polymorphism is maintained. Both migration pattern and selection scenario exert strong influence on the existence and shape of clines. The results for discrete demes are compared with those from models in which space varies continuously and dispersal is modeled by diffusion. Comparisons with previous studies, which investigated clines under neutrality or under linkage equilibrium, are performed. If there is no long-distance migration, the environment does not change abruptly, and linkage is not very tight, populations are almost everywhere close to linkage equilibrium. PMID:25446959

  10. Collaborative problem solving with a total quality model.

    PubMed

    Volden, C M; Monnig, R

    1993-01-01

    A collaborative problem-solving system committed to the interests of those involved complies with the teachings of the total quality management movement in health care. Deming espoused that any quality system must become an integral part of routine activities. A process that is used consistently in dealing with problems, issues, or conflicts provides a mechanism for accomplishing total quality improvement. The collaborative problem-solving process described here results in quality decision-making. This model incorporates Ishikawa's cause-and-effect (fishbone) diagram, Moore's key causes of conflict, and the steps of the University of North Dakota Conflict Resolution Center's collaborative problem solving model.

  11. Total quality management: care dealers vs. car dealers.

    PubMed

    Rubin, I M

    1992-01-01

    Let's turn our "flawed system into the Toyota City of world health care," proposes Fortune magazine. I shudder at the thought. Deming-Juran-type TQM procedures can help to ensure that cars and their drivers do not die on the road. Skillfully adapted for health care, these same procedures can help keep patients from dying on the operating table. These procedures can also respond to Fortune's indictment that the "U.S. medical system is as wasteful and managerially backward as Detroit before Henry Ford." However, people are not cars, and care dealers are not car dealers.

  12. Global epidemiology and public health in the 21st century. Applications of new technology.

    PubMed

    Laporte, R E; Barinas, E; Chang, Y F; Libman, I

    1996-03-01

    Epidemiology and public health need to change for the upcoming problems of the 21st century and beyond. We outline a four-point approach to produce this change. The first one is to take a systems approach to disease. The second approach discussed is the use of new techniques to "count" disease using capture-recapture. The third represents the application of telecommunications, especially the Internet, to public health. The fourth and final component represents the application, at the local health department level, of a total quality approach, as espoused by Deming, for the prevention of disease.

  13. Total quality management in American industry.

    PubMed

    Widtfeldt, A K; Widtfeldt, J R

    1992-07-01

    The definition of total quality management is conformance to customer requirements and specifications, fitness for use, buyer satisfaction, and value at an affordable price. The three individuals who have developed the total quality management concepts in the United States are W.E. Deming, J.M. Juran, and Philip Crosby. The universal principles of total quality management are (a) a customer focus, (b) management commitment, (c) training, (d) process capability and control, and (e) measurement through quality improvement tools. Results from the National Demonstration Project on Quality Improvement in Health Care showed the principles of total quality management could be applied to healthcare.

  14. H2 Emission Nebulosity Associated with KH 15D

    NASA Astrophysics Data System (ADS)

    Tokunaga, A. T.; Dahm, S.; Gässler, W.; Hayano, Yutaka; Hayashi, Masahiko; Iye, Masanori; Kanzawa, Tomio; Kobayashi, Naoto; Kamata, Yukiko; Minowa, Yosuke; Nedachi, Ko; Oya, Shin; Pyo, Tae-Soo; Saint-Jacques, D.; Terada, Hiroshi; Takami, Hideki; Takato, Naruhisa

    2004-01-01

    An H2 emission filament is found in close proximity to the unique object KH 15D using the adaptive optics system of the Subaru Telescope. The morphology of the filament, the presence of spectroscopic outflow signatures observed by Hamilton et al., and the detection of extended H2 emission from KH 15D by Deming, Charbonneau, & Harrington suggest that this filament arises from shocked H2 in an outflow. The filament extends about 15" to the north of KH 15D. Based on data collected at Subaru Telescope, which is operated by the National AstronomiObservatory of Japan.

  15. Comparative genetic structure of two mangrove species in Caribbean and Pacific estuaries of Panama

    PubMed Central

    2012-01-01

    Background Mangroves are ecologically important and highly threatened forest communities. Observational and genetic evidence has confirmed the long distance dispersal capacity of water-dispersed mangrove seeds, but less is known about the relative importance of pollen vs. seed gene flow in connecting populations. We analyzed 980 Avicennia germinans for 11 microsatellite loci and 940 Rhizophora mangle for six microsatellite loci and subsampled two non-coding cpDNA regions in order to understand population structure, and gene flow within and among four major estuaries on the Caribbean and Pacific coasts of Panama. Results Both species showed similar rates of outcrossing (t= 0.7 in A. germinans and 0.8 in R. mangle) and strong patterns of spatial genetic structure within estuaries, although A. germinans had greater genetic structure in nuclear and cpDNA markers (7 demes > 4 demes and Sp= 0.02 > 0.002), and much greater cpDNA diversity (Hd= 0.8 > 0.2) than R. mangle. The Central American Isthmus serves as an exceptionally strong barrier to gene flow, with high levels nuclear (FST= 0.3-0.5) and plastid (FST= 0.5-0.8) genetic differentiation observed within each species between coasts and no shared cpDNA haplotypes between species on each coast. Finally, evidence of low ratios of pollen to seed dispersal (r = −0.6 in A. germinans and 7.7 in R. mangle), coupled with the strong observed structure in nuclear and plastid DNA among most estuaries, suggests low levels of gene flow in these mangrove species. Conclusions We conclude that gene dispersal in mangroves is usually limited within estuaries and that coastal geomorphology and rare long distance dispersal events could also influence levels of structure. PMID:23078287

  16. Planning and assessing a cross-training initiative with multi-skilled employees.

    PubMed

    Wermers, M A; Dagnillo, R; Glenn, R; Macfarlane, R; St Clair, V; Scott, D

    1996-06-01

    An improvement initiative begun by nurses at Parkview Episcopal Medical Center (Pueblo, Colo) to develop patient-focused care delivered by multiskilled workers followed a quality improvement methodology. Implementation of the new care delivery system on a model unit--2 South--provided the opportunity to plan, analyze data, and make changes as appropriate. Parkview's indoctrination of the teachings of W. Edwards Deming has helped leaders and staff realize the integral role of training in improvement activities. In his 14 points, Deming emphasizes the importance of employee education and of the employee having a clear understanding of his or her job. The time and money put into up-front education should help ensure the long-term success of this initiative. DEFINING THE CAREPARTNER: Three new multi-skilled positions were developed on 2 South--a Personal CarePartner, a Business CarePartner, and a Clinical CarePartner. By cross-training each of these roles to perform duties formerly done by centralized departments, 2 South was able to cut costs and time while ensuring quality care. TRAINING THE CAREPARTNER: An internally developed training program provided the new CarePartners with up-front education to prepare them to deliver patient-centered care. 2 South has experienced drops in patient falls and medication errors--areas that are often negatively affected when multi-skilled programs are instituted. Patient and physician surveys have shown increased satisfaction with care provided on the unit. The increased efficiency of the model unit has produced these outcomes while cutting costs substantially. The interdisciplinary team coordinating the improvement project learned many lessons in the process, including the importance of communication, education, and a sense of humor.

  17. Evaluation of an in-practice wet-chemistry analyzer using canine and feline serum samples.

    PubMed

    Irvine, Katherine L; Burt, Kay; Papasouliotis, Kostas

    2016-01-01

    A wet-chemistry biochemical analyzer was assessed for in-practice veterinary use. Its small size may mean a cost-effective method for low-throughput in-house biochemical analyses for first-opinion practice. The objectives of our study were to determine imprecision, total observed error, and acceptability of the analyzer for measurement of common canine and feline serum analytes, and to compare clinical sample results to those from a commercial reference analyzer. Imprecision was determined by within- and between-run repeatability for canine and feline pooled samples, and manufacturer-supplied quality control material (QCM). Total observed error (TEobs) was determined for pooled samples and QCM. Performance was assessed for canine and feline pooled samples by sigma metric determination. Agreement and errors between the in-practice and reference analyzers were determined for canine and feline clinical samples by Bland-Altman and Deming regression analyses. Within- and between-run precision was high for most analytes, and TEobs(%) was mostly lower than total allowable error. Performance based on sigma metrics was good (σ > 4) for many analytes and marginal (σ > 3) for most of the remainder. Correlation between the analyzers was very high for most canine analytes and high for most feline analytes. Between-analyzer bias was generally attributed to high constant error. The in-practice analyzer showed good overall performance, with only calcium and phosphate analyses identified as significantly problematic. Agreement for most analytes was insufficient for transposition of reference intervals, and we recommend that in-practice-specific reference intervals be established in the laboratory. © 2015 The Author(s).

  18. Commutability of control materials for external quality assessment of serum apolipoprotein A-I measurement.

    PubMed

    Zeng, Jie; Qi, Tianqi; Wang, Shu; Zhang, Tianjiao; Zhou, Weiyan; Zhao, Haijian; Ma, Rong; Zhang, Jiangtao; Yan, Ying; Dong, Jun; Zhang, Chuanbao; Chen, Wenxiang

    2018-04-25

    The aim of the current study was to evaluate the commutability of commercial control materials and human serum pools and to investigate the suitability of the materials for the external quality assessment (EQA) of serum apolipoprotein A-I (apo A-I) measurement. The Clinical and Laboratory Standards Institute (CLSI) EP14-A3 protocol was used for the commutability study. Apo A-I concentrations in two levels of commercial control materials used in EQA program, two fresh-frozen human serum pools (FSPs) and two frozen human serum pools prepared from residual clinical specimens (RSPs) were measured along with 50 individual samples using nine commercial assays. Measurement results of the 50 individual samples obtained with different assays were pairwise analyzed by Deming regression, and 95% prediction intervals (PIs) were calculated. The commutability of the processed materials was evaluated by comparing the measurement results of the materials with the limits of the PIs. The FSP-1 was commutable for all the 36 assay pairs, and FSP-2 was commutable for 30 pairs; RSP-1 and RSP-2 showed commutability for 27/36 and 22/36 assay pairs, respectively, whereas the two EQA materials were commutable only for 4/36 and 5/36 assay pairs, respectively. Non-commutability of the tested EQA materials has been observed among current apo A-I assays. EQA programs need either to take into account the commutability-related biases in the interpretation of the EQA results or to use more commutable materials. Frozen human serum pools were commutable for most of the assays.

  19. The new Greiner FC-Mix tubes equal the old Terumo ones and are useful as glucose stabilizer after prolonged storage of samples.

    PubMed

    Bonetti, Graziella; Carta, Mariarosa

    2017-10-15

    The aim of our study is to compare new Greiner tubes containing granulated citrate buffer with the Terumo ones and to verify if they are suitable for glucose stabilization after prolonged storage. In Study 1, blood was collected in two Terumo and two Greiner tubes from 40 healthy volunteers. Samples were stored at room temperature (RT) for 1 and 2 hours, respectively. Comparison was made by Deming regression. In Study 2, glucose was measured in a reference tube (N = 50), according to the ADA-NACB guidelines and in aliquots of Greiner samples maintained un-centrifuged at RT for 1, 2, 4 (N = 50) and 24, 48, 72 hours (N = 35). There were insignificant mixed biases between the Terumo and Greiner tubes. Compared to reference (5.3 mmol/L), glucose concentration in the new tubes was 5.4 (P < 0.05), 5.4 (P < 0.05), 5.3 (P = 0.265), 5.2 (P = 0.156), 5.3 (P < 0.05) and 5.2 (P < 0.05) mmol/L after 1, 2, 4, 24, 48 and 72 hours at RT, respectively. There was no biological difference between any of the time points up to 48 h (bias < ± 1.95%). The study shows that the new tubes perform equally well as the Terumo ones and ensure glucose stabilization up to 48 h as well as permit to create a link between the previous studies demonstrating the clinical utility of granulated citrate buffer and the future ones.

  20. A framework for the continual improvement of behavioral healthcare. Part II--Policy for leadership.

    PubMed

    Redelheim, P S; Pomeroy, L H; Batalden, P

    1994-01-01

    In the first part of this article, published in the November/December 1993 issue of Behavioral Healthcare Tomorrow, the authors presented a framework for understanding the process of continuous quality improvement in the behavioral healthcare setting. Four elements of continual improvement were identified: underlying knowledge, policy for leadership, tools and methods, and daily work applications. They showed how traditional professional knowledge of one's subject, discipline and values must be augmented by improvement knowledge--which quality improvement guru W. Edwards Deming calls "the system of profound knowledge." In Part II, they focus on the second element of continual improvement, the importance of organizational leadership.

  1. Quality improvement in hospitals: how much does it reduce healthcare costs?

    PubMed

    Jones, S B

    1995-01-01

    The philosophy of W.E. Deming suggests that continuous quality improvement efforts, when properly applied, ultimately will lead to financial dividends and will help ensure business longevity. Reducing hospital charges can be exciting for the participants and can provide an impetus for expanding quality improvement efforts. Americans, however, tend to demand almost instant gratification and have limited patience for longer-term results. This factor, coupled with minimal knowledge of actual operational costs and inaccurate charge accounting systems, may lead hospital managers to misinterpret the potential net long-term effects of their quality improvement efforts. In the approaching environment of capitated reimbursement, such mistakes may have serious consequences.

  2. The Matrilineal Ancestry of Ashkenazi Jewry: Portrait of a Recent Founder Event

    PubMed Central

    Behar, Doron M.; Metspalu, Ene; Kivisild, Toomas; Achilli, Alessandro; Hadid, Yarin; Tzur, Shay; Pereira, Luisa; Amorim, Antonio; Quintana-Murci, Lluís; Majamaa, Kari; Herrnstadt, Corinna; Howell, Neil; Balanovsky, Oleg; Kutuev, Ildus; Pshenichnov, Andrey; Gurwitz, David; Bonne-Tamir, Batsheva; Torroni, Antonio; Villems, Richard; Skorecki, Karl

    2006-01-01

    Both the extent and location of the maternal ancestral deme from which the Ashkenazi Jewry arose remain obscure. Here, using complete sequences of the maternally inherited mitochondrial DNA (mtDNA), we show that close to one-half of Ashkenazi Jews, estimated at 8,000,000 people, can be traced back to only 4 women carrying distinct mtDNAs that are virtually absent in other populations, with the important exception of low frequencies among non-Ashkenazi Jews. We conclude that four founding mtDNAs, likely of Near Eastern ancestry, underwent major expansion(s) in Europe within the past millennium. PMID:16404693

  3. Resistive Plate Chambers with Gd-coated electrodes as thermal neutron detectors

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Iaselli, G.; Mongelli, T.; Paticchio, V.; Ranieri, A.; Trentadue, R.

    2003-12-01

    Resistive Plate Chambers (RPCs) are wide spread, cheap, easy-to-build and large size detectors, used mainly to reveal ionizing particles in high energy experiments. Here a tecnique, consisting in coating the inner surface of the bakelite electrodes with a mixture of linseed oil and Gd2O3 will be reported; this allows to make RPCs sensitive also to thermal neutrons, making them suitable to be employed for industrial, medical or de-ming applications. This new type, position sensitive gas detector can be operated at atmospheric pressure, is lightweighted, has low γ-ray sensitivity, and is easy to handle even when large areas are to be covered.

  4. Principal component regression analysis with SPSS.

    PubMed

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  5. Performance evaluation of the Roche Tina-quant Cystatin C assay and reference interval for cystatin C in healthy blood donors.

    PubMed

    Erlandsen, Erland J; Randers, Else

    2010-07-01

    To evaluate the performance of the Roche Diagnostics Tina-quant Cystatin C particle enhanced immuno turbidimetric assay for the measurement of plasma and serum cystatin C, and to establish reference intervals for cystatin C in healthy blood donors. The cystatin C measurements were performed on the Roche Modular Analytics P automated clinical chemistry analyzer. The cystatin C assay was linear in the measuring range 0.40-7.00 mg/L. Within-run CVs < or = 2.0%, between-run CVs < or = 4.2%, and total CVs < or = 5.5% in plasma pools and in commercial cystatin C control materials (range 1.0-4.7 mg/L). Recovery was 99.4-109.3%. No interference was detected from haemoglobin < 0.9 mmol/L, bilirubin < 330 micromol/L and Intralipid < 20 g/L. Measurement of cystatin C in Li-heparin plasma did not differ significantly from cystatin C measured in serum. Forty patient samples run on the Modular Analytics P (y) were compared to the Siemens Cystatin C assay on the BN II (x): y = 0.817x + 0.270, Sy.x = 0.168 (Deming regression). The non-parametric reference interval for cystatin C was calculated to be 0.41-0.91 mg/L in females (n = 86), and 0.43-0.94 mg/L in males (n = 76). The Mann-Whitney U test showed a significant difference between the two genders (p = 0.015), but the difference was without clinical relevance. A common reference interval for both genders (n = 162) was calculated to be 0.41-0.92 mg/L. The performance of the Tina-quant Cystatin C assay was acceptable for clinical use.

  6. Recalibration of blood analytes over 25 years in the Atherosclerosis Risk in Communities Study: The impact of recalibration on chronic kidney disease prevalence and incidence

    PubMed Central

    Parrinello, Christina M.; Grams, Morgan E.; Couper, David; Ballantyne, Christie M.; Hoogeveen, Ron C.; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef

    2016-01-01

    Background Equivalence of laboratory tests over time is important for longitudinal studies. Even a small systematic difference (bias) can result in substantial misclassification. Methods We selected 200 Atherosclerosis Risk in Communities Study participants attending all 5 study visits over 25 years. Eight analytes were re-measured in 2011–13 from stored blood samples from multiple visits: creatinine, uric acid, glucose, total cholesterol, HDL-cholesterol, LDL-cholesterol, triglycerides, and high-sensitivity C-reactive protein. Original values were recalibrated to re-measured values using Deming regression. Differences >10% were considered to reflect substantial bias, and correction equations were applied to affected analytes in the total study population. We examined trends in chronic kidney disease (CKD) pre- and post-recalibration. Results Repeat measures were highly correlated with original values (Pearson’s r>0.85 after removing outliers [median 4.5% of paired measurements]), but 2 of 8 analytes (creatinine and uric acid) had differences >10%. Original values of creatinine and uric acid were recalibrated to current values using correction equations. CKD prevalence differed substantially after recalibration of creatinine (visits 1, 2, 4 and 5 pre-recalibration: 21.7%, 36.1%, 3.5%, 29.4%; post-recalibration: 1.3%, 2.2%, 6.4%, 29.4%). For HDL-cholesterol, the current direct enzymatic method differed substantially from magnesium dextran precipitation used during visits 1–4. Conclusions Analytes re-measured in samples stored for ~25 years were highly correlated with original values, but two of the 8 analytes showed substantial bias at multiple visits. Laboratory recalibration improved reproducibility of test results across visits and resulted in substantial differences in CKD prevalence. We demonstrate the importance of consistent recalibration of laboratory assays in a cohort study. PMID:25952043

  7. Iterative outlier removal: A method for identifying outliers in laboratory recalibration studies

    PubMed Central

    Parrinello, Christina M.; Grams, Morgan E.; Sang, Yingying; Couper, David; Wruck, Lisa M.; Li, Danni; Eckfeldt, John H.; Selvin, Elizabeth; Coresh, Josef

    2016-01-01

    Background Extreme values that arise for any reason, including through non-laboratory measurement procedure-related processes (inadequate mixing, evaporation, mislabeling), lead to outliers and inflate errors in recalibration studies. We present an approach termed iterative outlier removal (IOR) for identifying such outliers. Methods We previously identified substantial laboratory drift in uric acid measurements in the Atherosclerosis Risk in Communities (ARIC) Study over time. Serum uric acid was originally measured in 1990–92 on a Coulter DACOS instrument using an uricase-based measurement procedure. To recalibrate previous measured concentrations to a newer enzymatic colorimetric measurement procedure, uric acid was re-measured in 200 participants from stored plasma in 2011–13 on a Beckman Olympus 480 autoanalyzer. To conduct IOR, we excluded data points >3 standard deviations (SDs) from the mean difference. We continued this process using the resulting data until no outliers remained. Results IOR detected more outliers and yielded greater precision in simulation. The original mean difference (SD) in uric acid was 1.25 (0.62) mg/dL. After four iterations, 9 outliers were excluded, and the mean difference (SD) was 1.23 (0.45) mg/dL. Conducting only one round of outlier removal (standard approach) would have excluded 4 outliers (mean difference [SD] = 1.22 [0.51] mg/dL). Applying the recalibration (derived from Deming regression) from each approach to the original measurements, the prevalence of hyperuricemia (>7 mg/dL) was 28.5% before IOR and 8.5% after IOR. Conclusion IOR is a useful method for removal of extreme outliers irrelevant to recalibrating laboratory measurements, and identifies more extraneous outliers than the standard approach. PMID:27197675

  8. Variation in clinical vitamin D status by DiaSorin Liaison and LC-MS/MS in the presence of elevated 25-OH vitamin D2.

    PubMed

    de Koning, Lawrence; Al-Turkmani, M Rabie; Berg, Anders H; Shkreta, Aida; Law, Terence; Kellogg, Mark D

    2013-01-16

    We compared total 25-OH vitamin D status measured by DiaSorin Liaison and tandem mass spectrometry (LC-MS/MS) among patients with high and low 25-OH vitamin D(2). Total 25-OH vitamin D was measured in plasma containing high (>25 nmol/l or >50%, n=26) and low (<2.5 nmol/l, n=29) 25-OH vitamin D(2) using DiaSorin Liaison and an LC-MS/MS method using NIST 972-verified calibrators. Samples were classified as vitamin D adequate (total 25-OH vitamin D ≥50 nmol/l), and inadequate or deficient (<50 nmol/l) by each method. Deming and multiple linear regression were used to compare methods. Samples were significantly more likely to be classified as inadequate or deficient by DiaSorin Liaison (36%) vs LC-MS/MS (9%). This increased in the presence of high 25-OH vitamin D2 (42% vs 0%). Total 25-OH vitamin D by DiaSorin Liaison was 26.0 nmol/l lower than LC-MS/MS, which increased to 34.1 nmol/l among samples with high 25-OH vitamin D(2). This was attributed to lower recovery of 25-OH vitamin D(2) (proportional bias=0.64 nmol/l) by DiaSorin Liaison, independent of D(3) (proportional bias=0.86 nmol/l). Patients were more likely to be classified as vitamin D inadequate or deficient by DiaSorin Liaison compared to an LC-MS/MS method, which was in part due to the presence of 25-OH vitamin D(2). Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Prospective validation of an automated chemiluminescence-based assay of renin and aldosterone for the work-up of arterial hypertension.

    PubMed

    Rossi, Gian Paolo; Ceolotto, Giulio; Rossitto, Giacomo; Seccia, Teresa Maria; Maiolino, Giuseppe; Berton, Chiara; Basso, Daniela; Plebani, Mario

    2016-09-01

    The availability of simple and accurate assays of plasma active renin (DRC) and aldosterone concentration (PAC) can improve the detection of secondary forms of arterial hypertension. Thus, we investigated the performance of an automated chemiluminescent assay for DRC and PAC in referred hypertensive patients. We prospectively recruited 260 consecutive hypertensive patients referred to an ESH Center for Hypertension. After exclusion of six protocol violations, 254 patients were analyzed: 67.3% had primary hypertension, 17.3% an aldosterone producing adenoma (APA), 11.4% idiopathic hyperaldosteronism (IHA), 2.4% renovascular hypertension (RVH), 0.8% familial hyperaldosteronism type 1 (FH-1), 0.4% apparent mineralocorticoid excess (AME), 0.4% a renin-producing tumor, and 3.9% were adrenalectomized APA patients. Bland-Altman plots and Deming regression were used to analyze results. The diagnostic accuracy (area under the curve, AUC of the ROC) of the DRC-based aldosterone-renin ratio (ARRCL) was compared with that of the PRA-based ARR (ARRRIA) using as reference the conclusive diagnosis of APA. At Bland-Altman plot, the DRC and PAC assay showed no bias as compared to the PRA and PAC assay. A tight relation was found between the DRC and the PRA values (concordance correlation coefficient=0.92, p<0.0001) and the PAC values measured with radioimmunoassay and chemiluminescence (concordance correlation coefficient=0.93, p<0.001). For APA identification the AUC of the ARRCL was higher than that of the ARRRIA [0.974 (95% CI 0.940-0.991) vs. 0.894 (95% CI 0.841-0.933), p=0.02]. This rapid automated chemiluminescent DRC/PAC assay performed better than validated PRA/PAC radioimmunoassays for the identification of APA in referred hypertensive patients.

  10. Simultaneous Quantification of Apolipoprotein A-I and Apolipoprotein B by Liquid-Chromatography–Multiple-Reaction–Monitoring Mass Spectrometry

    PubMed Central

    Agger, Sean A.; Marney, Luke C.; Hoofnagle, Andrew N.

    2011-01-01

    BACKGROUND If liquid-chromatography–multiple-reaction–monitoring mass spectrometry (LC-MRM/MS) could be used in the large-scale preclinical verification of putative biomarkers, it would obviate the need for the development of expensive immunoassays. In addition, the translation of novel biomarkers to clinical use would be accelerated if the assays used in preclinical studies were the same as those used in the clinical laboratory. To validate this approach, we developed a multiplexed assay for the quantification of 2 clinically well-known biomarkers in human plasma, apolipoprotein A-I and apolipoprotein B (apoA-I and apoB). METHODS We used PeptideAtlas to identify candidate peptides. Human samples were denatured with urea or trifluoroethanol, reduced and alkylated, and digested with trypsin. We compared reversed-phase chromatographic separation of peptides with normal flow and microflow, and we normalized endogenous peptide peak areas to internal standard peptides. We evaluated different methods of calibration and compared the final method with a nephelometric immunoassay. RESULTS We developed a final method using trifluoroethanol denaturation, 21-h digestion, normal flow chromatography-electrospray ionization, and calibration with a single normal human plasma sample. For samples injected in duplicate, the method had intraassay CVs <6% and interassay CVs <12% for both proteins, and compared well with immunoassay (n = 47; Deming regression, LC-MRM/MS = 1.17 × immunoassay – 36.6; Sx|y = 10.3 for apoA-I and LC-MRM/MS = 1.21 × immunoassay + 7.0; Sx|y = 7.9 for apoB). CONCLUSIONS Multiplexed quantification of proteins in human plasma/serum by LC-MRM/MS is possible and compares well with clinically useful immunoassays. The potential application of single-point calibration to large clinical studies could simplify efforts to reduce day-to-day digestion variability. PMID:20923952

  11. A validation method for near-infrared spectroscopy based tissue oximeters for cerebral and somatic tissue oxygen saturation measurements.

    PubMed

    Benni, Paul B; MacLeod, David; Ikeda, Keita; Lin, Hung-Mo

    2018-04-01

    We describe the validation methodology for the NIRS based FORE-SIGHT ELITE ® (CAS Medical Systems, Inc., Branford, CT, USA) tissue oximeter for cerebral and somatic tissue oxygen saturation (StO 2 ) measurements for adult subjects submitted to the United States Food and Drug Administration (FDA) to obtain clearance for clinical use. This validation methodology evolved from a history of NIRS validations in the literature and FDA recommended use of Deming regression and bootstrapping statistical validation methods. For cerebral validation, forehead cerebral StO 2 measurements were compared to a weighted 70:30 reference (REF CX B ) of co-oximeter internal jugular venous and arterial blood saturation of healthy adult subjects during a controlled hypoxia sequence, with a sensor placed on the forehead. For somatic validation, somatic StO 2 measurements were compared to a weighted 70:30 reference (REF CX S ) of co-oximetry central venous and arterial saturation values following a similar protocol, with sensors place on the flank, quadriceps muscle, and calf muscle. With informed consent, 25 subjects successfully completed the cerebral validation study. The bias and precision (1 SD) of cerebral StO 2 compared to REF CX B was -0.14 ± 3.07%. With informed consent, 24 subjects successfully completed the somatic validation study. The bias and precision of somatic StO 2 compared to REF CX S was 0.04 ± 4.22% from the average of flank, quadriceps, and calf StO 2 measurements to best represent the global whole body REF CX S . The NIRS validation methods presented potentially provide a reliable means to test NIRS monitors and qualify them for clinical use.

  12. Geological analysis and evaluation of ERTS-A imagery for the state of New Mexico

    NASA Technical Reports Server (NTRS)

    Kottlowski, F. E. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Coverage of approximately one-third of the test site had been received by January 31, 1973 and all of the images received were MSS products. Images recorded during the first two months of the ERTS-1 mission were of poor quality, owing largely to high ground reflectance. Later images were of better quality and MSS bands 5 and 7 have proven to be particularly useful. Features noted during visual inspection of 9 1/2 x 9 1/2 prints include major structural forms, vegetation patterns, drainage patterns, and outcrops of geologic formations having marked color contrasts. The Border Hills Structural Zone and the Y-O Structural Zone are prominently reflected in coverage of the Pecos Valley. A study of available maps and remote sensing material covering the Deming-Columbus area indicated that the limit of detection and the resolution of MSS products are not as good as those of aerial photographs, geologic maps, and manned satellite photographs. The limit of detection of high contrast features on MSS prints is approximately 1000 feet or 300 meters for linear features and about 18 acres for roughly circular areas.

  13. Four Easy Steps to Drastically Improve Your Phone-Based Customer Service.

    PubMed

    Peller, Spencer; Beimes, Zachary

    2015-01-01

    Japan is renowned for impeccable customer service (as anyone who's watched an apple get wrapped up like a crown jewel in a Tokyo grocery store will tell you). The Japanese concept of kaizen (constant improvement) is a fundamental reason for this, and for the enduring success of conglomerates such as Toyota, Honda, and Sony. From afar, you may think this trait is caused by something in the waters from Mt. Fuji, but many in the know credit the work of an American engineer named W. Edwards Deming as the catalyst for this movement. If his ideas could transform a nation, there's no question they can improve the patient satisfaction rates at your practice.

  14. A performance improvement plan to increase nurse adherence to use of medication safety software.

    PubMed

    Gavriloff, Carrie

    2012-08-01

    Nurses can protect patients receiving intravenous (IV) medication by using medication safety software to program "smart" pumps to administer IV medications. After a patient safety event identified inconsistent use of medication safety software by nurses, a performance improvement team implemented the Deming Cycle performance improvement methodology. The combined use of improved direct care nurse communication, programming strategies, staff education, medication safety champions, adherence monitoring, and technology acquisition resulted in a statistically significant (p < .001) increase in nurse adherence to using medication safety software from 28% to above 85%, exceeding national benchmark adherence rates (Cohen, Cooke, Husch & Woodley, 2007; Carefusion, 2011). Copyright © 2012 Elsevier Inc. All rights reserved.

  15. A conceptual persistent healthcare quality improvement process for software development management.

    PubMed

    Lin, Jen-Chiun; Su, Mei-Ju; Cheng, Po-Hsun; Weng, Yung-Chien; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    2007-01-01

    This paper illustrates a sustained conceptual service quality improvement process for the management of software development within a healthcare enterprise. Our proposed process is revised from Niland's healthcare quality information system (HQIS). This process includes functions to survey the satisfaction of system functions, describe the operation bylaws on-line, and provide on-demand training. To achieve these goals, we integrate five information systems in National Taiwan University Hospital, including healthcare information systems, health quality information system, requirement management system, executive information system, and digital learning system, to form a full Deming cycle. A preliminary user satisfaction survey showed that our outpatient information system scored an average of 71.31 in 2006.

  16. Regression Analysis by Example. 5th Edition

    ERIC Educational Resources Information Center

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  17. Incorporating Budget Impact Analysis in the Implementation of Complex Interventions: A Case of an Integrated Intervention for Multimorbid Patients within the CareWell Study.

    PubMed

    Soto-Gordoa, Myriam; Arrospide, Arantzazu; Merino Hernández, Marisa; Mora Amengual, Joana; Fullaondo Zabala, Ane; Larrañaga, Igor; de Manuel, Esteban; Mar, Javier

    2017-01-01

    To develop a framework for the management of complex health care interventions within the Deming continuous improvement cycle and to test the framework in the case of an integrated intervention for multimorbid patients in the Basque Country within the CareWell project. Statistical analysis alone, although necessary, may not always represent the practical significance of the intervention. Thus, to ascertain the true economic impact of the intervention, the statistical results can be integrated into the budget impact analysis. The intervention of the case study consisted of a comprehensive approach that integrated new provider roles and new technological infrastructure for multimorbid patients, with the aim of reducing patient decompensations by 10% over 5 years. The study period was 2012 to 2020. Given the aging of the general population, the conventional scenario predicts an increase of 21% in the health care budget for care of multimorbid patients during the study period. With a successful intervention, this figure should drop to 18%. The statistical analysis, however, showed no significant differences in costs either in primary care or in hospital care between 2012 and 2014. The real costs in 2014 were by far closer to those in the conventional scenario than to the reductions expected in the objective scenario. The present implementation should be reappraised, because the present expenditure did not move closer to the objective budget. This work demonstrates the capacity of budget impact analysis to enhance the implementation of complex interventions. Its integration in the context of the continuous improvement cycle is transferable to other contexts in which implementation depth and time are important. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  18. The use of segmented regression in analysing interrupted time series studies: an example in pre-hospital ambulance care.

    PubMed

    Taljaard, Monica; McKenzie, Joanne E; Ramsay, Craig R; Grimshaw, Jeremy M

    2014-06-19

    An interrupted time series design is a powerful quasi-experimental approach for evaluating effects of interventions introduced at a specific point in time. To utilize the strength of this design, a modification to standard regression analysis, such as segmented regression, is required. In segmented regression analysis, the change in intercept and/or slope from pre- to post-intervention is estimated and used to test causal hypotheses about the intervention. We illustrate segmented regression using data from a previously published study that evaluated the effectiveness of a collaborative intervention to improve quality in pre-hospital ambulance care for acute myocardial infarction (AMI) and stroke. In the original analysis, a standard regression model was used with time as a continuous variable. We contrast the results from this standard regression analysis with those from segmented regression analysis. We discuss the limitations of the former and advantages of the latter, as well as the challenges of using segmented regression in analysing complex quality improvement interventions. Based on the estimated change in intercept and slope from pre- to post-intervention using segmented regression, we found insufficient evidence of a statistically significant effect on quality of care for stroke, although potential clinically important effects for AMI cannot be ruled out. Segmented regression analysis is the recommended approach for analysing data from an interrupted time series study. Several modifications to the basic segmented regression analysis approach are available to deal with challenges arising in the evaluation of complex quality improvement interventions.

  19. Standardized Regression Coefficients as Indices of Effect Sizes in Meta-Analysis

    ERIC Educational Resources Information Center

    Kim, Rae Seon

    2011-01-01

    When conducting a meta-analysis, it is common to find many collected studies that report regression analyses, because multiple regression analysis is widely used in many fields. Meta-analysis uses effect sizes drawn from individual studies as a means of synthesizing a collection of results. However, indices of effect size from regression analyses…

  20. Evaluation of the implementation of an integrated program for musculoskeletal system care.

    PubMed

    Larrañaga, Igor; Soto-Gordoa, Myriam; Arrospide, Arantzazu; Jauregi, María Luz; Millas, Jesús; San Vicente, Ricardo; Aguirrebeña, Jabier; Mar, Javier

    The chronic nature of musculoskeletal diseases requires an integrated care which involves the Primary Care and the specialities of Rheumatology, Traumatology and Rehabilitation. The aim of this study was to assess the implementation of an integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease using Deming's continuous improvement process and considering referrals and resource consumption. A simulation model was used in the planning to predict the evolution of musculoskeletal diseases resource consumption and to carry out a Budget Impact Analysis from 2012 to 2020 in the Goierri-Alto Urola region. In the checking stage the status of the process in 2014 was evaluated using statistical analysis to check the degree of achievement of the objectives for each speciality. Simulation models showed that population with musculoskeletal disease in Goierri-Alto Urola will increase a 4.4% by 2020. Because of that, the expenses for a conventional healthcare system will have increased a 5.9%. However, if the intervention reaches its objectives the budget would decrease an 8.5%. The statistical analysis evidenced a decline in referrals to Traumatology service and a reduction of successive consultations in all specialities. The implementation of the integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease is still at an early stage. However, the empowerment of Primary Care improved patient referrals and reduced the costs. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  1. CA II K-line metallicity indicator for field RR Lyrae stars

    NASA Astrophysics Data System (ADS)

    Clementini, Gisella; Tosi, Monica; Merighi, Roberto

    In order to check and, possibly, improve the Preston's Delta S calibration scale, CCD spectra have been obtained for 25 field RR Lyrae variables. Eleven of the program stars have values of (Fe/H) derived by Butler and Deming (1979) from the Fe II lines' strength. For them we find that the equivalent width of the Ca II K line is extremely well correlated to the (Fe/H) values, the best fit relation being: (Fe/H) = 0.43W(K) - 2.75 where W(K) is the equivalent width of the K line. We conclude that the use of the K line equivalent width is at present the best method to derive the (Fe/H) abundance of the RR Lyrae stars.

  2. The quest for quality and productivity in health services.

    PubMed

    Sahney, V K; Warden, G L

    1991-01-01

    The leaders of health care organizations across the country are facing significant pressures to improve the quality of their services while reducing the rate of cost increases within the industry. Total Quality Management (TQM) has been credited, by many leaders in the manufacturing industry, as an effective tool to manage their organizations. This article presents key concepts of TQM as discussed by quality experts, namely, Deming, Juran, and Crosby. It discusses 12 key concepts that have formed the foundation of TQM implementation at Henry Ford Health System. The process of implementation is presented in detail, and the role of TQM in clinical applications is discussed. Success factors and visible actions by senior management designed to reinforce the implementation of TQM in any organization are presented.

  3. Using Dominance Analysis to Determine Predictor Importance in Logistic Regression

    ERIC Educational Resources Information Center

    Azen, Razia; Traxel, Nicole

    2009-01-01

    This article proposes an extension of dominance analysis that allows researchers to determine the relative importance of predictors in logistic regression models. Criteria for choosing logistic regression R[superscript 2] analogues were determined and measures were selected that can be used to perform dominance analysis in logistic regression. A…

  4. Multicenter Comparison of Roche COBAS AMPLICOR MONITOR Version 1.5, Organon Teknika NucliSens QT with Extractor, and Bayer Quantiplex Version 3.0 for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma

    PubMed Central

    Murphy, Donald G.; Côté, Louise; Fauvel, Micheline; René, Pierre; Vincelette, Jean

    2000-01-01

    The performance and characteristics of Roche COBAS AMPLICOR HIV-1 MONITOR version 1.5 (CA MONITOR 1.5) UltraSensitive (usCA MONITOR 1.5) and Standard (stCA MONITOR 1.5) procedures, Organon Teknika NucliSens HIV-1 RNA QT with Extractor (NucliSens), and Bayer Quantiplex HIV RNA version 3.0 (bDNA 3.0) were compared in a multicenter trial. Samples used in this study included 460 plasma specimens from human immunodeficiency virus (HIV) type 1 (HIV-1)-infected persons, 100 plasma specimens from HIV antibody (anti-HIV)-negative persons, and culture supernatants of HIV-1 subtype A to E isolates diluted in anti-HIV-negative plasma. Overall, bDNA 3.0 showed the least variation in RNA measures upon repeat testing. For the Roche assays, usCA MONITOR 1.5 displayed less variation in RNA measures than stCA MONITOR 1.5. NucliSens, at an input volume of 2 ml, showed the best sensitivity. Deming regression analysis indicated that the results of all three assays were significantly correlated (P < 0.0001). However, the mean difference in values between CA MONITOR 1.5 and bDNA 3.0 (0.274 log10 RNA copies/ml; 95% confidence interval, 0.192 to 0.356) was significantly different from 0, indicating that CA MONITOR 1.5 values were regularly higher than bDNA 3.0 values. Upon testing of 100 anti-HIV-negative plasma specimens, usCA MONITOR 1.5 and NucliSens displayed 100% specificity, while bDNA 3.0 showed 98% specificity. NucliSens quantified 2 of 10 non-subtype B viral isolates at 1 log10 lower than both CA MONITOR 1.5 and bDNA 3.0. For NucliSens, testing of specimens with greater than 1,000 RNA copies/ml at input volumes of 0.1, 0.2, and 2.0 ml did not affect the quality of results. Additional factors differing between assays included specimen throughput and volume requirements, limit of detection, ease of execution, instrument work space, and costs of disposal. These characteristics, along with assay performance, should be considered when one is selecting a viral load assay. PMID:11060065

  5. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  6. Cumulative sum control charts for assessing performance in arterial surgery.

    PubMed

    Beiles, C Barry; Morton, Anthony P

    2004-03-01

    The Melbourne Vascular Surgical Association (Melbourne, Australia) undertakes surveillance of mortality following aortic aneurysm surgery, patency at discharge following infrainguinal bypass and stroke and death following carotid endarterectomy. Quality improvement protocol employing the Deming cycle requires that the system for performing surgery first be analysed and optimized. Then process and outcome data are collected and these data require careful analysis. There must be a mechanism so that the causes of unsatisfactory outcomes can be determined and a good feedback mechanism must exist so that good performance is acknowledged and unsatisfactory performance corrected. A simple method for analysing these data that detects changes in average outcome rates is available using cumulative sum statistical control charts. Data have been analysed both retrospectively from 1999 to 2001, and prospectively during 2002 using cumulative sum control methods. A pathway to deal with control chart signals has been developed. The standard of arterial surgery in Victoria, Australia, is high. In one case a safe and satisfactory outcome was achieved by following the pathway developed by the audit committee. Cumulative sum control charts are a simple and effective tool for the identification of variations in performance standards in arterial surgery. The establishment of a pathway to manage problem performance is a vital part of audit activity.

  7. Applied Multiple Linear Regression: A General Research Strategy

    ERIC Educational Resources Information Center

    Smith, Brandon B.

    1969-01-01

    Illustrates some of the basic concepts and procedures for using regression analysis in experimental design, analysis of variance, analysis of covariance, and curvilinear regression. Applications to evaluation of instruction and vocational education programs are illustrated. (GR)

  8. External Tank Liquid Hydrogen (LH2) Prepress Regression Analysis Independent Review Technical Consultation Report

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie s.

    2009-01-01

    The request to conduct an independent review of regression models, developed for determining the expected Launch Commit Criteria (LCC) External Tank (ET)-04 cycle count for the Space Shuttle ET tanking process, was submitted to the NASA Engineering and Safety Center NESC on September 20, 2005. The NESC team performed an independent review of regression models documented in Prepress Regression Analysis, Tom Clark and Angela Krenn, 10/27/05. This consultation consisted of a peer review by statistical experts of the proposed regression models provided in the Prepress Regression Analysis. This document is the consultation's final report.

  9. Needs assessment and implementation of an employee assistance program: promoting a healthier work force.

    PubMed

    Monfils, M K

    1995-05-01

    1. The functions of a continuous quality improvement tool used by Deming--the Plan, Do, Check, Act Cycle--can be applied to the assessment, implementation, and ongoing evaluation of an Employee Assistance Program (EAP). 2. Various methods are available to assess the need for an EAP. As much data as possible should be collected to qualify and quantify the need so that management can make an informed decision and develop measures to determine program effectiveness. 3. Once an EAP is implemented, it should be monitored continually against the effectiveness measures initially developed. Using a continuous quality improvement process, the occupational health nurse and the EAP provider can establish a dynamic relationship that allows for growth beyond the original design and increased effectiveness of service to employees.

  10. Maintenance Operations in Mission Oriented Protective Posture Level IV (MOPPIV)

    DTIC Science & Technology

    1987-10-01

    Repair FADAC Printed Circuit Board ............. 6 3. Data Analysis Techniques ............................. 6 a. Multiple Linear Regression... ANALYSIS /DISCUSSION ............................... 12 1. Exa-ple of Regression Analysis ..................... 12 S2. Regression results for all tasks...6 * TABLE 9. Task Grouping for Analysis ........................ 7 "TABXLE 10. Remove/Replace H60A3 Power Pack................. 8 TABLE

  11. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  12. Resting-state functional magnetic resonance imaging: the impact of regression analysis.

    PubMed

    Yeh, Chia-Jung; Tseng, Yu-Sheng; Lin, Yi-Ru; Tsai, Shang-Yueh; Huang, Teng-Yi

    2015-01-01

    To investigate the impact of regression methods on resting-state functional magnetic resonance imaging (rsfMRI). During rsfMRI preprocessing, regression analysis is considered effective for reducing the interference of physiological noise on the signal time course. However, it is unclear whether the regression method benefits rsfMRI analysis. Twenty volunteers (10 men and 10 women; aged 23.4 ± 1.5 years) participated in the experiments. We used node analysis and functional connectivity mapping to assess the brain default mode network by using five combinations of regression methods. The results show that regressing the global mean plays a major role in the preprocessing steps. When a global regression method is applied, the values of functional connectivity are significantly lower (P ≤ .01) than those calculated without a global regression. This step increases inter-subject variation and produces anticorrelated brain areas. rsfMRI data processed using regression should be interpreted carefully. The significance of the anticorrelated brain areas produced by global signal removal is unclear. Copyright © 2014 by the American Society of Neuroimaging.

  13. Standards for Standardized Logistic Regression Coefficients

    ERIC Educational Resources Information Center

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  14. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  15. Linear regression analysis: part 14 of a series on evaluation of scientific publications.

    PubMed

    Schneider, Astrid; Hommel, Gerhard; Blettner, Maria

    2010-11-01

    Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.

  16. An improved multiple linear regression and data analysis computer program package

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  17. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    PubMed

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  18. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  19. Selective principal component regression analysis of fluorescence hyperspectral image to assess aflatoxin contamination in corn

    USDA-ARS?s Scientific Manuscript database

    Selective principal component regression analysis (SPCR) uses a subset of the original image bands for principal component transformation and regression. For optimal band selection before the transformation, this paper used genetic algorithms (GA). In this case, the GA process used the regression co...

  20. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  1. Regression Analysis and the Sociological Imagination

    ERIC Educational Resources Information Center

    De Maio, Fernando

    2014-01-01

    Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.

  2. Laboratory Simulations on Haze Formation in Cool Exoplanet Atmospheres

    NASA Astrophysics Data System (ADS)

    He, Chao; Horst, Sarah; Lewis, Nikole; Yu, Xinting; McGuiggan, Patricia; Moses, Julianne I.

    2017-10-01

    The Kepler mission has shown that the most abundant types of planets are super-Earths and mini-Neptunes among ~3500 confirmed exoplanets, and these types of exoplanets are expected to exhibit a wide variety of atmospheric compositions. Recent transit spectra have demonstrated that clouds and/or hazes could play a significant role in these planetary atmospheres (Deming et al. 2013, Knutson et al. 2014, Kreidberg et al. 2014, Pont, et al. 2013). However, very little laboratory work has been done to understand the formation of haze over a broad range of atmospheric compositions. Here we conducted a series of laboratory simulations to investigate haze formation in a range of planetary atmospheres using our newly built Planetary HAZE Research (PHAZER) chamber (He et al. 2017). We ran experimental simulations for nine different atmospheres: three temperatures (300 K, 400 K, and 600 K) and three metallicities (100, 1000, and 10000 times solar metallicity) using AC glow discharge as an energy source to irradiate gas mixtures. We found that haze particles are formed in all nine experiments, but the haze production rates are dramatically different for different cases. We investigated the particle sizes of the haze particles deposited on quartz discs using atomic force microscopy (AFM). The AFM images show that the particle size varies from 30 nm to 200 nm. The haze particles are more uniform for 100x solar metallicity experiments (30 nm to 40 nm) while the particles sizes for 1000x and 10000x solar metallicity experiments have wider distributions (30 nm to 200 nm). The particle size affects the scattering of light, and thus the temperature structure of planetary atmospheres. The haze production rates and particle size distributions obtained here can serve as critical inputs to atmospheric physical and chemical tools to understand the exoplanetary atmospheres and help guide future TESS and JWST observations of super-Earths and mini-Neptunes.Ref:Deming, D., et al. 2013, ApJ, 774, 95.He, C., et al. 2017, APJL, 841, L31.Knutson, H. A., et al. 2014, Nat. 505, 66.Kreidberg, L., et al. 2014, Nat. 505, 69.Pont, F., et al. 2013, MNRAS, 432, 2917.

  3. Modified mercalli intensities for nine earthquakes in central and western Washington between 1989 and 1999

    USGS Publications Warehouse

    Brocher, Thomas M.; Dewey, James W.; Cassidy, John F.

    2017-08-15

    We determine Modified Mercalli (Seismic) Intensities (MMI) for nine onshore earthquakes of magnitude 4.5 and larger that occurred in central and western Washington between 1989 and 1999, on the basis of effects reported in postal questionnaires, the press, and professional collaborators. The earthquakes studied include four earthquakes of M5 and larger: the M5.0 Deming earthquake of April 13, 1990, the M5.0 Point Robinson earthquake of January 29, 1995, the M5.4 Duvall earthquake of May 3, 1996, and the M5.8 Satsop earthquake of July 3, 1999. The MMI are assigned using data and procedures that evolved at the U.S. Geological Survey (USGS) and its Department of Commerce predecessors and that were used to assign MMI to felt earthquakes occurring in the United States between 1931 and 1986. We refer to the MMI assigned in this report as traditional MMI, because they are based on responses to postal questionnaires and on newspaper reports, and to distinguish them from MMI calculated from data contributed by the public by way of the internet. Maximum traditional MMI documented for the M5 and larger earthquakes are VII for the 1990 Deming earthquake, V for the 1995 Point Robinson earthquake, VI for the 1996 Duvall earthquake, and VII for the 1999 Satsop earthquake; the five other earthquakes were variously assigned maximum intensities of IV, V, or VI. Starting in 1995, the Pacific Northwest Seismic Network (PNSN) published MMI maps for four of the studied earthquakes, based on macroseismic observations submitted by the public by way of the internet. With the availability now of the traditional USGS MMI interpreted for all the sites from which USGS postal questionnaires were returned, the four Washington earthquakes join a rather small group of earthquakes for which both traditional USGS MMI and some type of internet-based MMI have been assigned. The values and distributions of the traditional MMI are broadly similar to the internet-based PNSN intensities; we discuss some differences in detail that reflect differences in data-sampling procedure, differences in the procedure used to assign intensity numbers from macroseismic observations, and differences in how intensities are mapped.

  4. Kin groups and trait groups: population structure and epidemic disease selection.

    PubMed

    Fix, A G

    1984-10-01

    A Monte Carlo simulation based on the population structure of a small-scale human population, the Semai Senoi of Malaysia, has been developed to study the combined effects of group, kin, and individual selection. The population structure resembles D.S. Wilson's structured deme model in that local breeding populations (Semai settlements) are subdivided into trait groups (hamlets) that may be kin-structured and are not themselves demes. Additionally, settlement breeding populations are connected by two-dimensional stepping-stone migration approaching 30% per generation. Group and kin-structured group selection occur among hamlets the survivors of which then disperse to breed within the settlement population. Genetic drift is modeled by the process of hamlet formation; individual selection as a deterministic process, and stepping-stone migration as either random or kin-structured migrant groups. The mechanism for group selection is epidemics of infectious disease that can wipe out small hamlets particularly if most adults become sick and social life collapses. Genetic resistance to a disease is an individual attribute; however, hamlet groups with several resistant adults are less likely to disintegrate and experience high social mortality. A specific human gene, hemoglobin E, which confers resistance to malaria, is studied as an example of the process. The results of the simulations show that high genetic variance among hamlet groups may be generated by moderate degrees of kin-structuring. This strong microdifferentiation provides the potential for group selection. The effect of group selection in this case is rapid increase in gene frequencies among the total set of populations. In fact, group selection in concert with individual selection produced a faster rate of gene frequency increase among a set of 25 populations than the rate within a single unstructured population subject to deterministic individual selection. Such rapid evolution with plausible rates of extinction, individual selection, and migration and a population structure realistic in its general form, has implications for specific human polymorphisms such as hemoglobin variants and for the more general problem of the tempo of evolution as well.

  5. Multivariate Regression Analysis and Slaughter Livestock,

    DTIC Science & Technology

    AGRICULTURE, *ECONOMICS), (*MEAT, PRODUCTION), MULTIVARIATE ANALYSIS, REGRESSION ANALYSIS , ANIMALS, WEIGHT, COSTS, PREDICTIONS, STABILITY, MATHEMATICAL MODELS, STORAGE, BEEF, PORK, FOOD, STATISTICAL DATA, ACCURACY

  6. Regression Analysis: Legal Applications in Institutional Research

    ERIC Educational Resources Information Center

    Frizell, Julie A.; Shippen, Benjamin S., Jr.; Luna, Andrew L.

    2008-01-01

    This article reviews multiple regression analysis, describes how its results should be interpreted, and instructs institutional researchers on how to conduct such analyses using an example focused on faculty pay equity between men and women. The use of multiple regression analysis will be presented as a method with which to compare salaries of…

  7. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    DTIC Science & Technology

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  8. Quality management and the work environment: an empirical investigation in a public sector organization.

    PubMed

    Taveira, Alvaro D; James, Craig A; Karsh, Ben -Tzion; Sainfort, François

    2003-07-01

    The integration of quality management initiatives, particularly total quality management (TQM), and ergonomics has received increasing attention from scholars and practitioners. Above all, the question of how TQM programs relate to ergonomic aspects of organizational design and culture is at the center of this discussion. This study examines how elements of a "typical", Deming-inspired, TQM program in the public sector interact with the work environment. Elements of the TQM program were defined and measured using the Malcom Baldridge Award criteria. The specific elements examined were "Management Support of Quality", "Information and Analysis", "Human Resources", "Processes and Quality Results", and "Customer Focus and Satisfaction". The relationship between these TQM elements and the work environment were defined through five separate hypotheses. The work environment was described by the constructs "Supervisor Support", "Task Clarity", "Task Orientation", and "Innovation". Data were obtained through survey questionnaires administered to employees of four departments in a municipal government organization. Results supported three of the hypotheses, but produced some unanticipated outcomes with regard to the other two. Namely, "Management Support of Quality" was significantly related to "Supervisor Support", "Task Orientation", "Task Clarity" and "Innovation"; "Human Resources" was significantly related to "Supervisor Support"; "Processes and Quality Results" was significantly related to "Task Orientation" and "Innovation". Contrary to predicted "Information and Analysis" was negatively related to "Innovation", and "Customer Focus" was unrelated to any of the outcome variables. The relationships between these TQM elements and work environment dimensions are discussed. Implications for TQM and ergonomic practice are analyzed, and directions for future research proposed.

  9. Evaluation of Performance Characteristics of the Aptima HIV-1 Quant Dx Assay for Detection and Quantitation of Human Immunodeficiency Virus Type 1 in Plasma and Cervicovaginal Lavage Samples.

    PubMed

    Sam, Soya S; Kurpewski, Jaclynn R; Cu-Uvin, Susan; Caliendo, Angela M

    2016-04-01

    Quantification of HIV-1 RNA has become the standard of care in the clinical management of HIV-1-infected individuals. The objective of this study was to evaluate performance characteristics and relative workflow of the Aptima HIV-1 Quant Dx assay in comparison with the Abbott RealTime HIV-1 assay using plasma and cervicovaginal lavage (CVL) specimens. Assay performance was evaluated by using an AcroMetrix HIV-1 panel, AcroMetrix positive controls, Qnostics and SeraCare HIV-1 evaluation panels, 208 clinical plasma samples, and 205 matched CVL specimens on the Panther and m2000 platforms. The Aptima assay demonstrated good linearity over the quantification range tested (2 to 5 log10copies/ml), and there was strong linear correlation between the assays (R(2)= 0.99), with a comparable coefficient of variance of <5.5%. For the plasma samples, Deming regression analyses and Bland-Altman plots showed excellent agreement between the assays, with an interassay concordance of 91.35% (kappa = 0.75; 95% confidence interval [CI], 0.65 to 0.85), and on average, the viral loads determined by the Aptima assay were 0.21 log10copies/ml higher than those determined by the RealTime assay. The assays differed in their sensitivity for quantifying HIV-1 RNA loads in CVL samples, with the Aptima and RealTime assays detecting 30% and 20%, respectively. Aptima had fewer invalid results, and on average, the viral loads in CVL samples quantified by the Aptima assay were 0.072 log10copies/ml higher than those of the RealTime assay. Our results demonstrate that the Aptima assay is sensitive and accurate in quantifying viral loads in both plasma and CVL specimens and that the fully automated Panther system has all the necessary features suitable for clinical laboratories demanding high-throughput sample processing. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  10. Evaluation of Performance Characteristics of the Aptima HIV-1 Quant Dx Assay for Detection and Quantitation of Human Immunodeficiency Virus Type 1 in Plasma and Cervicovaginal Lavage Samples

    PubMed Central

    Kurpewski, Jaclynn R.; Cu-Uvin, Susan; Caliendo, Angela M.

    2016-01-01

    Quantification of HIV-1 RNA has become the standard of care in the clinical management of HIV-1-infected individuals. The objective of this study was to evaluate performance characteristics and relative workflow of the Aptima HIV-1 Quant Dx assay in comparison with the Abbott RealTime HIV-1 assay using plasma and cervicovaginal lavage (CVL) specimens. Assay performance was evaluated by using an AcroMetrix HIV-1 panel, AcroMetrix positive controls, Qnostics and SeraCare HIV-1 evaluation panels, 208 clinical plasma samples, and 205 matched CVL specimens on the Panther and m2000 platforms. The Aptima assay demonstrated good linearity over the quantification range tested (2 to 5 log10 copies/ml), and there was strong linear correlation between the assays (R2 = 0.99), with a comparable coefficient of variance of <5.5%. For the plasma samples, Deming regression analyses and Bland-Altman plots showed excellent agreement between the assays, with an interassay concordance of 91.35% (kappa = 0.75; 95% confidence interval [CI], 0.65 to 0.85), and on average, the viral loads determined by the Aptima assay were 0.21 log10 copies/ml higher than those determined by the RealTime assay. The assays differed in their sensitivity for quantifying HIV-1 RNA loads in CVL samples, with the Aptima and RealTime assays detecting 30% and 20%, respectively. Aptima had fewer invalid results, and on average, the viral loads in CVL samples quantified by the Aptima assay were 0.072 log10 copies/ml higher than those of the RealTime assay. Our results demonstrate that the Aptima assay is sensitive and accurate in quantifying viral loads in both plasma and CVL specimens and that the fully automated Panther system has all the necessary features suitable for clinical laboratories demanding high-throughput sample processing. PMID:26842702

  11. Performance characteristics of the ARCHITECT Active-B12 (Holotranscobalamin) assay.

    PubMed

    Merrigan, Stephen D; Owen, William E; Straseski, Joely A

    2015-01-01

    Vitamin B12 (cobalamin) is a necessary cofactor in methionine and succinyl-CoA metabolism. Studies estimate the deficiency prevalence as high as 30% in the elderly population. Ten to thirty percent of circulating cobalamin is bound to transcobalamin (holotranscobalamin, holoTC) which can readily enter cells and is therefore considered the bioactive form. The objective of our study was to evaluate the analytical performance of a high-throughput, automated holoTC assay (ARCHITECT i2000(SR) Active-B12 (Holotranscobalamin)) and compare it to other available methods. Manufacturer-specified limits of blank (LoB), detection (LoD), and quantitation (LoQ), imprecision, interference, and linearity were evaluated for the ARCHITECT HoloTC assay. Residual de-identified serum samples were used to compare the ARCHITECT HoloTC assay with the automated AxSYM Active-B12 (Holotranscobalamin) assay (Abbott Diagnostics) and the manual Active-B12 (Holotranscobalamin) Enzyme Immunoassay (EIA) (Axis-Shield Diagnostics, Dundee, Scotland, UK). Manufacturer's claims of LoB, LoD, LoQ, imprecision, interference, and linearity to the highest point tested (113.4 pmol/L) were verified for the ARCHITECT HoloTC assay. Method comparison of the ARCHITECT HoloTC to the AxSYM HoloTC produced the following Deming regression statistics: (ARCHITECT(HoloTc)) = 0.941 (AxSYM(HoloTC)) + 1.2 pmol/L, S(y/x) = 6.4, r = 0.947 (n = 98). Comparison to the Active-B12 EIA produced: (ARCHITECT(HoloTC)) = 1.105 (EIA(Active-B12)) - 6.8 pmol/L, S(y/x) = 11.0, r = 0.950 (n = 221). This assay performed acceptably for LoB, LoD, LoQ, imprecision, interference, linearity and method comparison to the predicate device (AxSYM). An additional comparison to a manual Active-B12 EIA method performed similarly, with minor exceptions. This study determined that the ARCHITECT HoloTC assay is suitable for routine clinical use, which provides a high-throughput alternative for automated testing of this emerging marker of cobalamin deficiency.

  12. CD4 Lymphocyte Enumeration and Hemoglobin Assessment Aid for Priority Decisions: A Multisite Evaluation of the BD FACSPresto™ System

    PubMed Central

    Thakar, Madhuri; Angira, Francis; Pattanapanyasat, Kovit; Wu, Alan H.B.; O’Gorman, Maurice; Zeng, Hui; Qu, Chenxue; Mahajan, Bharati; Sukapirom, Kasama; Chen, Danying; Hao, Yu; Gong, Yan; Indig, Monika De Arruda; Graminske, Sharon; Orta, Diana; d’Empaire, Nicole; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2017-01-01

    Background: The BD FACSPresto™ system uses capillary and venous blood to measure CD4 absolute counts (CD4), %CD4 in lymphocytes, and hemoglobin (Hb) in approximately 25 minutes. CD4 cell count is used with portable CD4 counters in resource-limited settings to manage HIV/AIDS patients. A method comparison was performed using capillary and venous samples from seven clinical laboratories in five countries. The BD FACSPresto system was assessed for variability between laboratory, instrument/operators, cartridge lots and within-run at four sites. Methods: Samples were collected under approved voluntary consent. EDTA-anticoagulated venous samples were tested for CD4 and %CD4 T cells using the gold-standard BD FACSCalibur™ system, and for Hb, using the Sysmex® KX-21N™ analyzer. Venous and capillary samples were tested on the BD FACSPresto system. Matched data was analyzed for bias (Deming linear regression and Bland-Altman methods), and for concordance around the clinical decision point. The coefficient of variation was estimated per site, instrument/operator, cartridge-lot and between-runs. Results: For method comparison, 93% of the 720 samples were from HIV-positive and 7% from HIV-negative or normal subjects. CD4 and %CD4 T cells venous and capillary results gave slopes within 0.96–1.05 and R2 ≥0.96; Hb slopes were ≥1.00 and R2 ≥0.89. Variability across sites/operators gave %CV <5.8% for CD4 counts, <1.9% for %CD4 and <3.2% for Hb. The total %CV was <7.7% across instrument/cartridge lot. Conclusion: The BD FACSPresto system provides accurate, reliable, precise CD4/%CD4/Hb results compared to gold-standard methods, irrespective of venous or capillary blood sampling. The data showed good agreement between the BD FACSPresto, BD FACSCalibur and Sysmex systems. PMID:29290885

  13. CD4 Lymphocyte Enumeration and Hemoglobin Assessment Aid for Priority Decisions: A Multisite Evaluation of the BD FACSPresto™ System.

    PubMed

    Thakar, Madhuri; Angira, Francis; Pattanapanyasat, Kovit; Wu, Alan H B; O'Gorman, Maurice; Zeng, Hui; Qu, Chenxue; Mahajan, Bharati; Sukapirom, Kasama; Chen, Danying; Hao, Yu; Gong, Yan; Indig, Monika De Arruda; Graminske, Sharon; Orta, Diana; d'Empaire, Nicole; Lu, Beverly; Omana-Zapata, Imelda; Zeh, Clement

    2017-01-01

    The BD FACSPresto ™ system uses capillary and venous blood to measure CD4 absolute counts (CD4), %CD4 in lymphocytes, and hemoglobin (Hb) in approximately 25 minutes. CD4 cell count is used with portable CD4 counters in resource-limited settings to manage HIV/AIDS patients. A method comparison was performed using capillary and venous samples from seven clinical laboratories in five countries. The BD FACSPresto system was assessed for variability between laboratory, instrument/operators, cartridge lots and within-run at four sites. Samples were collected under approved voluntary consent. EDTA-anticoagulated venous samples were tested for CD4 and %CD4 T cells using the gold-standard BD FACSCalibur ™ system, and for Hb, using the Sysmex ® KX-21N ™ analyzer. Venous and capillary samples were tested on the BD FACSPresto system. Matched data was analyzed for bias (Deming linear regression and Bland-Altman methods), and for concordance around the clinical decision point. The coefficient of variation was estimated per site, instrument/operator, cartridge-lot and between-runs. For method comparison, 93% of the 720 samples were from HIV-positive and 7% from HIV-negative or normal subjects. CD4 and %CD4 T cells venous and capillary results gave slopes within 0.96-1.05 and R 2 ≥0.96; Hb slopes were ≥1.00 and R 2 ≥0.89. Variability across sites/operators gave %CV <5.8% for CD4 counts, <1.9% for %CD4 and <3.2% for Hb. The total %CV was <7.7% across instrument/cartridge lot. The BD FACSPresto system provides accurate, reliable, precise CD4/%CD4/Hb results compared to gold-standard methods, irrespective of venous or capillary blood sampling. The data showed good agreement between the BD FACSPresto, BD FACSCalibur and Sysmex systems.

  14. Clinical Validation and Implications of Dried Blood Spot Sampling of Carbamazepine, Valproic Acid and Phenytoin in Patients with Epilepsy

    PubMed Central

    Kong, Sing Teang; Lim, Shih-Hui; Lee, Wee Beng; Kumar, Pasikanthi Kishore; Wang, Hwee Yi Stella; Ng, Yan Lam Shannon; Wong, Pei Shieen; Ho, Paul C.

    2014-01-01

    To facilitate therapeutic monitoring of antiepileptic drugs (AEDs) by healthcare professionals for patients with epilepsy (PWE), we applied a GC-MS assay to measure three AEDs: carbamazepine (CBZ), phenytoin (PHT) and valproic acid (VPA) levels concurrently in one dried blood spot (DBS), and validated the DBS-measured levels to their plasma levels. 169 PWE on either mono- or polytherapy of CBZ, PHT or/and VPA were included. One DBS, containing ∼15 µL of blood, was acquired for the simultaneous measurement of the drug levels using GC-MS. Simple Deming regressions were performed to correlate the DBS levels with the plasma levels determined by the conventional immunoturbimetric assay in clinical practice. Statistical analyses of the results were done using MedCalc Version 12.6.1.0 and SPSS 21. DBS concentrations (Cdbs) were well-correlated to the plasma concentrations (Cplasma): r = 0.8381, 0.9305 and 0.8531 for CBZ, PHT and VPA respectively, The conversion formulas from Cdbs to plasma concentrations were [0.89×CdbsCBZ+1.00]µg/mL, [1.11×CdbsPHT−1.00]µg/mL and [0.92×CdbsVPA+12.48]µg/mL respectively. Inclusion of the red blood cells (RBC)/plasma partition ratio (K) and the individual hematocrit levels in the estimation of the theoretical Cplasma from Cdbs of PHT and VPA further improved the identity between the observed and the estimated theoretical Cplasma. Bland-Altman plots indicated that the theoretical and observed Cplasma of PHT and VPA agreed well, and >93.0% of concentrations was within 95% CI (±2SD); and similar agreement (1∶1) was also found between the observed Cdbs and Cplasma of CBZ. As the Cplasma of CBZ, PHT and VPA can be accurately estimated from their Cdbs, DBS can therefore be used for drug monitoring in PWE on any of these AEDs. PMID:25255292

  15. An Improved Enzymatic Indirect Method for Simultaneous Determinations of 3-MCPD Esters and Glycidyl Esters in Fish Oils.

    PubMed

    Miyazaki, Kinuko; Koyama, Kazuo

    2017-10-01

    The enzymatic indirect method for simultaneous determinations of 3-chloro-1, 2-propanediol fatty acid esters (3-MCPD-Es) and glycidyl fatty acid esters (Gly-Es) make use of lipase from Candida cylindracea (previously referred to as C. rugosa). Because of low substrate specificity of the lipase for esters of polyunsaturated fatty acids (PUFA), such as docosahexaenoic acid (DHA) and eicosapentaenoic acid (EPA), fish oils high in PUFAs are currently excluded from the range of application of the method. The objective of this study was to make the enzymatic indirect method applicable to fats and oils containing PUFAs. By using a Burkholderia cepacia lipase, and by removing sodium bromide from hydrolysis step and adding it after completion of the hydrolysis step, satisfactory recovery rates of 91-109% for 3-MCPD, and 91-110% for glycidol (Gly) were obtained from an EPA and DHA concentrated sardine oil, three DHA concentrated tuna oils, two fish oils, and five fish-oil based dietary supplements spiked with DHA-esters or oleic acid-esters of 3-MCPD and Gly at 20 mg/kg. Further, results from unspiked samples of seven fish oil based dietary supplements and five DHA concentrated tuna oils analyzed by the improved enzymatic indirect method were compared with the results analyzed by AOCS Cd 29a. For all 3-MCPD, 2-MCPD and Gly, the 95% confidence intervals determined by the weighted Deming regression for slopes and intercepts contained the value of 1 and 0, respectively. It was therefore concluded that the results from the two methods were not statistically different. These results suggest that fish oils high in PUFAs may be included in the range of application for the improved enzymatic indirect method for simultaneous determinations of 3-MCPD and Gly esters in fats and oils.

  16. [Comparison of application of Cochran-Armitage trend test and linear regression analysis for rate trend analysis in epidemiology study].

    PubMed

    Wang, D Z; Wang, C; Shen, C F; Zhang, Y; Zhang, H; Song, G D; Xue, X D; Xu, Z L; Zhang, S; Jiang, G H

    2017-05-10

    We described the time trend of acute myocardial infarction (AMI) from 1999 to 2013 in Tianjin incidence rate with Cochran-Armitage trend (CAT) test and linear regression analysis, and the results were compared. Based on actual population, CAT test had much stronger statistical power than linear regression analysis for both overall incidence trend and age specific incidence trend (Cochran-Armitage trend P value

  17. A primer for biomedical scientists on how to execute model II linear regression analysis.

    PubMed

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  18. Water quality parameter measurement using spectral signatures

    NASA Technical Reports Server (NTRS)

    White, P. E.

    1973-01-01

    Regression analysis is applied to the problem of measuring water quality parameters from remote sensing spectral signature data. The equations necessary to perform regression analysis are presented and methods of testing the strength and reliability of a regression are described. An efficient algorithm for selecting an optimal subset of the independent variables available for a regression is also presented.

  19. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  20. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  1. Using evidence-based practice to create a venous access team: the Venous Access Task Force of the Children's Hospital of Denver.

    PubMed

    MacPhee, Maura

    2002-12-01

    The following article is an example of evidence-based practice applied to an institutional Quality Improvement (QI) project. QI originated in the 1980s and is best associated with the work of W. Deming (1986). It is also known as Continuous Quality Improvement, because a major principle of this approach is constant improvement of services or products. This improvement process contains other critical components: scientific method, employee participation and teamwork, accountable leadership, appropriate training and ongoing education, and client focus (Demming, 1986). QI has been globally successful and has helped transform American industry, including health care services. The following clinically based project illustrates the application of QI concepts and evidence-based practice to enhance outcomes. Copyright 2002, Elsevier Science (USA). All rights reserved.

  2. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  3. The Evolution of Latent Genes in Subdivided Populations

    PubMed Central

    Moody, M. E.; Basten, C. J.

    1990-01-01

    We define latent genes as phenotypically silent DNA sequences which may be reactivated by various genetic mechanisms. Of interest is how they and their functional counterparts can be maintained at high frequency in the face of mutation and selection pressure. We propose a two-deme, three-allele model incorporating viability selection, mutation and migration in haploid populations. It is shown that polymorphism for the three alleles can be easily maintained for a wide range of biologically meaningful parameter values. Computer simulations were employed to gain qualitative insight into the global dynamics of the system. It was found that the dynamics of the latent allele is closely correlated with that of the functional allele. In addition, bias in the migration rates can strengthen or weaken selective conditions for preservation of the functional and latent alleles. PMID:2307354

  4. Exoplanetary Photometry

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph

    2008-09-01

    The Spitzer Space Telescope measured the first photons from exoplanets (Charbonneau et al. 2005, Deming et al. 2005). These secondary eclipses (planet passing behind star) revealed the planet's emitted infrared flux, and under a blackbody assumption provide a brightness temperature in each measured bandpass. Since the initial direct detections, Spitzer has made numerous measurements in the four Infrared Array Camera bandpasses at 3.6, 4.5, 5.7, and 8.0 microns; the Infrared Spectrograph's Blue Peakup Array at 16 microns; and the Multiband Imaging Photometer for Spitzer's 24-micron array. Initial measurements of orbital variation and further photometric study (Harrington et al. 2006, 2007) revealed the extreme day-night variability of some exoplanets, but full orbital phase curves of different planets (Knutson et al. 2007, 2008) demonstrated that not all planets are so variable. This talk will review progress and prospects in exoplanetary photometry.

  5. Using Robust Standard Errors to Combine Multiple Regression Estimates with Meta-Analysis

    ERIC Educational Resources Information Center

    Williams, Ryan T.

    2012-01-01

    Combining multiple regression estimates with meta-analysis has continued to be a difficult task. A variety of methods have been proposed and used to combine multiple regression slope estimates with meta-analysis, however, most of these methods have serious methodological and practical limitations. The purpose of this study was to explore the use…

  6. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    ERIC Educational Resources Information Center

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  7. A retrospective analysis to identify the factors affecting infection in patients undergoing chemotherapy.

    PubMed

    Park, Ji Hyun; Kim, Hyeon-Young; Lee, Hanna; Yun, Eun Kyoung

    2015-12-01

    This study compares the performance of the logistic regression and decision tree analysis methods for assessing the risk factors for infection in cancer patients undergoing chemotherapy. The subjects were 732 cancer patients who were receiving chemotherapy at K university hospital in Seoul, Korea. The data were collected between March 2011 and February 2013 and were processed for descriptive analysis, logistic regression and decision tree analysis using the IBM SPSS Statistics 19 and Modeler 15.1 programs. The most common risk factors for infection in cancer patients receiving chemotherapy were identified as alkylating agents, vinca alkaloid and underlying diabetes mellitus. The logistic regression explained 66.7% of the variation in the data in terms of sensitivity and 88.9% in terms of specificity. The decision tree analysis accounted for 55.0% of the variation in the data in terms of sensitivity and 89.0% in terms of specificity. As for the overall classification accuracy, the logistic regression explained 88.0% and the decision tree analysis explained 87.2%. The logistic regression analysis showed a higher degree of sensitivity and classification accuracy. Therefore, logistic regression analysis is concluded to be the more effective and useful method for establishing an infection prediction model for patients undergoing chemotherapy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Visual grading characteristics and ordinal regression analysis during optimisation of CT head examinations.

    PubMed

    Zarb, Francis; McEntee, Mark F; Rainford, Louise

    2015-06-01

    To evaluate visual grading characteristics (VGC) and ordinal regression analysis during head CT optimisation as a potential alternative to visual grading assessment (VGA), traditionally employed to score anatomical visualisation. Patient images (n = 66) were obtained using current and optimised imaging protocols from two CT suites: a 16-slice scanner at the national Maltese centre for trauma and a 64-slice scanner in a private centre. Local resident radiologists (n = 6) performed VGA followed by VGC and ordinal regression analysis. VGC alone indicated that optimised protocols had similar image quality as current protocols. Ordinal logistic regression analysis provided an in-depth evaluation, criterion by criterion allowing the selective implementation of the protocols. The local radiology review panel supported the implementation of optimised protocols for brain CT examinations (including trauma) in one centre, achieving radiation dose reductions ranging from 24 % to 36 %. In the second centre a 29 % reduction in radiation dose was achieved for follow-up cases. The combined use of VGC and ordinal logistic regression analysis led to clinical decisions being taken on the implementation of the optimised protocols. This improved method of image quality analysis provided the evidence to support imaging protocol optimisation, resulting in significant radiation dose savings. • There is need for scientifically based image quality evaluation during CT optimisation. • VGC and ordinal regression analysis in combination led to better informed clinical decisions. • VGC and ordinal regression analysis led to dose reductions without compromising diagnostic efficacy.

  9. REGRESSION ANALYSIS OF SEA-SURFACE-TEMPERATURE PATTERNS FOR THE NORTH PACIFIC OCEAN.

    DTIC Science & Technology

    SEA WATER, *SURFACE TEMPERATURE, *OCEANOGRAPHIC DATA, PACIFIC OCEAN, REGRESSION ANALYSIS , STATISTICAL ANALYSIS, UNDERWATER EQUIPMENT, DETECTION, UNDERWATER COMMUNICATIONS, DISTRIBUTION, THERMAL PROPERTIES, COMPUTERS.

  10. The process and utility of classification and regression tree methodology in nursing research

    PubMed Central

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-01-01

    Aim This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Background Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Design Discussion paper. Data sources English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984–2013. Discussion Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Implications for Nursing Research Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Conclusion Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. PMID:24237048

  11. The process and utility of classification and regression tree methodology in nursing research.

    PubMed

    Kuhn, Lisa; Page, Karen; Ward, John; Worrall-Carter, Linda

    2014-06-01

    This paper presents a discussion of classification and regression tree analysis and its utility in nursing research. Classification and regression tree analysis is an exploratory research method used to illustrate associations between variables not suited to traditional regression analysis. Complex interactions are demonstrated between covariates and variables of interest in inverted tree diagrams. Discussion paper. English language literature was sourced from eBooks, Medline Complete and CINAHL Plus databases, Google and Google Scholar, hard copy research texts and retrieved reference lists for terms including classification and regression tree* and derivatives and recursive partitioning from 1984-2013. Classification and regression tree analysis is an important method used to identify previously unknown patterns amongst data. Whilst there are several reasons to embrace this method as a means of exploratory quantitative research, issues regarding quality of data as well as the usefulness and validity of the findings should be considered. Classification and regression tree analysis is a valuable tool to guide nurses to reduce gaps in the application of evidence to practice. With the ever-expanding availability of data, it is important that nurses understand the utility and limitations of the research method. Classification and regression tree analysis is an easily interpreted method for modelling interactions between health-related variables that would otherwise remain obscured. Knowledge is presented graphically, providing insightful understanding of complex and hierarchical relationships in an accessible and useful way to nursing and other health professions. © 2013 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  12. Advantages of the net benefit regression framework for economic evaluations of interventions in the workplace: a case study of the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders.

    PubMed

    Hoch, Jeffrey S; Dewa, Carolyn S

    2014-04-01

    Economic evaluations commonly accompany trials of new treatments or interventions; however, regression methods and their corresponding advantages for the analysis of cost-effectiveness data are not well known. To illustrate regression-based economic evaluation, we present a case study investigating the cost-effectiveness of a collaborative mental health care program for people receiving short-term disability benefits for psychiatric disorders. We implement net benefit regression to illustrate its strengths and limitations. Net benefit regression offers a simple option for cost-effectiveness analyses of person-level data. By placing economic evaluation in a regression framework, regression-based techniques can facilitate the analysis and provide simple solutions to commonly encountered challenges. Economic evaluations of person-level data (eg, from a clinical trial) should use net benefit regression to facilitate analysis and enhance results.

  13. CADDIS Volume 4. Data Analysis: Basic Analyses

    EPA Pesticide Factsheets

    Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.

  14. Population heterogeneity in the salience of multiple risk factors for adolescent delinquency.

    PubMed

    Lanza, Stephanie T; Cooper, Brittany R; Bray, Bethany C

    2014-03-01

    To present mixture regression analysis as an alternative to more standard regression analysis for predicting adolescent delinquency. We demonstrate how mixture regression analysis allows for the identification of population subgroups defined by the salience of multiple risk factors. We identified population subgroups (i.e., latent classes) of individuals based on their coefficients in a regression model predicting adolescent delinquency from eight previously established risk indices drawn from the community, school, family, peer, and individual levels. The study included N = 37,763 10th-grade adolescents who participated in the Communities That Care Youth Survey. Standard, zero-inflated, and mixture Poisson and negative binomial regression models were considered. Standard and mixture negative binomial regression models were selected as optimal. The five-class regression model was interpreted based on the class-specific regression coefficients, indicating that risk factors had varying salience across classes of adolescents. Standard regression showed that all risk factors were significantly associated with delinquency. Mixture regression provided more nuanced information, suggesting a unique set of risk factors that were salient for different subgroups of adolescents. Implications for the design of subgroup-specific interventions are discussed. Copyright © 2014 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  15. The Astronomy Diagnostic Test: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Deming, G. L.; Hufnagel, B. R.

    2000-12-01

    During 1998, the Collaboration for Astronomy Education Research (Adams, Adrian, Brick, Deming, Hufnagel, Slater, and Zeilik) developed a content-based diagnostic test for undergraduate non-science majors taking their first introductory level astronomy course. Student interviews and written feedback were used to construct a series of questions reflecting the students' natural language and with distractors (wrong answers) that mirror commonly held misconceptions. Version 1.9 of the Astronomy Diagnostic Test (ADT) was administered during Spring 1999 by volunteers teaching astronomy at 22 institutions across the United States. Minor modifications were made and Version 2.0 was released on June 21, 1999. The ADT 2.0 currently is available to the astronomical community through two websites and we continue to collect pretest/posttest results. Award of an NSF Small Grant for Exploratory Research has enabled us to work with a team of education researchers at the Ontario Institute for Studies in Education. Our database will be subjected to a statistical analysis in order to establish reliability of ADT 2.0. In addition, content, face, and construct validity are being examined. If you are teaching an introductory astronomy course aimed at non-science majors for Spring 2001, your class can be part of this project. We are looking for volunteers! We are also interested in hearing your ideas for a "next-generation" version of the ADT. Funding provided by NSF grant REC-0089239

  16. Length Distributions of Identity by Descent Reveal Fine-Scale Demographic History

    PubMed Central

    Palamara, Pier Francesco; Lencz, Todd; Darvasi, Ariel; Pe’er, Itsik

    2012-01-01

    Data-driven studies of identity by descent (IBD) were recently enabled by high-resolution genomic data from large cohorts and scalable algorithms for IBD detection. Yet, haplotype sharing currently represents an underutilized source of information for population-genetics research. We present analytical results on the relationship between haplotype sharing across purportedly unrelated individuals and a population’s demographic history. We express the distribution of IBD sharing across pairs of individuals for segments of arbitrary length as a function of the population’s demography, and we derive an inference procedure to reconstruct such demographic history. The accuracy of the proposed reconstruction methodology was extensively tested on simulated data. We applied this methodology to two densely typed data sets: 500 Ashkenazi Jewish (AJ) individuals and 56 Kenyan Maasai (MKK) individuals (HapMap 3 data set). Reconstructing the demographic history of the AJ cohort, we recovered two subsequent population expansions, separated by a severe founder event, consistent with previous analysis of lower-throughput genetic data and historical accounts of AJ history. In the MKK cohort, high levels of cryptic relatedness were detected. The spectrum of IBD sharing is consistent with a demographic model in which several small-sized demes intermix through high migration rates and result in enrichment of shared long-range haplotypes. This scenario of historically structured demographies might explain the unexpected abundance of runs of homozygosity within several populations. PMID:23103233

  17. Improving the quality of mass produced maps

    USGS Publications Warehouse

    Simley, J.

    2001-01-01

    Quality is critical in cartography because key decisions are often made based on the information the map communicates. The mass production of digital cartographic information to support geographic information science has now added a new dimension to the problem of cartographic quality, as problems once limited to small volumes can now proliferate in mass production programs. These problems can also affect the economics of map production by diverting a sizeable portion of production cost to pay for rework on maps with poor quality. Such problems are common to general industry-in response, the quality engineering profession has developed a number of successful methods to overcome these problems. Two important methods are the reduction of error through statistical analysis and addressing the quality environment in which people work. Once initial and obvious quality problems have been solved, outside influences periodically appear that cause adverse variations in quality and consequently increase production costs. Such errors can be difficult to detect before the customer is affected. However, a number of statistical techniques can be employed to detect variation so that the problem is eliminated before significant damage is caused. Additionally, the environment in which the workforce operates must be conductive to quality. Managers have a powerful responsibility to create this environment. Two sets of guidelines, known as Deming's Fourteen Points and ISO-9000, provide models for this environment.

  18. A Note on the Relationship between the Number of Indicators and Their Reliability in Detecting Regression Coefficients in Latent Regression Analysis

    ERIC Educational Resources Information Center

    Dolan, Conor V.; Wicherts, Jelte M.; Molenaar, Peter C. M.

    2004-01-01

    We consider the question of how variation in the number and reliability of indicators affects the power to reject the hypothesis that the regression coefficients are zero in latent linear regression analysis. We show that power remains constant as long as the coefficient of determination remains unchanged. Any increase in the number of indicators…

  19. Moderation analysis using a two-level regression model.

    PubMed

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  20. Multiple Correlation versus Multiple Regression.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    2003-01-01

    Describes differences between multiple correlation analysis (MCA) and multiple regression analysis (MRA), showing how these approaches involve different research questions and study designs, different inferential approaches, different analysis strategies, and different reported information. (SLD)

  1. Functional Relationships and Regression Analysis.

    ERIC Educational Resources Information Center

    Preece, Peter F. W.

    1978-01-01

    Using a degenerate multivariate normal model for the distribution of organismic variables, the form of least-squares regression analysis required to estimate a linear functional relationship between variables is derived. It is suggested that the two conventional regression lines may be considered to describe functional, not merely statistical,…

  2. Isolating and Examining Sources of Suppression and Multicollinearity in Multiple Linear Regression

    ERIC Educational Resources Information Center

    Beckstead, Jason W.

    2012-01-01

    The presence of suppression (and multicollinearity) in multiple regression analysis complicates interpretation of predictor-criterion relationships. The mathematical conditions that produce suppression in regression analysis have received considerable attention in the methodological literature but until now nothing in the way of an analytic…

  3. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  4. Logistic Regression: Concept and Application

    ERIC Educational Resources Information Center

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  5. Applying Regression Analysis to Problems in Institutional Research.

    ERIC Educational Resources Information Center

    Bohannon, Tom R.

    1988-01-01

    Regression analysis is one of the most frequently used statistical techniques in institutional research. Principles of least squares, model building, residual analysis, influence statistics, and multi-collinearity are described and illustrated. (Author/MSE)

  6. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies

    PubMed Central

    Vatcheva, Kristina P.; Lee, MinJae; McCormick, Joseph B.; Rahbar, Mohammad H.

    2016-01-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis. PMID:27274911

  7. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    PubMed

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  8. Stepwise versus Hierarchical Regression: Pros and Cons

    ERIC Educational Resources Information Center

    Lewis, Mitzi

    2007-01-01

    Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…

  9. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  10. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  11. Precision Efficacy Analysis for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.

    When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

  12. Estimation of 1RM for knee extension based on the maximal isometric muscle strength and body composition.

    PubMed

    Kanada, Yoshikiyo; Sakurai, Hiroaki; Sugiura, Yoshito; Arai, Tomoaki; Koyama, Soichiro; Tanabe, Shigeo

    2017-11-01

    [Purpose] To create a regression formula in order to estimate 1RM for knee extensors, based on the maximal isometric muscle strength measured using a hand-held dynamometer and data regarding the body composition. [Subjects and Methods] Measurement was performed in 21 healthy males in their twenties to thirties. Single regression analysis was performed, with measurement values representing 1RM and the maximal isometric muscle strength as dependent and independent variables, respectively. Furthermore, multiple regression analysis was performed, with data regarding the body composition incorporated as another independent variable, in addition to the maximal isometric muscle strength. [Results] Through single regression analysis with the maximal isometric muscle strength as an independent variable, the following regression formula was created: 1RM (kg)=0.714 + 0.783 × maximal isometric muscle strength (kgf). On multiple regression analysis, only the total muscle mass was extracted. [Conclusion] A highly accurate regression formula to estimate 1RM was created based on both the maximal isometric muscle strength and body composition. Using a hand-held dynamometer and body composition analyzer, it was possible to measure these items in a short time, and obtain clinically useful results.

  13. Regression Model Optimization for the Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2009-01-01

    A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.

  14. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    PubMed

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  15. Logistic regression analysis of conventional ultrasonography, strain elastosonography, and contrast-enhanced ultrasound characteristics for the differentiation of benign and malignant thyroid nodules

    PubMed Central

    Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Liu, Weixiang

    2017-01-01

    The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules’ 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively. PMID:29228030

  16. Logistic regression analysis of conventional ultrasonography, strain elastosonography, and contrast-enhanced ultrasound characteristics for the differentiation of benign and malignant thyroid nodules.

    PubMed

    Pang, Tiantian; Huang, Leidan; Deng, Yingyuan; Wang, Tianfu; Chen, Siping; Gong, Xuehao; Liu, Weixiang

    2017-01-01

    The aim of the study is to screen the significant sonographic features by logistic regression analysis and fit a model to diagnose thyroid nodules. A total of 525 pathological thyroid nodules were retrospectively analyzed. All the nodules underwent conventional ultrasonography (US), strain elastosonography (SE), and contrast -enhanced ultrasound (CEUS). Those nodules' 12 suspicious sonographic features were used to assess thyroid nodules. The significant features of diagnosing thyroid nodules were picked out by logistic regression analysis. All variables that were statistically related to diagnosis of thyroid nodules, at a level of p < 0.05 were embodied in a logistic regression analysis model. The significant features in the logistic regression model of diagnosing thyroid nodules were calcification, suspected cervical lymph node metastasis, hypoenhancement pattern, margin, shape, vascularity, posterior acoustic, echogenicity, and elastography score. According to the results of logistic regression analysis, the formula that could predict whether or not thyroid nodules are malignant was established. The area under the receiver operating curve (ROC) was 0.930 and the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value were 83.77%, 89.56%, 87.05%, 86.04%, and 87.79% respectively.

  17. Inferring Temperature Inversions in Hot Jupiters Via Spitzer Emission Spectroscopy

    NASA Astrophysics Data System (ADS)

    Garhart, Emily; Deming, Drake; Mandell, Avi

    2016-10-01

    We present a systematic study of 35 hot Jupiter secondary eclipses, including 16 hot Jupiters never before characterized via emission, observed at the 3.6 μm and 4.5 μm bandpasses of Warm Spitzer in order to classify their atmospheric structure, namely, the existence of temperature inversions. This is a robust study in that these planets orbit stars with a wide range of compositions, temperatures, and activity levels. This diverse sample allows us to investigate the source of planetary temperature inversions, specifically, its correlation with stellar irradiance and magnetic activity. We correct for systematic and intra-pixel sensitivity effects with a pixel level decorrelation (PLD) method described in Deming et al. (2015). The relationship between eclipse depths and a best-fit blackbody function versus stellar activity, a method described in Knutson et al. (2010), will ultimately enable us to appraise the current hypotheses of temperature inversions.

  18. The application of total quality management principles to spacecraft mission operations

    NASA Astrophysics Data System (ADS)

    Sweetin, Maury

    1993-03-01

    By now, the philosophies of Total Quality Management have had an impact on every aspect of American industrial life. The trail-blazing work of Deming, Juran, and Crosby, first implemented in Japan, has 're-migrated' across the Pacific and now plays a growing role in America's management culture. While initially considered suited only for a manufacturing environment, TQM has moved rapidly into the 'service' areas of offices, sales forces, and even fast-food restaurants. The next logical step has also been taken - TQM has found its way into virtually all departments of the Federal Government, including NASA. Because of this widespread success, it seems fair to ask whether this new discipline is directly applicable to the profession of spacecraft operations. The results of quality emphasis on OAO Corporation's contract at JPL provide strong support for Total Quality Management as a useful tool in spacecraft operations.

  19. The quality march. National survey profiles quality improvement activities.

    PubMed

    1993-12-05

    This nationwide profile of CQI/TQM adopters and non-adopters provides important baseline information with which to chart the growing involvement of hospitals with formal quality improvement efforts. Using a stringent definition, the findings suggest rather widespread adoption of CQI/TQM, although most of it has been very recent. Further, there are systematic differences by bed size, teaching orientation, and system membership. Though the Deming method is the most popular approach to CQI/TQM, nearly as many hospitals report using a combination of approaches, and approximately 22 percent report that they have not selected any specific approach. Of particular note is the finding that those involved with CQI/TQM activities perceive fewer barriers to their quality improvement efforts than those not involved. The impact of these differences on perceived costs and outcomes will be addressed in the next issue of Hospitals & Health Networks.

  20. The quality management journey: the progress of health facilities in Australia.

    PubMed

    Carr, B J

    1994-12-01

    Many facilities in Australia have taken the Total Quality Management (TQM) step. The objective of this study was to examine progress of adopted formal quality systems in health. Sixty per cent of organizations surveyed have adopted formal systems. Of these, Deming adherents are the most common, followed by eclectic choices. Only 35% considered the quality transition as reasonably easy. There was no relationship between accreditation and formal quality systems identified. The most common improvement techniques were: flow charts, histograms, and cause and effect diagrams. Quality practitioners are happy to use several tools exceptionally well rather than have many tools at their disposal. The greatest impediment to the adoption of quality was the lack of top management support. This study did not support the view that clinicians are not readily actively supporting quality initiatives. Total Quality Management is not a mature concept; however, Chief Executive Officers are assured that rewards will be realized over time.

  1. Initial evaluation of the geologic applications of ERTS-1 imagery for New Mexico

    NASA Technical Reports Server (NTRS)

    Vonderlinden, K.; Kottlowski, F. E.

    1973-01-01

    Coverage of approximately one-third of the test site, the state of New Mexico, had been received by January 31, 1973 and all of the images received were MSS products. Features noted during visual inspection of 91/2 x 91/2 prints include major structural forms, vegetation patterns, drainage patterns and outcrops of geologic formations having marked color contrasts. The Border Hills Structural Zone and the Y-O Structural Zone are prominently reflected in coverage of the Pecos Valley. A study of available maps and remote sensing material covering the Deming-Columbus area indicated that the limit of detection and the resolution of MSS products are not as good as those of aerial photographs, geologic maps, and manned-satellite photographs. The limit of detection of high contrast features on MSS prints in approximately 1000 feet or 300 meters for linear features and about 18 acres for roughly circular areas.

  2. The application of total quality management principles to spacecraft mission operations

    NASA Technical Reports Server (NTRS)

    Sweetin, Maury

    1993-01-01

    By now, the philosophies of Total Quality Management have had an impact on every aspect of American industrial life. The trail-blazing work of Deming, Juran, and Crosby, first implemented in Japan, has 're-migrated' across the Pacific and now plays a growing role in America's management culture. While initially considered suited only for a manufacturing environment, TQM has moved rapidly into the 'service' areas of offices, sales forces, and even fast-food restaurants. The next logical step has also been taken - TQM has found its way into virtually all departments of the Federal Government, including NASA. Because of this widespread success, it seems fair to ask whether this new discipline is directly applicable to the profession of spacecraft operations. The results of quality emphasis on OAO Corporation's contract at JPL provide strong support for Total Quality Management as a useful tool in spacecraft operations.

  3. Measuring the Infrared Spectrum of the Transiting Extrasolar Planet HD 209458b

    NASA Astrophysics Data System (ADS)

    Richardson, L. Jeremy; Cho, James; Deming, Drake; Hansen, Brad; Harrington, Joseph; Menou, Kristen; Seager, Sara

    2005-06-01

    Researchers from two independent groups recently detected the first infrared signal from an extrasolar planet. Deming et. al. (2005a) detected the 24-micron flux density of HD 209458b using MIPS at secondary eclipse, and Charbonneau et. al. (2005) detected the infrared signal of TrES-1 using IRAC at 4.5 and 8 microns. These results have dramatically demonstrated the ability of Spitzer to characterize extrasolar planets. We propose to build on these observations with IRS spectroscopy of HD 209458b from 7.4 to 14.5 microns. By observing the system both during and outside of secondary eclipse, we will derive the planetary spectrum from the change in the shape of the continuum spectrum in combined light. These observations will lead directly to a measurement of the temperature gradient in the planetary atmosphere and the column density of water above the clouds, and we will search for variability due to atmospheric dynamics.

  4. Quality improvement in basic histotechnology: the lean approach.

    PubMed

    Clark, David

    2016-01-01

    Lean is a comprehensive system of management based on the Toyota production system (TPS), encompassing all the activities of an organization. It focuses management activity on creating value for the end-user by continuously improving operational effectiveness and removing waste. Lean management creates a culture of continuous quality improvement with a strong emphasis on developing the problem-solving capability of staff using the scientific method (Deming's Plan, Do, Check, Act cycle). Lean management systems have been adopted by a number of histopathology departments throughout the world to simultaneously improve quality (reducing errors and shortening turnround times) and lower costs (by increasing efficiency). This article describes the key concepts that make up a lean management system, and how these concepts have been adapted from manufacturing industry and applied to histopathology using a case study of lean implementation and evidence from the literature. It discusses the benefits, limitations, and pitfalls encountered when implementing lean management systems.

  5. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  6. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  7. Effect of Contact Damage on the Strength of Ceramic Materials.

    DTIC Science & Technology

    1982-10-01

    variables that are important to erosion, and a multivariate , linear regression analysis is used to fit the data to the dimensional analysis. The...of Equations 7 and 8 by a multivariable regression analysis (room tem- perature data) Exponent Regression Standard error Computed coefficient of...1980) 593. WEAVER, Proc. Brit. Ceram. Soc. 22 (1973) 125. 39. P. W. BRIDGMAN, "Dimensional Analaysis ", (Yale 18. R. W. RICE, S. W. FREIMAN and P. F

  8. Common pitfalls in statistical analysis: Linear regression analysis

    PubMed Central

    Aggarwal, Rakesh; Ranganathan, Priya

    2017-01-01

    In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis. PMID:28447022

  9. Quality of life in breast cancer patients--a quantile regression analysis.

    PubMed

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  10. The microcomputer scientific software series 2: general linear model--regression.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  11. USAF (United States Air Force) Stability and Control DATCOM (Data Compendium)

    DTIC Science & Technology

    1978-04-01

    regression analysis involves the study of a group of variables to determine their effect on a given parameter. Because of the empirical nature of this...regression analysis of mathematical statistics. In general, a regression analysis involves the study of a group of variables to determine their effect on a...Excperiment, OSR TN 58-114, MIT Fluid Dynamics Research Group Rapt. 57-5, 1957. (U) 90. Kennet, H., and Ashley, H.: Review of Unsteady Aerodynamic Studies in

  12. A Method of Calculating Functional Independence Measure at Discharge from Functional Independence Measure Effectiveness Predicted by Multiple Regression Analysis Has a High Degree of Predictive Accuracy.

    PubMed

    Tokunaga, Makoto; Watanabe, Susumu; Sonoda, Shigeru

    2017-09-01

    Multiple linear regression analysis is often used to predict the outcome of stroke rehabilitation. However, the predictive accuracy may not be satisfactory. The objective of this study was to elucidate the predictive accuracy of a method of calculating motor Functional Independence Measure (mFIM) at discharge from mFIM effectiveness predicted by multiple regression analysis. The subjects were 505 patients with stroke who were hospitalized in a convalescent rehabilitation hospital. The formula "mFIM at discharge = mFIM effectiveness × (91 points - mFIM at admission) + mFIM at admission" was used. By including the predicted mFIM effectiveness obtained through multiple regression analysis in this formula, we obtained the predicted mFIM at discharge (A). We also used multiple regression analysis to directly predict mFIM at discharge (B). The correlation between the predicted and the measured values of mFIM at discharge was compared between A and B. The correlation coefficients were .916 for A and .878 for B. Calculating mFIM at discharge from mFIM effectiveness predicted by multiple regression analysis had a higher degree of predictive accuracy of mFIM at discharge than that directly predicted. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  13. Use of Multiple Regression and Use-Availability Analyses in Determining Habitat Selection by Gray Squirrels (Sciurus Carolinensis)

    Treesearch

    John W. Edwards; Susan C. Loeb; David C. Guynn

    1994-01-01

    Multiple regression and use-availability analyses are two methods for examining habitat selection. Use-availability analysis is commonly used to evaluate macrohabitat selection whereas multiple regression analysis can be used to determine microhabitat selection. We compared these techniques using behavioral observations (n = 5534) and telemetry locations (n = 2089) of...

  14. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  15. Latest Pleistocene and Holocene glacier fluctuations on Mount Baker, Washington

    NASA Astrophysics Data System (ADS)

    Osborn, Gerald; Menounos, Brian; Ryane, Chanone; Riedel, Jon; Clague, John J.; Koch, Johannes; Clark, Douglas; Scott, Kevin; Davis, P. Thompson

    2012-08-01

    Glaciers on stratovolcanoes of the Pacific Northwest of North America offer opportunities for dating late Pleistocene and Holocene glacier advances because tephra and fossil wood are common in lateral moraines and in glacier forefields. We capitalize on this opportunity by examining the Holocene glacial record at Mount Baker, an active stratovolcano in northwest Washington. Earlier workers concluded that glaciers on Mount Baker during the early Holocene were more extensive than during the Little Ice Age and hypothesized that the explanation lay in unusual climatic or hypsometric effects peculiar to large volcanoes. We show that the main argument for an early Holocene glacier advance on Mount Baker, namely the absence of ca 10,000-year-old tephra on part of the south flank of the mountain, is incorrect. Moreover, a lake-sediment core indicates that a small cirque moraine previously thought be of early Holocene age is also likely older than the tephra and consequently of late Pleistocene age. Lateral and end moraines and wood mats ca 2 km downvalley of the present snout of Deming Glacier indicate that an advance during the Younger Dryas interval was little more extensive than the climactic Little Ice Age advance. Tephra and wood between tills in the left lateral moraine of Easton Glacier suggest that ice on Mount Baker was restricted in the early Holocene and that Neoglaciation began ca 6 ka. A series of progressively more extensive Neoglacial advances, dated to about 2.2, 1.6, 0.9, and 0.4 ka, are recorded by stacked tills in the right lateral moraine of Deming Glacier. Intervening retreats were long enough to allow establishment of forests on the moraine. Wood mats in moraines of Coleman and Easton glaciers indicate that Little Ice Age expansion began before 0.7 ka and was followed by retreat and a readvance ca 0.5 ka. Tree-ring and lichen data indicate glaciers on the south side of the mountain reached their maximum extents in the mid-1800s. The similarity between glacier fluctuations at Mount Baker and those elsewhere in the Cascades and in British Columbia suggests a coherent history of Holocene climate change over a broad area of the western Cordillera. We found no evidence that glaciers on stratovolcanoes behave differently than glaciers elsewhere.

  16. [Application of SAS macro to evaluated multiplicative and additive interaction in logistic and Cox regression in clinical practices].

    PubMed

    Nie, Z Q; Ou, Y Q; Zhuang, J; Qu, Y J; Mai, J Z; Chen, J M; Liu, X Q

    2016-05-01

    Conditional logistic regression analysis and unconditional logistic regression analysis are commonly used in case control study, but Cox proportional hazard model is often used in survival data analysis. Most literature only refer to main effect model, however, generalized linear model differs from general linear model, and the interaction was composed of multiplicative interaction and additive interaction. The former is only statistical significant, but the latter has biological significance. In this paper, macros was written by using SAS 9.4 and the contrast ratio, attributable proportion due to interaction and synergy index were calculated while calculating the items of logistic and Cox regression interactions, and the confidence intervals of Wald, delta and profile likelihood were used to evaluate additive interaction for the reference in big data analysis in clinical epidemiology and in analysis of genetic multiplicative and additive interactions.

  17. Prediction by regression and intrarange data scatter in surface-process studies

    USGS Publications Warehouse

    Toy, T.J.; Osterkamp, W.R.; Renard, K.G.

    1993-01-01

    Modeling is a major component of contemporary earth science, and regression analysis occupies a central position in the parameterization, calibration, and validation of geomorphic and hydrologic models. Although this methodology can be used in many ways, we are primarily concerned with the prediction of values for one variable from another variable. Examination of the literature reveals considerable inconsistency in the presentation of the results of regression analysis and the occurrence of patterns in the scatter of data points about the regression line. Both circumstances confound utilization and evaluation of the models. Statisticians are well aware of various problems associated with the use of regression analysis and offer improved practices; often, however, their guidelines are not followed. After a review of the aforementioned circumstances and until standard criteria for model evaluation become established, we recommend, as a minimum, inclusion of scatter diagrams, the standard error of the estimate, and sample size in reporting the results of regression analyses for most surface-process studies. ?? 1993 Springer-Verlag.

  18. Quantile regression for the statistical analysis of immunological data with many non-detects.

    PubMed

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  19. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  20. CADDIS Volume 4. Data Analysis: PECBO Appendix - R Scripts for Non-Parametric Regressions

    EPA Pesticide Factsheets

    Script for computing nonparametric regression analysis. Overview of using scripts to infer environmental conditions from biological observations, statistically estimating species-environment relationships, statistical scripts.

  1. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    PubMed

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF clusters. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Using "Excel" for White's Test--An Important Technique for Evaluating the Equality of Variance Assumption and Model Specification in a Regression Analysis

    ERIC Educational Resources Information Center

    Berenson, Mark L.

    2013-01-01

    There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…

  3. Rex fortran 4 system for combinatorial screening or conventional analysis of multivariate regressions

    Treesearch

    L.R. Grosenbaugh

    1967-01-01

    Describes an expansible computerized system that provides data needed in regression or covariance analysis of as many as 50 variables, 8 of which may be dependent. Alternatively, it can screen variously generated combinations of independent variables to find the regression with the smallest mean-squared-residual, which will be fitted if desired. The user can easily...

  4. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    PubMed

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.

  5. Applications of statistics to medical science, III. Correlation and regression.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    In this third part of a series surveying medical statistics, the concepts of correlation and regression are reviewed. In particular, methods of linear regression and logistic regression are discussed. Arguments related to survival analysis will be made in a subsequent paper.

  6. Optimizing methods for linking cinematic features to fMRI data.

    PubMed

    Kauttonen, Janne; Hlushchuk, Yevhen; Tikka, Pia

    2015-04-15

    One of the challenges of naturalistic neurosciences using movie-viewing experiments is how to interpret observed brain activations in relation to the multiplicity of time-locked stimulus features. As previous studies have shown less inter-subject synchronization across viewers of random video footage than story-driven films, new methods need to be developed for analysis of less story-driven contents. To optimize the linkage between our fMRI data collected during viewing of a deliberately non-narrative silent film 'At Land' by Maya Deren (1944) and its annotated content, we combined the method of elastic-net regularization with the model-driven linear regression and the well-established data-driven independent component analysis (ICA) and inter-subject correlation (ISC) methods. In the linear regression analysis, both IC and region-of-interest (ROI) time-series were fitted with time-series of a total of 36 binary-valued and one real-valued tactile annotation of film features. The elastic-net regularization and cross-validation were applied in the ordinary least-squares linear regression in order to avoid over-fitting due to the multicollinearity of regressors, the results were compared against both the partial least-squares (PLS) regression and the un-regularized full-model regression. Non-parametric permutation testing scheme was applied to evaluate the statistical significance of regression. We found statistically significant correlation between the annotation model and 9 ICs out of 40 ICs. Regression analysis was also repeated for a large set of cubic ROIs covering the grey matter. Both IC- and ROI-based regression analyses revealed activations in parietal and occipital regions, with additional smaller clusters in the frontal lobe. Furthermore, we found elastic-net based regression more sensitive than PLS and un-regularized regression since it detected a larger number of significant ICs and ROIs. Along with the ISC ranking methods, our regression analysis proved a feasible method for ordering the ICs based on their functional relevance to the annotated cinematic features. The novelty of our method is - in comparison to the hypothesis-driven manual pre-selection and observation of some individual regressors biased by choice - in applying data-driven approach to all content features simultaneously. We found especially the combination of regularized regression and ICA useful when analyzing fMRI data obtained using non-narrative movie stimulus with a large set of complex and correlated features. Copyright © 2015. Published by Elsevier Inc.

  7. Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design

    NASA Astrophysics Data System (ADS)

    Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.

    1987-04-01

    Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.

  8. The Population Structure of Glossina palpalis gambiensis from Island and Continental Locations in Coastal Guinea

    PubMed Central

    Solano, Philippe; Ravel, Sophie; Bouyer, Jeremy; Camara, Mamadou; Kagbadouno, Moise S.; Dyer, Naomi; Gardes, Laetitia; Herault, Damien; Donnelly, Martin J.; De Meeûs, Thierry

    2009-01-01

    Background We undertook a population genetics analysis of the tsetse fly Glossina palpalis gambiensis, a major vector of sleeping sickness in West Africa, using microsatellite and mitochondrial DNA markers. Our aims were to estimate effective population size and the degree of isolation between coastal sites on the mainland of Guinea and Loos Islands. The sampling locations encompassed Dubréka, the area with the highest Human African Trypanosomosis (HAT) prevalence in West Africa, mangrove and savannah sites on the mainland, and two islands, Fotoba and Kassa, within the Loos archipelago. These data are discussed with respect to the feasibility and sustainability of control strategies in those sites currently experiencing, or at risk of, sleeping sickness. Principal Findings We found very low migration rates between sites except between those sampled around the Dubréka area that seems to contain a widely dispersed and panmictic population. In the Kassa island samples, various effective population size estimates all converged on surprisingly small values (10

  9. INNOVATIVE INSTRUMENTATION AND ANALYSIS OF THE TEMPERATURE MEASUREMENT FOR HIGH TEMPERATURE GASIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong W. Lee

    During this reporting period, the literature survey including the gasifier temperature measurement literature, the ultrasonic application and its background study in cleaning application, and spray coating process are completed. The gasifier simulator (cold model) testing has been successfully conducted. Four factors (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. The Analysis of Variance (ANOVA) was applied to analyze the test data. The analysis shows that all four factors are significant to the temperature measurements in the gasifier simulator (cold model). The regression analysis for the case with the normalizedmore » room temperature shows that linear model fits the temperature data with 82% accuracy (18% error). The regression analysis for the case without the normalized room temperature shows 72.5% accuracy (27.5% error). The nonlinear regression analysis indicates a better fit than that of the linear regression. The nonlinear regression model's accuracy is 88.7% (11.3% error) for normalized room temperature case, which is better than the linear regression analysis. The hot model thermocouple sleeve design and fabrication are completed. The gasifier simulator (hot model) design and the fabrication are completed. The system tests of the gasifier simulator (hot model) have been conducted and some modifications have been made. Based on the system tests and results analysis, the gasifier simulator (hot model) has met the proposed design requirement and the ready for system test. The ultrasonic cleaning method is under evaluation and will be further studied for the gasifier simulator (hot model) application. The progress of this project has been on schedule.« less

  10. The Economic Value of Mangroves: A Meta-Analysis

    Treesearch

    Marwa Salem; D. Evan Mercer

    2012-01-01

    This paper presents a synthesis of the mangrove ecosystem valuation literature through a meta-regression analysis. The main contribution of this study is that it is the first meta-analysis focusing solely on mangrove forests, whereas previous studies have included different types of wetlands. The number of studies included in the regression analysis is 44 for a total...

  11. Estimation of diffusion coefficients from voltammetric signals by support vector and gaussian process regression

    PubMed Central

    2014-01-01

    Background Support vector regression (SVR) and Gaussian process regression (GPR) were used for the analysis of electroanalytical experimental data to estimate diffusion coefficients. Results For simulated cyclic voltammograms based on the EC, Eqr, and EqrC mechanisms these regression algorithms in combination with nonlinear kernel/covariance functions yielded diffusion coefficients with higher accuracy as compared to the standard approach of calculating diffusion coefficients relying on the Nicholson-Shain equation. The level of accuracy achieved by SVR and GPR is virtually independent of the rate constants governing the respective reaction steps. Further, the reduction of high-dimensional voltammetric signals by manual selection of typical voltammetric peak features decreased the performance of both regression algorithms compared to a reduction by downsampling or principal component analysis. After training on simulated data sets, diffusion coefficients were estimated by the regression algorithms for experimental data comprising voltammetric signals for three organometallic complexes. Conclusions Estimated diffusion coefficients closely matched the values determined by the parameter fitting method, but reduced the required computational time considerably for one of the reaction mechanisms. The automated processing of voltammograms according to the regression algorithms yields better results than the conventional analysis of peak-related data. PMID:24987463

  12. Introduction to methodology of dose-response meta-analysis for binary outcome: With application on software.

    PubMed

    Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang

    2018-05-01

    Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  13. Predictors of postoperative outcomes of cubital tunnel syndrome treatments using multiple logistic regression analysis.

    PubMed

    Suzuki, Taku; Iwamoto, Takuji; Shizu, Kanae; Suzuki, Katsuji; Yamada, Harumoto; Sato, Kazuki

    2017-05-01

    This retrospective study was designed to investigate prognostic factors for postoperative outcomes for cubital tunnel syndrome (CubTS) using multiple logistic regression analysis with a large number of patients. Eighty-three patients with CubTS who underwent surgeries were enrolled. The following potential prognostic factors for disease severity were selected according to previous reports: sex, age, type of surgery, disease duration, body mass index, cervical lesion, presence of diabetes mellitus, Workers' Compensation status, preoperative severity, and preoperative electrodiagnostic testing. Postoperative severity of disease was assessed 2 years after surgery by Messina's criteria which is an outcome measure specifically for CubTS. Bivariate analysis was performed to select candidate prognostic factors for multiple linear regression analyses. Multiple logistic regression analysis was conducted to identify the association between postoperative severity and selected prognostic factors. Both bivariate and multiple linear regression analysis revealed only preoperative severity as an independent risk factor for poor prognosis, while other factors did not show any significant association. Although conflicting results exist regarding prognosis of CubTS, this study supports evidence from previous studies and concludes early surgical intervention portends the most favorable prognosis. Copyright © 2017 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  14. Drug treatment rates with beta-blockers and ACE-inhibitors/angiotensin receptor blockers and recurrences in takotsubo cardiomyopathy: A meta-regression analysis.

    PubMed

    Brunetti, Natale Daniele; Santoro, Francesco; De Gennaro, Luisa; Correale, Michele; Gaglione, Antonio; Di Biase, Matteo

    2016-07-01

    In a recent paper Singh et al. analyzed the effect of drug treatment on recurrence of takotsubo cardiomyopathy (TTC) in a comprehensive meta-analysis. The study found that recurrence rates were independent of clinic utilization of BB prescription, but inversely correlated with ACEi/ARB prescription: authors therefore conclude that ACEi/ARB rather than BB may reduce risk of recurrence. We aimed to re-analyze data reported in the study, now weighted for populations' size, in a meta-regression analysis. After multiple meta-regression analysis, we found a significant regression between rates of prescription of ACEi and rates of recurrence of TTC; regression was not statistically significant for BBs. On the bases of our re-analysis, we confirm that rates of recurrence of TTC are lower in populations of patients with higher rates of treatment with ACEi/ARB. That could not necessarily imply that ACEi may prevent recurrence of TTC, but barely that, for example, rates of recurrence are lower in cohorts more compliant with therapy or more prescribed with ACEi because more carefully followed. Randomized prospective studies are surely warranted. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. A general framework for the use of logistic regression models in meta-analysis.

    PubMed

    Simmonds, Mark C; Higgins, Julian Pt

    2016-12-01

    Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.

  16. Examination of influential observations in penalized spline regression

    NASA Astrophysics Data System (ADS)

    Türkan, Semra

    2013-10-01

    In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.

  17. Robust analysis of trends in noisy tokamak confinement data using geodesic least squares regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, G., E-mail: geert.verdoolaege@ugent.be; Laboratory for Plasma Physics, Royal Military Academy, B-1000 Brussels; Shabbir, A.

    Regression analysis is a very common activity in fusion science for unveiling trends and parametric dependencies, but it can be a difficult matter. We have recently developed the method of geodesic least squares (GLS) regression that is able to handle errors in all variables, is robust against data outliers and uncertainty in the regression model, and can be used with arbitrary distribution models and regression functions. We here report on first results of application of GLS to estimation of the multi-machine scaling law for the energy confinement time in tokamaks, demonstrating improved consistency of the GLS results compared to standardmore » least squares.« less

  18. Retro-regression--another important multivariate regression improvement.

    PubMed

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  19. Regression analysis for solving diagnosis problem of children's health

    NASA Astrophysics Data System (ADS)

    Cherkashina, Yu A.; Gerget, O. M.

    2016-04-01

    The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.

  20. Regression analysis using dependent Polya trees.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J

    2013-11-30

    Many commonly used models for linear regression analysis force overly simplistic shape and scale constraints on the residual structure of data. We propose a semiparametric Bayesian model for regression analysis that produces data-driven inference by using a new type of dependent Polya tree prior to model arbitrary residual distributions that are allowed to evolve across increasing levels of an ordinal covariate (e.g., time, in repeated measurement studies). By modeling residual distributions at consecutive covariate levels or time points using separate, but dependent Polya tree priors, distributional information is pooled while allowing for broad pliability to accommodate many types of changing residual distributions. We can use the proposed dependent residual structure in a wide range of regression settings, including fixed-effects and mixed-effects linear and nonlinear models for cross-sectional, prospective, and repeated measurement data. A simulation study illustrates the flexibility of our novel semiparametric regression model to accurately capture evolving residual distributions. In an application to immune development data on immunoglobulin G antibodies in children, our new model outperforms several contemporary semiparametric regression models based on a predictive model selection criterion. Copyright © 2013 John Wiley & Sons, Ltd.

  1. A comparison of methods for the analysis of binomial clustered outcomes in behavioral research.

    PubMed

    Ferrari, Alberto; Comelli, Mario

    2016-12-01

    In behavioral research, data consisting of a per-subject proportion of "successes" and "failures" over a finite number of trials often arise. This clustered binary data are usually non-normally distributed, which can distort inference if the usual general linear model is applied and sample size is small. A number of more advanced methods is available, but they are often technically challenging and a comparative assessment of their performances in behavioral setups has not been performed. We studied the performances of some methods applicable to the analysis of proportions; namely linear regression, Poisson regression, beta-binomial regression and Generalized Linear Mixed Models (GLMMs). We report on a simulation study evaluating power and Type I error rate of these models in hypothetical scenarios met by behavioral researchers; plus, we describe results from the application of these methods on data from real experiments. Our results show that, while GLMMs are powerful instruments for the analysis of clustered binary outcomes, beta-binomial regression can outperform them in a range of scenarios. Linear regression gave results consistent with the nominal level of significance, but was overall less powerful. Poisson regression, instead, mostly led to anticonservative inference. GLMMs and beta-binomial regression are generally more powerful than linear regression; yet linear regression is robust to model misspecification in some conditions, whereas Poisson regression suffers heavily from violations of the assumptions when used to model proportion data. We conclude providing directions to behavioral scientists dealing with clustered binary data and small sample sizes. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Regression analysis of informative current status data with the additive hazards model.

    PubMed

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  3. Comparison of cranial sex determination by discriminant analysis and logistic regression.

    PubMed

    Amores-Ampuero, Anabel; Alemán, Inmaculada

    2016-04-05

    Various methods have been proposed for estimating dimorphism. The objective of this study was to compare sex determination results from cranial measurements using discriminant analysis or logistic regression. The study sample comprised 130 individuals (70 males) of known sex, age, and cause of death from San José cemetery in Granada (Spain). Measurements of 19 neurocranial dimensions and 11 splanchnocranial dimensions were subjected to discriminant analysis and logistic regression, and the percentages of correct classification were compared between the sex functions obtained with each method. The discriminant capacity of the selected variables was evaluated with a cross-validation procedure. The percentage accuracy with discriminant analysis was 78.2% for the neurocranium (82.4% in females and 74.6% in males) and 73.7% for the splanchnocranium (79.6% in females and 68.8% in males). These percentages were higher with logistic regression analysis: 85.7% for the neurocranium (in both sexes) and 94.1% for the splanchnocranium (100% in females and 91.7% in males).

  4. Building Regression Models: The Importance of Graphics.

    ERIC Educational Resources Information Center

    Dunn, Richard

    1989-01-01

    Points out reasons for using graphical methods to teach simple and multiple regression analysis. Argues that a graphically oriented approach has considerable pedagogic advantages in the exposition of simple and multiple regression. Shows that graphical methods may play a central role in the process of building regression models. (Author/LS)

  5. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  6. [Regression on order statistics and its application in estimating nondetects for food exposure assessment].

    PubMed

    Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang

    2009-01-01

    To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.

  7. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis

    PubMed Central

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Background: Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. Methods: In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. Results: The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Conclusion: Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended. PMID:26793655

  8. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis.

    PubMed

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.

  9. Regression Analysis of Physician Distribution to Identify Areas of Need: Some Preliminary Findings.

    ERIC Educational Resources Information Center

    Morgan, Bruce B.; And Others

    A regression analysis was conducted of factors that help to explain the variance in physician distribution and which identify those factors that influence the maldistribution of physicians. Models were developed for different geographic areas to determine the most appropriate unit of analysis for the Western Missouri Area Health Education Center…

  10. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R. (Principal Investigator)

    1982-01-01

    Data analysis procedures for quantification of water quality parameters that are already identified and are known to exist within the water body are considered. The liner multiple-regression technique was examined as a procedure for defining and calibrating data analysis algorithms for such instruments as spectrometers and multispectral scanners.

  11. The Analysis of the Regression-Discontinuity Design in R

    ERIC Educational Resources Information Center

    Thoemmes, Felix; Liao, Wang; Jin, Ze

    2017-01-01

    This article describes the analysis of regression-discontinuity designs (RDDs) using the R packages rdd, rdrobust, and rddtools. We discuss similarities and differences between these packages and provide directions on how to use them effectively. We use real data from the Carolina Abecedarian Project to show how an analysis of an RDD can be…

  12. Addressing the identification problem in age-period-cohort analysis: a tutorial on the use of partial least squares and principal components analysis.

    PubMed

    Tu, Yu-Kang; Krämer, Nicole; Lee, Wen-Chung

    2012-07-01

    In the analysis of trends in health outcomes, an ongoing issue is how to separate and estimate the effects of age, period, and cohort. As these 3 variables are perfectly collinear by definition, regression coefficients in a general linear model are not unique. In this tutorial, we review why identification is a problem, and how this problem may be tackled using partial least squares and principal components regression analyses. Both methods produce regression coefficients that fulfill the same collinearity constraint as the variables age, period, and cohort. We show that, because the constraint imposed by partial least squares and principal components regression is inherent in the mathematical relation among the 3 variables, this leads to more interpretable results. We use one dataset from a Taiwanese health-screening program to illustrate how to use partial least squares regression to analyze the trends in body heights with 3 continuous variables for age, period, and cohort. We then use another dataset of hepatocellular carcinoma mortality rates for Taiwanese men to illustrate how to use partial least squares regression to analyze tables with aggregated data. We use the second dataset to show the relation between the intrinsic estimator, a recently proposed method for the age-period-cohort analysis, and partial least squares regression. We also show that the inclusion of all indicator variables provides a more consistent approach. R code for our analyses is provided in the eAppendix.

  13. MODELING SNAKE MICROHABITAT FROM RADIOTELEMETRY STUDIES USING POLYTOMOUS LOGISTIC REGRESSION

    EPA Science Inventory

    Multivariate analysis of snake microhabitat has historically used techniques that were derived under assumptions of normality and common covariance structure (e.g., discriminant function analysis, MANOVA). In this study, polytomous logistic regression (PLR which does not require ...

  14. Estimating regression coefficients from clustered samples: Sampling errors and optimum sample allocation

    NASA Technical Reports Server (NTRS)

    Kalton, G.

    1983-01-01

    A number of surveys were conducted to study the relationship between the level of aircraft or traffic noise exposure experienced by people living in a particular area and their annoyance with it. These surveys generally employ a clustered sample design which affects the precision of the survey estimates. Regression analysis of annoyance on noise measures and other variables is often an important component of the survey analysis. Formulae are presented for estimating the standard errors of regression coefficients and ratio of regression coefficients that are applicable with a two- or three-stage clustered sample design. Using a simple cost function, they also determine the optimum allocation of the sample across the stages of the sample design for the estimation of a regression coefficient.

  15. Regression Model for Light Weight and Crashworthiness Enhancement Design of Automotive Parts in Frontal CAR Crash

    NASA Astrophysics Data System (ADS)

    Bae, Gihyun; Huh, Hoon; Park, Sungho

    This paper deals with a regression model for light weight and crashworthiness enhancement design of automotive parts in frontal car crash. The ULSAB-AVC model is employed for the crash analysis and effective parts are selected based on the amount of energy absorption during the crash behavior. Finite element analyses are carried out for designated design cases in order to investigate the crashworthiness and weight according to the material and thickness of main energy absorption parts. Based on simulations results, a regression analysis is performed to construct a regression model utilized for light weight and crashworthiness enhancement design of automotive parts. An example for weight reduction of main energy absorption parts demonstrates the validity of a regression model constructed.

  16. Logistic regression analysis of factors associated with avascular necrosis of the femoral head following femoral neck fractures in middle-aged and elderly patients.

    PubMed

    Ai, Zi-Sheng; Gao, You-Shui; Sun, Yuan; Liu, Yue; Zhang, Chang-Qing; Jiang, Cheng-Hua

    2013-03-01

    Risk factors for femoral neck fracture-induced avascular necrosis of the femoral head have not been elucidated clearly in middle-aged and elderly patients. Moreover, the high incidence of screw removal in China and its effect on the fate of the involved femoral head require statistical methods to reflect their intrinsic relationship. Ninety-nine patients older than 45 years with femoral neck fracture were treated by internal fixation between May 1999 and April 2004. Descriptive analysis, interaction analysis between associated factors, single factor logistic regression, multivariate logistic regression, and detailed interaction analysis were employed to explore potential relationships among associated factors. Avascular necrosis of the femoral head was found in 15 cases (15.2 %). Age × the status of implants (removal vs. maintenance) and gender × the timing of reduction were interactive according to two-factor interactive analysis. Age, the displacement of fractures, the quality of reduction, and the status of implants were found to be significant factors in single factor logistic regression analysis. Age, age × the status of implants, and the quality of reduction were found to be significant factors in multivariate logistic regression analysis. In fine interaction analysis after multivariate logistic regression analysis, implant removal was the most important risk factor for avascular necrosis in 56-to-85-year-old patients, with a risk ratio of 26.00 (95 % CI = 3.076-219.747). The middle-aged and elderly have less incidence of avascular necrosis of the femoral head following femoral neck fractures treated by cannulated screws. The removal of cannulated screws can induce a significantly high incidence of avascular necrosis of the femoral head in elderly patients, while a high-quality reduction is helpful to reduce avascular necrosis.

  17. Performance of a completely automated system for monitoring CMV DNA in plasma.

    PubMed

    Mengelle, C; Sandres-Sauné, K; Mansuy, J-M; Haslé, C; Boineau, J; Izopet, J

    2016-06-01

    Completely automated systems for monitoring CMV-DNA in plasma samples are now available. Evaluate analytical and clinical performances of the VERIS™/MDx System CMV Assay(®). Analytical performance was assessed using quantified quality controls. Clinical performance was assessed by comparison with the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test using 169 plasma samples that had tested positive with the in-house technique in whole blood. The specificity of the VERIS™/MDx System CMV Assay(®) was 99% [CI 95%: 97.7-100]. Intra-assay reproducibilities were 0.03, 0.04, 0.05 and 0.04 log10IU/ml (means 2.78, 3.70, 4.64 and 5.60 log10IU/ml) for expected values of 2.70, 3.70, 4.70 and 5.70 log10IU/ml. The inter-assay reproducibilities were 0.12 and 0.08 (means 6.30 and 2.85 log10IU/ml) for expected values of 6.28 and 2.80 log10IU/ml. The lower limit of detection was 14.6IU/ml, and the assay was linear from 2.34 to 5.58 log10IU/ml. The results for the positive samples were concordant (r=0.71, p<0.0001; slope of Deming regression 0.79 [CI 95%: 0.56-1.57] and y-intercept 0.79 [CI 95%: 0.63-0.95]). The VERIS™/MDx System CMV Assay(®) detected 18 more positive samples than did the COBAS(®) Ampliprep™/COBAS(®) Taqman CMV test and the mean virus load were higher (0.41 log10IU/ml). Patient monitoring on 68 samples collected from 17 immunosuppressed patients showed similar trends between the two assays. As secondary question, virus loads detected by the VERIS™/MDx System CMV Assay(®) were compared to those of the in-house procedure on whole blood. The results were similar between the two assays (-0.09 log10IU/ml) as were the patient monitoring trends. The performances of the VERIS™/MDx System CMV Assay(®) facilitated its routine use in monitoring CMV-DNA loads in plasma samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  19. Multiple Linear Regression Analysis of Factors Affecting Real Property Price Index From Case Study Research In Istanbul/Turkey

    NASA Astrophysics Data System (ADS)

    Denli, H. H.; Koc, Z.

    2015-12-01

    Estimation of real properties depending on standards is difficult to apply in time and location. Regression analysis construct mathematical models which describe or explain relationships that may exist between variables. The problem of identifying price differences of properties to obtain a price index can be converted into a regression problem, and standard techniques of regression analysis can be used to estimate the index. Considering regression analysis for real estate valuation, which are presented in real marketing process with its current characteristics and quantifiers, the method will help us to find the effective factors or variables in the formation of the value. In this study, prices of housing for sale in Zeytinburnu, a district in Istanbul, are associated with its characteristics to find a price index, based on information received from a real estate web page. The associated variables used for the analysis are age, size in m2, number of floors having the house, floor number of the estate and number of rooms. The price of the estate represents the dependent variable, whereas the rest are independent variables. Prices from 60 real estates have been used for the analysis. Same price valued locations have been found and plotted on the map and equivalence curves have been drawn identifying the same valued zones as lines.

  20. A Bayesian goodness of fit test and semiparametric generalization of logistic regression with measurement data.

    PubMed

    Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E

    2013-06-01

    Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.

Top