Pope, Larry M.; Diaz, A.M.
1982-01-01
Quality-of-water data, collected October 21-23, 1980, and a statistical summary are presented for 42 coal-mined strip pits in Crawford and Cherokee Counties, Southeastern Kansas. The statistical summary includes minimum and maximum observed values , mean, and standard deviation. Simple linear regression equations relating specific conductance, dissolved solids, and acidity to concentrations of dissolved solids, sulfate, calcium, and magnesium, potassium, aluminum, and iron are also presented. (USGS)
Lambing, J.H.
1988-01-01
Water quality sampling was conducted at seven sites on the Clark Fork and selected tributaries from Deer Lodge to Missoula, Montana, from July 1986 through September 1987. This report presents tabulations and statistical summaries of the water quality data. The data presented in this report supplement previous data collected from March 1985 through June 1986 for six of the seven sites. Included in this report are tabulations of instantaneous values of streamflow, onsite water quality, hardness, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstream stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1986 through September 1987. Selected data are illustrated by graphs showing median concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediment. (USGS)
Lambing, J.H.
1990-01-01
Water quality sampling was conducted at eight sites on the Clark Fork and selected tributaries from Galen to Missoula, from October 1988 through September 1989. This report presents tabulations and statistical summaries of the water quality data. Included are tabulations of streamflow, onsite water quality, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstem stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1985 through September 1989. Selected data are illustrated by graphs showing median concentrations of trace elements in water, relation of trace-element concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediment. (USGS)
Lambing, John H.
1989-01-01
Water quality sampling was conducted at eight sites on the Clark Fork and selected tributaries from Galen to Missoula, Mont., from October 1987 through September 1988. This report presents tabulations and statistical summaries of the water quality data. Included in this report are tabulations of streamflow, onsite water quality, and concentrations of trace elements and suspended sediment for periodic samples. Also included are tables and hydrographs of daily mean values for streamflow, suspended-sediment concentration, and suspended-sediment discharge at three mainstream stations and one tributary. Statistical summaries are presented for periodic water quality data collected from March 1985 through September 1988. Selected data are illustrated by graphs showing median concentrations of trace elements in water, relation of trace element concentrations to suspended-sediment concentrations, and median concentrations of trace elements in suspended sediments. (USGS)
Distribution of water quality parameters in Dhemaji district, Assam (India).
Buragohain, Mridul; Bhuyan, Bhabajit; Sarma, H P
2010-07-01
The primary objective of this study is to present a statistically significant water quality database of Dhemaji district, Assam (India) with special reference to pH, fluoride, nitrate, arsenic, iron, sodium and potassium. 25 water samples collected from different locations of five development blocks in Dhemaji district have been studied separately. The implications presented are based on statistical analyses of the raw data. Normal distribution statistics and reliability analysis (correlation and covariance matrix) have been employed to find out the distribution pattern, localisation of data, and other related information. Statistical observations show that all the parameters under investigation exhibit non uniform distribution with a long asymmetric tail either on the right or left side of the median. The width of the third quartile was consistently found to be more than the second quartile for each parameter. Differences among mean, mode and median, significant skewness and kurtosis value indicate that the distribution of various water quality parameters in the study area is widely off normal. Thus, the intrinsic water quality is not encouraging due to unsymmetrical distribution of various water quality parameters in the study area.
Nebraska's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett
2011-01-01
The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...
Kansas's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; W. Keith Moser; Charles J. Barnett
2011-01-01
The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Statistical approaches used to assess and redesign surface water-quality-monitoring networks.
Khalil, B; Ouarda, T B M J
2009-11-01
An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.
1990-01-01
This document contains summaries of fifteen of the well known books which underlie the Total Quality Management philosophy. Members of the DCASR St Louis staff offer comments and opinions on how the authors have presented the quality concept in todays business environment. Keywords: TQM (Total Quality Management ), Quality concepts, Statistical process control.
Bello, Jibril Oyekunle
2013-11-14
Nigeria is one of the top three countries in Africa in terms of science research output and Nigerian urologists' biomedical research output contributes to this. Each year, urologists in Nigeria gather to present their recent research at the conference of the Nigerian Association of Urological Surgeons (NAUS). These abstracts are not thoroughly vetted as are full length manuscripts published in peer reviewed journals but the information they disseminate may affect clinical practice of attendees. This study aims to describe the characteristics of abstracts presented at the annual conferences of NAUS, the quality of the abstracts as determined by the subsequent publication of full length manuscripts in peer-review indexed journals and the factors that influence such successful publication. Abstracts presented at the 2007 to 2010 NAUS conferences were identified through conference abstracts books. Using a strict search protocol, publication in peer-reviewed journals was determined. The abstracts characteristics were analyzed and their quality judged by subsequent successful publishing of full length manuscripts. Statistical analysis was performed using SPSS 16.0 software to determine factors predictive of successful publication. Only 75 abstracts were presented at the NAUS 2007 to 2010 conferences; a quarter (24%) of the presented abstracts was subsequently published as full length manuscripts. Median time to publication was 15 months (range 2-40 months). Manuscripts whose result data were analyzed with 'beyond basic' statistics of frequencies and averages were more likely to be published than those with basic or no statistics. Quality of the abstracts and thus subsequent publication success is influenced by the use of 'beyond basic' statistics in analysis of the result data presented. There is a need for improvement in the quality of urological research from Nigeria.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Statistical summaries of water-quality data for two coal areas of Jackson County, Colorado
Kuhn, Gerhard
1982-01-01
Statistical summaries of water-quality data are compiled for eight streams in two separate coal areas of Jackson County, Colo. The quality-of-water data were collected from October 1976 to September 1980. For inorganic constituents, the maximum, minimum, and mean concentrations, as well as other statistics are presented; for minor elements, only the maximum, minimum, and mean values are included. Least-squares equations (regressions) are also given relating specific conductance of the streams to the concentration of the major ions. The observed range of specific conductance was 85 to 1,150 micromhos per centimeter for the eight sites. (USGS)
Source apportionment of groundwater pollution around landfill site in Nagpur, India.
Pujari, Paras R; Deshpande, Vijaya
2005-12-01
The present work attempts statistical analysis of groundwater quality near a Landfill site in Nagpur, India. The objective of the present work is to figure out the impact of different factors on the quality of groundwater in the study area. Statistical analysis of the data has been attempted by applying Factor Analysis concept. The analysis brings out the effect of five different factors governing the groundwater quality in the study area. Based on the contribution of the different parameters present in the extracted factors, the latter are linked to the geological setting, the leaching from the host rock, leachate of heavy metals from the landfill as well as the bacterial contamination from landfill site and other anthropogenic activities. The analysis brings out the vulnerability of the unconfined aquifer to contamination.
USING STATISTICAL METHODS FOR WATER QUALITY MANAGEMENT: ISSUES, PROBLEMS AND SOLUTIONS
This book is readable, comprehensible and I anticipate, usable. The author has an enthusiasm which comes out in the text. Statistics is presented as a living breathing subject, still being debated, defined, and refined. This statistics book actually has examples in the field...
Comparison of Data Quality of NOAA's ISIS and SURFRAD Networks to NREL's SRRL-BMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderberg, M.; Sengupta, M.
2014-11-01
This report provides analyses of broadband solar radiometric data quality for the National Oceanic and Atmospheric Administration's Integrated Surface Irradiance Study and Surface Radiation Budget Network (SURFRAD) solar measurement networks. The data quality of these networks is compared to that of the National Renewable Energy Laboratory's Solar Radiation Research Laboratory Baseline Measurement System (SRRL-BMS) native data resolutions and hourly averages of the data from the years 2002 through 2013. This report describes the solar radiometric data quality testing and flagging procedures and the method used to determine and tabulate data quality statistics. Monthly data quality statistics for each network weremore » plotted by year against the statistics for the SRRL-BMS. Some of the plots are presented in the body of the report, but most are in the appendix. These plots indicate that the overall solar radiometric data quality of the SURFRAD network is superior to that of the Integrated Surface Irradiance Study network and can be comparable to SRRL-BMS.« less
Kail, Robert V.
2013-01-01
According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N = 64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N = 83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. PMID:23735681
ERIC Educational Resources Information Center
Schwabe, Robert A.
Interest in Total Quality Management (TQM) at institutions of higher education has been stressed in recent years as an important area of activity for institutional researchers. Two previous AIR Forum papers have presented some of the statistical and graphical methods used for TQM. This paper, the third in the series, first discusses some of the…
Using Statistical Process Control to Enhance Student Progression
ERIC Educational Resources Information Center
Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson
2012-01-01
Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…
[Notes on vital statistics for the study of perinatal health].
Juárez, Sol Pía
2014-01-01
Vital statistics, published by the National Statistics Institute in Spain, are a highly important source for the study of perinatal health nationwide. However, the process of data collection is not well-known and has implications both for the quality and interpretation of the epidemiological results derived from this source. The aim of this study was to present how the information is collected and some of the associated problems. This study is the result of an analysis of the methodological notes from the National Statistics Institute and first-hand information obtained from hospitals, the Central Civil Registry of Madrid, and the Madrid Institute for Statistics. Greater integration between these institutions is required to improve the quality of birth and stillbirth statistics. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC.
The Panel on Guidelines for Statistical Software was organized in 1990 to document, assess, and prioritize problem areas regarding quality and reliability of statistical software; present prototype guidelines in high priority areas; and make recommendations for further research and discussion. This document provides the following papers presented…
Kail, Robert V
2013-11-01
According to dual-process models that include analytic and heuristic modes of processing, analytic processing is often expected to become more common with development. Consistent with this view, on reasoning problems, adolescents are more likely than children to select alternatives that are backed by statistical evidence. It is shown here that this pattern depends on the quality of the statistical evidence and the quality of the testimonial that is the typical alternative to statistical evidence. In Experiment 1, 9- and 13-year-olds (N=64) were presented with scenarios in which solid statistical evidence was contrasted with casual or expert testimonial evidence. When testimony was casual, children relied on it but adolescents did not; when testimony was expert, both children and adolescents relied on it. In Experiment 2, 9- and 13-year-olds (N=83) were presented with scenarios in which casual testimonial evidence was contrasted with weak or strong statistical evidence. When statistical evidence was weak, children and adolescents relied on both testimonial and statistical evidence; when statistical evidence was strong, most children and adolescents relied on it. Results are discussed in terms of their implications for dual-process accounts of cognitive development. Copyright © 2013 Elsevier Inc. All rights reserved.
Correction of stream quality trends for the effects of laboratory measurement bias
Alexander, Richard B.; Smith, Richard A.; Schwarz, Gregory E.
1993-01-01
We present a statistical model relating measurements of water quality to associated errors in laboratory methods. Estimation of the model allows us to correct trends in water quality for long-term and short-term variations in laboratory measurement errors. An illustration of the bias correction method for a large national set of stream water quality and quality assurance data shows that reductions in the bias of estimates of water quality trend slopes are achieved at the expense of increases in the variance of these estimates. Slight improvements occur in the precision of estimates of trend in bias by using correlative information on bias and water quality to estimate random variations in measurement bias. The results of this investigation stress the need for reliable, long-term quality assurance data and efficient statistical methods to assess the effects of measurement errors on the detection of water quality trends.
The Environmental Data Book: A Guide to Statistics on the Environment and Development.
ERIC Educational Resources Information Center
Sheram, Katherine
This book presents statistics on countries with populations of more than 1 million related to the quality of the environment, economic development, and how each is affected by the other. Sometimes called indicators, the statistics are measures of environmental, economic, and social conditions in developing and industrial countries. The book is…
Bevans, Hugh E.; Diaz, Arthur M.
1980-01-01
Summaries of descriptive statistics are compiled for 14 data-collection sites located on streams draining areas that have been shaft mined and strip mined for coal in Cherokee and Crawford Counties in southeastern Kansas. These summaries include water-quality data collected from October 1976 through April 1979. Regression equations relating specific conductance and instantaneous streamflow to concentrations of bicarbonate, sulfate, chloride, fluoride, calcium, magnesium, sodium, potassium, silica, and dissolved solids are presented.
CRN5EXP: Expert system for statistical quality control
NASA Technical Reports Server (NTRS)
Hentea, Mariana
1991-01-01
The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.
[Influence of demographic and socioeconomic characteristics on the quality of life].
Grbić, Gordana; Djokić, Dragoljub; Kocić, Sanja; Mitrašinović, Dejan; Rakić, Ljiljana; Prelević, Rade; Krivokapić, Žarko; Miljković, Snežana
2011-01-01
The quality of life is a multidimensional concept, which is best expressed by the subjective well-being. Evaluation of the quality of life is the basis for measuring the well-being, and the determination of factors that determine the quality of life quality is the basis for its improvement. To evaluate and assess the determinants of the perceived quality of life of group distinguishing features which characterize demographic and socioeconomic factors. This was a cross-sectional study of a representative sample of the population in Serbia aged over 20 years (9479 examinees). The quality of life was expressed by the perception of well-being (pleasure of life). Data on the examinees (demographic and socioeconomic characteristics) were collected by using a questionnaire for adults of each household. To process, analyze and present the data, we used the methods of parametric descriptive statistics (mean value, standard deviation, coefficient of variation), variance analysis and factor analysis. Although men evaluated the quality of life with a slightly higher grading, there was no statistically significant difference in the evaluation of the quality of life in relation to the examinee's gender (p > 0.005). Among the examinees there was a high statistically significant difference in grading the quality of life depending on age, level of education, marital status and type of job (p < 0.001). In relation to the number of children, there was no statistically significant difference in he grading of the quality of life (p > 0.005). The quality of life is influenced by numerous factors that characterize each person (demographic and socioeconomic characteristics of individual). Determining factors of the quality of life are numerous and diverse, and the manner and the strength of their influence are variable.
The statistical reporting quality of articles published in 2010 in five dental journals.
Vähänikkilä, Hannu; Tjäderhane, Leo; Nieminen, Pentti
2015-01-01
Statistical methods play an important role in medical and dental research. In earlier studies it has been observed that current use of methods and reporting of statistics are responsible for some of the errors in the interpretation of results. The aim of this study was to investigate the quality of statistical reporting in dental research articles. A total of 200 articles published in 2010 were analysed covering five dental journals: Journal of Dental Research, Caries Research, Community Dentistry and Oral Epidemiology, Journal of Dentistry and Acta Odontologica Scandinavica. Each paper underwent careful scrutiny for the use of statistical methods and reporting. A paper with at least one poor reporting item has been classified as 'problems with reporting statistics' and a paper without any poor reporting item as 'acceptable'. The investigation showed that 18 (9%) papers were acceptable and 182 (91%) papers contained at least one poor reporting item. The proportion of at least one poor reporting item in this survey was high (91%). The authors of dental journals should be encouraged to improve the statistical section of their research articles and to present the results in such a way that it is in line with the policy and presentation of the leading dental journals.
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
Design, analysis, and interpretation of field quality-control data for water-sampling projects
Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.
2015-01-01
The report provides extensive information about statistical methods used to analyze quality-control data in order to estimate potential bias and variability in environmental data. These methods include construction of confidence intervals on various statistical measures, such as the mean, percentiles and percentages, and standard deviation. The methods are used to compare quality-control results with the larger set of environmental data in order to determine whether the effects of bias and variability might interfere with interpretation of these data. Examples from published reports are presented to illustrate how the methods are applied, how bias and variability are reported, and how the interpretation of environmental data can be qualified based on the quality-control analysis.
NASA Technical Reports Server (NTRS)
Powers, B. G.
1972-01-01
The magnitude and frequency of occurrence of aircraft responses and control inputs during 27 flights of the XB-70 airplane were measured. Exceedance curves are presented for the airplane responses and control usage. A technique is presented which makes use of these exceedance curves to establish or verify handling qualities criteria. This technique can provide a means of incorporating current operational experience in handling qualities requirements for future aircraft.
Highway runoff quality models for the protection of environmentally sensitive areas
NASA Astrophysics Data System (ADS)
Trenouth, William R.; Gharabaghi, Bahram
2016-11-01
This paper presents novel highway runoff quality models using artificial neural networks (ANN) which take into account site-specific highway traffic and seasonal storm event meteorological factors to predict the event mean concentration (EMC) statistics and mean daily unit area load (MDUAL) statistics of common highway pollutants for the design of roadside ditch treatment systems (RDTS) to protect sensitive receiving environs. A dataset of 940 monitored highway runoff events from fourteen sites located in five countries (Canada, USA, Australia, New Zealand, and China) was compiled and used to develop ANN models for the prediction of highway runoff suspended solids (TSS) seasonal EMC statistical distribution parameters, as well as the MDUAL statistics for four different heavy metal species (Cu, Zn, Cr and Pb). TSS EMCs are needed to estimate the minimum required removal efficiency of the RDTS needed in order to improve highway runoff quality to meet applicable standards and MDUALs are needed to calculate the minimum required capacity of the RDTS to ensure performance longevity.
Advani, Aneel; Jones, Neil; Shahar, Yuval; Goldstein, Mary K; Musen, Mark A
2004-01-01
We develop a method and algorithm for deciding the optimal approach to creating quality-auditing protocols for guideline-based clinical performance measures. An important element of the audit protocol design problem is deciding which guide-line elements to audit. Specifically, the problem is how and when to aggregate individual patient case-specific guideline elements into population-based quality measures. The key statistical issue involved is the trade-off between increased reliability with more general population-based quality measures versus increased validity from individually case-adjusted but more restricted measures done at a greater audit cost. Our intelligent algorithm for auditing protocol design is based on hierarchically modeling incrementally case-adjusted quality constraints. We select quality constraints to measure using an optimization criterion based on statistical generalizability coefficients. We present results of the approach from a deployed decision support system for a hypertension guideline.
Kouris, Anargyros; Christodoulou, Christos; Efstathiou, Vasiliki; Tsatovidou, Revekka; Torlidi-Kordera, Evangelia; Zouridaki, Eftychia; Kontochristopoulos, George
2016-03-01
Psoriasis and leg ulcers have a marked impact on the patient's quality of life and represent a life-long burden for affected patients. The aim of this study is to compare the quality of life, anxiety and depression, self-esteem, and loneliness in patients with psoriasis and leg-ulcer patients. Eighty patients with leg ulcers, eighty patients with psoriasis, and eighty healthy controls were included in this study. The quality of life, depression and anxiety, loneliness of the patient, and self-esteem were assessed using the Dermatology Life Quality Index (DLQI), Hospital Anxiety and Depression Scale (HADS), the UCLA loneliness Scale (UCLA-Version 3), and the Rosenberg's Self-esteem Scale (RSES), respectively. The DLQI score among patients with psoriasis was 12.74 ± 4.89 and leg ulcer patients was 13.28 ± 2.57. The patients with psoriasis presented statistically significant higher anxiety (9.87 ± 4.56) than both leg ulcer patients (8.26 ± 2.82) and controls (6.45 ± 1.89), while leg ulcer patients also presented higher anxiety than controls. Regarding self-esteem, although there were no significant differences between the patients with psoriasis (15.25 ± 3.20) and the ones with leg ulcers (15.89 ± 2.93), they both presented statistically significant lower self-esteem scores than control group (18.53 ± 3.04). The patients with psoriasis presented statistically significant higher levels of loneliness and social isolation (46.18 ± 6.63) compared to leg ulcer patients (43.73 ± 5.68) than controls (42.49 ± 3.41). Psoriasis and leg ulcers are long-term skin diseases associated with significant impairment of the patient's quality of life, anxiety, and self-esteem, which are frequently under-recognized. © 2016 by the Wound Healing Society.
A PERFORMANCE EVALUATION OF THE ETA- CMAQ AIR QUALITY FORECAST SYSTEM FOR THE SUMMER OF 2005
This poster presents an evaluation of the Eta-CMAQ Air Quality Forecast System's experimental domain using O3 observations obtained from EPA's AIRNOW program and a suite of statistical metrics examining both discrete and categorical forecasts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicholson, W L; Harris, J L
1976-03-01
The First ERDA Statistical Symposium was organized to provide a means for communication among ERDA statisticians, and the sixteen papers presented at the meeting are given. Topics include techniques of numerical analysis used for accelerators, nuclear reactors, skewness and kurtosis statistics, radiochemical spectral analysis, quality control, and other statistics problems. Nine of the papers were previously announced in Nuclear Science Abstracts (NSA), while the remaining seven were abstracted for ERDA Energy Research Abstracts (ERA) and INIS Atomindex. (PMA)
The Statistical point of view of Quality: the Lean Six Sigma methodology
Viti, Andrea; Terzi, Alberto
2015-01-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253
The Statistical point of view of Quality: the Lean Six Sigma methodology.
Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto
2015-04-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.
Michigan's forests, 2004: statistics and quality assurance
Scott A. Pugh; Mark H. Hansen; Gary Brand; Ronald E. McRoberts
2010-01-01
The first annual inventory of Michigan's forests was completed in 2004 after 18,916 plots were selected and 10,355 forested plots were visited. This report includes detailed information on forest inventory methods, quality of estimates, and additional tables. An earlier publication presented analyses of the inventoried data (Pugh et al. 2009).
ERIC Educational Resources Information Center
Sheram, Katherine
This teaching guide accompanies "The Environmental Data Book," published by the World Bank. Environmental and development statistics are presented for discussion and analysis. Indicators of environmental quality and economic development are defined with accompanying charts and maps. Activities accompany each lesson, along with a worksheet at the…
Publishing in "SERJ": An Analysis of Papers from 2002-2009
ERIC Educational Resources Information Center
Zieffler, Andrew; Garfield, Joan; delMas, Robert C.; Le, Laura; Isaak, Rebekah; Bjornsdottir, Audbjorg; Park, Jiyoon
2011-01-01
"SERJ" has provided a high quality professional publication venue for researchers in statistics education for close to a decade. This paper presents a review of the articles published to explore what they suggest about the field of statistics education, the researchers, the questions addressed, and the growing knowledge base on teaching and…
Falk, Sarah E.; Anderholm, Scott K.; Engdahl, Nicholas B.
2011-01-01
The Albuquerque Bernalillo County Water Utility Authority (ABCWUA) is supplementing the municipal water supply for Albuquerque, New Mexico, and the surrounding area with water diverted from the Rio Grande. The distribution of surface water for municipal supply has raised questions about the quality of water in the Rio Grande and the possibility of contaminants in the water. The U.S. Geological Survey (USGS), in cooperation with ABCWUA, has compiled existing water-quality data collected on the Rio Grande and its main tributary, the Rio Chama, by various Federal and State agencies to provide a comprehensive overview of water quality in the Rio Grande basin upstream from Albuquerque. This report describes selected water-quality investigations conducted by various Federal and State agencies and 2007 USGS surface-water-quality investigations and data-collection activities and presents a statistical summary of selected water-quality data collected on the Rio Grande and the Rio Chama in central and northern New Mexico
Wartberg, Lutz; Kriston, Levente; Kammerl, Rudolf
2017-07-01
Internet Gaming Disorder (IGD) has been included in the current edition of the Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5). In the present study, the relationship among social support, friends only known through the Internet, health-related quality of life, and IGD in adolescence was explored for the first time. For this purpose, 1,095 adolescents aged from 12 to 14 years were surveyed with a standardized questionnaire concerning IGD, self-perceived social support, proportion of friends only known through the Internet, and health-related quality of life. The authors conducted unpaired t-tests, a chi-square test, as well as correlation and logistic regression analyses. According to the statistical analyses, adolescents with IGD reported lower self-perceived social support, more friends only known through the Internet, and a lower health-related quality of life compared with the group without IGD. Both in bivariate and multivariate logistic regression models, statistically significant associations between IGD and male gender, a higher proportion of friends only known through the Internet, and a lower health-related quality of life (multivariate model: Nagelkerke's R 2 = 0.37) were revealed. Lower self-perceived social support was related to IGD in the bivariate model only. In summary, quality of life and social aspects seem to be important factors for IGD in adolescence and therefore should be incorporated in further (longitudinal) studies. The findings of the present survey may provide starting points for the development of prevention and intervention programs for adolescents affected by IGD.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2014-01-01
This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2012 through September 2013. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity and dissolved organic carbon were analyzed for water samples collected at the four sites where seasonal daily values of turbidity were being determined. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical sum-maries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.
Predicting perceptual quality of images in realistic scenario using deep filter banks
NASA Astrophysics Data System (ADS)
Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang
2018-03-01
Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
Multivariate Statistical Analysis of Water Quality data in Indian River Lagoon, Florida
NASA Astrophysics Data System (ADS)
Sayemuzzaman, M.; Ye, M.
2015-12-01
The Indian River Lagoon, is part of the longest barrier island complex in the United States, is a region of particular concern to the environmental scientist because of the rapid rate of human development throughout the region and the geographical position in between the colder temperate zone and warmer sub-tropical zone. Thus, the surface water quality analysis in this region always brings the newer information. In this present study, multivariate statistical procedures were applied to analyze the spatial and temporal water quality in the Indian River Lagoon over the period 1998-2013. Twelve parameters have been analyzed on twelve key water monitoring stations in and beside the lagoon on monthly datasets (total of 27,648 observations). The dataset was treated using cluster analysis (CA), principle component analysis (PCA) and non-parametric trend analysis. The CA was used to cluster twelve monitoring stations into four groups, with stations on the similar surrounding characteristics being in the same group. The PCA was then applied to the similar groups to find the important water quality parameters. The principal components (PCs), PC1 to PC5 was considered based on the explained cumulative variances 75% to 85% in each cluster groups. Nutrient species (phosphorus and nitrogen), salinity, specific conductivity and erosion factors (TSS, Turbidity) were major variables involved in the construction of the PCs. Statistical significant positive or negative trends and the abrupt trend shift were detected applying Mann-Kendall trend test and Sequential Mann-Kendall (SQMK), for each individual stations for the important water quality parameters. Land use land cover change pattern, local anthropogenic activities and extreme climate such as drought might be associated with these trends. This study presents the multivariate statistical assessment in order to get better information about the quality of surface water. Thus, effective pollution control/management of the surface waters can be undertaken.
1987-03-01
statistics for storm water quality variables and fractions of phosphorus, solids, and carbon are presented in Tables 7 and 8, respectively. The correlation...matrix and factor analysis (same method as used for baseflow) of storm water quality variables suggested three groups: Group I - TMG, TCA, TNA, TSI...models to predict storm water quality . The 11 static and 3 dynamic storm variables were used as potential dependent variables. All independent and
NASA Technical Reports Server (NTRS)
Wallace, G. R.; Weathers, G. D.; Graf, E. R.
1973-01-01
The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.
Student laboratory reports: an approach to improving feedback and quality
NASA Astrophysics Data System (ADS)
Ellingsen, Pål Gunnar; Støvneng, Jon Andreas
2018-05-01
We present an ongoing effort in improving the quality of laboratory reports written by first and second year physics students. The effort involves a new approach where students are given the opportunity to submit reports at intermediate deadlines, receive feedback, and then resubmit for the final deadline. In combination with a differential grading system, instead of pass/fail, the improved feedback results in higher quality reports. Improvement in the quality of the reports is visible through the grade statistics.
Parsons, Nick R; Price, Charlotte L; Hiskens, Richard; Achten, Juul; Costa, Matthew L
2012-04-25
The application of statistics in reported research in trauma and orthopaedic surgery has become ever more important and complex. Despite the extensive use of statistical analysis, it is still a subject which is often not conceptually well understood, resulting in clear methodological flaws and inadequate reporting in many papers. A detailed statistical survey sampled 100 representative orthopaedic papers using a validated questionnaire that assessed the quality of the trial design and statistical analysis methods. The survey found evidence of failings in study design, statistical methodology and presentation of the results. Overall, in 17% (95% confidence interval; 10-26%) of the studies investigated the conclusions were not clearly justified by the results, in 39% (30-49%) of studies a different analysis should have been undertaken and in 17% (10-26%) a different analysis could have made a difference to the overall conclusions. It is only by an improved dialogue between statistician, clinician, reviewer and journal editor that the failings in design methodology and analysis highlighted by this survey can be addressed.
New statistical potential for quality assessment of protein models and a survey of energy functions
2010-01-01
Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048
An operational definition of a statistically meaningful trend.
Bryhn, Andreas C; Dimberg, Peter H
2011-04-28
Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.
NASA Astrophysics Data System (ADS)
Jokhio, Gul A.; Syed Mohsin, Sharifah M.; Gul, Yasmeen
2018-04-01
It has been established that Adobe provides, in addition to being sustainable and economic, a better indoor air quality without spending extensive amounts of energy as opposed to the modern synthetic materials. The material, however, suffers from weak structural behaviour when subjected to adverse loading conditions. A wide range of mechanical properties has been reported in literature owing to lack of research and standardization. The present paper presents the statistical analysis of the results that were obtained through compressive and flexural tests on Adobe samples. Adobe specimens with and without wire mesh reinforcement were tested and the results were reported. The statistical analysis of these results presents an interesting read. It has been found that the compressive strength of adobe increases by about 43% after adding a single layer of wire mesh reinforcement. This increase is statistically significant. The flexural response of Adobe has also shown improvement with the addition of wire mesh reinforcement, however, the statistical significance of the same cannot be established.
Dodge, Kent A.; Hornberger, Michelle I.
2015-12-24
This report presents the analytical results and qualityassurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2013 through September 2014. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. At 12 sites, dissolved organic carbon and turbidity samples were collected. In addition, nitrogen (nitrate plus nitrite) samples were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Seasonal daily values of turbidity were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in wholebody tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.
Use of statistical procedures in Brazilian and international dental journals.
Ambrosano, Gláucia Maria Bovi; Reis, André Figueiredo; Giannini, Marcelo; Pereira, Antônio Carlos
2004-01-01
A descriptive survey was performed in order to assess the statistical content and quality of Brazilian and international dental journals, and compare their evolution throughout the last decades. The authors identified the reporting and accuracy of statistical techniques in 1000 papers published from 1970 to 2000 in seven dental journals: three Brazilian (Brazilian Dental Journal, Revista de Odontologia da Universidade de Sao Paulo and Revista de Odontologia da UNESP) and four international journals (Journal of the American Dental Association, Journal of Dental Research, Caries Research and Journal of Periodontology). Papers were divided into two time periods: from 1970 to 1989, and from 1990 to 2000. A slight increase in the number of articles that presented some form of statistical technique was noticed for Brazilian journals (from 61.0 to 66.7%), whereas for international journals, a significant increase was observed (65.8 to 92.6%). In addition, a decrease in the number of statistical errors was verified. The most commonly used statistical tests as well as the most frequent errors found in dental journals were assessed. Hopefully, this investigation will encourage dental educators to better plan the teaching of biostatistics, and to improve the statistical quality of submitted manuscripts.
Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program
Assessment of the quality of primary care for the elderly according to the Chronic Care Model 1
Silva, Líliam Barbosa; Soares, Sônia Maria; Silva, Patrícia Aparecida Barbosa; Santos, Joseph Fabiano Guimarães; Miranda, Lívia Carvalho Viana; Santos, Raquel Melgaço
2018-01-01
ABSTRACT Objective: to evaluate the quality of care provided to older people with diabetes mellitus and/or hypertension in the Primary Health Care (PHC) according to the Chronic Care Model (CCM) and identify associations with care outcomes. Method: cross-sectional study involving 105 older people with diabetes mellitus and/or hypertension. The Patient Assessment of Chronic Illness Care (PACIC) questionnaire was used to evaluate the quality of care. The total score was compared with care outcomes that included biochemical parameters, body mass index, pressure levels and quality of life. Data analysis was based on descriptive statistics and multiple logistic regression. Results: there was a predominance of females and a median age of 72 years. The median PACIC score was 1.55 (IQ 1.30-2.20). Among the PACIC dimensions, the “delivery system design/decision support” was the one that presented the best result. There was no statistical difference between the medians of the overall PACIC score and individual care outcomes. However, when the quality of life and health satisfaction were simultaneously evaluated, a statistical difference between the medians was observed. Conclusion: the low PACIC scores found indicate that chronic care according to the CCM in the PHC seems still to fall short of its assumptions. PMID:29538582
Assessment of the quality of primary care for the elderly according to the Chronic Care Model.
Silva, Líliam Barbosa; Soares, Sônia Maria; Silva, Patrícia Aparecida Barbosa; Santos, Joseph Fabiano Guimarães; Miranda, Lívia Carvalho Viana; Santos, Raquel Melgaço
2018-03-08
to evaluate the quality of care provided to older people with diabetes mellitus and/or hypertension in the Primary Health Care (PHC) according to the Chronic Care Model (CCM) and identify associations with care outcomes. cross-sectional study involving 105 older people with diabetes mellitus and/or hypertension. The Patient Assessment of Chronic Illness Care (PACIC) questionnaire was used to evaluate the quality of care. The total score was compared with care outcomes that included biochemical parameters, body mass index, pressure levels and quality of life. Data analysis was based on descriptive statistics and multiple logistic regression. there was a predominance of females and a median age of 72 years. The median PACIC score was 1.55 (IQ 1.30-2.20). Among the PACIC dimensions, the "delivery system design/decision support" was the one that presented the best result. There was no statistical difference between the medians of the overall PACIC score and individual care outcomes. However, when the quality of life and health satisfaction were simultaneously evaluated, a statistical difference between the medians was observed. the low PACIC scores found indicate that chronic care according to the CCM in the PHC seems still to fall short of its assumptions.
Using luminosity data as a proxy for economic statistics
Chen, Xi
2011-01-01
A pervasive issue in social and environmental research has been how to improve the quality of socioeconomic data in developing countries. Given the shortcomings of standard sources, the present study examines luminosity (measures of nighttime lights visible from space) as a proxy for standard measures of output (gross domestic product). We compare output and luminosity at the country level and at the 1° latitude × 1° longitude grid-cell level for the period 1992–2008. We find that luminosity has informational value for countries with low-quality statistical systems, particularly for those countries with no recent population or economic censuses. PMID:21576474
BINGE EATING DISORDER AND QUALITY OF LIFE OF CANDIDATES TO BARIATRIC SURGERY.
Costa, Ana Júlia Rosa Barcelos; Pinto, Sônia Lopes
2015-01-01
Obesity decreases the quality of life, which is aggravated by the association of comorbidities, and the binge eating disorder is directly related to body image and predisposes to overweight. Evaluate association between the presence and the level of binge eating disorder and the quality of life of the obese candidates for bariatric surgery. Cross-sectional study analyzing anthropometric data (weight and height) and socioeconomics (age, sex, marital status, education and income). The application of Binge Eating Scale was held for diagnosis of Binge Eating Disorder and the Medical Outcomes Study 36-item Short-From Health Survey to assess the quality of life. Total sample studied was 96 patients, mean age 38.15±9.6 years, 80.2% female, 67.7% married, 41% with complete and incomplete higher education, 77.1% with lower income or equal to four the minimum salary, 59.3% with grade III obesity. Binge eating disorder was observed in 44.2% of patients (29.9% moderate and 14.3% severe), and these had the worst scores in all domains of quality of life SF36 scale; however, this difference was not statistically significant. Only the nutritional status presented significant statistically association with the presence of binge eating disorder. High prevalence of patients with binge eating disorder was found and they presented the worst scores in all domains of quality of life.
Thinking Globally, Acting Locally: Using the Local Environment to Explore Global Issues.
ERIC Educational Resources Information Center
Simmons, Deborah
1994-01-01
Asserts that water pollution is a global problem and presents statistics indicating how much of the world's water is threatened. Presents three elementary school classroom activities on water quality and local water resources. Includes a figure describing the work of the Global Rivers Environmental Education Network. (CFR)
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noo, F; Guo, Z
2016-06-15
Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used withmore » two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It may be that different results would be obtained if the penalty term was used with a pixel-dependent weight. F Noo receives research support from Siemens Healthcare GmbH.« less
Cardiac surgery report cards: comprehensive review and statistical critique.
Shahian, D M; Normand, S L; Torchiana, D F; Lewis, S M; Pastore, J O; Kuntz, R E; Dreyer, P I
2001-12-01
Public report cards and confidential, collaborative peer education represent distinctly different approaches to cardiac surgery quality assessment and improvement. This review discusses the controversies regarding their methodology and relative effectiveness. Report cards have been the more commonly used approach, typically as a result of state legislation. They are based on the presumption that publication of outcomes effectively motivates providers, and that market forces will reward higher quality. Numerous studies have challenged the validity of these hypotheses. Furthermore, although states with report cards have reported significant decreases in risk-adjusted mortality, it is unclear whether this improvement resulted from public disclosure or, rather, from the development of internal quality programs by hospitals. An additional confounding factor is the nationwide decline in heart surgery mortality, including states without quality monitoring. Finally, report cards may engender negative behaviors such as high-risk case avoidance and "gaming" of the reporting system, especially if individual surgeon results are published. The alternative approach, continuous quality improvement, may provide an opportunity to enhance performance and reduce interprovider variability while avoiding the unintended negative consequences of report cards. This collaborative method, which uses exchange visits between programs and determination of best practice, has been highly effective in northern New England and in the Veterans Affairs Administration. However, despite their potential advantages, quality programs based solely on confidential continuous quality improvement do not address the issue of public accountability. For this reason, some states may continue to mandate report cards. In such instances, it is imperative that appropriate statistical techniques and report formats are used, and that professional organizations simultaneously implement continuous quality improvement programs. The statistical methodology underlying current report cards is flawed, and does not justify the degree of accuracy presented to the public. All existing risk-adjustment methods have substantial inherent imprecision, and this is compounded when the results of such patient-level models are aggregated and used inappropriately to assess provider performance. Specific problems include sample size differences, clustering of observations, multiple comparisons, and failure to account for the random component of interprovider variability. We advocate the use of hierarchical or multilevel statistical models to address these concerns, as well as report formats that emphasize the statistical uncertainty of the results.
Using SERVQUAL and Kano research techniques in a patient service quality survey.
Christoglou, Konstantinos; Vassiliadis, Chris; Sigalas, Ioakim
2006-01-01
This article presents the results of a service quality study. After an introduction to the SERVQUAL and the Kano research techniques, a Kano analysis of 75 patients from the General Hospital of Katerini in Greece is presented. The service quality criterion used satisfaction and dissatisfaction indices. The Kano statistical analysis process results strengthened the hypothesis of previous research regarding the importance of personal knowledge, the courtesy of the hospital employees and their ability to convey trust and confidence (assurance dimension). Managerial suggestions are made regarding the best way of acting and approaching hospital patients based on the basic SERVQUAL model.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 1, presents basic information about data including a classification system that describes the four major types of variables: continuous quantitative variable, discrete quantitative variable, ordinal categorical variable (including the binomial variable), and nominal categorical variable. A histogram is a graph that displays the frequency distribution for a continuous variable. The article also demonstrates how to calculate the mean, median, standard deviation, and variance for a continuous variable.
Game Location and Team Quality Effects on Performance Profiles in Professional Soccer
Lago-Peñas, Carlos; Lago-Ballesteros, Joaquin
2011-01-01
Home advantage in team sports has an important role in determining the outcome of a game. The aim of the present study was to identify the soccer game- related statistics that best discriminate home and visiting teams according to the team quality. The sample included all 380 games of the Spanish professional men’s league. The independent variables were game location (home or away) and the team quality. Teams were classified into four groups according to their final ranking at the end of the league. The game-related statistics registered were divided into three groups: (i) variables related to goals scored; (ii) variables related to offense and (iii) variables related to defense. A univariate (t-test and Mann-Whitney U) and multivariate (discriminant analysis) analysis of data was done. Results showed that home teams have significantly higher means for goal scored, total shots, shots on goal, attacking moves, box moves, crosses, offsides committed, assists, passes made, successful passes, dribbles made, successful dribbles, ball possession, and gains of possession, while visiting teams presented higher means for losses of possession and yellow cards. In addition, the findings of the current study confirm that game location and team quality are important in determining technical and tactical performances in matches. Teams described as superior and those described as inferior did not experience the same home advantage. Future research should consider the influence of other confounding variables such as weather conditions, game status and team form. Key points Home teams have significantly higher figures for attack indicators probably due to facilities familiarity and crowd effects. The teams’ game-related statistics profile varied according to game location and team quality. Teams described as superior and those described as inferior did not experience the same home advantage. PMID:24150619
A quality assessment of randomized controlled trial reports in endodontics.
Lucena, C; Souza, E M; Voinea, G C; Pulgar, R; Valderrama, M J; De-Deus, G
2017-03-01
To assess the quality of the randomized clinical trial (RCT) reports published in Endodontics between 1997 and 2012. Retrieval of RCTs in Endodontics was based on a search of the Thomson Reuters Web of Science (WoS) database (March 2013). Quality evaluation was performed using a checklist based on the Jadad criteria, CONSORT (Consolidated Standards of Reporting Trials) statement and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials). Descriptive statistics were used for frequency distribution of data. Student's t-test and Welch test were used to identify the influence of certain trial characteristics upon report quality (α = 0.05). A total of 89 RCTs were evaluated, and several methodological flaws were found: only 45% had random sequence generation at low risk of bias, 75% did not provide information on allocation concealment, and 19% were nonblinded designs. Regarding statistics, only 55% of the RCTs performed adequate sample size estimations, only 16% presented confidence intervals, and 25% did not provide the exact P-value. Also, 2% of the articles used no statistical tests, and in 87% of the RCTs, the information provided was insufficient to determine whether the statistical methodology applied was appropriate or not. Significantly higher scores were observed for multicentre trials (P = 0.023), RCTs signed by more than 5 authors (P = 0.03), articles belonging to journals ranked above the JCR median (P = 0.03), and articles complying with the CONSORT guidelines (P = 0.000). The quality of RCT reports in key areas for internal validity of the study was poor. Several measures, such as compliance with the CONSORT guidelines, are important in order to raise the quality of RCTs in Endodontics. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
What affects the subjective sleep quality of hospitalized elderly patients?
Park, Mi Jeong; Kim, Kon Hee
2017-03-01
The present study aimed to identify the factors affecting the subjective sleep quality in elderly inpatients. The participants were 290 older adults admitted in three general hospitals. Data were collected using a structured questionnaire consisting of scales for general characteristics, sleep quality, activities of daily living, instrumental activities of daily living and depression. Collected data were analyzed by descriptive statistics, t-test, one-way anova, Scheffé post-hoc, Pearson's correlation coefficient and stepwise multiple regression. There were statistically significant differences in sleep quality according to age, education level, marital status, monthly income and number of cohabitants. The most powerful predictor of sleep quality was depression (P < 0.01, R 2 = 0.30). Five variables, depression, perceived health status, diagnosis, number of cohabitants and duration of hospitalization; explained 43.0% of the total variance in sleep quality. Elderly inpatients suffered from low sleep quality, and depression affected their sleep. We should develop and apply hospital-tailored sleep interventions considering older adults' depression, and then hospitalized older adults' sleep could improve. Furthermore, it is useful to identify other sleep-related factors. Geriatr Gerontol Int 2017; 17: 471-479. © 2016 Japan Geriatrics Society.
Statistical tools for transgene copy number estimation based on real-time PCR.
Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal
2007-11-01
As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.
Optimizing construction quality management of pavements using mechanistic performance analysis.
DOT National Transportation Integrated Search
2004-08-01
This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...
Computer assisted outcomes research in orthopedics: total joint replacement.
Arslanian, C; Bond, M
1999-06-01
Long-term studies are needed to determine clinically relevant outcomes within the practice of orthopedic surgery. Historically, the patient's subjective feelings of quality of life have been largely ignored. However, there has been a strong movement toward measuring perceived quality of life through such instruments as the SF-36. In a large database from an orthopedic practice results are presented. First, computerized data entry using touch screen technology is not only cost effective but user friendly. Second, patients undergoing hip or knee arthroplasty surgeries make statistically significant improvements in seven of the eight domains of the SF-36 in the first 3 months after surgery. Additional statistically significant improvements over the next 6 to 12 months are also seen. The data are presented here in detail to demonstrate the benefits of a patient outcomes program, to enhance the understanding and use of outcomes data and to encourage further work in outcomes measurement in orthopedics.
Evaluation of air quality in a megacity using statistics tools
NASA Astrophysics Data System (ADS)
Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana
2018-06-01
Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.
Evaluation of air quality in a megacity using statistics tools
NASA Astrophysics Data System (ADS)
Ventura, Luciana Maria Baptista; de Oliveira Pinto, Fellipe; Soares, Laiza Molezon; Luna, Aderval Severino; Gioda, Adriana
2017-03-01
Local physical characteristics (e.g., meteorology and topography) associate to particle concentrations are important to evaluate air quality in a region. Meteorology and topography affect air pollutant dispersions. This study used statistics tools (PCA, HCA, Kruskal-Wallis, Mann-Whitney's test and others) to a better understanding of the relationship between fine particulate matter (PM2.5) levels and seasons, meteorological conditions and air basins. To our knowledge, it is one of the few studies performed in Latin America involving all parameters together. PM2.5 samples were collected in six sampling sites with different emission sources (industrial, vehicular, soil dust) in Rio de Janeiro, Brazil. The PM2.5 daily concentrations ranged from 1 to 61 µg m-3, with averages higher than the annual limit (15 µg m-3) for some of the sites. The results of the statistics evaluation showed that PM2.5 concentrations were not influenced by seasonality. Furthermore, air basins defined previously were not confirmed, because some sites presented similar emission sources. Therefore, new redefinitions of air basins need to be done, once they are important to air quality management.
Towards a new tool for the evaluation of the quality of ultrasound compressed images.
Delgorge, Cécile; Rosenberger, Christophe; Poisson, Gérard; Vieyres, Pierre
2006-11-01
This paper presents a new tool for the evaluation of ultrasound image compression. The goal is to measure the image quality as easily as with a statistical criterion, and with the same reliability as the one provided by the medical assessment. An initial experiment is proposed to medical experts and represents our reference value for the comparison of evaluation criteria. Twenty-one statistical criteria are selected from the literature. A cumulative absolute similarity measure is defined as a distance between the criterion to evaluate and the reference value. A first fusion method based on a linear combination of criteria is proposed to improve the results obtained by each of them separately. The second proposed approach combines different statistical criteria and uses the medical assessment in a training phase with a support vector machine. Some experimental results are given and show the benefit of fusion.
Defraene, Bruno; van Waterschoot, Toon; Diehl, Moritz; Moonen, Marc
2016-07-01
Subjective audio quality evaluation experiments have been conducted to assess the performance of embedded-optimization-based precompensation algorithms for mitigating perceptible linear and nonlinear distortion in audio signals. It is concluded with statistical significance that the perceived audio quality is improved by applying an embedded-optimization-based precompensation algorithm, both in case (i) nonlinear distortion and (ii) a combination of linear and nonlinear distortion is present. Moreover, a significant positive correlation is reported between the collected subjective and objective PEAQ audio quality scores, supporting the validity of using PEAQ to predict the impact of linear and nonlinear distortion on the perceived audio quality.
A statistical summary of data from the U.S. Geological Survey's national water quality networks
Smith, R.A.; Alexander, R.B.
1983-01-01
The U.S. Geological Survey Operates two nationwide networks to monitor water quality, the National Hydrologic Bench-Mark Network and the National Stream Quality Accounting Network (NASQAN). The Bench-Mark network is composed of 51 stations in small drainage basins which are as close as possible to their natural state, with no human influence and little likelihood of future development. Stations in the NASQAN program are located to monitor flow from accounting units (subregional drainage basins) which collectively encompass the entire land surface of the nation. Data collected at both networks include streamflow, concentrations of major inorganic constituents, nutrients, and trace metals. The goals of the two water quality sampling programs include the determination of mean constituent concentrations and transport rates as well as the analysis of long-term trends in those variables. This report presents a station-by-station statistical summary of data from the two networks for the period 1974 through 1981. (Author 's abstract)
A ranking index for quality assessment of forensic DNA profiles forensic DNA profiles
2010-01-01
Background Assessment of DNA profile quality is vital in forensic DNA analysis, both in order to determine the evidentiary value of DNA results and to compare the performance of different DNA analysis protocols. Generally the quality assessment is performed through manual examination of the DNA profiles based on empirical knowledge, or by comparing the intensities (allelic peak heights) of the capillary electrophoresis electropherograms. Results We recently developed a ranking index for unbiased and quantitative quality assessment of forensic DNA profiles, the forensic DNA profile index (FI) (Hedman et al. Improved forensic DNA analysis through the use of alternative DNA polymerases and statistical modeling of DNA profiles, Biotechniques 47 (2009) 951-958). FI uses electropherogram data to combine the intensities of the allelic peaks with the balances within and between loci, using Principal Components Analysis. Here we present the construction of FI. We explain the mathematical and statistical methodologies used and present details about the applied data reduction method. Thereby we show how to adapt the ranking index for any Short Tandem Repeat-based forensic DNA typing system through validation against a manual grading scale and calibration against a specific set of DNA profiles. Conclusions The developed tool provides unbiased quality assessment of forensic DNA profiles. It can be applied for any DNA profiling system based on Short Tandem Repeat markers. Apart from crime related DNA analysis, FI can therefore be used as a quality tool in paternal or familial testing as well as in disaster victim identification. PMID:21062433
The accurate assessment of small-angle X-ray scattering data
Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...
2015-01-23
Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less
Statistical issues in reporting quality data: small samples and casemix variation.
Zaslavsky, A M
2001-12-01
To present two key statistical issues that arise in analysis and reporting of quality data. Casemix variation is relevant to quality reporting when the units being measured have differing distributions of patient characteristics that also affect the quality outcome. When this is the case, adjustment using stratification or regression may be appropriate. Such adjustments may be controversial when the patient characteristic does not have an obvious relationship to the outcome. Stratified reporting poses problems for sample size and reporting format, but may be useful when casemix effects vary across units. Although there are no absolute standards of reliability, high reliabilities (interunit F > or = 10 or reliability > or = 0.9) are desirable for distinguishing above- and below-average units. When small or unequal sample sizes complicate reporting, precision may be improved using indirect estimation techniques that incorporate auxiliary information, and 'shrinkage' estimation can help to summarize the strength of evidence about units with small samples. With broader understanding of casemix adjustment and methods for analyzing small samples, quality data can be analysed and reported more accurately.
VoroMQA: Assessment of protein structure quality using interatomic contact areas.
Olechnovič, Kliment; Venclovas, Česlovas
2017-06-01
In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Integrating image quality in 2nu-SVM biometric match score fusion.
Vatsa, Mayank; Singh, Richa; Noore, Afzel
2007-10-01
This paper proposes an intelligent 2nu-support vector machine based match score fusion algorithm to improve the performance of face and iris recognition by integrating the quality of images. The proposed algorithm applies redundant discrete wavelet transform to evaluate the underlying linear and non-linear features present in the image. A composite quality score is computed to determine the extent of smoothness, sharpness, noise, and other pertinent features present in each subband of the image. The match score and the corresponding quality score of an image are fused using 2nu-support vector machine to improve the verification performance. The proposed algorithm is experimentally validated using the FERET face database and the CASIA iris database. The verification performance and statistical evaluation show that the proposed algorithm outperforms existing fusion algorithms.
EPA-ORD MEASUREMENT SCIENCE SUPPORT FOR HOMELAND SECURITY
This presentation will describe the organization and the research and development activities of the ORD National Exposure Measurements Center and will focus on the Center's planned role in providing analytical method development, statistical sampling and design guidance, quality ...
Azarbarzin, Mehrdad; Malekian, Azadeh; Taleghani, Fariba
2015-01-01
Cancer has significant traumatic effects on the family members of the patients, particularly in Asia's tightly knitted families. Research evidence suggests a debilitating impact of cancer on the quality of life of the afflicted individuals, their spouses, and their families. Since a few studies have been carried out on the quality of life of adolescents living with parents diagnosed with cancer, especially in Iran, the research team decided to evaluate the quality of life of them and also investigate the effects of supportive-educative program on it. The present quasi-experimental, one-group study had a pre-test-post-test design and was performed in Esfahan in 2014. The sample of this study consisted of 30 adolescents. The data gathering tool was the short form of quality of life questionnaire (SF-36). Data were analyzed by descriptive statistics and paired sample t-test. P-value of 0.05 was considered significant. The paired sample t-test showed that before and after presenting the program, there were significant statistical differences in some aspects of quality of life, such as physical functioning (P = 0.01), energy/fatigue (P < 0.0001), emotional well-being (P < 0.0001), social functioning (P = 0.001), pain (P < 0.0001), and general health (P = 0.01). This research showed that supportive-educative program can enhance some aspects of quality of life. Therefore, nurses and other health professionals can use this scheme or similar programs for helping adolescents living with a parent with cancer.
Azarbarzin, Mehrdad; Malekian, Azadeh; Taleghani, Fariba
2015-01-01
Background: Cancer has significant traumatic effects on the family members of the patients, particularly in Asia's tightly knitted families. Research evidence suggests a debilitating impact of cancer on the quality of life of the afflicted individuals, their spouses, and their families. Since a few studies have been carried out on the quality of life of adolescents living with parents diagnosed with cancer, especially in Iran, the research team decided to evaluate the quality of life of them and also investigate the effects of supportive-educative program on it. Materials and Methods: The present quasi-experimental, one-group study had a pre-test–post-test design and was performed in Esfahan in 2014. The sample of this study consisted of 30 adolescents. The data gathering tool was the short form of quality of life questionnaire (SF-36). Data were analyzed by descriptive statistics and paired sample t-test. P-value of 0.05 was considered significant. Results: The paired sample t-test showed that before and after presenting the program, there were significant statistical differences in some aspects of quality of life, such as physical functioning (P = 0.01), energy/fatigue (P < 0.0001), emotional well-being (P < 0.0001), social functioning (P = 0.001), pain (P < 0.0001), and general health (P = 0.01). Conclusions: This research showed that supportive-educative program can enhance some aspects of quality of life. Therefore, nurses and other health professionals can use this scheme or similar programs for helping adolescents living with a parent with cancer. PMID:26457095
Quality assessment of butter cookies applying multispectral imaging
Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne
2013-01-01
A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036
Evaluating the statistical methodology of randomized trials on dentin hypersensitivity management.
Matranga, Domenica; Matera, Federico; Pizzo, Giuseppe
2017-12-27
The present study aimed to evaluate the characteristics and quality of statistical methodology used in clinical studies on dentin hypersensitivity management. An electronic search was performed for data published from 2009 to 2014 by using PubMed, Ovid/MEDLINE, and Cochrane Library databases. The primary search terms were used in combination. Eligibility criteria included randomized clinical trials that evaluated the efficacy of desensitizing agents in terms of reducing dentin hypersensitivity. A total of 40 studies were considered eligible for assessment of quality statistical methodology. The four main concerns identified were i) use of nonparametric tests in the presence of large samples, coupled with lack of information about normality and equality of variances of the response; ii) lack of P-value adjustment for multiple comparisons; iii) failure to account for interactions between treatment and follow-up time; and iv) no information about the number of teeth examined per patient and the consequent lack of cluster-specific approach in data analysis. Owing to these concerns, statistical methodology was judged as inappropriate in 77.1% of the 35 studies that used parametric methods. Additional studies with appropriate statistical analysis are required to obtain appropriate assessment of the efficacy of desensitizing agents.
State Estimates of Adolescent Cigarette Use and Perceptions of Risk of Smoking: 2012 and 2013
... data from the combined 2012 and 2013 National Surveys on Drug Use and Health (NSDUHs) to present ... Center for Behavioral Health Statistics and Quality, National Surveys on Drug Use and Health (NSDUHs), 2012 and ...
Qualitative Assessment of IVHS Emission and Air Quality Impacts
DOT National Transportation Integrated Search
2000-04-07
The purpose of this document is to present state-level statistics for the CVISN deployment described in the national report. These data will allow state stakeholders to evaluate their own deployment standings in relation to national averages. The nat...
Shuman, William P; Chan, Keith T; Busey, Janet M; Mitsumori, Lee M; Choi, Eunice; Koprowicz, Kent M; Kanal, Kalpana M
2014-12-01
To investigate whether reduced radiation dose liver computed tomography (CT) images reconstructed with model-based iterative reconstruction ( MBIR model-based iterative reconstruction ) might compromise depiction of clinically relevant findings or might have decreased image quality when compared with clinical standard radiation dose CT images reconstructed with adaptive statistical iterative reconstruction ( ASIR adaptive statistical iterative reconstruction ). With institutional review board approval, informed consent, and HIPAA compliance, 50 patients (39 men, 11 women) were prospectively included who underwent liver CT. After a portal venous pass with ASIR adaptive statistical iterative reconstruction images, a 60% reduced radiation dose pass was added with MBIR model-based iterative reconstruction images. One reviewer scored ASIR adaptive statistical iterative reconstruction image quality and marked findings. Two additional independent reviewers noted whether marked findings were present on MBIR model-based iterative reconstruction images and assigned scores for relative conspicuity, spatial resolution, image noise, and image quality. Liver and aorta Hounsfield units and image noise were measured. Volume CT dose index and size-specific dose estimate ( SSDE size-specific dose estimate ) were recorded. Qualitative reviewer scores were summarized. Formal statistical inference for signal-to-noise ratio ( SNR signal-to-noise ratio ), contrast-to-noise ratio ( CNR contrast-to-noise ratio ), volume CT dose index, and SSDE size-specific dose estimate was made (paired t tests), with Bonferroni adjustment. Two independent reviewers identified all 136 ASIR adaptive statistical iterative reconstruction image findings (n = 272) on MBIR model-based iterative reconstruction images, scoring them as equal or better for conspicuity, spatial resolution, and image noise in 94.1% (256 of 272), 96.7% (263 of 272), and 99.3% (270 of 272), respectively. In 50 image sets, two reviewers (n = 100) scored overall image quality as sufficient or good with MBIR model-based iterative reconstruction in 99% (99 of 100). Liver SNR signal-to-noise ratio was significantly greater for MBIR model-based iterative reconstruction (10.8 ± 2.5 [standard deviation] vs 7.7 ± 1.4, P < .001); there was no difference for CNR contrast-to-noise ratio (2.5 ± 1.4 vs 2.4 ± 1.4, P = .45). For ASIR adaptive statistical iterative reconstruction and MBIR model-based iterative reconstruction , respectively, volume CT dose index was 15.2 mGy ± 7.6 versus 6.2 mGy ± 3.6; SSDE size-specific dose estimate was 16.4 mGy ± 6.6 versus 6.7 mGy ± 3.1 (P < .001). Liver CT images reconstructed with MBIR model-based iterative reconstruction may allow up to 59% radiation dose reduction compared with the dose with ASIR adaptive statistical iterative reconstruction , without compromising depiction of findings or image quality. © RSNA, 2014.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
van Gelder, P.H.A.J.M.; Nijs, M.
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care. PMID:24753877
van Gelder, P H A J M; Nijs, M
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost -importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the -required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper -interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care.
Artificial Intelligence Approach to Support Statistical Quality Control Teaching
ERIC Educational Resources Information Center
Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno
2006-01-01
Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
Wisconsin's forest, 2004: statistics and quality assurance
Mark H. Hansen; Charles H. Perry; Gary Brand; Ronald E. McRoberts
2008-01-01
The first full, annualized inventory of Wisconsin's forests was completed in 2004 after 6,478 forested plots were visited. An earlier publication summarized the results and presented issue - driven analyses (Perry et al. 2008) . This report includes detailed information on forest inventory methods...
Assessment of water quality parameters using multivariate analysis for Klang River basin, Malaysia.
Mohamed, Ibrahim; Othman, Faridah; Ibrahim, Adriana I N; Alaa-Eldin, M E; Yunus, Rossita M
2015-01-01
This case study uses several univariate and multivariate statistical techniques to evaluate and interpret a water quality data set obtained from the Klang River basin located within the state of Selangor and the Federal Territory of Kuala Lumpur, Malaysia. The river drains an area of 1,288 km(2), from the steep mountain rainforests of the main Central Range along Peninsular Malaysia to the river mouth in Port Klang, into the Straits of Malacca. Water quality was monitored at 20 stations, nine of which are situated along the main river and 11 along six tributaries. Data was collected from 1997 to 2007 for seven parameters used to evaluate the status of the water quality, namely dissolved oxygen, biochemical oxygen demand, chemical oxygen demand, suspended solids, ammoniacal nitrogen, pH, and temperature. The data were first investigated using descriptive statistical tools, followed by two practical multivariate analyses that reduced the data dimensions for better interpretation. The analyses employed were factor analysis and principal component analysis, which explain 60 and 81.6% of the total variation in the data, respectively. We found that the resulting latent variables from the factor analysis are interpretable and beneficial for describing the water quality in the Klang River. This study presents the usefulness of several statistical methods in evaluating and interpreting water quality data for the purpose of monitoring the effectiveness of water resource management. The results should provide more straightforward data interpretation as well as valuable insight for managers to conceive optimum action plans for controlling pollution in river water.
Design of experiments (DoE) in pharmaceutical development.
N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios
2017-06-01
At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Phung, Dung; Huang, Cunrui; Rutherford, Shannon; Dwirahmadi, Febi; Chu, Cordia; Wang, Xiaoming; Nguyen, Minh; Nguyen, Nga Huy; Do, Cuong Manh; Nguyen, Trung Hieu; Dinh, Tuan Anh Diep
2015-05-01
The present study is an evaluation of temporal/spatial variations of surface water quality using multivariate statistical techniques, comprising cluster analysis (CA), principal component analysis (PCA), factor analysis (FA) and discriminant analysis (DA). Eleven water quality parameters were monitored at 38 different sites in Can Tho City, a Mekong Delta area of Vietnam from 2008 to 2012. Hierarchical cluster analysis grouped the 38 sampling sites into three clusters, representing mixed urban-rural areas, agricultural areas and industrial zone. FA/PCA resulted in three latent factors for the entire research location, three for cluster 1, four for cluster 2, and four for cluster 3 explaining 60, 60.2, 80.9, and 70% of the total variance in the respective water quality. The varifactors from FA indicated that the parameters responsible for water quality variations are related to erosion from disturbed land or inflow of effluent from sewage plants and industry, discharges from wastewater treatment plants and domestic wastewater, agricultural activities and industrial effluents, and contamination by sewage waste with faecal coliform bacteria through sewer and septic systems. Discriminant analysis (DA) revealed that nephelometric turbidity units (NTU), chemical oxygen demand (COD) and NH₃ are the discriminating parameters in space, affording 67% correct assignation in spatial analysis; pH and NO₂ are the discriminating parameters according to season, assigning approximately 60% of cases correctly. The findings suggest a possible revised sampling strategy that can reduce the number of sampling sites and the indicator parameters responsible for large variations in water quality. This study demonstrates the usefulness of multivariate statistical techniques for evaluation of temporal/spatial variations in water quality assessment and management.
Impact of syncope on quality of life: validation of a measure in patients undergoing tilt testing.
Nave-Leal, Elisabete; Oliveira, Mário; Pais-Ribeiro, José; Santos, Sofia; Oliveira, Eunice; Alves, Teresa; Cruz Ferreira, Rui
2015-03-01
Recurrent syncope has a significant impact on quality of life. The development of measurement scales to assess this impact that are easy to use in clinical settings is crucial. The objective of the present study is a preliminary validation of the Impact of Syncope on Quality of Life questionnaire for the Portuguese population. The instrument underwent a process of translation, validation, analysis of cultural appropriateness and cognitive debriefing. A population of 39 patients with a history of recurrent syncope (>1 year) who underwent tilt testing, aged 52.1 ± 16.4 years (21-83), 43.5% male, most in active employment (n=18) or retired (n=13), constituted a convenience sample. The resulting Portuguese version is similar to the original, with 12 items in a single aggregate score, and underwent statistical validation, with assessment of reliability, validity and stability over time. With regard to reliability, the internal consistency of the scale is 0.9. Assessment of convergent and discriminant validity showed statistically significant results (p<0.01). Regarding stability over time, a test-retest of this instrument at six months after tilt testing with 22 patients of the sample who had not undergone any clinical intervention found no statistically significant changes in quality of life. The results indicate that this instrument is of value for assessing quality of life in patients with recurrent syncope in Portugal. Copyright © 2014 Sociedade Portuguesa de Cardiologia. Published by Elsevier España. All rights reserved.
ERIC Educational Resources Information Center
Wolf, Fredric M.
2000-01-01
Presents statistics of deaths caused by medical errors and argues the effects of misconceptions in diagnosis and treatment. Suggests evidence-based medicine to enhance the quality of practice and minimize error rates. Presents 10 evidence-based lessons and discusses the possible benefits of evidence-based medicine to evidence-based education and…
Improved statistical method for temperature and salinity quality control
NASA Astrophysics Data System (ADS)
Gourrion, Jérôme; Szekely, Tanguy
2017-04-01
Climate research and Ocean monitoring benefit from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of an automatic quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to late 2015, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has already been implemented in the latest version of the delayed-time CMEMS in-situ dataset and will be deployed soon in the equivalent near-real time products.
Improved Statistical Method For Hydrographic Climatic Records Quality Control
NASA Astrophysics Data System (ADS)
Gourrion, J.; Szekely, T.
2016-02-01
Climate research benefits from the continuous development of global in-situ hydrographic networks in the last decades. Apart from the increasing volume of observations available on a large range of temporal and spatial scales, a critical aspect concerns the ability to constantly improve the quality of the datasets. In the context of the Coriolis Dataset for ReAnalysis (CORA) version 4.2, a new quality control method based on a local comparison to historical extreme values ever observed is developed, implemented and validated. Temperature, salinity and potential density validity intervals are directly estimated from minimum and maximum values from an historical reference dataset, rather than from traditional mean and standard deviation estimates. Such an approach avoids strong statistical assumptions on the data distributions such as unimodality, absence of skewness and spatially homogeneous kurtosis. As a new feature, it also allows addressing simultaneously the two main objectives of a quality control strategy, i.e. maximizing the number of good detections while minimizing the number of false alarms. The reference dataset is presently built from the fusion of 1) all ARGO profiles up to early 2014, 2) 3 historical CTD datasets and 3) the Sea Mammals CTD profiles from the MEOP database. All datasets are extensively and manually quality controlled. In this communication, the latest method validation results are also presented. The method has been implemented in the latest version of the CORA dataset and will benefit to the next version of the Copernicus CMEMS dataset.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Interactive Video: Meeting the Ford Challenge.
ERIC Educational Resources Information Center
Copeland, Peter
Many companies using Statistical Process Control (SPC) in their manufacturing processes have found that, despite the training difficulties presented by the technique, the rewards of successful SPC include increased productivity, quality, and market leadership. The Ford Motor Company has developed its SPC training with interactive video, which…
Quality control analysis : part IV : field simulation of asphaltic concrete specifications.
DOT National Transportation Integrated Search
1969-02-01
The report present some of the major findings, from a simulated study of statistical specifications, on three asphaltic concrete projects representing a total of approximately 30, 000 tons of hot mix. The major emphasis of the study has been on the a...
Bodenburg, Sebastian; Dopslaff, Nina
2008-01-01
The Dysexecutive Questionnaire (DEX, , Behavioral assessment of the dysexecutive syndrome, 1996) is a standardized instrument to measure possible behavioral changes as a result of the dysexecutive syndrome. Although initially intended only as a qualitative instrument, the DEX has also been used increasingly to address quantitative problems. Until now there have not been more fundamental statistical analyses of the questionnaire's testing quality. The present study is based on an unselected sample of 191 patients with acquired brain injury and reports on the data relating to the quality of the items, the reliability and the factorial structure of the DEX. Item 3 displayed too great an item difficulty, whereas item 11 was not sufficiently discriminating. The DEX's reliability in self-rating is r = 0.85. In addition to presenting the statistical values of the tests, a clinical severity classification of the overall scores of the 4 found factors and of the questionnaire as a whole is carried out on the basis of quartile standards.
A clinical research analytics toolkit for cohort study.
Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue
2012-01-01
This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.
Danucalov, Marcelo Ad; Kozasa, Elisa H; Afonso, Rui F; Galduroz, José Cf; Leite, José R
2017-01-01
To investigate the effects of the practice of yoga in combination with compassion meditation on the quality of life, attention, vitality and self-compassion of family caregivers of patients with Alzheimer's disease. A total of 46 volunteers were randomly allocated to two groups, the yoga and compassion meditation program group (n = 25), and the control group (CG) that received no treatment (n = 21). The program lasted 8 weeks, and comprised three yoga and meditation practices per week, with each session lasting 1 h and 15 min. Quality of life, attention, vitality, and self-compassion scores were measured pre- and postintervention. The yoga and compassion meditation program group showed statistically significant improvements (P < 0.05) on quality of life, attention, vitality and self-compassion scores as compared with the control group, which showed no statistical significant differences at the postintervention time-point. The findings of the present study suggest that an 8-week yoga and compassion meditation program can improve the quality of life, vitality, attention, and self-compassion of family caregivers of Alzheimer's disease patients. Geriatr Gerontol Int 2017; 17: 85-91. © 2015 Japan Geriatrics Society.
Blind image quality assessment based on aesthetic and statistical quality-aware features
NASA Astrophysics Data System (ADS)
Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi
2017-07-01
The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.
Fritscher, Karl; Grunerbl, Agnes; Hanni, Markus; Suhm, Norbert; Hengg, Clemens; Schubert, Rainer
2009-10-01
Currently, conventional X-ray and CT images as well as invasive methods performed during the surgical intervention are used to judge the local quality of a fractured proximal femur. However, these approaches are either dependent on the surgeon's experience or cannot assist diagnostic and planning tasks preoperatively. Therefore, in this work a method for the individual analysis of local bone quality in the proximal femur based on model-based analysis of CT- and X-ray images of femur specimen will be proposed. A combined representation of shape and spatial intensity distribution of an object and different statistical approaches for dimensionality reduction are used to create a statistical appearance model in order to assess the local bone quality in CT and X-ray images. The developed algorithms are tested and evaluated on 28 femur specimen. It will be shown that the tools and algorithms presented herein are highly adequate to automatically and objectively predict bone mineral density values as well as a biomechanical parameter of the bone that can be measured intraoperatively.
Ambiguity of Quality in Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Christopher; Leptoukh, Greg
2010-01-01
This slide presentation reviews some of the issues in quality of remote sensing data. Data "quality" is used in several different contexts in remote sensing data, with quite different meanings. At the pixel level, quality typically refers to a quality control process exercised by the processing algorithm, not an explicit declaration of accuracy or precision. File level quality is usually a statistical summary of the pixel-level quality but is of doubtful use for scenes covering large areal extents. Quality at the dataset or product level, on the other hand, usually refers to how accurately the dataset is believed to represent the physical quantities it purports to measure. This assessment often bears but an indirect relationship at best to pixel level quality. In addition to ambiguity at different levels of granularity, ambiguity is endemic within levels. Pixel-level quality terms vary widely, as do recommendations for use of these flags. At the dataset/product level, quality for low-resolution gridded products is often extrapolated from validation campaigns using high spatial resolution swath data, a suspect practice at best. Making use of quality at all levels is complicated by the dependence on application needs. We will present examples of the various meanings of quality in remote sensing data and possible ways forward toward a more unified and usable quality framework.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
Wavelet methodology to improve single unit isolation in primary motor cortex cells
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A.
2016-01-01
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein’s unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. PMID:25794461
Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726
Statistical auditing of toxicology reports.
Deaton, R R; Obenchain, R L
1994-06-01
Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.
Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani
2014-07-01
Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.
Rule-based statistical data mining agents for an e-commerce application
NASA Astrophysics Data System (ADS)
Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar
2003-03-01
Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.
The Content of Statistical Requirements for Authors in Biomedical Research Journals
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-01-01
Background: Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues’ serious considerations not only at the stage of data analysis but also at the stage of methodological design. Methods: Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Results: Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including “address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation,” and “statistical methods and the reasons.” Conclusions: Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible. PMID:27748343
The Content of Statistical Requirements for Authors in Biomedical Research Journals.
Liu, Tian-Yi; Cai, Si-Yu; Nie, Xiao-Lu; Lyu, Ya-Qi; Peng, Xiao-Xia; Feng, Guo-Shuang
2016-10-20
Robust statistical designing, sound statistical analysis, and standardized presentation are important to enhance the quality and transparency of biomedical research. This systematic review was conducted to summarize the statistical reporting requirements introduced by biomedical research journals with an impact factor of 10 or above so that researchers are able to give statistical issues' serious considerations not only at the stage of data analysis but also at the stage of methodological design. Detailed statistical instructions for authors were downloaded from the homepage of each of the included journals or obtained from the editors directly via email. Then, we described the types and numbers of statistical guidelines introduced by different press groups. Items of statistical reporting guideline as well as particular requirements were summarized in frequency, which were grouped into design, method of analysis, and presentation, respectively. Finally, updated statistical guidelines and particular requirements for improvement were summed up. Totally, 21 of 23 press groups introduced at least one statistical guideline. More than half of press groups can update their statistical instruction for authors gradually relative to issues of new statistical reporting guidelines. In addition, 16 press groups, covering 44 journals, address particular statistical requirements. The most of the particular requirements focused on the performance of statistical analysis and transparency in statistical reporting, including "address issues relevant to research design, including participant flow diagram, eligibility criteria, and sample size estimation," and "statistical methods and the reasons." Statistical requirements for authors are becoming increasingly perfected. Statistical requirements for authors remind researchers that they should make sufficient consideration not only in regards to statistical methods during the research design, but also standardized statistical reporting, which would be beneficial in providing stronger evidence and making a greater critical appraisal of evidence more accessible.
Smith, Samuel G; Wolf, Michael S; von Wagner, Christian
2010-01-01
The increasing trend of exposing patients seeking health advice to numerical information has the potential to adversely impact patient-provider relationships especially among individuals with low literacy and numeracy skills. We used the HINTS 2007 to provide the first large scale study linking statistical confidence (as a marker of subjective numeracy) to demographic variables and a health-related outcome (in this case the quality of patient-provider interactions). A cohort of 7,674 individuals answered sociodemographic questions, a question on how confident they were in understanding medical statistics, a question on preferences for words or numbers in risk communication, and a measure of patient-provider interaction quality. Over thirty-seven percent (37.4%) of individuals lacked confidence in their ability to understand medical statistics. This was particularly prevalent among the elderly, low income, low education, and non-White ethnic minority groups. Individuals who lacked statistical confidence demonstrated clear preferences for having risk-based information presented with words rather than numbers and were 67% more likely to experience a poor patient-provider interaction, after controlling for gender, ethnicity, insurance status, the presence of a regular health care professional, and the language of the telephone interview. We will discuss the implications of our findings for health care professionals.
Control by quality: proposition of a typology.
Pujo, P; Pillet, M
The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.
Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling
NASA Technical Reports Server (NTRS)
Mog, Robert A.
1997-01-01
Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.
Electric Power Monthly, June 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-09-13
The EPM is prepared by the Electric Power Division; Office of Coal, Nuclear, Electric and Alternate Fuels, Energy Information Administration (EIA), Department of Energy. This publication provides monthly statistics at the national, Census division, and State levels for net generation, fuel consumption, fuel stocks, quantity and quality of fuel, electricity sales, and average revenue per kilowatthour of electricity sold. Data on net generation are also displayed at the North American Electric Reliability Council (NERC) region level. Additionally, company and plant level information are published in the EPM on capability of new plants, net generation, fuel consumption, fuel stocks, quantity andmore » quality of fuel, and cost of fuel. Quantity, quality, and cost of fuel data lag the net generation, fuel consumption, fuel stocks, electricity sales, and average revenue per kilowatthour data by 1 month. This difference in reporting appears in the national, Census division, and State level tables. However, at the plant level, all statistics presented are for the earlier month for the purpose of comparison. 40 tabs.« less
An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.
Undrill, P E; Frazer, S C
1979-01-01
A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340
Kim, Boram; Joo, Nami
2014-10-01
Although the issues of singles' dietary style and quality of life are becoming important due to the increasing number of singles with economic power, little research has been conducted to date on singles' use of convenience food and quality of life in relation to their dietary style. Thus, the present study intends to provide basic data to improve the quality of life by determining the current status of the use of convenience food and explicating its relationship with quality of life through analyzing the dietary lifestyles of the singles. The targets of this study were singles, identified as adults between the ages of 25 and 54, living alone, either legally or in actuality having no partner. A statistical analysis of 208 surveys from Seoul, respectively, was conducted using SPSS12.0 for Windows and SEM using AMOS 5.0 statistics package. The convenience-oriented was shown to have a significant positive effect on convenience food satisfaction. HMR satisfaction was found to have a significant effect on positive psychological satisfaction and the convenience-oriented was found to have a significant negative effect on all aspects of quality of life satisfaction. There must be persistent development of food industries considering the distinctive characteristics of the lives of singles in order to satisfy their needs and improve the quality of their lives.
Clustering and Flow Conservation Monitoring Tool for Software Defined Networks.
Puente Fernández, Jesús Antonio; García Villalba, Luis Javier; Kim, Tai-Hoon
2018-04-03
Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.
The quality assessment of family physician service in rural regions, Northeast of Iran in 2012
Vafaee-Najar, Ali; Nejatzadegan, Zohreh; Pourtaleb, Arefeh; Kaffashi, Shahnaz; Vejdani, Marjan; Molavi-Taleghani, Yasamin; Ebrahimipour, Hosein
2014-01-01
Background: Following the implementation of family physician plan in rural areas, the quantity of provided services has been increased, but what leads on the next topic is the improvement in expected quality of service, as well. The present study aims at determining the gap between patients’ expectation and perception from the quality of services provided by family physicians during the spring and summer of 2012. Methods: This was a cross-sectional study in which 480 patients who referred to family physician centers were selected with clustering and simple randomized method. Data were collected through SERVQUAL standard questionnaire and were analyzed with descriptive statistics, using statistical T-test, Kruskal-Wallis, and Wilcoxon signed-rank tests by SPSS 16 at a significance level of 0.05. Results: The difference between the mean scores of expectation and perception was about -0.93, which is considered as statistically significant difference (P≤ 0.05). Also, the differences in five dimensions of quality were as follows: tangible -1.10, reliability -0.87, responsiveness -1.06, assurance -0.83, and empathy -0.82. Findings showed that there was a significant difference between expectation and perception in five concepts of the provided services (P≤ 0.05). Conclusion: There was a gap between the ideal situation and the current situation of family physician quality of services. Our suggestion is maintaining a strong focus on patients, creating a medical practice that would exceed patients’ expectations, providing high-quality healthcare services, and realizing the continuous improvement of all processes. In both tangible and responsive, the gap was greater than the other dimensions. It is recommended that more attention should be paid to the physical appearance of the health center environment and the availability of staff and employees. PMID:24757691
Sheikh, Adnan
2016-01-01
Objective: The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. Methods: We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. Results: We found that the ASiR technique was able to reduce the volume CT dose index, dose–length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. Conclusion: The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. Advances in knowledge: The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions. PMID:26882825
Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan
2016-01-01
The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.
Tu, Pei-Weng; Wu, Shiow-Ing
2015-01-01
The implementation of an effective quality management system has always been considered a principal method for a manufacturer to maintain and improve its product and service quality. Globally many regulatory authorities incorporate quality management system as one of the mandatory requirements for the regulatory control of high-risk medical devices. The present study aims to analyze the GMP enforcement experience in Taiwan between 1998 and 2013. It describes the regulatory implementation of medical device GMP requirement and initiatives taken to assist small and medium-sized enterprises in compliance with the regulatory requirement. Based on statistical data collected by the competent authority and industry research institutes, the present paper reports the growth of Taiwan local medical device industry after the enforcement of GMP regulation. Transition in the production, technologies, and number of employees of Taiwan medical device industry between 1998 and 2013 provides the competent authorities around the world with an empirical foundation for further policy development. PMID:26075255
Li, Tzu-Wei; Tu, Pei-Weng; Liu, Li-Ling; Wu, Shiow-Ing
2015-01-01
The implementation of an effective quality management system has always been considered a principal method for a manufacturer to maintain and improve its product and service quality. Globally many regulatory authorities incorporate quality management system as one of the mandatory requirements for the regulatory control of high-risk medical devices. The present study aims to analyze the GMP enforcement experience in Taiwan between 1998 and 2013. It describes the regulatory implementation of medical device GMP requirement and initiatives taken to assist small and medium-sized enterprises in compliance with the regulatory requirement. Based on statistical data collected by the competent authority and industry research institutes, the present paper reports the growth of Taiwan local medical device industry after the enforcement of GMP regulation. Transition in the production, technologies, and number of employees of Taiwan medical device industry between 1998 and 2013 provides the competent authorities around the world with an empirical foundation for further policy development.
Ranking and validation of spallation models for isotopic production cross sections of heavy residua
NASA Astrophysics Data System (ADS)
Sharma, Sushil K.; Kamys, Bogusław; Goldenbaum, Frank; Filges, Detlef
2017-07-01
The production cross sections of isotopically identified residual nuclei of spallation reactions induced by 136Xe projectiles at 500AMeV on hydrogen target were analyzed in a two-step model. The first stage of the reaction was described by the INCL4.6 model of an intranuclear cascade of nucleon-nucleon and pion-nucleon collisions whereas the second stage was analyzed by means of four different models; ABLA07, GEM2, GEMINI++ and SMM. The quality of the data description was judged quantitatively using two statistical deviation factors; the H-factor and the M-factor. It was found that the present analysis leads to a different ranking of models as compared to that obtained from the qualitative inspection of the data reproduction. The disagreement was caused by sensitivity of the deviation factors to large statistical errors present in some of the data. A new deviation factor, the A factor, was proposed, that is not sensitive to the statistical errors of the cross sections. The quantitative ranking of models performed using the A-factor agreed well with the qualitative analysis of the data. It was concluded that using the deviation factors weighted by statistical errors may lead to erroneous conclusions in the case when the data cover a large range of values. The quality of data reproduction by the theoretical models is discussed. Some systematic deviations of the theoretical predictions from the experimental results are observed.
Air quality assessment in Portugal and the special case of the Tâmega e Sousa region
NASA Astrophysics Data System (ADS)
de Almeida, Fátima; Correia, Aldina; Silva, Eliana Costa e.
2017-06-01
Air pollution is a major environmental problem which can present a significant risk for human health. This paper, presents the evaluation of the air quality in several region of Portugal. Special focus is given to the region of Tâmega e Sousa where ESTG/P. Porto is located. ANOVA and MANOVA techniques are applied to study the differences between air quality in the period between 2009 and 2012 in several regions of Portugal. The data includes altitude, area, expenditure of environmental measures on protection of air quality and climate, expenditure on protection of biodiversity and landscape, burned area, number of forest fires, extractive and manufacturing industries, per municipality and per year. Using information gathered by the project QualAr about concentrations of the pollutants: CO, NO2, O3, PM10 and SO2, an air quality indicator with five levels is consider. The results point to significant differences in the air quality for the regions and the years considered. Additionally, for identifying the factors that influence the air quality in 2012 a multivariate regression model was used. The results show statistical evidence that air quality in 2011, number of forest fires in 2012 and 2010, number of manufacturing industries per km2 in 2012 and number of forest fires in 2010 are the variables that present a larger contribution to the quality of the air in 2012.
NASA Astrophysics Data System (ADS)
Bencomo, Jose Antonio Fagundez
The main goal of this study was to relate physical changes in image quality measured by Modulation Transfer Function (MTF) to diagnostic accuracy. One Hundred and Fifty Kodak Min-R screen/film combination conventional craniocaudal mammograms obtained with the Pfizer Microfocus Mammographic system were selected from the files of the Department of Radiology, at M.D. Anderson Hospital and Tumor Institute. The mammograms included 88 cases with a variety of benign diagnosis and 62 cases with a variety of malignant biopsy diagnosis. The average age of the patient population was 55 years old. 70 cases presented calcifications with 30 cases having calcifications smaller than 0.5mm. 46 cases presented irregular bordered masses larger than 1 cm. 30 cases presented smooth bordered masses with 20 larger than 1 cm. Four separated copies of the original images were made each having a different change in the MTF using a defocusing technique whereby copies of the original were obtained by light exposure through different thicknesses (spacing) of transparent film base. The mammograms were randomized, and evaluated by three experienced mammographers for the degree of visibility of various anatomical breast structures and pathological lesions (masses and calicifications), subjective image quality, and mammographic interpretation. 3,000 separate evaluations were anayzed by several statistical techniques including Receiver Operating Characteristic curve analysis, McNemar test for differences between proportions and the Landis et al. method of agreement weighted kappa for ordinal categorical data. Results from the statistical analysis show: (1) There were no statistical significant differences in the diagnostic accuracy of the observers when diagnosing from mammograms with the same MTF. (2) There were no statistically significant differences in diagnostic accuracy for each observer when diagnosing from mammograms with the different MTF's used in the study. (3) There statistical significant differences in detail visibility between the copies and the originals. Detail visibility was better in the originals. (4) Feature interpretations were not significantly different between the originals and the copies. (5) Perception of image quality did not affect image interpretation. Continuation and improvement of this research ca be accomplished by: using a case population more sensitive to MTF changes, i.e., asymptomatic women with minimum breast cancer, more observers (including less experienced radiologists and experienced technologists) must collaborate in the study, and using a minimum of 200 benign and 200 malignant cases.
Kids Count in Nebraska: 2001 Report.
ERIC Educational Resources Information Center
Johnston, Janet M.
This Kids Count report examines statewide trends and county data on the well-being of Nebraska's children. Section 1 contains a commentary on promoting quality early childhood care and education services. Section 2, the bulk of this statistical report, presents finding on indicators of well-being in eight areas: (1) child abuse and…
Corporate Use of Information regarding Natural Resources and Environmental Quality.
ERIC Educational Resources Information Center
Train, Russell E.
This report presents findings and recommendations from a 1-year study which identified corporate needs for resource information (particularly statistical information) and assessed the extent to which these needs are being met by various resource-information services, including those of the federal government. Chapter I discusses 11 types of…
Deriving health utilities from the MacNew Heart Disease Quality of Life Questionnaire.
Chen, Gang; McKie, John; Khan, Munir A; Richardson, Jeff R
2015-10-01
Quality of life is included in the economic evaluation of health services by measuring the preference for health states, i.e. health state utilities. However, most intervention studies include a disease-specific, not a utility, instrument. Consequently, there has been increasing use of statistical mapping algorithms which permit utilities to be estimated from a disease-specific instrument. The present paper provides such algorithms between the MacNew Heart Disease Quality of Life Questionnaire (MacNew) instrument and six multi-attribute utility (MAU) instruments, the Euroqol (EQ-5D), the Short Form 6D (SF-6D), the Health Utilities Index (HUI) 3, the Quality of Wellbeing (QWB), the 15D (15 Dimension) and the Assessment of Quality of Life (AQoL-8D). Heart disease patients and members of the healthy public were recruited from six countries. Non-parametric rank tests were used to compare subgroup utilities and MacNew scores. Mapping algorithms were estimated using three separate statistical techniques. Mapping algorithms achieved a high degree of precision. Based on the mean absolute error and the intra class correlation the preferred mapping is MacNew into SF-6D or 15D. Using the R squared statistic the preferred mapping is MacNew into AQoL-8D. The algorithms reported in this paper enable MacNew data to be mapped into utilities predicted from any of six instruments. This permits studies which have included the MacNew to be used in cost utility analyses which, in turn, allows the comparison of services with interventions across the health system. © The European Society of Cardiology 2014.
Irvine, Kathryn M.; Manlove, Kezia; Hollimon, Cynthia
2012-01-01
An important consideration for long term monitoring programs is determining the required sampling effort to detect trends in specific ecological indicators of interest. To enhance the Greater Yellowstone Inventory and Monitoring Network’s water resources protocol(s) (O’Ney 2006 and O’Ney et al. 2009 [under review]), we developed a set of tools to: (1) determine the statistical power for detecting trends of varying magnitude in a specified water quality parameter over different lengths of sampling (years) and different within-year collection frequencies (monthly or seasonal sampling) at particular locations using historical data, and (2) perform periodic trend analyses for water quality parameters while addressing seasonality and flow weighting. A power analysis for trend detection is a statistical procedure used to estimate the probability of rejecting the hypothesis of no trend when in fact there is a trend, within a specific modeling framework. In this report, we base our power estimates on using the seasonal Kendall test (Helsel and Hirsch 2002) for detecting trend in water quality parameters measured at fixed locations over multiple years. We also present procedures (R-scripts) for conducting a periodic trend analysis using the seasonal Kendall test with and without flow adjustment. This report provides the R-scripts developed for power and trend analysis, tutorials, and the associated tables and graphs. The purpose of this report is to provide practical information for monitoring network staff on how to use these statistical tools for water quality monitoring data sets.
NASA Astrophysics Data System (ADS)
Boswijk, G.; Fowler, A. M.; Palmer, J. G.; Fenwick, P.; Hogg, A.; Lorrey, A.; Wunder, J.
2014-04-01
Millennial and multi-millennial tree-ring chronologies can provide useful proxy records of past climate, giving insight into a more complete range of natural climate variability prior to the 20th century. Since the 1980s a multi-millennial tree-ring chronology has been developed from kauri (Agathis australis) from the upper North Island, New Zealand. Previous work has demonstrated the sensitivity of kauri to the El Niño-Southern Oscillation (ENSO). Here we present recent additions and extensions to the late Holocene kauri chronology (LHKC), and assess the potential of a composite master chronology, AGAUc13, for palaeoclimate reconstruction. The updated composite kauri chronology now spans 4491 years (2488 BCE-2002 CE) and includes data from 18 modern sites, 25 archaeological sites, and 18 sub-fossil (swamp) kauri sites. Consideration of the composition and statistical quality of AGAUc13 suggests the LHKC has utility for palaeoclimate reconstruction but there are caveats. These include: (a) differences in character between the three assemblages including growth rate and sensitivity; (b) low sample depth and low statistical quality in the 10th-13th century CE, when the record transitions from modern and archaeological material to the swamp kauri; (c) a potential difference in amplitude of the signal in the swamp kauri; (d) a westerly bias in site distribution prior to 911 CE; (e) variable statistical quality across the entire record associated with variable replication; and (f) complex changes in sample depth and tree age and size which may influence centennial scale trends in the data. Further tree ring data are required to improve statistical quality, particularly in the first half of the second millennium CE.
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Regojo Zapata, O; Lamata Hernández, F; Sánchez Zalabardo, J M; Elizalde Benito, A; Navarro Gil, J; Valdivia Uría, J G
2004-09-01
Studies about quality in thesis and investigation projects in biomedical sciences are unusual, but very important in university teaching because is necessary to improve the quality elaboration of the thesis. The objectives the study were to determine the project's quality of thesis in our department, according to the fulfillment of the scientific methodology and to establish, if it exists, a relation between the global quality of the project and the statistical used resources. Descriptive study of 273 thesis projects performed between 1995-2002 in surgery department of the Zaragoza University. The review realized for 15 observers that they analyzed 28 indicators of every project. Giving a value to each of the indicators, the projects qualified in a scale from 1 to 10 according to the quality in the fulfillment of the scientific methodology. The mean of the project's quality was 5.53 (D.E: 1.77). In 13.9% the thesis projects was concluded with the reading of the work. The three indicators of statistical used resources had a significant difference with the value of the quality projects. The quality of the statistical resources is very important when a project of thesis wants to be realized by good methodology, because it assures to come to certain conclusions. In our study we have thought that more of the third part of the variability in the quality of the project of thesis explains for three statistical above-mentioned articles.
Kawata, Masaaki; Sato, Chikara
2007-06-01
In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.
Statistical analysis of arthroplasty data
2011-01-01
It is envisaged that guidelines for statistical analysis and presentation of results will improve the quality and value of research. The Nordic Arthroplasty Register Association (NARA) has therefore developed guidelines for the statistical analysis of arthroplasty register data. The guidelines are divided into two parts, one with an introduction and a discussion of the background to the guidelines (Ranstam et al. 2011a, see pages x-y in this issue), and this one with a more technical statistical discussion on how specific problems can be handled. This second part contains (1) recommendations for the interpretation of methods used to calculate survival, (2) recommendations on howto deal with bilateral observations, and (3) a discussion of problems and pitfalls associated with analysis of factors that influence survival or comparisons between outcomes extracted from different hospitals. PMID:21619500
[Cause-of-death statistics and ICD, quo vadis?
Eckert, Olaf; Vogel, Ulrich
2018-07-01
The International Statistical Classification of Diseases and Related Health Problems (ICD) is the worldwide binding standard for generating underlying cause-of-death statistics. What are the effects of former revisions of the ICD on underlying cause-of-death statistics and which opportunities and challenges are becoming apparent in a possible transition process from ICD-10 to ICD-11?This article presents the calculation of the exploitation grade of ICD-9 and ICD-10 in the German cause-of-death statistics and quality of documentation. Approximately 67,000 anonymized German death certificates are processed by Iris/MUSE and official German cause-of-death statistics are analyzed.In addition to substantial changes in the exploitation grade in the transition from ICD-9 to ICD-10, regional effects become visible. The rate of so-called "ill-defined" conditions exceeds 10%.Despite substantial improvement of ICD revisions there are long-known deficits in the coroner's inquest, filling death certificates and quality of coding. To make better use of the ICD as a methodological framework for mortality statistics and health reporting in Germany, the following measures are necessary: 1. General use of Iris/MUSE, 2. Establishing multiple underlying cause-of-death statistics, 3. Introduction of an electronic death certificate, 4. Improvement of the medical assessment of cause of death.Within short time the WHO will release the 11th revision of the ICD that will provide additional opportunities for the development of underlying cause-of-death statistics and their use in science, public health and politics. A coordinated effort including participants in the process and users is necessary to meet the related challenges.
Rogala, James T.; Gray, Brian R.
2006-01-01
The Long Term Resource Monitoring Program (LTRMP) uses a stratified random sampling design to obtain water quality statistics within selected study reaches of the Upper Mississippi River System (UMRS). LTRMP sampling strata are based on aquatic area types generally found in large rivers (e.g., main channel, side channel, backwater, and impounded areas). For hydrologically well-mixed strata (i.e., main channel), variance associated with spatial scales smaller than the strata scale is a relatively minor issue for many water quality parameters. However, analysis of LTRMP water quality data has shown that within-strata variability at the strata scale is high in off-channel areas (i.e., backwaters). A portion of that variability may be associated with differences among individual backwater lakes (i.e., small and large backwater regions separated by channels) that cumulatively make up the backwater stratum. The objective of the statistical modeling presented here is to determine if differences among backwater lakes account for a large portion of the variance observed in the backwater stratum for selected parameters. If variance associated with backwater lakes is high, then inclusion of backwater lake effects within statistical models is warranted. Further, lakes themselves may represent natural experimental units where associations of interest to management may be estimated.
History of water quality parameters - a study on the Sinos River/Brazil.
Konzen, G B; Figueiredo, J A S; Quevedo, D M
2015-05-01
Water is increasingly becoming a valuable resource, constituting one of the central themes of environmental, economic and social discussions. The Sinos River, located in southern Brazil, is the main river from the Sinos River Basin, representing a source of drinking water supply for a highly populated region. Considering its size and importance, it becomes necessary to conduct a study to follow up the water quality of this river, which is considered by some experts as one of the most polluted rivers in Brazil. As for this study, its great importance lies in the historical analysis of indicators. In this sense, we sought to develop aspects related to the management of water resources by performing a historical analysis of the Water Quality Index (WQI) of the Sinos River, using statistical methods. With regard to the methodological procedures, it should be pointed out that this study performs a time analysis of monitoring data on parameters related to a punctual measurement that is variable in time, using statistical tools. The data used refer to analyses of the water quality of the Sinos River (WQI) from the State Environmental Protection Agency Henrique Luiz Roessler (Fundação Estadual de Proteção Ambiental Henrique Luiz Roessler, FEPAM) covering the period between 2000 and 2008, as well as to a theoretical analysis focusing on the management of water resources. The study of WQI and its parameters by statistical analysis has shown to be effective, ensuring its effectiveness as a tool for the management of water resources. The descriptive analysis of the WQI and its parameters showed that the water quality of the Sinos River is concerning low, which reaffirms that it is one of the most polluted rivers in Brazil. It should be highlighted that there was an overall difficulty in obtaining data with the appropriate periodicity, as well as a long complete series, which limited the conduction of statistical studies such as the present one.
Development and application of a statistical quality assessment method for dense-graded mixes.
DOT National Transportation Integrated Search
2004-08-01
This report describes the development of the statistical quality assessment method and the procedure for mapping the measures obtained from the quality assessment method to a composite pay factor. The application to dense-graded mixes is demonstrated...
Crawford, J. Kent
1985-01-01
Historical water-quality data collected by the U.S. Geological Survey from the Cape Fear River at Lock 1, near Kelly, North Carolina, show increasing concentrations of total-dissolved solids, specific conductance, sulfate, chloride, nitrite plus nitrate nitrogen, magnesium, sodium, and potassium during the past 25 years. Silica and pH show decreasing trends during the same 1957-80 period. These long-term changes in water quality are statistically related to increasing population in the basin and especially to manufacturing employment. Comparisons of water-quality data for present conditions with estimated natural conditions indicate that over 50 percent of the loads of most major dissolved substances in the river at Lock 1 are the result of development impacts in the basin. Over 80 percent of the nutrients plus nitrate nitrogen, ammonia nitrogen, and total phosphorus presently in the streams originate from development. At four sampling stations on the Cape Fear River and its tributaries, recent water-quality data show that most constituents are always within North Carolina water-quality standards and Environmental Protection Agency water-quality criteria. However, iron, manganese and mercury concentrations usually exceed standards. Although no algal problems have been identified in the Cape Fear River, nitrogen and phosphorus are present in concentrations that have produced nuisance algal growths in lakes
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
An Exercise in Exploring Big Data for Producing Reliable Statistical Information.
Rey-Del-Castillo, Pilar; Cardeñosa, Jesús
2016-06-01
The availability of copious data about many human, social, and economic phenomena is considered an opportunity for the production of official statistics. National statistical organizations and other institutions are more and more involved in new projects for developing what is sometimes seen as a possible change of paradigm in the way statistical figures are produced. Nevertheless, there are hardly any systems in production using Big Data sources. Arguments of confidentiality, data ownership, representativeness, and others make it a difficult task to get results in the short term. Using Call Detail Records from Ivory Coast as an illustration, this article shows some of the issues that must be dealt with when producing statistical indicators from Big Data sources. A proposal of a graphical method to evaluate one specific aspect of the quality of the computed figures is also presented, demonstrating that the visual insight provided improves the results obtained using other traditional procedures.
Kouris, Anargyros; Armyra, Kalliopi; Christodoulou, Christos; Katoulis, Alexandros; Potouridou, Irene; Tsatovidou, Revekka; Rigopoulos, Dimitrios; Kontochristopoulos, Georgios
2015-06-01
Chronic hand eczema is a common dermatological disorder of multifactorial aetiology. It affects physical, material, social and psychological aspects of life, thereby impairing health-related quality of life. The aim of the present study was to assess quality of life, anxiety, depression and obsessive-compulsive tendencies in patients with chronic hand eczema. Seventy-one patients with chronic hand eczema were included in the study. Quality of life was evaluated according to the Dermatology Life Quality Index (DLQI). Patients were also assessed for anxiety and depression with the Hospital Anxiety and Depression Scale (HADS), and for compulsive behaviour with the Leyton Trait Scale. The DLQI score was 11.11 ± 1.81 in patients with chronic hand eczema. Scores on the Leyton Trait Scale were significantly higher than those of healthy controls (p < 0.027). As concerns the HADS-Anxiety subscale, patients with hand dermatitis had statistically significantly higher scores than those of volunteers (p = 0.002). In contrast, no statistically significant difference was found between the two groups with regard to the HADS-Depression subscale score and total HADS score. Hand eczema treatment should address the severity of skin lesions as well as the psychological impact of hand eczema. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Méndez, Jesús; González, Mónica; Lobo, M Gloria; Carnero, Aurelio
2004-03-10
The commercial value of a cochineal (Dactylopius coccus Costa) sample is associated with its color quality. Because the cochineal is a legal food colorant, its color quality is generally understood as its pigment content. Simply put, the higher this content, the more valuable the sample is to the market. In an effort to devise a way to measure the color quality of a cochineal, the present study evaluates different parameters of color measurement such as chromatic attributes (L*, and a*), percentage of carminic acid, tint determination, and chromatographic profile of pigments. Tint determination did not achieve this objective because this parameter does not correlate with carminic acid content. On the other hand, carminic acid showed a highly significant correlation (r = - 0.922, p = 0.000) with L* values determined from powdered cochineal samples. The combination of the information from the spectrophotometric determination of carminic acid with that of the pigment profile acquired by liquid chromatography (LC) and the composition of the red and yellow pigment groups, also acquired by LC, enables greater accuracy in judging the quality of the final sample. As a result of this study, it was possible to achieve the separation of cochineal samples according to geographical origin using two statistical techniques: cluster analysis and principal component analysis.
[Review of meta-analysis research on exercise in South Korea].
Song, Youngshin; Gang, Moonhee; Kim, Sun Ae; Shin, In Soo
2014-10-01
The purpose of this study was to evaluate the quality of meta-analysis regarding exercise using Assessment of Multiple Systematic Reviews (AMSTAR) as well as to compare effect size according to outcomes. Electronic databases including the Korean Studies Information Service System (KISS), the National Assembly Library and the DBpia, HAKJISA and RISS4U for the dates 1990 to January 2014 were searched for 'meta-analysis' and 'exercise' in the fields of medical, nursing, physical therapy and physical exercise in Korea. AMSTAR was scored for quality assessment of the 33 articles included in the study. Data were analyzed using descriptive statistics, t-test, ANOVA and χ²-test. The mean score for AMSTAR evaluations was 4.18 (SD=1.78) and about 67% were classified at the low-quality level and 30% at the moderate-quality level. The scores of quality were statistically different by field of research, number of participants, number of databases, financial support and approval by IRB. The effect size that presented in individual studies were different by type of exercise in the applied intervention. This critical appraisal of meta-analysis published in various field that focused on exercise indicates that a guideline such as the PRISMA checklist should be strongly recommended for optimum reporting of meta-analysis across research fields.
Melo Freire, C A; Borges, G A; Caldas, Dbm; Santos, R S; Ignácio, S A; Mazur, R F
To evaluate the cement line thickness and the interface quality in milled or injected lithium disilicate ceramic restorations and their influence on marginal adaptation using different cement types and different adhesive cementation techniques. Sixty-four bovine teeth were prepared for full crown restoration (7.0±0.5 mm in height, 8.0 mm in cervical diameter, and 4.2 mm in incisal diameter) and were divided into two groups: CAD/CAM automation technology, IPS e.max CAD (CAD), and isostatic injection by heat technology, IPS e.max Press (PRESS). RelyX ARC (ARC) and RelyX U200 resin cements were used as luting agents in two activation methods: initial self-activation and light pre-activation for one second (tack-cure). Next, the specimens were stored in distilled water at 23°C ± 2°C for 72 hours. The cement line thickness was measured in micrometers, and the interface quality received scores according to the characteristics and sealing aspects. The evaluations were performed with an optical microscope, and scanning electron microscope images were presented to demonstrate the various features found in the cement line. For the cement line thickness, data were analyzed with three-way analysis of variance (ANOVA) and the Games-Howell test (α=0.05). For the variable interface quality, the data were analyzed with the Mann-Whitney U-test, the Kruskal-Wallis test, and multiple comparisons nonparametric Dunn test (α=0.05). The ANOVA presented statistical differences among the ceramic restoration manufacturing methods as well as a significant interaction between the manufacturing methods and types of cement (p<0.05). The U200 presented lower cement line thickness values when compared to the ARC with both cementation techniques (p<0.05). With regard to the interface quality, the Mann-Whitney U-test and the Kruskal-Wallis test demonstrated statistical differences between the ceramic restoration manufacturing methods and cementation techniques. The PRESS ceramics obtained lower scores than did the CAD ceramics when using ARC cement (p<0.05). Milled restorations cemented with self-adhesive resin cement resulted in a thinner cement line that is statistically different from that of CAD or pressed ceramics cemented with resin cement with adhesive application. No difference between one-second tack-cure and self-activation was noted.
A full year evaluation of the CALIOPE-EU air quality modeling system over Europe for 2004
NASA Astrophysics Data System (ADS)
Pay, M. T.; Piot, M.; Jorba, O.; Gassó, S.; Gonçalves, M.; Basart, S.; Dabdub, D.; Jiménez-Guerrero, P.; Baldasano, J. M.
The CALIOPE-EU high-resolution air quality modeling system, namely WRF-ARW/HERMES-EMEP/CMAQ/BSC-DREAM8b, is developed and applied to Europe (12 km × 12 km, 1 h). The model performances are tested in terms of air quality levels and dynamics reproducibility on a yearly basis. The present work describes a quantitative evaluation of gas phase species (O 3, NO 2 and SO 2) and particulate matter (PM2.5 and PM10) against ground-based measurements from the EMEP (European Monitoring and Evaluation Programme) network for the year 2004. The evaluation is based on statistics. Simulated O 3 achieves satisfactory performances for both daily mean and daily maximum concentrations, especially in summer, with annual mean correlations of 0.66 and 0.69, respectively. Mean normalized errors are comprised within the recommendations proposed by the United States Environmental Protection Agency (US-EPA). The general trends and daily variations of primary pollutants (NO 2 and SO 2) are satisfactory. Daily mean concentrations of NO 2 correlate well with observations (annual correlation r = 0.67) but tend to be underestimated. For SO 2, mean concentrations are well simulated (mean bias = 0.5 μg m -3) with relatively high annual mean correlation ( r = 0.60), although peaks are generally overestimated. The dynamics of PM2.5 and PM10 is well reproduced (0.49 < r < 0.62), but mean concentrations remain systematically underestimated. Deficiencies in particulate matter source characterization are discussed. Also, the spatially distributed statistics and the general patterns for each pollutant over Europe are examined. The model performances are compared with other European studies. While O 3 statistics generally remain lower than those obtained by the other considered studies, statistics for NO 2, SO 2, PM2.5 and PM10 present higher scores than most models.
NASA Astrophysics Data System (ADS)
Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli
2018-01-01
This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.
Jadidi, Masoud; Båth, Magnus; Nyrén, Sven
2018-04-09
To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.
Influence of leadership on quality nursing care.
Mendes, Luis; Fradique, Maria de Jesus José Gil
2014-01-01
The purpose of this paper is to investigate the extent to which nursing leadership, perceived by nursing staff, influences nursing quality. Data were collected between August and October 2011 in a Portuguese health center via a questionnaire completed by nurses. Our original sample included 283 employees; 184 questionnaires were received (65% response). The theoretical model presents reasonably satisfactory fit indices (values above literature reference). Path analysis between latent constructs clearly suggests that nursing leadership has a direct (beta = 0.724) and statistically significant (p = 0.007) effect on nursing quality. Results reinforce several ideas propagated throughout the literature, which suggests the relationship's relevance, but lacks empirical support, which this study corrects.
A generalised significance test for individual communities in networks.
Kojaku, Sadamori; Masuda, Naoki
2018-05-09
Many empirical networks have community structure, in which nodes are densely interconnected within each community (i.e., a group of nodes) and sparsely across different communities. Like other local and meso-scale structure of networks, communities are generally heterogeneous in various aspects such as the size, density of edges, connectivity to other communities and significance. In the present study, we propose a method to statistically test the significance of individual communities in a given network. Compared to the previous methods, the present algorithm is unique in that it accepts different community-detection algorithms and the corresponding quality function for single communities. The present method requires that a quality of each community can be quantified and that community detection is performed as optimisation of such a quality function summed over the communities. Various community detection algorithms including modularity maximisation and graph partitioning meet this criterion. Our method estimates a distribution of the quality function for randomised networks to calculate a likelihood of each community in the given network. We illustrate our algorithm by synthetic and empirical networks.
MUSTANG: A Community-Facing Web Service to Improve Seismic Data Quality Awareness Through Metrics
NASA Astrophysics Data System (ADS)
Templeton, M. E.; Ahern, T. K.; Casey, R. E.; Sharer, G.; Weertman, B.; Ashmore, S.
2014-12-01
IRIS DMC is engaged in a new effort to provide broad and deep visibility into the quality of data and metadata found in its terabyte-scale geophysical data archive. Taking advantage of large and fast disk capacity, modern advances in open database technologies, and nimble provisioning of virtual machine resources, we are creating an openly accessible treasure trove of data measurements for scientists and the general public to utilize in providing new insights into the quality of this data. We have branded this statistical gathering system MUSTANG, and have constructed it as a component of the web services suite that IRIS DMC offers. MUSTANG measures over forty data metrics addressing issues with archive status, data statistics and continuity, signal anomalies, noise analysis, metadata checks, and station state of health. These metrics could potentially be used both by network operators to diagnose station problems and by data users to sort suitable data from unreliable or unusable data. Our poster details what MUSTANG is, how users can access it, what measurements they can find, and how MUSTANG fits into the IRIS DMC's data access ecosystem. Progress in data processing, approaches to data visualization, and case studies of MUSTANG's use for quality assurance will be presented. We want to illustrate what is possible with data quality assurance, the need for data quality assurance, and how the seismic community will benefit from this freely available analytics service.
Quality of life and psychosocial aspects in Greek patients with psoriasis: a cross-sectional study.
Kouris, Anargyros; Christodoulou, Christos; Stefanaki, Christina; Livaditis, Miltiadis; Tsatovidou, Revekka; Kouskoukis, Constantinos; Petridis, Athanasios; Kontochristopoulos, George
2015-01-01
Psoriasis is a common, long-term skin disease associated with high levels of psychological distress and a considerable adverse impact on life. The effects of psoriasis, beyond skin affliction, are seldom recognized and often undertreated. The aim of the study is to evaluate the quality of life, anxiety and depression, self-esteem and loneliness in patients with psoriasis. Eighty-four patients with psoriasis were enrolled in the study. The quality of life, depression and anxiety, loneliness and self-esteem of the patient were assessed using the Dermatology Life Quality Index, Hospital Anxiety and Depression Scale, the UCLA loneliness Scale (UCLA-Version 3) and Rosenberg's Self-esteem Scale, respectively. The Dermatology Quality of Life Index score among psoriasis patients was 12.61 ± 4.88. They had statistically significantly higher scores according to the Hospital Anxiety and Depression Scale -anxiety subscale (p=0.032)-compared with healthy volunteers. Moreover, a statistically significant difference was found between the two groups concerning the UCLA-scale (p=0.033) and RSES-scale (p<0.0001). Female patients presented with lower self-esteem than male patients. Psoriasis is a distressing, recurrent disorder that significantly impairs quality of life. Therefore, the recognition and future management of psoriasis may require the involvement of multi-disciplinary teams to manage the physical, psychological and social aspects of the condition, as is the case for systemic, long-term conditions.
Liu, Yingchun; Sun, Guoxiang; Wang, Yan; Yang, Lanping; Yang, Fangliang
2015-06-01
Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statistical analyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi
2010-09-01
A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.
Brady, Amie M.G.; Plona, Meg B.
2009-01-01
During the recreational season of 2008 (May through August), a regression model relating turbidity to concentrations of Escherichia coli (E. coli) was used to predict recreational water quality in the Cuyahoga River at the historical community of Jaite, within the present city of Brecksville, Ohio, a site centrally located within Cuyahoga Valley National Park. Samples were collected three days per week at Jaite and at three other sites on the river. Concentrations of E. coli were determined and compared to environmental and water-quality measures and to concentrations predicted with a regression model. Linear relations between E. coli concentrations and turbidity, gage height, and rainfall were statistically significant for Jaite. Relations between E. coli concentrations and turbidity were statistically significant for the three additional sites, and relations between E. coli concentrations and gage height were significant at the two sites where gage-height data were available. The turbidity model correctly predicted concentrations of E. coli above or below Ohio's single-sample standard for primary-contact recreation for 77 percent of samples collected at Jaite.
Williams-Sether, Tara
2004-01-01
The Dakota Water Resources Act, passed by the U.S. Congress on December 15, 2000, authorized the Secretary of the Interior to conduct a comprehensive study of future water-quantity and quality needs of the Red River of the North Basin in North Dakota and possible options to meet those water needs. Previous Red River of the North Basin studies conducted by the Bureau of Reclamation used streamflow and water-quality data bases developed by the U.S. Geological Survey that included data for 1931-84. As a result of the recent congressional authorization and results of previous studies by the Bureau of Reclamation, redevelopment of the streamflow and water-quality data bases with current data through 1999 are needed in order to evaluate and predict the water-quantity and quality effects within the Red River of the North Basin. This report provides updated statistical summaries of selected water-quality constituents and streamflow and the regression relations between them. Available data for 1931-99 were used to develop regression equations between 5 selected water-quality constituents and streamflow for 38 gaging stations in the Red River of the North Basin. The water-quality constituents that were regressed against streamflow were hardness (as CaCO3), sodium, chloride, sulfate, and dissolved solids. Statistical summaries of the selected water-quality constituents and streamflow for the gaging stations used in the regression equations development and the applications and limitations of the regression equations are presented in this report.
The effect of different types of employment on quality of life.
Kober, R; Eggleton, I R C
2005-10-01
Despite research that has investigated whether the financial benefits of open employment exceed the costs, there has been scant research as to the effect sheltered and open employment have upon the quality of life of participants. The importance of this research is threefold: it investigates outcomes explicitly in terms of quality of life; the sample size is comparatively large; and it uses an established and validated questionnaire. One hundred and seventeen people with intellectual disability (ID) who were employed in either open or sheltered employment by disability employment agencies were interviewed. Quality of life was assessed using the Quality of Life Questionnaire. After making an initial assessment to see whether the outcomes achieved depended on type of employment, quality of life scores were analyzed controlling for participants' level of functional work ability (assessed via the Functional Assessment Inventory). The results showed that participants placed in open employment reported statistically significant higher quality of life scores. When the sample was split based upon participants' functional work ability, the type of employment had no effect on the reported quality of life for participants with a low functional work ability. However, for those participants with a high functional work ability, those in open employment reported statistically significantly higher quality of life. The results of this study support the placement of people with ID with high functional work ability into open employment. However, a degree of caution needs to be taken in interpreting the results presented given the disparity in income levels between the two types of employment.
2011-01-01
Background Despite more than a decade of research on hospitalists and their performance, disagreement still exists regarding whether and how hospital-based physicians improve the quality of inpatient care delivery. This systematic review summarizes the findings from 65 comparative evaluations to determine whether hospitalists provide a higher quality of inpatient care relative to traditional inpatient physicians who maintain hospital privileges with concurrent outpatient practices. Methods Articles on hospitalist performance published between January 1996 and December 2010 were identified through MEDLINE, Embase, Science Citation Index, CINAHL, NHS Economic Evaluation Database and a hand-search of reference lists, key journals and editorials. Comparative evaluations presenting original, quantitative data on processes, efficiency or clinical outcome measures of care between hospitalists, community-based physicians and traditional academic attending physicians were included (n = 65). After proposing a conceptual framework for evaluating inpatient physician performance, major findings on quality are summarized according to their percentage change, direction and statistical significance. Results The majority of reviewed articles demonstrated that hospitalists are efficient providers of inpatient care on the basis of reductions in their patients' average length of stay (69%) and total hospital costs (70%); however, the clinical quality of hospitalist care appears to be comparable to that provided by their colleagues. The methodological quality of hospitalist evaluations remains a concern and has not improved over time. Persistent issues include insufficient reporting of source or sample populations (n = 30), patients lost to follow-up (n = 42) and estimates of effect or random variability (n = 35); inappropriate use of statistical tests (n = 55); and failure to adjust for established confounders (n = 37). Conclusions Future research should include an expanded focus on the specific structures of care that differentiate hospitalists from other inpatient physician groups as well as the development of better conceptual and statistical models that identify and measure underlying mechanisms driving provider-outcome associations in quality. PMID:21592322
IEEE International Symposium on Biomedical Imaging.
2017-01-01
The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.
Taguchi Approach to Design Optimization for Quality and Cost: An Overview
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.
1990-01-01
Calibrations to existing cost of doing business in space indicate that to establish human presence on the Moon and Mars with the Space Exploration Initiative (SEI) will require resources, felt by many, to be more than the national budget can afford. In order for SEI to succeed, we must actually design and build space systems at lower cost this time, even with tremendous increases in quality and performance requirements, such as extremely high reliability. This implies that both government and industry must change the way they do business. Therefore, new philosophy and technology must be employed to design and produce reliable, high quality space systems at low cost. In recognizing the need to reduce cost and improve quality and productivity, Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) have initiated Total Quality Management (TQM). TQM is a revolutionary management strategy in quality assurance and cost reduction. TQM requires complete management commitment, employee involvement, and use of statistical tools. The quality engineering methods of Dr. Taguchi, employing design of experiments (DOE), is one of the most important statistical tools of TQM for designing high quality systems at reduced cost. Taguchi methods provide an efficient and systematic way to optimize designs for performance, quality, and cost. Taguchi methods have been used successfully in Japan and the United States in designing reliable, high quality products at low cost in such areas as automobiles and consumer electronics. However, these methods are just beginning to see application in the aerospace industry. The purpose of this paper is to present an overview of the Taguchi methods for improving quality and reducing cost, describe the current state of applications and its role in identifying cost sensitive design parameters.
Wellner, Ulrich F; Klinger, Carsten; Lehmann, Kai; Buhr, Heinz; Neugebauer, Edmund; Keck, Tobias
2017-04-05
Pancreatic resections are among the most complex procedures in visceral surgery. While mortality has decreased substantially over the past decades, morbidity remains high. The volume-outcome correlation in pancreatic surgery is among the strongest in the field of surgery. The German Society for General and Visceral Surgery (DGAV) established a national registry for quality control, risk assessment and outcomes research in pancreatic surgery in Germany (DGAV SuDoQ|Pancreas). Here, we present the aims and scope of the DGAV StuDoQ|Pancreas Registry. A systematic assessment of registry quality is performed based on the recommendations of the German network for outcomes research (DNVF). The registry quality was assessed by consensus criteria of the DNVF in regard to the domains Systematics and Appropriateness, Standardization, Validity of the sampling procedure, Validity of data collection, Validity of statistical analysis and reports, and General demands for registry quality. In summary, DGAV StuDoQ|Pancreas meets most of the criteria of a high-quality clinical registry. The DGAV StuDoQ|Pancreas provides a valuable platform for quality assessment, outcomes research as well as randomized registry trials in pancreatic surgery.
[Effect of a multidisciplinar protocol on the clinical results obtained after bariatric surgery].
Cánovas Gaillemin, B; Sastre Martos, J; Moreno Segura, G; Llamazares Iglesias, O; Familiar Casado, C; Abad de Castro, S; López Pardo, R; Sánchez-Cabezudo Muñoz, M A
2011-01-01
Bariatric surgery has been shown to be an effective therapy for weight loss in patients with severe obesity, and the implementation of a multidisciplinar management protocol is recommended. To assess the usefulness of the implementation of a management protocol in obesity surgery based on the Spanish Consensus Document of the SEEDO. Retrospective comparative study of the outcomes in patients previously operated (51 patients) and after the implementation of the protocol (66 patients). The following data were gathered: anthropometry, pre-and post-surgery comorbidities, post-surgical nutritional and surgical complications, validated Quality of Life questionnaire, and dietary habits. Withdrawals (l7.6%) and alcoholism (5.8%) were higher in patients pre- versus post-implementation of the protocol (4.5% vs. 3%, respectively), the differences being statistically significant. The mortality rate was 2% in the pre-protocol group and 0% in the postprotocol group. The dietary habits were better in the post-protocol group, the pre-protocol group presenting a higher percentage of feeding-behavior disorders (5.1%) although not reaching a statistical significance. The improvement in quality of life was higher in the post-protocol group for all items, but only reaching statistical significance in sexual activity (p = 0.004). In the pre-protocol group, 70.5% of the patients had more than one nutritional complication vs. 32.8% in the post-protocol group (p < 0.05). There were no differences regarding the percentage of weight in excess lost at two years (> 50% in 81.3% in the pre-protocol group vs. 74.8% in the pos-protocol group) or the comorbidities. Bariatric surgery achieves excellent outcomes in weight loss, comorbidities, and quality of life, but presents nutritional, surgical, and psychiatric complications that require a protocol-based and multidisciplinary approach. Our protocol improves the outcomes regarding the withdrawal rates, feeding-behavior disorders, dietary habits, nutritional complications, and quality of life.
Wavelet methodology to improve single unit isolation in primary motor cortex cells.
Ortiz-Rosario, Alexis; Adeli, Hojjat; Buford, John A
2015-05-15
The proper isolation of action potentials recorded extracellularly from neural tissue is an active area of research in the fields of neuroscience and biomedical signal processing. This paper presents an isolation methodology for neural recordings using the wavelet transform (WT), a statistical thresholding scheme, and the principal component analysis (PCA) algorithm. The effectiveness of five different mother wavelets was investigated: biorthogonal, Daubachies, discrete Meyer, symmetric, and Coifman; along with three different wavelet coefficient thresholding schemes: fixed form threshold, Stein's unbiased estimate of risk, and minimax; and two different thresholding rules: soft and hard thresholding. The signal quality was evaluated using three different statistical measures: mean-squared error, root-mean squared, and signal to noise ratio. The clustering quality was evaluated using two different statistical measures: isolation distance, and L-ratio. This research shows that the selection of the mother wavelet has a strong influence on the clustering and isolation of single unit neural activity, with the Daubachies 4 wavelet and minimax thresholding scheme performing the best. Copyright © 2015. Published by Elsevier B.V.
QSAR models for anti-malarial activity of 4-aminoquinolines.
Masand, Vijay H; Toropov, Andrey A; Toropova, Alla P; Mahajan, Devidas T
2014-03-01
In the present study, predictive quantitative structure - activity relationship (QSAR) models for anti-malarial activity of 4-aminoquinolines have been developed. CORAL, which is freely available on internet (http://www.insilico.eu/coral), has been used as a tool of QSAR analysis to establish statistically robust QSAR model of anti-malarial activity of 4-aminoquinolines. Six random splits into the visible sub-system of the training and invisible subsystem of validation were examined. Statistical qualities for these splits vary, but in all these cases, statistical quality of prediction for anti-malarial activity was quite good. The optimal SMILES-based descriptor was used to derive the single descriptor based QSAR model for a data set of 112 aminoquinolones. All the splits had r(2)> 0.85 and r(2)> 0.78 for subtraining and validation sets, respectively. The three parametric multilinear regression (MLR) QSAR model has Q(2) = 0.83, R(2) = 0.84 and F = 190.39. The anti-malarial activity has strong correlation with presence/absence of nitrogen and oxygen at a topological distance of six.
Knobel, LeRoy L.
2006-01-01
This report presents qualitative and quantitative comparisons of water-quality data from the Idaho National Laboratory, Idaho, to determine if the change from purging three wellbore volumes to one wellbore volume has a discernible effect on the comparability of the data. Historical water-quality data for 30 wells were visually compared to water-quality data collected after purging only 1 wellbore volume from the same wells. Of the 322 qualitatively examined constituent plots, 97.5 percent met 1 or more of the criteria established for determining data comparability. A simple statistical equation to determine if water-quality data collected from 28 wells at the INL with long purge times (after pumping 1 and 3 wellbore volumes of water) were statistically the same at the 95-percent confidence level indicated that 97.9 percent of 379 constituent pairs were equivalent. Comparability of water-quality data determined from both the qualitative (97.5 percent comparable) and quantitative (97.9 percent comparable) evaluations after purging 1 and 3 wellbore volumes of water indicates that the change from purging 3 to 1 wellbore volumes had no discernible effect on comparability of water-quality data at the INL. However, the qualitative evaluation was limited because only October-November 2003 data were available for comparison to historical data. This report was prepared by the U.S. Geological Survey in cooperation with the U.S. Department of Energy.
Kim, Boram
2014-01-01
BACKGROUND/OBJECTIVES Although the issues of singles' dietary style and quality of life are becoming important due to the increasing number of singles with economic power, little research has been conducted to date on singles' use of convenience food and quality of life in relation to their dietary style. Thus, the present study intends to provide basic data to improve the quality of life by determining the current status of the use of convenience food and explicating its relationship with quality of life through analyzing the dietary lifestyles of the singles. SUBJECTS/METHODS The targets of this study were singles, identified as adults between the ages of 25 and 54, living alone, either legally or in actuality having no partner. A statistical analysis of 208 surveys from Seoul, respectively, was conducted using SPSS12.0 for Windows and SEM using AMOS 5.0 statistics package. RESULTS The convenience-oriented was shown to have a significant positive effect on convenience food satisfaction. HMR satisfaction was found to have a significant effect on positive psychological satisfaction and the convenience-oriented was found to have a significant negative effect on all aspects of quality of life satisfaction. CONCLUSIONS There must be persistent development of food industries considering the distinctive characteristics of the lives of singles in order to satisfy their needs and improve the quality of their lives. PMID:25324938
Kouris, Anargyros; Armyra, Kalliopi; Christodoulou, Christos; Sgontzou, Themis; Karypidis, Dimitrios; Kontochristopoulos, George; Liordou, Fotini; Zakopoulou, Nikoletta; Zouridaki, Eftychia
2016-10-01
Chronic leg ulcers are a public health problem that can have a significant impact on the patient's physical, socioeconomic and psychological status. The aim of this study is to evaluate the quality of life, anxiety and depression, self-esteem and loneliness in patients suffering from leg ulcers. A total of 102 patients were enrolled in the study. The quality of life, anxiety and depression, self-esteem and loneliness of the patient were assessed using the Dermatology Life Quality Index (DLQI), Hospital Anxiety and Depression Scale (HADS), Rosenberg's Self-esteem Scale (RSES) and the UCLA Loneliness Scale (UCLA-Version 3), respectively. The mean DLQI score was 13·38 ± 2·59, suggesting a serious effect on the quality of life of patients. Those with leg ulcers had statistically significant higher scores according to the HADS-total scale (P = 0·031) and HADS-anxiety subscale (P = 0·015) compared with healthy volunteers. Moreover, a statistically significant difference was found between the two groups concerning the UCLA-scale (P = 0·029). Female patients presented with a higher score of anxiety (P = 0·027) and social isolation (P = 0·048), and worse quality of life (P = 0·018) than male patients. A severe quality of life impairment was documented, reflecting a significant psychosocial impact on patients with leg ulcers. © 2014 The Authors. International Wound Journal © 2014 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
Consistency errors in p-values reported in Spanish psychology journals.
Caperos, José Manuel; Pardo, Antonio
2013-01-01
Recent reviews have drawn attention to frequent consistency errors when reporting statistical results. We have reviewed the statistical results reported in 186 articles published in four Spanish psychology journals. Of these articles, 102 contained at least one of the statistics selected for our study: Fisher-F , Student-t and Pearson-c 2 . Out of the 1,212 complete statistics reviewed, 12.2% presented a consistency error, meaning that the reported p-value did not correspond to the reported value of the statistic and its degrees of freedom. In 2.3% of the cases, the correct calculation would have led to a different conclusion than the reported one. In terms of articles, 48% included at least one consistency error, and 17.6% would have to change at least one conclusion. In meta-analytical terms, with a focus on effect size, consistency errors can be considered substantial in 9.5% of the cases. These results imply a need to improve the quality and precision with which statistical results are reported in Spanish psychology journals.
Healthy mom, healthy baby: pregnancy and prenatal issues.
Mack, L
2000-01-01
Information on how HIV-positive women can maintain a healthy pregnancy, presented at the 1999 Conference on Women and HIV, is summarized. Topics include receiving quality care, the safety of Emivirine, and a comparison of statistics on pregnancy outcomes in infected and non-infected women. A brief summary of maternal factors related to transmission is presented. The impact of recreational drug use, as well as domestic abuse on mother and fetus, are also discussed.
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E.; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-01-01
OBJECTIVE: To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. METHODS: Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. FINDINGS: Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. CONCLUSION: Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers. PMID:16583080
Bradshaw, Debbie; Groenewald, Pamela; Bourne, David E; Mahomed, Hassan; Nojilana, Beatrice; Daniels, Johan; Nixon, Jo
2006-03-01
To review the quality of the coding of the cause of death (COD) statistics and assess the mortality information needs of the City of Cape Town. Using an action research approach, a study was set up to investigate the quality of COD information, the accuracy of COD coding and consistency of coding practices in the larger health subdistricts. Mortality information needs and the best way of presenting the statistics to assist health managers were explored. Useful information was contained in 75% of death certificates, but nearly 60% had only a single cause certified; 55% of forms were coded accurately. Disagreement was mainly because routine coders coded the immediate instead of the underlying COD. An abridged classification of COD, based on causes of public health importance, prevalent causes and selected combinations of diseases was implemented with training on underlying cause. Analysis of the 2001 data identified the leading causes of death and premature mortality and illustrated striking differences in the disease burden and profile between health subdistricts. Action research is particularly useful for improving information systems and revealed the need to standardize the coding practice to identify underlying cause. The specificity of the full ICD classification is beyond the level of detail on the death certificates currently available. An abridged classification for coding provides a practical tool appropriate for local level public health surveillance. Attention to the presentation of COD statistics is important to enable the data to inform decision-makers.
An Optical Flow-Based Full Reference Video Quality Assessment Algorithm.
K, Manasa; Channappayya, Sumohana S
2016-06-01
We present a simple yet effective optical flow-based full-reference video quality assessment (FR-VQA) algorithm for assessing the perceptual quality of natural videos. Our algorithm is based on the premise that local optical flow statistics are affected by distortions and the deviation from pristine flow statistics is proportional to the amount of distortion. We characterize the local flow statistics using the mean, the standard deviation, the coefficient of variation (CV), and the minimum eigenvalue ( λ min ) of the local flow patches. Temporal distortion is estimated as the change in the CV of the distorted flow with respect to the reference flow, and the correlation between λ min of the reference and of the distorted patches. We rely on the robust multi-scale structural similarity index for spatial quality estimation. The computed temporal and spatial distortions, thus, are then pooled using a perceptually motivated heuristic to generate a spatio-temporal quality score. The proposed method is shown to be competitive with the state-of-the-art when evaluated on the LIVE SD database, the EPFL Polimi SD database, and the LIVE Mobile HD database. The distortions considered in these databases include those due to compression, packet-loss, wireless channel errors, and rate-adaptation. Our algorithm is flexible enough to allow for any robust FR spatial distortion metric for spatial distortion estimation. In addition, the proposed method is not only parameter-free but also independent of the choice of the optical flow algorithm. Finally, we show that the replacement of the optical flow vectors in our proposed method with the much coarser block motion vectors also results in an acceptable FR-VQA algorithm. Our algorithm is called the flow similarity index.
NASA Astrophysics Data System (ADS)
Lemaire, Vincent; Colette, Augustin; Menut, Laurent
2016-04-01
Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.
Statistics Report on TEQSA Registered Higher Education Providers, 2016
ERIC Educational Resources Information Center
Australian Government Tertiary Education Quality and Standards Agency, 2016
2016-01-01
This Statistics Report is the third release of selected higher education sector data held by the Australian Government Tertiary Education Quality and Standards Agency (TEQSA) for its quality assurance activities. It provides a snapshot of national statistics on all parts of the sector by bringing together data collected directly by TEQSA with data…
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Clustering and Flow Conservation Monitoring Tool for Software Defined Networks
Puente Fernández, Jesús Antonio
2018-01-01
Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches. PMID:29614049
Alikari, Victoria; Sachlas, Athanasios; Giatrakou, Stavroula; Stathoulis, John; Fradelos, Evagelos; Theofilou, Paraskevi; Lavdaniti, Maria; Zyga, Sofia
2017-01-01
An important factor which influences the quality of life of patients with arthritis is the fatigue they experience. The purpose of this study was to assess the relationship between fatigue and quality of life among patients with osteoarthritis and rheumatoid arthritis. Between January 2015 and March 2015, 179 patients with osteoarthritis and rheumatoid arthritis completed the Fatigue Assessment Scale and the Missoula-VITAS Quality of Life Index-15 (MVQoLI-15). The study was conducted in Rehabilitation Centers located in the area of Peloponnese, Greece. Data related to sociodemographic characteristics and their individual medical histories were recorded. Statistical analysis was performed using the IBM SPSS Statistics version 19. The analysis did not reveal statistically significant correlation between fatigue and quality of life neither in the total sample nor among patients with osteoarthritis (r = -0.159; p = 0.126) or rheumatoid arthritis. However, there was a statistically significant relationship between some aspects of fatigue and dimensions of quality of life. Osteoarthritis patients had statistically significant lower MVQoLI-15 score than rheumatoid arthritis patients (13.73 ± 1.811 vs 14.61 ± 1.734) and lower FAS score than rheumatoid patients (26.14 ± 3.668 vs 29.94 ± 3.377) (p-value < 0.001). The finding that different aspects of fatigue may affect dimensions of quality of life may help health care professionals by proposing the early treatment of fatigue in order to gain benefits for quality of life.
[Quality of service provided to heart surgery patients of the Unified Health System-SUS].
Borges, Juliana Bassalobre Carvalho; Carvalho, Sebastião Marcos Ribeiro de; Silva, Marcos Augusto de Moraes
2010-01-01
To evaluate the service quality provided to heart surgery patients during their hospital stay, identifying the patient's expectations and perceptions. To associate service quality with: gender, age and the use of extracorporeal circulation. We studied 82 elective heart surgery patients (52.4% females and 47.6% males), operated by midsternal thoracotomy, age: 31 to 83 years (60.4 +/- 13.2 years); period: March to September 2006. Service quality was evaluated in two instances: the expectations at pre-operative and the perceptions of the service received on the 6th post-operative; through the application of the modified SERVQUAL scale (SERVQUAL-Card). The result was obtained by the difference of the sum of the scores on perception minus those of the expectations, and through statistical analysis. The SERVQUAL-Card scale was statistically validated, showing adequate level of internal consistency. We found a higher frequency of myocardial revascularization 55 (67.0%); first heart surgery 72 (87.8%) and the use of ECC 69 (84.1%). We noticed high mean values for expectations and perceptions with significant results (P<0.05). We observed a significant relationship between the quality of service with: gender, in empathy (P= 0.04) and age, in reliability (P = 0.02). There was no significant association between ECC and quality of service. Service quality was satisfactory. The patient demonstrated a high expectation to hospital medical service. Women present a higher perception of quality in empathy and younger people in reliability. The use of ECC is not related to service quality in this sample. The data obtained in this study suggest that the quality of this health service can be monitored through the periodical application of the SERVQUAL scale.
Quality of life in patients with diabetic foot ulcer in Visegrad countries.
Nemcová, Jana; Hlinková, Edita; Farský, Ivan; Žiaková, Katarína; Jarošová, Darja; Zeleníková, Renáta; Bužgová, Radka; Janíková, Eva; Zdzieblo, Kazimiera; Wiraszka, Grazyna; Stepien, Renata; Nowak-Starz, Grazyna; Csernus, Mariann; Balogh, Zoltan
2017-05-01
To identify the quality of life of patients with diabetic foot ulcers in the Visegrad countries. The diabetics with foot ulcers are principally evaluated on the basis of physical parameters, but this does not always reveal much about the patient's experience of life with ulceration. The cross-sectional study. The standardised generic questionnaire World Health Organization Quality of Life-BREF was used. The sample was made up of 525 participants and the calculations were performed using the IBM spss statistical program. The significant negative correlations between demographic data such as age, duration of diabetes mellitus, duration of diabetic ulceration treatment and a lower level of quality of life were found across the sample. The statistically significant differences according to clinical characteristics such as Wagner classification, frequency of foot ulcers, present peripheral vascular disease and pain in terms of quality of life were also revealed. Significant differences of quality of life among Visegrad countries were revealed: Hungary's participants had a worse quality of life than others, while Slovak participants expressed lower satisfaction with their health than Czech. Socio-demographic factors and clinical characteristics influence the quality of life of patients with diabetic foot ulcer. Significant differences between patients of Visegrad countries were found in all domains of quality of life: physical, psychological, social and environmental. The quality of life of patients with diabetic foot ulcer reflects the conditions and healthcare system in each of the Visegrad countries. We have to respect socio-demographic factors and clinical characteristics in nursing care. This could have an impact on managing patient care not only with regard to their diabetic foot ulcer but also with regard to the patient as a personality with their own problems in relation to physical, psychosocial and environmental conditions. © 2016 John Wiley & Sons Ltd.
Mast, M. Alisa
2007-01-01
This report summarizes historical water-quality data for six National Park units that compose the Rocky Mountain Network. The park units in Colorado are Florissant Fossil Beds National Monument, Great Sand Dunes National Park and Preserve, and Rocky Mountain National Park; and in Montana, they are Glacier National Park, Grant-Kohrs Ranch National Historic Site, and Little Bighorn Battlefield National Monument. This study was conducted in cooperation with the Inventory and Monitoring Program of the National Park Service to aid in the design of an effective and efficient water-quality monitoring plan for each park. Data were retrieved from a number of sources for the period of record through 2004 and compiled into a relational database. Descriptions of the environmental setting of each park and an overview of the park's water resources are presented. Statistical summaries of water-quality constituents are presented and compared to aquatic-life and drinking-water standards. Spatial, seasonal, and temporal patterns in constituent concentrations also are described and suggestions for future water-quality monitoring are provided.
Quality of life of victims of traumatic brain injury six months after the trauma.
Vieira, Rita de Cássia Almeida; Hora, Edilene Curvelo; de Oliveira, Daniel Vieira de; Ribeiro, Maria do Carmo de Oliveira; de Sousa, Regina Márcia Cardoso
2013-01-01
to describe the quality of life of victims of traumatic brain injury six months after the event and to show the relationship between the results observed and the clinical, sociodemographic and return to productivity data. data were analyzed from 47 victims assisted in a trauma reference hospital in the municipality of Aracaju and monitored in an outpatient neurosurgery clinic. The data were obtained through analysis of the patient records and structured interviews, with the application of the World Health Organization Quality of Life, brief version, questionnaire. the victims presented positive perceptions of their quality of life, and the physical domain presented the highest mean value (68.4±22.9). Among the sociodemographic characteristics, a statistically significant correlation was found between marital status and the psychological domain. However, the return to productivity was related to all the domains. the return to productivity was an important factor for the quality of life of the victims of traumatic brain injury and should direct the public policies in promoting the health of these victims.
Balneotherapy for osteoarthritis. A cochrane review.
Verhagen, Arianne; Bierma-Zeinstra, Sita; Lambeck, Johan; Cardoso, Jefferson Rosa; de Bie, Rob; Boers, Maarten; de Vet, Henrica C W
2008-06-01
Balneotherapy (or spa therapy, mineral baths) for patients with arthritis is one of the oldest forms of therapy. We assessed effectiveness of balneotherapy for patients with osteoarthritis (OA). We performed a broad search strategy to retrieve eligible studies, selecting randomized controlled trials comparing balneotherapy with any intervention or with no intervention. Two authors independently assessed quality and extracted data. Disagreements were solved by consensus. In the event of clinical heterogeneity or lack of data we refrained from statistical pooling. Seven trials (498 patients) were included in this review: one performed an intention-to-treat analysis, 2 provided data for our own analysis, and one reported a "quality of life" outcome. We found silver-level evidence of mineral baths compared to no treatment (effect sizes 0.34-1.82). Adverse events were not measured or found in included trials. We found silver-level evidence concerning the beneficial effects of mineral baths compared to no treatment. Of all other balneological treatments, no clear effects were found. However, the scientific evidence is weak because of the poor methodological quality and the absence of an adequate statistical analysis and data presentation.
Analysis of the sleep quality of elderly people using biomedical signals.
Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A
2015-01-01
This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.
Estimation of descriptive statistics for multiply censored water quality data
Helsel, Dennis R.; Cohn, Timothy A.
1988-01-01
This paper extends the work of Gilliom and Helsel (1986) on procedures for estimating descriptive statistics of water quality data that contain “less than” observations. Previously, procedures were evaluated when only one detection limit was present. Here we investigate the performance of estimators for data that have multiple detection limits. Probability plotting and maximum likelihood methods perform substantially better than simple substitution procedures now commonly in use. Therefore simple substitution procedures (e.g., substitution of the detection limit) should be avoided. Probability plotting methods are more robust than maximum likelihood methods to misspecification of the parent distribution and their use should be encouraged in the typical situation where the parent distribution is unknown. When utilized correctly, less than values frequently contain nearly as much information for estimating population moments and quantiles as would the same observations had the detection limit been below them.
Optimal experimental designs for fMRI when the model matrix is uncertain.
Kao, Ming-Hung; Zhou, Lin
2017-07-15
This study concerns optimal designs for functional magnetic resonance imaging (fMRI) experiments when the model matrix of the statistical model depends on both the selected stimulus sequence (fMRI design), and the subject's uncertain feedback (e.g. answer) to each mental stimulus (e.g. question) presented to her/him. While practically important, this design issue is challenging. This mainly is because that the information matrix cannot be fully determined at the design stage, making it difficult to evaluate the quality of the selected designs. To tackle this challenging issue, we propose an easy-to-use optimality criterion for evaluating the quality of designs, and an efficient approach for obtaining designs optimizing this criterion. Compared with a previously proposed method, our approach requires a much less computing time to achieve designs with high statistical efficiencies. Copyright © 2017 Elsevier Inc. All rights reserved.
The difficulty in assessing errors in numerical models of air quality is a major obstacle to improving their ability to predict and retrospectively map air quality. In this paper, using simulation outputs from the Community Multi-scale Air Quality Model (CMAQ), the statistic...
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Leadership in nursing and patient satisfaction in hospital context.
Nunes, Elisabete Maria Garcia Teles; Gaspar, Maria Filomena Mendes
2016-06-01
Objectives to know the quality of the leadership relationship from the perspective of a chief nurse and nurse, patient satisfaction, the relationship between the quality of the relationship perceived for both and patient satisfaction. Methods a quantitative, transverse and correlational approach. Non-probabilistic convenience sample consists of 15 chief nurses, 342 nurses, 273 patients. Data collected at the Central Lisbon Hospital Center, between January and March 2013, through the LMX-7, CLMX-7 and SUCEH21 scales. Statistical analysis was performed through SPSS ® Statistics 19. Results the chief nurse considers the quality of the leadership relationship good, the nurses consider it satisfactory, patients are considered to be satisfied with nursing care; there is a statistically significant correlation between the quality of the leadership relationship from the perspective of chief nurses and patient satisfaction, there is no statistically significant correlation between the quality of the leadership relationship in the nurse's perspective and satisfaction. Conclusion the chief nurse has a major role in patient satisfaction.
ERIC Educational Resources Information Center
Rojas, Mariano
2011-01-01
In 2009 the Stiglitz Commission presented its report on the Measurement of Progress in Societies. The report was commissioned by President Sarkozy of France in 2008. Among its members, the Commission had five Nobel laureates. The report emphasizes three areas which require further attention by statistical offices and policy makers: A better…
Climate change presents increased potential for very large fires in the contiguous United States
R. Barbero; J. T. Abatzoglou; Sim Larkin; C. A. Kolden; B. Stocks
2015-01-01
Very large fires (VLFs) have important implications for communities, ecosystems, air quality and fire suppression expenditures. VLFs over the contiguous US have been strongly linked with meteorological and climatological variability. Building on prior modelling of VLFs (>5000 ha), an ensemble of 17 global climate models were statistically downscaled over the US...
Comparison of de novo assembly statistics of Cucumis sativus L.
NASA Astrophysics Data System (ADS)
Wojcieszek, Michał; Kuśmirek, Wiktor; Pawełkowicz, Magdalena; PlÄ der, Wojciech; Nowak, Robert M.
2017-08-01
Genome sequencing is the core of genomic research. With the development of NGS and lowering the cost of procedure there is another tight gap - genome assembly. Developing the proper tool for this task is essential as quality of genome has important impact on further research. Here we present comparison of several de Bruijn assemblers tested on C. sativus genomic reads. The assessment shows that newly developed software - dnaasm provides better results in terms of quantity and quality. The number of generated sequences is lower by 5 - 33% with even two fold higher N50. Quality check showed reliable results were generated by dnaasm. This provides us with very strong base for future genomic analysis.
Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo
The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P < 0.0001) for the ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P < 0.0001) for ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
Härkönen, Kati; Kivekäs, Ilkka; Kotti, Voitto; Sivonen, Ville; Vasama, Juha-Pekka
2017-10-01
The objective of the present study is to evaluate the effect of hybrid cochlear implantation (hCI) on quality of life (QoL), quality of hearing (QoH), and working performance in adult patients, and to compare the long-term results of patients with hCI to those of patients with conventional unilateral cochlear implantation (CI), bilateral CI, and single-sided deafness (SSD) with CI. Sound localization accuracy and speech-in-noise test were also compared between these groups. Eight patients with high-frequency sensorineural hearing loss of unknown etiology were selected in the study. Patients with hCI had better long-term speech perception in noise than uni- or bilateral CI patients, but the difference was not statistically significant. The sound localization accuracy was equal in the hCI, bilateral CI, and SSD patients. QoH was statistically significantly better in bilateral CI patients than in the others. In hCI patients, residual hearing was preserved in all patients after the surgery. During the 3.6-year follow-up, the mean hearing threshold at 125-500 Hz decreased on average by 15 dB HL in the implanted ear. QoL and working performance improved significantly in all CI patients. Hearing outcomes with hCI are comparable to the results of bilateral CI or CI with SSD, but hearing in noise and sound localization are statistically significantly better than with unilateral CI. Interestingly, the impact of CI on QoL, QoH, and working performance was similar in all groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.
For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less
Validation of the H-SAF precipitation product H03 over Greece using rain gauge data
NASA Astrophysics Data System (ADS)
Feidas, H.; Porcu, F.; Puca, S.; Rinollo, A.; Lagouvardos, C.; Kotroni, V.
2018-01-01
This paper presents an extensive validation of the combined infrared/microwave H-SAF (EUMETSAT Satellite Application Facility on Support to Operational Hydrology and Water Management) precipitation product H03, for a 1-year period, using gauge observations from a relatively dense network of 233 stations over Greece. First, the quality of the interpolated data used to validate the precipitation product is assessed and a quality index is constructed based on parameters such as the density of the station network and the orography. Then, a validation analysis is conducted based on comparisons of satellite (H03) with interpolated rain gauge data to produce continuous and multi-categorical statistics at monthly and annual timescales by taking into account the different geophysical characteristics of the terrain (land, coast, sea, elevation). Finally, the impact of the quality of interpolated data on the validation statistics is examined in terms of different configurations of the interpolation model and the rain gauge network characteristics used in the interpolation. The possibility of using a quality index of the interpolated data as a filter in the validation procedure is also investigated. The continuous validation statistics show yearly root mean squared error (RMSE) and mean absolute error (MAE) corresponding to the 225 and 105 % of the mean rain rate, respectively. Mean error (ME) indicates a slight overall tendency for underestimation of the rain gauge rates, which takes large values for the high rain rates. In general, the H03 algorithm cannot retrieve very well the light (< 1 mm/h) and the convective type (>10 mm/h) precipitation. The poor correlation between satellite and gauge data points to algorithm problems in co-locating precipitation patterns. Seasonal comparison shows that retrieval errors are lower for cold months than in the summer months of the year. The multi-categorical statistics indicate that the H03 algorithm is able to discriminate efficiently the rain from the no rain events although a large number of rain events are missed. The most prominent feature is the very high false alarm ratio (FAR) (more than 70 %), the relatively low probability of detection (POD) (less than 40 %), and the overestimation of the rainy pixels. Although the different geophysical features of the terrain (land, coast, sea, elevation) and the quality of the interpolated data have an effect on the validation statistics, this, in general, is not significant and seems to be more distinct in the categorical than in the continuous statistics.
NASA Astrophysics Data System (ADS)
Kulkarni, R. D.; Agarwal, Vivek
2008-08-01
An ion chamber amplifier (ICA) is used as a safety device for neutronic power (flux) measurement in regulation and protection systems of nuclear reactors. Therefore, performance reliability of an ICA is an important issue. Appropriate quality engineering is essential to achieve a robust design and performance of the ICA circuit. It is observed that the low input bias current operational amplifiers used in the input stage of the ICA circuit are the most critical devices for proper functioning of the ICA. They are very sensitive to the gamma radiation present in their close vicinity. Therefore, the response of the ICA deteriorates with exposure to gamma radiation resulting in a decrease in the overall reliability, unless desired performance is ensured under all conditions. This paper presents a performance enhancement scheme for an ICA operated in the nuclear environment. The Taguchi method, which is a proven technique for reliability enhancement, has been used in this work. It is demonstrated that if a statistical, optimal design approach, like the Taguchi method is used, the cost of high quality and reliability may be brought down drastically. The complete methodology and statistical calculations involved are presented, as are the experimental and simulation results to arrive at a robust design of the ICA.
Development of an integrated data base for land use and water quality planning
NASA Technical Reports Server (NTRS)
Adams, J.; Vanschayk, C.; Istvan, L. B.
1977-01-01
To help understand the role played by different land resources in water quality management a computer based data system was created. The Land Resource Information System (LRIS) allows data to be readily retrieved or statistically analyzed for a variety of purposes. It is specifically formatted to perform coordination of water quality data with logy, etc. New understanding of the region gained through the use of LRIS has gone well beyond the initial purpose of assessing water quality conditions. The land use and natural features information has provided a well defined starting point for a systematic evaluation of proposed land uses, transportation, housing, and other public investments. It has laid the foundation for a comprehensive and integrated approach to many different planning and investment programs presently underway.
Lee, Jacky W Y; Chan, Catherine W S; Chan, Jonathan C H; Li, Q; Lai, Jimmy S M
2014-08-01
OBJECTIVE. To investigate the association between clinical measurements and glaucoma-specific quality of life in Chinese glaucoma patients. DESIGN. Cross-sectional study. SETTING. An academic hospital in Hong Kong. PATIENTS. A Chinese translation of the Glaucoma Quality of Life-15 questionnaire was completed by 51 consecutive patients with bilateral primary open-angle glaucoma. The binocular means of several clinical measurements were correlated with Glaucoma Quality of Life-15 findings using Pearson's correlation coefficient and linear regression. The measurements were the visual field index and pattern standard deviation from the Humphrey Field Analyzer, Snellen best-corrected visual acuity, presenting intra-ocular pressure, current intra-ocular pressure, average retinal nerve fibre layer thickness via optical coherence tomography, and the number of topical anti-glaucoma medications being used. RESULTS. In these patients, there was a significant correlation and linear relationship between a poorer Glaucoma Quality of Life-15 score and a lower visual field index (r=0.3, r(2)=0.1, P=0.01) and visual acuity (r=0.3, r(2)=0.1, P=0.03). A thinner retinal nerve fibre layer also correlated with a poorer Glaucoma Quality of Life-15 score, but did not attain statistical significance (r=0.3, P=0.07). There were no statistically significant correlations for the other clinical parameters with the Glaucoma Quality of Life-15 scores (all P values being >0.7). The three most problematic activities affecting quality of life were "adjusting to bright lights", "going from a light to a dark room or vice versa", and "seeing at night". CONCLUSION. For Chinese primary open-angle glaucoma patients, binocular visual field index and visual acuity correlated linearly with glaucoma-specific quality of life, and activities involving dark adaptation were the most problematic.
Quality of Life of Patients with Oral Cavity Cancer.
Dzebo, Senada; Mahmutovic, Jasmina; Erkocevic, Hasiba
2017-03-01
In recent years the quality of life of patients is very important in monitoring the treatment and therapeutic procedure success. It has become a significant factor in assessing the therapeutic procedure accomplishment, and for the first time the patient alone can access the success of the respective therapy. Cancer of the oral cavity is one of the most common cancers of the head and neck, and is one of the ten most common causes of death in the world. In the majority of cases, cancer of the oral cavity is detected in an advanced stage when therapeutic options are reduced, and the prognosis is much worse. Cancer of the oral cavity is 10 times more common in men. Assessment of quality of life should be an indicator of the multidisciplinary treatment success and it should point to areas in which the affected person requires support. To examine the quality of life of patients with oral cavity cancer. The study was conducted at the Clinic of Maxillofacial Surgery of the Clinical Center University of Sarajevo (CCUS), through a survey on patients with verified oral cavity cancer, questionnaire related to socio-demographic characteristics of the patients and the University of Washington Quality of Life Questionnaire (UW-QOL). The results were included in the database and statistically processed in the SPSS program, 19.0 version for Windows. Afterwards, the results were thoroughly analyzed and documented, presented in absolute numbers and statistical values using statistical indicators in simple and understandable tables and figures. The study results showed that out of the total score of 100, the median value of quality of life of patients with oral cavity cancer, for the physical health component in the definition of quality was M=69.75 ±29.12 and for social-emotional health M=65.11 ± 27.47. This could be considered as satisfactory quality of life, in the sphere above half of the rating scale, although both values significantly deviate from the UW-QOL scale norm. Physical and socio-emotional health components are in a strong positive correlation, R 2 =0.750, p=0.0001.
Management of defects on lower extremities with the use of matriderm and skin graft.
Choi, Jun-Young; Kim, Seong-Hun; Oh, Gwang-Jin; Roh, Si-Gyun; Lee, Nae-Ho; Yang, Kyung-Moo
2014-07-01
The reconstruction of large skin and soft tissue defects on the lower extremities is challenging. The skin graft is a simple and frequently used method for covering a skin defect. However, poor skin quality and architecture are well-known problems that lead to scar contracture. The collagen-elastin matrix, Matriderm, has been used to improve the quality of skin grafts; however, no statistical and objective review of the results has been reported. Thirty-four patients (23 male and 11 female) who previously received a skin graft and simultaneous application of Matriderm between January 2010 and June 2012 were included in this study. The quality of the skin graft was evaluated using Cutometer, occasionally accompanied by pathologic findings. All 34 patients showed good skin quality compared to a traditional skin graft and were satisfied with their results. The statistical data for the measurement of the mechanical properties of the skin were similar to those for normal skin. In addition, there was no change in the engraftment rate. The biggest problem of a traditional skin graft is scar contracture. However, the dermal matrix presents an improvement in skin quality with elastin and collagen. Therefore, a skin graft along with a simultaneous application of Matriderm is safe and effective and leads to a significantly better outcome from the perspective of skin elasticity.
Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance
Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield
2013-01-01
The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...
Simkó, Myrtill; Remondini, Daniel; Zeni, Olga; Scarfi, Maria Rosaria
2016-01-01
Possible hazardous effects of radiofrequency electromagnetic fields (RF-EMF) at low exposure levels are controversially discussed due to inconsistent study findings. Therefore, the main focus of the present study is to detect if any statistical association exists between RF-EMF and cellular responses, considering cell proliferation and apoptosis endpoints separately and with both combined as a group of “cellular life” to increase the statistical power of the analysis. We searched for publications regarding RF-EMF in vitro studies in the PubMed database for the period 1995–2014 and extracted the data to the relevant parameters, such as cell culture type, frequency, exposure duration, SAR, and five exposure-related quality criteria. These parameters were used for an association study with the experimental outcome in terms of the defined endpoints. We identified 104 published articles, from which 483 different experiments were extracted and analyzed. Cellular responses after exposure to RF-EMF were significantly associated to cell lines rather than to primary cells. No other experimental parameter was significantly associated with cellular responses. A highly significant negative association with exposure condition-quality and cellular responses was detected, showing that the more the quality criteria requirements were satisfied, the smaller the number of detected cellular responses. According to our knowledge, this is the first systematic analysis of specific RF-EMF bio-effects in association to exposure quality, highlighting the need for more stringent quality procedures for the exposure conditions. PMID:27420084
Feature maps driven no-reference image quality prediction of authentically distorted images
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Bovik, Alan C.
2015-03-01
Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.
Getting a scientific paper published in Epilepsia: an editor's perspective.
Schwartzkroin, Philip A
2013-11-01
Getting a paper published in Epilepsia depends first and foremost on the quality of the work reported, and on the clarity and convincingness of the presentation. Papers should focus on important and interesting topics with clearly stated objectives and goals. The observations and findings are of greatest interest when they are novel and change our views on the mechanisms and/or treatment of an epileptic disease. Studies should be carefully designed to include adequate sample size, comparison groups, and statistical analyses. Critically, the data must be clearly presented and appropriately interpreted. If followed, these recommendations will improve an author's chances of having his/her paper accepted in a high quality journal like Epilepsia. Wiley Periodicals, Inc. © 2013 International League Against Epilepsy.
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.
Bonn, Bernadine A.
2008-01-01
A long-term method detection level (LT-MDL) and laboratory reporting level (LRL) are used by the U.S. Geological Survey?s National Water Quality Laboratory (NWQL) when reporting results from most chemical analyses of water samples. Changing to this method provided data users with additional information about their data and often resulted in more reported values in the low concentration range. Before this method was implemented, many of these values would have been censored. The use of the LT-MDL and LRL presents some challenges for the data user. Interpreting data in the low concentration range increases the need for adequate quality assurance because even small contamination or recovery problems can be relatively large compared to concentrations near the LT-MDL and LRL. In addition, the definition of the LT-MDL, as well as the inclusion of low values, can result in complex data sets with multiple censoring levels and reported values that are less than a censoring level. Improper interpretation or statistical manipulation of low-range results in these data sets can result in bias and incorrect conclusions. This document is designed to help data users use and interpret data reported with the LTMDL/ LRL method. The calculation and application of the LT-MDL and LRL are described. This document shows how to extract statistical information from the LT-MDL and LRL and how to use that information in USGS investigations, such as assessing the quality of field data, interpreting field data, and planning data collection for new projects. A set of 19 detailed examples are included in this document to help data users think about their data and properly interpret lowrange data without introducing bias. Although this document is not meant to be a comprehensive resource of statistical methods, several useful methods of analyzing censored data are demonstrated, including Regression on Order Statistics and Kaplan-Meier Estimation. These two statistical methods handle complex censored data sets without resorting to substitution, thereby avoiding a common source of bias and inaccuracy.
ERIC Educational Resources Information Center
Vaughn, Brandon K.; Wang, Pei-Yu
2009-01-01
The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…
The Development of Official Social Statistics in Italy with a Life Quality Approach
ERIC Educational Resources Information Center
Sabbadini, Linda Laura
2011-01-01
The article covers the main steps of official statistics in the second half of the Nineties through the illustration of the transition from economic oriented official statistics to the quality of life approach. The system of the Multipurpose Surveys introduced in 1993 to give an answer to questions at social level and to provide indicators for…
Statistical issues in quality control of proteomic analyses: good experimental design and planning.
Cairns, David A
2011-03-01
Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups
NASA Astrophysics Data System (ADS)
Kenna, Ralph; Berche, Bertrand
Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.
Petzold, Thomas; Steinwitz, Adrienne; Schmitt, Jochen; Eberlein-Gonska, Maria
2013-01-01
Obligatory external quality assurance is an established method used to ensure the quality of inpatient care in Germany. The comprehensive approach is unique in international comparison. In addition to the statutory requirement, the health insurance funds require this form of external quality control in order to foster quality-based competition between hospitals. Ever since its introduction, healthcare providers have scrutinised the effects of the mandatory use of this survey. The study was based on all patients in the University Hospital Dresden, for whom a quality assurance sheet (n = 45,639) had to be recorded between 2003 and 2011. The documentation of these sheets was carried out by specially trained personnel. For each performance area, the duration of the documentation quality sheets was assessed, and a descriptive analysis of all quality assurance sheets was conducted. In the presence of statistical significance the so-called "Structured Dialogues" were analysed. Over the whole period, 167 statistically noticeable problems occurred. Nine of these have been rated as noticeable problems in medical quality by the specialised working groups of the project office quality assurance (PGSQS) at the Saxon State Medical Association (SLÄK). The remaining 158 statistical anomalies included 25 documentation errors; 96 were classified as statistically significant, and only 37 were marked to indicate that re-observation by the PGSQS was required. The total effort estimate for the documentation of quality assurance sheets was approximately 1,420 working days in the observation period. As far as the quality of patient care is concerned, the results can be considered positive because only a small number of quality indicators indicate noticeable qualitative problems. This statement is based primarily on the comparison of the groups of Saxony and Germany, which are included in the quality report of external quality assurance in accordance with sect. 137 SGB V. The majority of noticeable statistical problems were due to documentation errors. Other noticeable statistical problems that are medically indicated, but without effect on the extramural care to patients, recurrently occur with the respective quality indicators. Examples include the postoperative mobility indicators of the implementation of endoprostheses which cannot be used to draw conclusions about patient outcomes. Information on the quality of life as well as the post-hospital course of disease would be important in this context, but is still lacking. The use of external quality assurance data in accordance with sect. 137 SGB V for evaluation research has so far been handled quite restrictively. Thus, in-depth analyses on the quality of treatment cannot be derived. Copyright © 2013. Published by Elsevier GmbH.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Probabilistic Evaluation of Competing Climate Models
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Chatterjee, S.; Heyman, M.; Cressie, N.
2017-12-01
A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. Here, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data, over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. We compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set, as an illustration.
Non-homogeneous updates for the iterative coordinate descent algorithm
NASA Astrophysics Data System (ADS)
Yu, Zhou; Thibault, Jean-Baptiste; Bouman, Charles A.; Sauer, Ken D.; Hsieh, Jiang
2007-02-01
Statistical reconstruction methods show great promise for improving resolution, and reducing noise and artifacts in helical X-ray CT. In fact, statistical reconstruction seems to be particularly valuable in maintaining reconstructed image quality when the dosage is low and the noise is therefore high. However, high computational cost and long reconstruction times remain as a barrier to the use of statistical reconstruction in practical applications. Among the various iterative methods that have been studied for statistical reconstruction, iterative coordinate descent (ICD) has been found to have relatively low overall computational requirements due to its fast convergence. This paper presents a novel method for further speeding the convergence of the ICD algorithm, and therefore reducing the overall reconstruction time for statistical reconstruction. The method, which we call nonhomogeneous iterative coordinate descent (NH-ICD) uses spatially non-homogeneous updates to speed convergence by focusing computation where it is most needed. Experimental results with real data indicate that the method speeds reconstruction by roughly a factor of two for typical 3D multi-slice geometries.
2008 Niday Perinatal Database quality audit: report of a quality assurance project.
Dunn, S; Bottomley, J; Ali, A; Walker, M
2011-12-01
This quality assurance project was designed to determine the reliability, completeness and comprehensiveness of the data entered into Niday Perinatal Database. Quality of the data was measured by comparing data re-abstracted from the patient record to the original data entered into the Niday Perinatal Database. A representative sample of hospitals in Ontario was selected and a random sample of 100 linked mother and newborn charts were audited for each site. A subset of 33 variables (representing 96 data fields) from the Niday dataset was chosen for re-abstraction. Of the data fields for which Cohen's kappa statistic or intraclass correlation coefficient (ICC) was calculated, 44% showed substantial or almost perfect agreement (beyond chance). However, about 17% showed less than 95% agreement and a kappa or ICC value of less than 60% indicating only slight, fair or moderate agreement (beyond chance). Recommendations to improve the quality of these data fields are presented.
Global map of lithosphere thermal thickness on a 1 deg x 1 deg grid - digitally available
NASA Astrophysics Data System (ADS)
Artemieva, Irina
2014-05-01
This presentation reports a 1 deg ×1 deg global thermal model for the continental lithosphere (TC1). The model is digitally available from the author's web-site: www.lithosphere.info. Geotherms for continental terranes of different ages (early Archean to present) are constrained by reliable data on borehole heat flow measurements (Artemieva and Mooney, 2001), checked with the original publications for data quality, and corrected for paleo-temperature effects where needed. These data are supplemented by cratonic geotherms based on xenolith data. Since heat flow measurements cover not more than half of the continents, the remaining areas (ca. 60% of the continents) are filled by the statistical numbers derived from the thermal model constrained by borehole data. Continental geotherms are statistically analyzed as a function of age and are used to estimate lithospheric temperatures in continental regions with no or low quality heat flow data. This analysis requires knowledge of lithosphere age globally. A compilation of tectono-thermal ages of lithospheric terranes on a 1 deg × 1 deg grid forms the basis for the statistical analysis. It shows that, statistically, lithospheric thermal thickness z (in km) depends on tectono-thermal age t (in Ma) as: z=0.04t+93.6. This relationship formed the basis for a global thermal model of the continental lithosphere (TC1). Statistical analysis of continental geotherms also reveals that this relationship holds for the Archean cratons in general, but not in detail. Particularly, thick (more than 250 km) lithosphere is restricted solely to young Archean terranes (3.0-2.6 Ga), while in old Archean cratons (3.6-3.0 Ga) lithospheric roots do not extend deeper than 200-220 km. The TC1 model is presented by a set of maps, which show significant thermal heterogeneity within continental upper mantle. The strongest lateral temperature variations (as large as 800 deg C) are typical of the shallow mantle (depth less than 100 km). A map of the depth to a 600 deg C isotherm in continental upper mantle is presented as a proxy to the elastic thickness of the cratonic lithosphere, in which flexural rigidity is dominated by olivine rheology of the mantle. The TC1 model of the lithosphere thickness is used to calculate the growth and preservation rates of the lithosphere since the Archean.
Electric power quarterly, April-June 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-10-13
The EPQ presents monthly summaries of electric utility statistics at the national, divisional, state, company, and plant levels on the following subjects: quantity of fuel, cost of fuel, quality of fuel, net generation, fuel consumption, fuel stocks. In addition, the EPQ presents a quarterly summary of reported major disturbances and unusual occurrences. These data are collected on the Form IE-417R. Every electric utility engaged in the generation, transmission, or distribution of electric energy must file a report with DOE if it experiences a major power system emergency.
Electric power quarterly, July-September 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-01-22
The EPQ presents monthly summaries of electric utility statistics at the national, divisional, state, company, and plant levels on the following subjects: quantity of fuel, cost of fuel, quality of fuel, net generation, fuel consumption, fuel stocks. In addition, the EPQ presents a quarterly summary of reported major disturbances and unusual occurrences. These data are collected on the Form IE-417R. Every electric utility engaged in the generation, transmission, or distribution of electric energy must file a report with DOE if it experiences a major power system emergency.
Vermont's use-value appraisal property tax program: a forest inventory and analysis
Paul E. Sendak; Donald F. Dennis; Donald F. Dennis
1989-01-01
A statistical report and analysis of the timberland enrolled in the Vermont Use Value Appraisal (UVA) property tax program. The study was conducted using data collected in the fourth forest survey of Vermont (1983). Estimates are presented on land area, timber volumes, tree quality, numbers of live trees, and biomass for timberland enrolled in the UVA program and for...
Schwitzer, Jonathan A.; Albino, Frank P.; Mathis, Ryan K.; Scott, Amie M.; Gamble, Laurie; Baker, Stephen B.
2015-01-01
Background As rhinoplasty patient demographics evolve, surgeons must consider the impact of demographics on patient satisfaction. Objectives The objective of this study was to identify independent demographic predictors of differences in satisfaction with appearance and quality of life following rhinoplasty utilizing the FACE-Q patient-reported outcome instrument. Methods Patients presenting for rhinoplasty completed the following FACE-Q scales: Satisfaction with Facial Appearance, Satisfaction with Nose, Social Function, and Psychological Well-being. Higher FACE-Q scores indicate greater satisfaction with appearance or superior quality of life. Pre- and post-treatment scores were compared in the context of patient demographics. Results The scales were completed by 59 patients. Women demonstrated statistically significant improvements in Satisfaction with Facial Appearance and quality of life while men only experienced significant improvement in Satisfaction with Facial appearance. Caucasians demonstrated statistically significant improvement in Satisfaction with Facial Appearance and quality of life while non-Caucasians did not. Patients younger than 35 years old were more likely to experience enhanced Satisfaction with Facial Appearance and quality of life compared with patients older than 35 years old. Patients with income ≥$100,000 were more likely to experience significant increases in Satisfaction with Facial Appearance and quality of life than patients with incomes <$100,000. Conclusions In an objective study using a validated patient-reported outcome instrument, the authors were able to quantify differences in the clinically meaningful change in perception of appearance and quality of life that rhinoplasty patients gain based on demographic variables. The authors also demonstrated that these variables are potential predictors of differences in satisfaction. Level of Evidence 3 Therapeutic PMID:26063837
Schwitzer, Jonathan A; Albino, Frank P; Mathis, Ryan K; Scott, Amie M; Gamble, Laurie; Baker, Stephen B
2015-09-01
As rhinoplasty patient demographics evolve, surgeons must consider the impact of demographics on patient satisfaction. The objective of this study was to identify independent demographic predictors of differences in satisfaction with appearance and quality of life following rhinoplasty utilizing the FACE-Q patient-reported outcome instrument. Patients presenting for rhinoplasty completed the following FACE-Q scales: Satisfaction with Facial Appearance, Satisfaction with Nose, Social Function, and Psychological Well-being. Higher FACE-Q scores indicate greater satisfaction with appearance or superior quality of life. Pre- and post-treatment scores were compared in the context of patient demographics. The scales were completed by 59 patients. Women demonstrated statistically significant improvements in Satisfaction with Facial Appearance and quality of life while men only experienced significant improvement in Satisfaction with Facial appearance. Caucasians demonstrated statistically significant improvement in Satisfaction with Facial Appearance and quality of life while non-Caucasians did not. Patients younger than 35 years old were more likely to experience enhanced Satisfaction with Facial Appearance and quality of life compared with patients older than 35 years old. Patients with income ≥$100,000 were more likely to experience significant increases in Satisfaction with Facial Appearance and quality of life than patients with incomes <$100,000. In an objective study using a validated patient-reported outcome instrument, the authors were able to quantify differences in the clinically meaningful change in perception of appearance and quality of life that rhinoplasty patients gain based on demographic variables. The authors also demonstrated that these variables are potential predictors of differences in satisfaction. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2007-01-01
Water, bed sediment, and biota were sampled in streams from Butte to below Milltown Reservoir as part of a long-term monitoring program in the upper Clark Fork basin; additional water-quality samples were collected in the Clark Fork basin from sites near Milltown Reservoir downstream to near the confluence of the Clark Fork and Flathead River as part of a supplemental sampling program. The sampling programs were conducted in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork basin of western Montana, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water-quality samples were collected periodically at 22 sites from October 2005 through September 2006. Bed-sediment and biological samples were collected once at 12 sites during August 2006. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at all long-term and supplemental monitoring sites from October 2005 through September 2006. Water-quality data include concentrations of selected major ions, trace ele-ments, and suspended sediment. Nutrients also were analyzed in the supplemental water-quality samples. Daily values of suspended-sed-iment concentration and suspended-sediment discharge were determined for four sites, and seasonal daily values of turbidity were determined for four sites. Bed-sediment data include trace-ele-ment concentrations in the fine-grained fraction. Bio-logical data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of long-term water-quality, bed-sediment, and biological data for sites in the upper Clark Fork basin are provided for the period of record since 1985.
George, Stephen L; Buyse, Marc
2015-01-01
Highly publicized cases of fabrication or falsification of data in clinical trials have occurred in recent years and it is likely that there are additional undetected or unreported cases. We review the available evidence on the incidence of data fraud in clinical trials, describe several prominent cases, present information on motivation and contributing factors and discuss cost-effective ways of early detection of data fraud as part of routine central statistical monitoring of data quality. Adoption of these clinical trial monitoring procedures can identify potential data fraud not detected by conventional on-site monitoring and can improve overall data quality. PMID:25729561
Effectiveness of propolis on oral health: a meta-analysis.
Hwu, Yueh-Juen; Lin, Feng-Yu
2014-12-01
The use of propolis mouth rinse or gel as a supplementary intervention has increased during the last decade in Taiwan. However, the effect of propolis on oral health is not well understood. The purpose of this meta-analysis was to present the best available evidence regarding the effects of propolis use on oral health, including oral infection, dental plaque, and stomatitis. Researchers searched seven electronic databases for relevant articles published between 1969 and 2012. Data were collected using inclusion and exclusion criteria. The Joanna Briggs Institute Meta Analysis of Statistics Assessment and Review Instrument was used to evaluate the quality of the identified articles. Eight trials published from 1997 to 2011 with 194 participants had extractable data. The result of the meta-analysis indicated that, although propolis had an effect on reducing dental plaque, this effect was not statistically significant. The results were not statistically significant for oral infection or stomatitis. Although there are a number of promising indications, in view of the limited number and quality of studies and the variation in results among studies, this review highlights the need for additional well-designed trials to draw conclusions that are more robust.
21 CFR 211.165 - Testing and release for distribution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... products meet each appropriate specification and appropriate statistical quality control criteria as a condition for their approval and release. The statistical quality control criteria shall include appropriate acceptance levels and/or appropriate rejection levels. (e) The accuracy, sensitivity, specificity, and...
Some Dimensions of Data Quality in Statistical Systems
DOT National Transportation Integrated Search
1997-07-01
An important objective of a statistical data system is to enable users of the data to recommend (an organizations to take) rational action for solving problems or for improving quality of service of manufactured product. With this view in mind, this ...
Conradi, Una; Joffe, Ari R
2017-07-07
To determine a direct measure of publication bias by determining subsequent full-paper publication (P) of studies reported in animal research abstracts presented at an international conference (A). We selected 100 random (using a random-number generator) A from the 2008 Society of Critical Care Medicine Conference. Using a data collection form and study manual, we recorded methodology and result variables from A. We searched PubMed and EMBASE to June 2015, and DOAJ and Google Scholar to May 2017 to screen for subsequent P. Methodology and result variables were recorded from P to determine changes in reporting from A. Predictors of P were examined using Fisher's Exact Test. 62% (95% CI 52-71%) of studies described in A were subsequently P after a median 19 [IQR 9-33.3] months from conference presentation. Reporting of studies in A was of low quality: randomized 27% (the method of randomization and allocation concealment not described), blinded 0%, sample-size calculation stated 0%, specifying the primary outcome 26%, numbers given with denominators 6%, and stating number of animals used 47%. Only being an orally presented (vs. poster presented) A (14/16 vs. 48/84, p = 0.025) predicted P. Reporting of studies in P was of poor quality: randomized 39% (the method of randomization and allocation concealment not described), likely blinded 6%, primary outcome specified 5%, sample size calculation stated 0%, numbers given with denominators 34%, and number of animals used stated 56%. Changes in reporting from A to P occurred: from non-randomized to randomized 19%, from non-blinded to blinded 6%, from negative to positive outcomes 8%, from having to not having a stated primary outcome 16%, and from non-statistically to statistically significant findings 37%. Post-hoc, using publication data, P was predicted by having positive outcomes (published 62/62, unpublished 33/38; p = 0.003), or statistically significant results (published 58/62, unpublished 20/38; p < 0.001). Only 62% (95% CI 52-71%) of animal research A are subsequently P; this was predicted by oral presentation of the A, finally having positive outcomes, and finally having statistically significant results. Publication bias is prevalent in critical care animal research.
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
NASA Astrophysics Data System (ADS)
Stewart, Brent K.; Carter, Stephen J.; Langer, Steven G.; Andrew, Rex K.
1998-06-01
Experiments using NASA's Advanced Communications Technology Satellite were conducted to provide an estimate of the compressed video quality required for preservation of clinically relevant features for the detection of trauma. Bandwidth rates of 128, 256 and 384 kbps were used. A five point Likert scale (1 equals no useful information and 5 equals good diagnostic quality) was used for a subjective preference questionnaire to evaluate the quality of the compressed ultrasound imagery at the three compression rates for several anatomical regions of interest. At 384 kbps the Likert scores (mean plus or minus SD) were abdomen (4.45 plus or minus 0.71), carotid artery (4.70 plus or minus 0.36), kidney (5.0 plus or minus 0.0), liver (4.67 plus or minus 0.58) and thyroid (4.03 plus or minus 0.74). Due to the volatile nature of the H.320 compressed digital video stream, no statistically significant results can be derived through this methodology. As the MPEG standard has at its roots many of the same intraframe and motion vector compression algorithms as the H.261 (such as that used in the previous ACTS/AMT experiments), we are using the MPEG compressed video sequences to best gauge what minimum bandwidths are necessary for preservation of clinically relevant features for the detection of trauma. We have been using an MPEG codec board to collect losslessly compressed video clips from high quality S- VHS tapes and through direct digitization of S-video. Due to the large number of videoclips and questions to be presented to the radiologists and for ease of application, we have developed a web browser interface for this video visual perception study. Due to the large numbers of observations required to reach statistical significance in most ROC studies, Kappa statistical analysis is used to analyze the degree of agreement between observers and between viewing assessment. If the degree of agreement amongst readers is high, then there is a possibility that the ratings (i.e., average Likert score at each bandwidth) do in fact reflect the dimension they are purported to reflect (video quality versus bandwidth). It is then possible to make intelligent choice of bandwidth for streaming compressed video and compressed videoclips.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
A neighborhood statistics model for predicting stream pathogen indicator levels.
Pandey, Pramod K; Pasternack, Gregory B; Majumder, Mahbubul; Soupir, Michelle L; Kaiser, Mark S
2015-03-01
Because elevated levels of water-borne Escherichia coli in streams are a leading cause of water quality impairments in the U.S., water-quality managers need tools for predicting aqueous E. coli levels. Presently, E. coli levels may be predicted using complex mechanistic models that have a high degree of unchecked uncertainty or simpler statistical models. To assess spatio-temporal patterns of instream E. coli levels, herein we measured E. coli, a pathogen indicator, at 16 sites (at four different times) within the Squaw Creek watershed, Iowa, and subsequently, the Markov Random Field model was exploited to develop a neighborhood statistics model for predicting instream E. coli levels. Two observed covariates, local water temperature (degrees Celsius) and mean cross-sectional depth (meters), were used as inputs to the model. Predictions of E. coli levels in the water column were compared with independent observational data collected from 16 in-stream locations. The results revealed that spatio-temporal averages of predicted and observed E. coli levels were extremely close. Approximately 66 % of individual predicted E. coli concentrations were within a factor of 2 of the observed values. In only one event, the difference between prediction and observation was beyond one order of magnitude. The mean of all predicted values at 16 locations was approximately 1 % higher than the mean of the observed values. The approach presented here will be useful while assessing instream contaminations such as pathogen/pathogen indicator levels at the watershed scale.
Kabir-Mokamelkhah, Elaheh; Bahrami-Ahmadi, Amir; Aghili, Negar
2016-01-01
Background: Impairment in quality of life and mental health had been reported in the previous studies as the results of musculoskeletal disorders among workers. Mental health has a wide concept and contains different disorders including anxiety, depression or even decreased quality of life, all of which having challengeable impacts on work- related characters such as work productivity and absensism. The present study aimed at evaluating work- related stress and quality of life among Iranian blue-collar workers of Fars ABFA Company with selfreported low back pain. Methods: In the present study, we focused on the low back pain among 451 blue-collar workers and assessed their work- related stress and quality of life status using DASS-21 and short form questionnaire (SF-36), respectively. Independent sample t-test was used to compare the qualitative variables, and chi-square test was utilized for statistical analysis of the qualitative variables. Results: Mean of the total score of quality of life among workers with low back pain was significantly lower than in those workers without low back pain. The mean of work- related stress score was significantly higher in workers with low back pain than in workers without low back pain. The mean quality of life subdomains in patients with low back pain was significantly lower than in workers without low back pain. Conclusion: Findings of the present study revealed that workers with low back pain had lower quality of life score and higher work- related stress score. These findings should be considered in designing preventive programs rather than controlling the pain.
Three Experts on Quality Management: Philip B. Crosby, W. Edwards Deming, Joseph M. Juran
1992-07-01
Department of the Navy Office of the Under Secretary of the Navy Total Quality Leadership Omce THREE EXPERTS ON QUALITY MANAGEMENT : PHILIP B. CROSBY W...research, as the "price of nonconformance." To aid managers in statistical theory , statistical thinking, and the application tracking the cost of doing...Quality Management emphasizes that the process must become a way of life in Theory of Systems. "A system is a series of the organization. Continuance is
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
River water quality and pollution sources in the Pearl River Delta, China.
Ouyang, Tingping; Zhu, Zhaoyu; Kuang, Yaoqiu
2005-07-01
Some physicochemical parameters were determined for thirty field water samples collected from different water channels in the Pearl River Delta Economic Zone river system. The analytical results were compared with the environmental quality standards for surface water. Using the SPSS software, statistical analyses were performed to determine the main pollutants of the river water. The main purpose of the present research is to investigate the river water quality and to determine the main pollutants and pollution sources. Furthermore, the research provides some approaches for protecting and improving river water quality. The results indicate that the predominant pollutants are ammonium, phosphorus, and organic compounds. The wastewater discharged from households in urban and rural areas, industrial facilities, and non-point sources from agricultural areas are the main sources of pollution in river water in the Pearl River Delta Economic Zone.
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
Examining the Quality of IEPs for Young Children with Autism
McGrew, John; Dalrymple, Nancy; Jung, Lee Ann
2011-01-01
The purpose of this study was to develop an Individual Education Program (IEP) evaluation tool based on Individuals with Disabilities Education Act (IDEA) requirements and National Research Council recommendations for children with autism; determine the tool’s reliability; test the tool on a pilot sample of IEPs of young children; and examine associations between IEP quality and school, teacher, and child characteristics. IEPs for 35 students with autism (Mage = 6.1 years; SD = 1.6) from 35 different classrooms were examined. The IEP tool had adequate interrater reliability (ICC = .70). Results identified no statistically significant association between demographics and IEP quality, and IEPs contained relatively clear descriptions of present levels of performance. Weaknesses of IEPs were described and recommendations provided. PMID:20373007
Examining the quality of IEPs for young children with autism.
Ruble, Lisa A; McGrew, John; Dalrymple, Nancy; Jung, Lee Ann
2010-12-01
The purpose of this study was to develop an Individual Education Program (IEP) evaluation tool based on Individuals with Disabilities Education Act (IDEA) requirements and National Research Council recommendations for children with autism; determine the tool's reliability; test the tool on a pilot sample of IEPs of young children; and examine associations between IEP quality and school, teacher, and child characteristics. IEPs for 35 students with autism (Mage = 6.1 years; SD = 1.6) from 35 different classrooms were examined. The IEP tool had adequate interrater reliability (ICC = .70). Results identified no statistically significant association between demographics and IEP quality, and IEPs contained relatively clear descriptions of present levels of performance. Weaknesses of IEPs were described and recommendations provided.
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
Intelligent Systems Approaches to Product Sound Quality Analysis
NASA Astrophysics Data System (ADS)
Pietila, Glenn M.
As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach. Next, an unsupervised jury clustering algorithm is used to identify and classify subgroups within a jury who have conflicting preferences. In addition, a nested Artificial Neural Network (ANN) architecture is developed to predict subjective preference based on objective sound quality metrics, in the presence of non-linear preferences. Finally, statistical decomposition and correlation algorithms are reviewed that can help an analyst establish a clear understanding of the variability of the product sounds used as inputs into the jury study and to identify correlations between preference scores and sound quality metrics in the presence of non-linearities.
Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.
Westgard, James O; Westgard, Sten A
2017-03-01
Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Danish; Baloch, Muhammad Awais
2018-03-01
The focus of the present research work is to investigate the dynamic relationship between economic growth, road transport energy consumption, and environmental quality. To this end, we rely on time series data for the period 1971 to 2014 in the context of Pakistan. To use sulfur dioxide (SO 2 ) emission from transport sector as a new proxy for measuring environmental quality, the present work employs time series technique ARDL which allows energy consumption from the transport sector, urbanization, and road infrastructure to be knotted by symmetric relationships with SO 2 emissions and economic growth. From the statistical results, we confirm that road infrastructure boosts economic growth. Simultaneously, road infrastructure and urbanization hampers environmental quality and causes to accelerate emission of SO 2 in the atmosphere. Furthermore, economic growth has a diminishing negative impact on total SO 2 emission. Moreover, we did not find any proof of the expected role of transport energy consumption in SO 2 emission. The acquired results directed that care should be taken in the expansion of road infrastructure and green city policies and planning are required in the country.
NASA Astrophysics Data System (ADS)
Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.
2014-12-01
River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.
Petroleum marketing monthly, May 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-05-26
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. The Petroleum Marketing Division, Office of Oil and Gas, Energy Information Administration ensures the accuracy, quality, and confidentiality of the published datamore » in the Petroleum Marketing Monthly.« less
Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril
2014-07-01
Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Mindfulness for palliative care patients. Systematic review.
Latorraca, Carolina de Oliveira Cruz; Martimbianco, Ana Luiza Cabrera; Pachito, Daniela Vianna; Pacheco, Rafael Leite; Riera, Rachel
2017-12-01
Nineteen million adults worldwide are in need of palliative care. Of those who have access to it, 80% fail to receive an efficient management of symptoms. To assess the effectiveness and safety of mindfulness meditation for palliative care patients. We searched CENTRAL, MEDLINE, Embase, LILACS, PEDro, CINAHL, PsycINFO, Opengrey, ClinicalTrials.gov and WHO-ICTRP. No restriction of language, status or date of publication was applied. We considered randomised clinical trials (RCTs) comparing any mindfulness meditation scheme vs any comparator for palliative care. Cochrane Risk of Bias (Rob) Table was used for assessing methodological quality of RCTs. Screening, data extraction and methodological assessments were performed by two reviewers. Mean differences (MD) (confidence intervals of 95% (CI 95%)) were considered for estimating effect size. Quality of evidence was appraised by GRADE. Four RCTs, 234 participants, were included. All studies presented high risk of bias in at least one RoB table criteria. We assessed 4 comparisons, but only 2 studies showed statistically significant difference for at least one outcome. 1. Mindfulness meditation (eight weeks, one session/week, daily individual practice) vs control: statistically significant difference in favour of control for quality of life - physical aspects. 2. Mindfulness meditation (single 5-minute session) vs control: benefit in favour of mindfulness for stress outcome in both time-points. None of the included studies analysed safety and harms outcomes. Although two studies have showed statistically significant difference, only one showed effectiveness of mindfulness meditation in improving perceived stress. This study focused on one single session of mindfulness of 5 minutes for adult cancer patients in palliative care, but it was considered as possessing high risk of bias. Other schemes of mindfulness meditation did not show benefit in any outcome evaluated (low and very low quality evidence). © 2017 John Wiley & Sons Ltd.
Cotton genotypes selection through artificial neural networks.
Júnior, E G Silva; Cardoso, D B O; Reis, M C; Nascimento, A F O; Bortolin, D I; Martins, M R; Sousa, L B
2017-09-27
Breeding programs currently use statistical analysis to assist in the identification of superior genotypes at various stages of a cultivar's development. Differently from these analyses, the computational intelligence approach has been little explored in genetic improvement of cotton. Thus, this study was carried out with the objective of presenting the use of artificial neural networks as auxiliary tools in the improvement of the cotton to improve fiber quality. To demonstrate the applicability of this approach, this research was carried out using the evaluation data of 40 genotypes. In order to classify the genotypes for fiber quality, the artificial neural networks were trained with replicate data of 20 genotypes of cotton evaluated in the harvests of 2013/14 and 2014/15, regarding fiber length, uniformity of length, fiber strength, micronaire index, elongation, short fiber index, maturity index, reflectance degree, and fiber quality index. This quality index was estimated by means of a weighted average on the determined score (1 to 5) of each characteristic of the HVI evaluated, according to its industry standards. The artificial neural networks presented a high capacity of correct classification of the 20 selected genotypes based on the fiber quality index, so that when using fiber length associated with the short fiber index, fiber maturation, and micronaire index, the artificial neural networks presented better results than using only fiber length and previous associations. It was also observed that to submit data of means of new genotypes to the neural networks trained with data of repetition, provides better results of classification of the genotypes. When observing the results obtained in the present study, it was verified that the artificial neural networks present great potential to be used in the different stages of a genetic improvement program of the cotton, aiming at the improvement of the fiber quality of the future cultivars.
Electric Power Quarterly, January-March 1983
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-07-01
The Electric Power Quarterly (EPQ), a new series in the EIA statistical publications, provides electric utilities' plant-level information about the cost, quantity, and quality of fossil fuel receipts, net generation, fuel consumption and fuel stocks. The EPQ contains monthly data and quarterly totals for the reporting quarter. The data presented in this report were collected and published by the EIA to fulfill its responsibilities as specified in the Federal Energy Administration Act of 1974 (P.L. 93-275). This edition of the EPQ contains monthly data for the first quarter of 1983. In this report, data collected on Form EIA-759 regarding electricmore » utilities' net generation, fuel consumption, and fuel stocks are presented for the first time on a plant-by-plant basis. In addition, quantity, cost, and quality of fossil fuel receipts collected on the Federal Energy Regulatory Commission (FERC) Form 423 are presented on a plant-by-plant basis.« less
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
Dexter, Franklin; Shafer, Steven L
2017-03-01
Considerable attention has been drawn to poor reproducibility in the biomedical literature. One explanation is inadequate reporting of statistical methods by authors and inadequate assessment of statistical reporting and methods during peer review. In this narrative review, we examine scientific studies of several well-publicized efforts to improve statistical reporting. We also review several retrospective assessments of the impact of these efforts. These studies show that instructions to authors and statistical checklists are not sufficient; no findings suggested that either improves the quality of statistical methods and reporting. Second, even basic statistics, such as power analyses, are frequently missing or incorrectly performed. Third, statistical review is needed for all papers that involve data analysis. A consistent finding in the studies was that nonstatistical reviewers (eg, "scientific reviewers") and journal editors generally poorly assess statistical quality. We finish by discussing our experience with statistical review at Anesthesia & Analgesia from 2006 to 2016.
Using a statistical process control chart during the quality assessment of cancer registry data.
Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia
2011-01-01
Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.
Electric power quarterly, July--September 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1989-01-19
The Electric Power Quarterly (EPQ) is prepared by the Electric Power Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA). The EPQ is designed to provide energy decisionmakers with accurate and timely generation and fuel cost and quality information on a plant-by-plant basis. This publication is designed for applications by electric utilities, fuel suppliers, consumers, educational institutions, and government in recognition of the importance of energy planning. The EPQ presents monthly summaries of electric utility statistics at the national, Census division, state, company, and plant levels on the following subjects: quantity of fuel; cost of fuel;more » quality of fuel; net generation; fuel consumption, and fuel stocks. In addition, the EPQ presents a quarterly summary of reported major disturbances and unusual occurrences. 1 fig., 15 tabs.« less
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.
[Development of bakery products for greater adult consumption based on wheat and rice flour].
Reyes Aguilar, María José; Palomo, Patricia de; Bressani, Ricardo
2004-09-01
The present investigation was developed as a contribution to Guatemalan's elderly food and nutrition. Its main objective was to evaluate the chemical, nutritional and sensory quality of bread prepared from the partial substitution of wheat flour with rice flour. Wheat flour substitutions with rice flour in the order of 15, 20, 30, 40, 50 and 60% were evaluated. Differences with the control (100% wheat bread) were found during the process of preparation, as well as texture, volume, height, weight and specific volume. Important effects in dough handling were noted specifically in the 40, 50 and 60% rice bread. Thus, a sandy texture was found in breads of higher rice levels. The bread protein quality increased with the level of substitution; however the protein quality difference between the wheat bread and the bread with 60% rice flour did not achieve statistical significance. Based on a statistical analysis of the physical properties the bread with 30 and 40% rice flour was selected, and through a preference test between these last two, the 30% rice flour bread was selected as the sample best suited to the present study's purposes. This bread was not different to wheat bread in many nutritional parameters, although in others it showed to be superior. Each serving size of bread has a weight of 80 grams (2 slices) that contributes adequate quantity of calories, protein and sodium, although a little less dietary fiber than 100% wheat bread.
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Jaynes, M.L.
1994-01-01
Hydrologic, water-quality, and meteorologic data were collected from January 1993 through March 1994 as part of a water-quality investigation of the Upper Catawba River Basin, North Carolina. Specific objectives of the investigation were to characterize the water quality of Rhodhiss Lake, Lake Hickory, and three tributary streams, and to calibrate hydrodynamic water-quality models for the two reservoirs. Sampling locations included 11 sites in Rhodhiss Lake, 14 sites in Lake Hickory, and 3 tributary sites. Tributary sites were located at Lower Creek upstream from Rhodhiss Lake and at Upper Little River and Middle Little River upstream from Lake Hickory. During 21 sampling visits, specific conductance, pH, water temperature, dissolved-oxygen concentration, and water transparency were measured at all sampling locations. Water samples were collected for analysis of biochemical oxygen demand, fecal coliform bacteria, hardness, alkalinity, total and volatile suspended solids, suspended sediment, nutrients, total organic carbon, chlorophyll, iron, calcium, and magnesium from three sites in each reservoir and from the three tributary sites. Chemical and particle-size analyses of bottom material from Rhodhiss Lake and Lake Hickory were performed once during the study. At selected locations, automated instruments recorded water level, streamflow, water temperature, solar radiation, and air temperature at 15-minute intervals throughout the study. Hydrologic data presented in the report include monthly water-level statistics and daily mean values of discharge. Diagrams, tables, and statistical summaries of water-quality data are provided. Meteorologic data in the report include monthly precipitation, and daily mean values of solar radiation and air temperature.
Clustering of reads with alignment-free measures and quality values.
Comin, Matteo; Leoni, Andrea; Schimd, Michele
2015-01-01
The data volume generated by Next-Generation Sequencing (NGS) technologies is growing at a pace that is now challenging the storage and data processing capacities of modern computer systems. In this context an important aspect is the reduction of data complexity by collapsing redundant reads in a single cluster to improve the run time, memory requirements, and quality of post-processing steps like assembly and error correction. Several alignment-free measures, based on k-mers counts, have been used to cluster reads. Quality scores produced by NGS platforms are fundamental for various analysis of NGS data like reads mapping and error detection. Moreover future-generation sequencing platforms will produce long reads but with a large number of erroneous bases (up to 15 %). In this scenario it will be fundamental to exploit quality value information within the alignment-free framework. To the best of our knowledge this is the first study that incorporates quality value information and k-mers counts, in the context of alignment-free measures, for the comparison of reads data. Based on this principles, in this paper we present a family of alignment-free measures called D (q) -type. A set of experiments on simulated and real reads data confirms that the new measures are superior to other classical alignment-free statistics, especially when erroneous reads are considered. Also results on de novo assembly and metagenomic reads classification show that the introduction of quality values improves over standard alignment-free measures. These statistics are implemented in a software called QCluster (http://www.dei.unipd.it/~ciompin/main/qcluster.html).
Bench to bedside: the quest for quality in experimental stroke research.
Dirnagl, Ulrich
2006-12-01
Over the past decades, great progress has been made in clinical as well as experimental stroke research. Disappointingly, however, hundreds of clinical trials testing neuroprotective agents have failed despite efficacy in experimental models. Recently, several systematic reviews have exposed a number of important deficits in the quality of preclinical stroke research. Many of the issues raised in these reviews are not specific to experimental stroke research, but apply to studies of animal models of disease in general. It is the aim of this article to review some quality-related sources of bias with a particular focus on experimental stroke research. Weaknesses discussed include, among others, low statistical power and hence reproducibility, defects in statistical analysis, lack of blinding and randomization, lack of quality-control mechanisms, deficiencies in reporting, and negative publication bias. Although quantitative evidence for quality problems at present is restricted to preclinical stroke research, to spur discussion and in the hope that they will be exposed to meta-analysis in the near future, I have also included some quality-related sources of bias, which have not been systematically studied. Importantly, these may be also relevant to mechanism-driven basic stroke research. I propose that by a number of rather simple measures reproducibility of experimental results, as well as the step from bench to bedside in stroke research may be made more successful. However, the ultimate proof for this has to await successful phase III stroke trials, which were built on basic research conforming to the criteria as put forward in this article.
Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin
2015-01-01
The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.
Scientific, statistical, practical, and regulatory considerations in design space development.
Debevec, Veronika; Srčič, Stanko; Horvat, Matej
2018-03-01
The quality by design (QbD) paradigm guides the pharmaceutical industry towards improved understanding of products and processes, and at the same time facilitates a high degree of manufacturing and regulatory flexibility throughout the establishment of the design space. This review article presents scientific, statistical and regulatory considerations in design space development. All key development milestones, starting with planning, selection of factors, experimental execution, data analysis, model development and assessment, verification, and validation, and ending with design space submission, are presented and discussed. The focus is especially on frequently ignored topics, like management of factors and CQAs that will not be included in experimental design, evaluation of risk of failure on design space edges, or modeling scale-up strategy. Moreover, development of a design space that is independent of manufacturing scale is proposed as the preferred approach.
ERIC Educational Resources Information Center
National Center for Health Statistics (DHHS/PHS), Hyattsville, MD.
This report summarizes current knowledge and research on the quality and reliability of death rates by race and Hispanic origin in official mortality statistics of the United States produced by the National Center for Health Statistics (NCHS). It provides a quantitative assessment of bias in death rates by race and Hispanic origin and identifies…
NRC TLD Direct Radiation Monitoring Network. Progress report, October--December 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Struckmeyer, R.
This report presents the results of the NRC Direct Radiation Monitoring Network for the fourth quarter of 1996. It provides the ambient radiation levels measured in the vicinity of 74 sites throughout the United States. In addition, it describes the equipment used, monitoring station selection criteria, characterization of the dosimeter response, calibration procedures, statistical methods, intercomparison, and quality assurance program. 3 figs., 4 tabs.
Code of Federal Regulations, 2014 CFR
2014-07-01
... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...
Code of Federal Regulations, 2013 CFR
2013-07-01
... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...
Code of Federal Regulations, 2012 CFR
2012-07-01
... comparative data to other methods and SRM materials are presented in Reference 23 of Section 16.0. 13... Plasma, Anal. Chem. 52:1965, 1980. 20. Deming, S.N. and S.L. Morgan. Experimental Design for Quality and... Statistical Designs, 9941 Rowlett, Suite 6, Houston, TX 77075, 1989. 21. Winefordner, J.D., Trace Analysis...
ERIC Educational Resources Information Center
Vermont Inst. for Self-Reliance, Rutland.
This guide provides a description of Responsive Text (RT), a method for presenting job-relevant information within a computer-based support system. A summary of what RT is and why it is important is provided first. The first section of the guide provides a brief overview of what research tells about the reading process and how the general design…
Spectral signature verification using statistical analysis and text mining
NASA Astrophysics Data System (ADS)
DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.
2016-05-01
In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is present for comparison. The spectral validation method proposed is described from a practical application and analytical perspective.
Mapping air quality zones for coastal urban centers.
Freeman, Brian; Gharabaghi, Bahram; Thé, Jesse; Munshed, Mohammad; Faisal, Shah; Abdullah, Meshal; Al Aseed, Athari
2017-05-01
This study presents a new method that incorporates modern air dispersion models allowing local terrain and land-sea breeze effects to be considered along with political and natural boundaries for more accurate mapping of air quality zones (AQZs) for coastal urban centers. This method uses local coastal wind patterns and key urban air pollution sources in each zone to more accurately calculate air pollutant concentration statistics. The new approach distributes virtual air pollution sources within each small grid cell of an area of interest and analyzes a puff dispersion model for a full year's worth of 1-hr prognostic weather data. The difference of wind patterns in coastal and inland areas creates significantly different skewness (S) and kurtosis (K) statistics for the annually averaged pollutant concentrations at ground level receptor points for each grid cell. Plotting the S-K data highlights grouping of sources predominantly impacted by coastal winds versus inland winds. The application of the new method is demonstrated through a case study for the nation of Kuwait by developing new AQZs to support local air management programs. The zone boundaries established by the S-K method were validated by comparing MM5 and WRF prognostic meteorological weather data used in the air dispersion modeling, a support vector machine classifier was trained to compare results with the graphical classification method, and final zones were compared with data collected from Earth observation satellites to confirm locations of high-exposure-risk areas. The resulting AQZs are more accurate and support efficient management strategies for air quality compliance targets effected by local coastal microclimates. A novel method to determine air quality zones in coastal urban areas is introduced using skewness (S) and kurtosis (K) statistics calculated from grid concentrations results of air dispersion models. The method identifies land-sea breeze effects that can be used to manage local air quality in areas of similar microclimates.
Using Quality Management Tools to Enhance Feedback from Student Evaluations
ERIC Educational Resources Information Center
Jensen, John B.; Artz, Nancy
2005-01-01
Statistical tools found in the service quality assessment literature--the "T"[superscript 2] statistic combined with factor analysis--can enhance the feedback instructors receive from student ratings. "T"[superscript 2] examines variability across multiple sets of ratings to isolate individual respondents with aberrant response…
Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality...
Sleep quality and quality of life in female shift-working nurses.
Shao, Ming-Fen; Chou, Yu-Ching; Yeh, Mei-Yu; Tzeng, Wen-Chii
2010-07-01
This paper is a report of a study of the factors that influence sleep quality and quality of life among shift-working nurses and the relationship between their sleep quality and quality of life. Although shift-working nurses strive to adapt their life schedules to shift rotations, they tend to suffer from severe sleep disturbances and increased rates of cancer, cardiovascular diseases, digestive disease and irregular menstrual cycles. Poor sleep is also associated with medical errors and occupational injuries. A cross-sectional study was conducted in 2008 with a convenience sample of 435 female nurses from five regional hospitals in Taiwan. Data were collected on sleep quality and quality of life using the Pittsburgh Sleep Quality Index and World Health Organization Quality of Life Instrument-BREF Taiwan version respectively. Data were analysed using descriptive statistics, independent t-tests, analysis of variance and Pearson correlations. The majority of female shift workers (57%) had global sleep-quality scores > or = 5, indicating poor sleep and all mean scores in four domains of the quality-of-life measure were statistically significantly lower than those of females in Taiwan's general population. Scores for poor sleep quality and quality of life were related to premenstrual dysphoria, occupational injury, illness and medication use. In addition, the associations between scores on the sleep-quality and quality-of-life scales were statistically significantly inversely correlated. Advice should be included in both undergraduate programmes and continuing education to help nurses to recognize and improve their own sleep quality and life quality managers should create a supportive environment to encourage shift-working nurses to engage in healthy behaviours.
Global aesthetic surgery statistics: a closer look.
Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas
2017-08-01
Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.
Total Quality Management Implementation Strategy: Directorate of Quality Assurance
1989-05-01
Total Quality Control Harrington, H. James The Improvement Process Imai, Masaaki Kaizen Ishikawa , Kaoru What is Total Quality Control Ishikawa ... Kaoru Statistical Quality Control Juran, J. M. Managerial Breakthrough Juran, J. M. Quality Control Handbook Mizuno, Ed Managing for Quality Improvements
MyPMFs: a simple tool for creating statistical potentials to assess protein structural models.
Postic, Guillaume; Hamelryck, Thomas; Chomilier, Jacques; Stratmann, Dirk
2018-05-29
Evaluating the model quality of protein structures that evolve in environments with particular physicochemical properties requires scoring functions that are adapted to their specific residue compositions and/or structural characteristics. Thus, computational methods developed for structures from the cytosol cannot work properly on membrane or secreted proteins. Here, we present MyPMFs, an easy-to-use tool that allows users to train statistical potentials of mean force (PMFs) on the protein structures of their choice, with all parameters being adjustable. We demonstrate its use by creating an accurate statistical potential for transmembrane protein domains. We also show its usefulness to study the influence of the physical environment on residue interactions within protein structures. Our open-source software is freely available for download at https://github.com/bibip-impmc/mypmfs. Copyright © 2018. Published by Elsevier B.V.
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
Kepler Planet Detection Metrics: Statistical Bootstrap Test
NASA Technical Reports Server (NTRS)
Jenkins, Jon M.; Burke, Christopher J.
2016-01-01
This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.
NASA Astrophysics Data System (ADS)
Jia, Huizhen; Sun, Quansen; Ji, Zexuan; Wang, Tonghan; Chen, Qiang
2014-11-01
The goal of no-reference/blind image quality assessment (NR-IQA) is to devise a perceptual model that can accurately predict the quality of a distorted image as human opinions, in which feature extraction is an important issue. However, the features used in the state-of-the-art "general purpose" NR-IQA algorithms are usually natural scene statistics (NSS) based or are perceptually relevant; therefore, the performance of these models is limited. To further improve the performance of NR-IQA, we propose a general purpose NR-IQA algorithm which combines NSS-based features with perceptually relevant features. The new method extracts features in both the spatial and gradient domains. In the spatial domain, we extract the point-wise statistics for single pixel values which are characterized by a generalized Gaussian distribution model to form the underlying features. In the gradient domain, statistical features based on neighboring gradient magnitude similarity are extracted. Then a mapping is learned to predict quality scores using a support vector regression. The experimental results on the benchmark image databases demonstrate that the proposed algorithm correlates highly with human judgments of quality and leads to significant performance improvements over state-of-the-art methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jassal, K; Sarkar, B; Mohanti, B
Objective: The study presents the application of a simple concept of statistical process control (SPC) for pre-treatment quality assurance procedure analysis for planar dose measurements performed using 2D-array and a-Si electronic portal imaging device (a-Si EPID). Method: A total of 195 patients of four different anatomical sites: brain (n1=45), head & neck (n2=45), thorax (n3=50) and pelvis (n4=55) were selected for the study. Pre-treatment quality assurance for the clinically acceptable IMRT/VMAT plans was measured with 2D array and a-Si EPID of the accelerator. After the γ-analysis, control charts and the quality index Cpm was evaluated for each cohort. Results: Meanmore » and σ of γ ( 3%/3 mm) were EPID γ %≤1= 99.9% ± 1.15% and array γ %<1 = 99.6% ± 1.06%. Among all plans γ max was consistently lower than for 2D array as compared to a-Si EPID. Fig.1 presents the X-bar control charts for every cohort. Cpm values for a-Si EPID were found to be higher than array, detailed results are presented in table 1. Conclusion: Present study demonstrates the significance of control charts used for quality management purposes in newer radiotherapy clinics. Also, provides a pictorial overview of the clinic performance for the advanced radiotherapy techniques.Higher Cpm values for EPID indicate its higher efficiency than array based measurements.« less
Berlage, Silvia; Wenzlaff, Paul; Damm, Gabriele; Sens, Brigitte
2010-01-01
The concept of the "ZQ In-house Seminars" provided by external trainers/experts pursues the specific aim to enable all healthcare staff members of hospital departments to analyse statistical data--especially from external quality measurements--and to initiate in-hospital measures of quality improvement based on structured team work. The results of an evaluation in Lower Saxony for the period between 2004 and 2008 demonstrate a sustainable increase in outcome quality of care and a strengthening of team and process orientation in clinical care.
Statistical Process Control: Going to the Limit for Quality.
ERIC Educational Resources Information Center
Training, 1987
1987-01-01
Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)
Accountability Indicators from the Viewpoint of Statistical Method.
ERIC Educational Resources Information Center
Jordan, Larry
Few people seriously regard students as "products" coming off an educational assembly line, but notions about accountability and quality improvement in higher education are pervaded by manufacturing ideas and metaphors. Because numerical indicators of quality are inevitably expressed by trend lines or statistical control chars of some kind, they…
Meat Quality Assessment by Electronic Nose (Machine Olfaction Technology)
Ghasemi-Varnamkhasti, Mahdi; Mohtasebi, Seyed Saeid; Siadat, Maryam; Balasubramanian, Sundar
2009-01-01
Over the last twenty years, newly developed chemical sensor systems (so called “electronic noses”) have made odor analyses possible. These systems involve various types of electronic chemical gas sensors with partial specificity, as well as suitable statistical methods enabling the recognition of complex odors. As commercial instruments have become available, a substantial increase in research into the application of electronic noses in the evaluation of volatile compounds in food, cosmetic and other items of everyday life is observed. At present, the commercial gas sensor technologies comprise metal oxide semiconductors, metal oxide semiconductor field effect transistors, organic conducting polymers, and piezoelectric crystal sensors. Further sensors based on fibreoptic, electrochemical and bi-metal principles are still in the developmental stage. Statistical analysis techniques range from simple graphical evaluation to multivariate analysis such as artificial neural network and radial basis function. The introduction of electronic noses into the area of food is envisaged for quality control, process monitoring, freshness evaluation, shelf-life investigation and authenticity assessment. Considerable work has already been carried out on meat, grains, coffee, mushrooms, cheese, sugar, fish, beer and other beverages, as well as on the odor quality evaluation of food packaging material. This paper describes the applications of these systems for meat quality assessment, where fast detection methods are essential for appropriate product management. The results suggest the possibility of using this new technology in meat handling. PMID:22454572
Correlation of phthalate exposures with semen quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pant, Niraj; Shukla, Manju; Kumar Patel, Devendra
2008-08-15
Phthalates are widely used man-made chemical released in the environment and human exposure is mainly through diet. As the phthalate plasticizers are not covalently bound to PVC, they can leach, migrate or evaporate into the environment and as a result have become ubiquitously contaminants. The present study investigates the correlation, if any, between the phthalate esters (DEP, DEHP, DBP, DMP, DOP) and sperm mitochondrial status, ROS, LPO, SCSA, and sperm quality. The study was conducted in the urban/rural population of Lucknow visiting Obstetrics and Gynecology Department, CSMMU, Lucknow. Semen analysis was performed according to the WHO guidelines while phthalate analysismore » by HPLC and LPO by spectrophotometer and the sperm mitochondrial status, ROS, SCSA using flow cytometry. The questionnaire data showed no significant difference in the demographic characteristics among the groups. In general, urban population was found to have statistically significant higher levels of phthalate esters than the rural. Further, infertile men showed statistically significant (p < 0.05) higher levels of pollutants in the semen than fertile men. A negative correlation between semen phthalate level viz DEHP and sperm quality and positive association with depolarized mitochondria, elevation in ROS production and LPO, DNA fragmentation was established. The findings are suggestive that phthalates might be one among the contributing factors associated with the deterioration in semen quality and these adverse effects might be ROS, LPO and mitochondrial dysfunction mediated.« less
Zantut, Fabio; Holzchuh, Ricardo; Boni, Reginaldo Carlos; Mackus, Eva Cristina; Zantut, Paulo Roberto; Nakano, Claudio; Netto, Adamo Lui; Hida, Richard Yudi
2012-01-01
To compare the interval between death and enucleation (ΔT-O-E), between enucleation and preservation (ΔT-E-P) and the quality of the cornea before and after the implantation of new technique and sanitary rules. A retrospective study that evaluated the records of cornea donors in Sao Paulo's Santa Casa Eye Tissue Bank 2 years before and 2 years after the implementation new sanitary rules. An increase was observed in the absolute number of 205 to 374 donors following the adopted changes. There was no statistically significant difference in Δt-O-E and ΔT-E-P before and after the implemented changes. Of the total of 1,105 donor corneas, 388 donor corneas were observed before the changes and 717 donor corneas after the implemented changes. We observed a statistically significant increase in grading of donor cornea quality from 1.76 ± 0.90 to 1.94 ± 0.88 after the implementation of new standards of resolution. After the changes required by Resolution 347, there was a large increase in the number of donated, taken and preserved corneas. The BTO has not diminished the ΔT O-E and ΔT E-P. Cornea quality presented itself lower after the new rules.
Statistical Reviewers Improve Reporting in Biomedical Articles: A Randomized Trial
Cobo, Erik; Selva-O'Callagham, Albert; Ribera, Josep-Maria; Cardellach, Francesc; Dominguez, Ruth; Vilardell, Miquel
2007-01-01
Background Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both. Methodology and Principal Findings Interventions were defined as 1) the addition of a statistical reviewer to the clinical peer review process, and 2) suggesting reporting guidelines to reviewers; with “no statistical expert” and “no checklist” as controls. The two interventions were crossed in a 2×2 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc) editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others). Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3%) of the 115 included papers were interventions, 46 (40.0%) were longitudinal designs, 28 (24.3%) cross-sectional and 20 (17.4%) others. The 16 (13.9%) rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6–24.4). The effect of suggesting a guideline to the reviewers had no effect on change in overall quality as measured by the Goodman scale (0.9, 95% CI: −0.3–+2.1). The estimated effect of adding a statistical reviewer was 5.5 (95% CI: 4.3–6.7), showing a significant improvement in quality. Conclusions and Significance This prospective randomized study shows the positive effect of adding a statistical reviewer to the field-expert peers in improving manuscript quality. We did not find a statistically significant positive effect by suggesting reviewers use reporting guidelines. PMID:17389922
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Emerging Techniques for Dose Optimization in Abdominal CT
Platt, Joel F.; Goodsitt, Mitchell M.; Al-Hawary, Mahmoud M.; Maturen, Katherine E.; Wasnik, Ashish P.; Pandya, Amit
2014-01-01
Recent advances in computed tomographic (CT) scanning technique such as automated tube current modulation (ATCM), optimized x-ray tube voltage, and better use of iterative image reconstruction have allowed maintenance of good CT image quality with reduced radiation dose. ATCM varies the tube current during scanning to account for differences in patient attenuation, ensuring a more homogeneous image quality, although selection of the appropriate image quality parameter is essential for achieving optimal dose reduction. Reducing the x-ray tube voltage is best suited for evaluating iodinated structures, since the effective energy of the x-ray beam will be closer to the k-edge of iodine, resulting in a higher attenuation for the iodine. The optimal kilovoltage for a CT study should be chosen on the basis of imaging task and patient habitus. The aim of iterative image reconstruction is to identify factors that contribute to noise on CT images with use of statistical models of noise (statistical iterative reconstruction) and selective removal of noise to improve image quality. The degree of noise suppression achieved with statistical iterative reconstruction can be customized to minimize the effect of altered image quality on CT images. Unlike with statistical iterative reconstruction, model-based iterative reconstruction algorithms model both the statistical noise and the physical acquisition process, allowing CT to be performed with further reduction in radiation dose without an increase in image noise or loss of spatial resolution. Understanding these recently developed scanning techniques is essential for optimization of imaging protocols designed to achieve the desired image quality with a reduced dose. © RSNA, 2014 PMID:24428277
NASA Astrophysics Data System (ADS)
Osterman, G. B.; Neu, J. L.; Eldering, A.; Pinder, R. W.; Tang, Y.; McQueen, J.
2014-12-01
Most regional scale models that are used for air quality forecasts and ozone source attribution do not adequately capture the distribution of ozone in the mid- and upper troposphere, but it is unclear how this shortcoming relates to their ability to simulate surface ozone. We combine ozone profile data from the NASA Earth Observing System (EOS) Tropospheric Emission Spectrometer (TES) and a new joint product from TES and the Ozone Monitoring Instrument along with ozonesonde measurements and EPA AirNow ground station ozone data to examine air quality events during August 2006 in the Community Multi-Scale Air Quality (CMAQ) and National Air Quality Forecast Capability (NAQFC) models. We present both aggregated statistics and case-study analyses with the goal of assessing the relationship between the models' ability to reproduce surface air quality events and their ability to capture the vertical distribution of ozone. We find that the models lack the mid-tropospheric ozone variability seen in TES and the ozonesonde data, and discuss the conditions under which this variability appears to be important for surface air quality.
Experimental Study of Quantum Graphs With and Without Time-Reversal Invariance
NASA Astrophysics Data System (ADS)
Anlage, Steven Mark; Fu, Ziyuan; Koch, Trystan; Antonsen, Thomas; Ott, Edward
An experimental setup consisting of a microwave network is used to simulate quantum graphs. The random coupling model (RCM) is applied to describe the universal statistical properties of the system with and without time-reversal invariance. The networks which are large compared to the wavelength, are constructed from coaxial cables connected by T junctions, and by making nodes with circulators time-reversal invariance for microwave propagation in the networks can be broken. The results of experimental study of microwave networks with and without time-reversal invariance are presented both in frequency domain and time domain. With the measured S-parameter data of two-port networks, the impedance statistics and the nearest-neighbor spacing statistics are examined. Moreover, the experiments of time reversal mirrors for networks demonstrate that the reconstruction quality can be used to quantify the degree of the time-reversal invariance for wave propagation. Numerical models of networks are also presented to verify the time domain experiments. We acknowledge support under contract AFOSR COE Grant FA9550-15-1-0171 and the ONR Grant N000141512134.
Caballero Morales, Santiago Omar
2013-01-01
The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082
Rostami, Reza; Nahm, Meredith; Pieper, Carl F
2009-04-01
Despite a pressing and well-documented need for better sharing of information on clinical trials data quality assurance methods, many research organizations remain reluctant to publish descriptions of and results from their internal auditing and quality assessment methods. We present findings from a review of a decade of internal data quality audits performed at the Duke Clinical Research Institute, a large academic research organization that conducts data management for a diverse array of clinical studies, both academic and industry-sponsored. In so doing, we hope to stimulate discussions that could benefit the wider clinical research enterprise by providing insight into methods of optimizing data collection and cleaning, ultimately helping patients and furthering essential research. We present our audit methodologies, including sampling methods, audit logistics, sample sizes, counting rules used for error rate calculations, and characteristics of audited trials. We also present database error rates as computed according to two analytical methods, which we address in detail, and discuss the advantages and drawbacks of two auditing methods used during this 10-year period. Our review of the DCRI audit program indicates that higher data quality may be achieved from a series of small audits throughout the trial rather than through a single large database audit at database lock. We found that error rates trended upward from year to year in the period characterized by traditional audits performed at database lock (1997-2000), but consistently trended downward after periodic statistical process control type audits were instituted (2001-2006). These increases in data quality were also associated with cost savings in auditing, estimated at 1000 h per year, or the efforts of one-half of a full time equivalent (FTE). Our findings are drawn from retrospective analyses and are not the result of controlled experiments, and may therefore be subject to unanticipated confounding. In addition, the scope and type of audits we examine here are specific to our institution, and our results may not be broadly generalizable. Use of statistical process control methodologies may afford advantages over more traditional auditing methods, and further research will be necessary to confirm the reliability and usability of such techniques. We believe that open and candid discussion of data quality assurance issues among academic and clinical research organizations will ultimately benefit the entire research community in the coming era of increased data sharing and re-use.
Lack, N
2001-08-01
The introduction of the modified data set for quality assurance in obstetrics (formerly perinatal survey) in Lower Saxony and Bavaria as early as 1999 saw the urgent requirement for a corresponding new statistical analysis of the revised data. The general outline of a new data reporting concept was originally presented by the Bavarian Commission for Perinatology and Neonatology at the Munich Perinatal Conference in November 1997. These ideas are germinal to content and layout of the new quality report for obstetrics currently in its nationwide harmonisation phase coordinated by the federal office for quality assurance in hospital care. A flexible and modular database oriented analysis tool developed in Bavaria is now in its second year of successful operation. The functionalities of this system are described in detail.
Electric power quarterly, October-December 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-04-19
The EPQ presents monthly summaries of electric utility statistics at the national, divisional, state, company, and plant levels on the following subjects: quantity of fuel, cost of fuel, quality of fuel, net generation, fuel consumption, and fuel stocks. In addition, the EPQ presents a quarterly summary of reported major disturbances and unusual occurrences. These data are collected on the Form IE-417R. Every electric utility engaged in the generation, transmission, or distribution of electric energy must file a report with DOE if it experiences a major power system emergency.
Leontjevas, Ruslan; Gerritsen, Debby L; Koopmans, Raymond T C M; Smalbrugge, Martin; Vernooij-Dassen, Myrra J F J
2012-06-01
A multidisciplinary, evidence-based care program to improve the management of depression in nursing home residents was implemented and tested using a stepped-wedge design in 23 nursing homes (NHs): "Act in case of Depression" (AiD). Before effect analyses, to evaluate AiD process data on sampling quality (recruitment and randomization, reach) and intervention quality (relevance and feasibility, extent to which AiD was performed), which can be used for understanding internal and external validity. In this article, a model is presented that divides process evaluation data into first- and second-order process data. Qualitative and quantitative data based on personal files of residents, interviews of nursing home professionals, and a research database were analyzed according to the following process evaluation components: sampling quality and intervention quality. Nursing home. The pattern of residents' informed consent rates differed for dementia special care units and somatic units during the study. The nursing home staff was satisfied with the AiD program and reported that the program was feasible and relevant. With the exception of the first screening step (nursing staff members using a short observer-based depression scale), AiD components were not performed fully by NH staff as prescribed in the AiD protocol. Although NH staff found the program relevant and feasible and was satisfied with the program content, individual AiD components may have different feasibility. The results on sampling quality implied that statistical analyses of AiD effectiveness should account for the type of unit, whereas the findings on intervention quality implied that, next to the type of unit, analyses should account for the extent to which individual AiD program components were performed. In general, our first-order process data evaluation confirmed internal and external validity of the AiD trial, and this evaluation enabled further statistical fine tuning. The importance of evaluating the first-order process data before executing statistical effect analyses is thus underlined. Copyright © 2012 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hoover, F. A.; Bowling, L. C.; Prokopy, L. S.
2015-12-01
Urban stormwater is an on-going management concern in municipalities of all sizes. In both combined or separated sewer systems, pollutants from stormwater runoff enter the natural waterway system during heavy rain events. Urban flooding during frequent and more intense storms are also a growing concern. Therefore, stormwater best-management practices (BMPs) are being implemented in efforts to reduce and manage stormwater pollution and overflow. The majority of BMP water quality studies focus on the small-scale, individual effects of the BMP, and the change in water quality directly from the runoff of these infrastructures. At the watershed scale, it is difficult to establish statistically whether or not these BMPs are making a difference in water quality, given that watershed scale monitoring is often costly and time consuming, relying on significant sources of funds, which a city may not have. Hence, there is a need to quantify the level of sampling needed to detect the water quality impact of BMPs at the watershed scale. In this study, a power analysis was performed on data from an urban watershed in Lafayette, Indiana, to determine the frequency of sampling required to detect a significant change in water quality measurements. Using the R platform, results indicate that detecting a significant change in watershed level water quality would require hundreds of weekly measurements, even when improvement is present. The second part of this study investigates whether the difficulty in demonstrating water quality change represents a barrier to adoption of stormwater BMPs. Semi-structured interviews of community residents and organizations in Chicago, IL are being used to investigate residents understanding of water quality and best management practices and identify their attitudes and perceptions towards stormwater BMPs. Second round interviews will examine how information on uncertainty in water quality improvements influences their BMP attitudes and perceptions.
Exploring Marine Corps Officer Quality: An Analysis of Promotion to Lieutenant Colonel
2017-03-01
44 G. DESCRIPTIVE STATISTICS ................................................................44 1. Dependent...Variable Summary Statistics ...................................44 2. Performance...87 4. Further Research .........................................................................88 APPENDIX A. SUMMARY STATISTICS OF FITREP AND
Optimization of Thick, Large Area YBCO Film Growth Through Response Surface Methods
NASA Astrophysics Data System (ADS)
Porzio, J.; Mahoney, C. H.; Sullivan, M. C.
2014-03-01
We present our work on the optimization of thick, large area YB2C3O7-δ (YBCO) film growth through response surface methods. Thick, large area films have commercial uses and have recently been used in dramatic demonstrations of levitation and suspension. Our films are grown via pulsed laser deposition and we have optimized growth parameters via response surface methods. Response surface methods is a statistical tool to optimize selected quantities with respect to a set of variables. We optimized our YBCO films' critical temperatures, thicknesses, and structures with respect to three PLD growth parameters: deposition temperature, laser energy, and deposition pressure. We will present an overview of YBCO growth via pulsed laser deposition, the statistical theory behind response surface methods, and the application of response surface methods to pulsed laser deposition growth of YBCO. Results from the experiment will be presented in a discussion of the optimized film quality. Supported by NFS grant DMR-1305637
A quality score for coronary artery tree extraction results
NASA Astrophysics Data System (ADS)
Cao, Qing; Broersen, Alexander; Kitslaar, Pieter H.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke
2018-02-01
Coronary artery trees (CATs) are often extracted to aid the fully automatic analysis of coronary artery disease on coronary computed tomography angiography (CCTA) images. Automatically extracted CATs often miss some arteries or include wrong extractions which require manual corrections before performing successive steps. For analyzing a large number of datasets, a manual quality check of the extraction results is time-consuming. This paper presents a method to automatically calculate quality scores for extracted CATs in terms of clinical significance of the extracted arteries and the completeness of the extracted CAT. Both right dominant (RD) and left dominant (LD) anatomical statistical models are generated and exploited in developing the quality score. To automatically determine which model should be used, a dominance type detection method is also designed. Experiments are performed on the automatically extracted and manually refined CATs from 42 datasets to evaluate the proposed quality score. In 39 (92.9%) cases, the proposed method is able to measure the quality of the manually refined CATs with higher scores than the automatically extracted CATs. In a 100-point scale system, the average scores for automatically and manually refined CATs are 82.0 (+/-15.8) and 88.9 (+/-5.4) respectively. The proposed quality score will assist the automatic processing of the CAT extractions for large cohorts which contain both RD and LD cases. To the best of our knowledge, this is the first time that a general quality score for an extracted CAT is presented.
Quality of the leader-member relationship and the organizational commitment of nurses.
Nunes, Elisabete Maria Garcia Teles; Gaspar, Maria Filomena Mendes
2017-12-18
To understand the perception of the quality of leadership relationships and the organizational commitment of nurses, and to analyze the influence of this relationship quality. Cross-sectional and correlational study, with a quantitative approach, using a non-probability convenience sampling with 408 nurses. The data were collected through questionnaires at Central Hospital in Lisbon, between January and March 2013. The statistical analysis of the data was carried out using IBM® SPSS® Statistics 19 software. Three hundred forty-two questionnaires were considered valid. The quality of the leadership relationship was satisfactory, and the nurses were poorly committed to the organization. The quality of the leadership relationship was statistically correlated with organizational commitment: there was found a moderate association to affective commitment (rs=0.42, p<0.05), a low association with the normative commitment (rs=0.37, p<0.05), and a very low association with the calculative commitment (rs=0.14, p<0.05). Leadership exerts influence on organizational commitment. An opportunity to improve the quality of the leadership relationship between nurses and their leaders was found, with the consequent possibility of developing organizational commitment.
Valerian for sleep: a systematic review and meta-analysis.
Bent, Stephen; Padula, Amy; Moore, Dan; Patterson, Michael; Mehling, Wolf
2006-12-01
Insomnia affects approximately one-third of the adult population and contributes to increased rates of absenteeism, health care use, and social disability. Extracts of the roots of valerian (Valeriana officinalis) are widely used for inducing sleep and improving sleep quality. A systematic review of randomized, placebo-controlled trials of valerian for improving sleep quality is presented. An extensive literature search identified 16 eligible studies examining a total of 1093 patients. Most studies had significant methodologic problems, and the valerian doses, preparations, and length of treatment varied considerably. A dichotomous outcome of sleep quality (improved or not) was reported by 6 studies and showed a statistically significant benefit (relative risk of improved sleep = 1.8, 95% confidence interval, 1.2-2.9), but there was evidence of publication bias in this summary measure. The available evidence suggests that valerian might improve sleep quality without producing side effects. Future studies should assess a range of doses of standardized preparations of valerian and include standard measures of sleep quality and safety.
Valerian for Sleep: A Systematic Review and Meta-Analysis
Bent, Stephen; Padula, Amy; Moore, Dan; Patterson, Michael; Mehling, Wolf
2014-01-01
Insomnia affects approximately one-third of the adult population and contributes to increased rates of absenteeism, health care use, and social disability. Extracts of the roots of valerian (Valeriana officinalis) are widely used for inducing sleep and improving sleep quality. A systematic review of randomized, placebo-controlled trials of valerian for improving sleep quality is presented. An extensive literature search identified 16 eligible studies examining a total of 1093 patients. Most studies had significant methodologic problems, and the valerian doses, preparations, and length of treatment varied considerably. A dichotomous outcome of sleep quality (improved or not) was reported by 6 studies and showed a statistically significant benefit (relative risk of improved sleep = 1.8, 95% confidence interval, 1.2-2.9), but there was evidence of publication bias in this summary measure. The available evidence suggests that valerian might improve sleep quality without producing side effects. Future studies should assess a range of doses of standardized preparations of valerian and include standard measures of sleep quality and safety. PMID:17145239
Statistical process management: An essential element of quality improvement
NASA Astrophysics Data System (ADS)
Buckner, M. R.
Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Struckmeyer, R.
This report presents the results of the NRC Direct Radiation Monitoring Network for the fourth quarter of 1995. It provides the ambient radiation levels measured in the vicinity of 75 sites throughout the United States. In addition, it describes the equipment used, monitoring station selection criteria, characterization of the dosimeter response, calibration procedures, statistical methods, intercomparison, and quality assurance program.
NASA Astrophysics Data System (ADS)
Aubrecht, Gordon J.; Aubrecht, Judith D.
1983-07-01
True-false or multiple-choice tests can be useful instruments for evaluating student progress. We examine strategies for planning objective tests which serve to test the material covered in science (physics) courses. We also examine strategies for writing questions for tests within a test blueprint. The statistical basis for judging the quality of test items are discussed. Reliability, difficulty, and discrimination indices are defined and examples presented. Our recommendation are rather easily put into practice.
smwrGraphs—An R package for graphing hydrologic data, version 1.1.2
Lorenz, David L.; Diekoff, Aliesha L.
2017-01-31
This report describes an R package called smwrGraphs, which consists of a collection of graphing functions for hydrologic data within R, a programming language and software environment for statistical computing. The functions in the package have been developed by the U.S. Geological Survey to create high-quality graphs for publication or presentation of hydrologic data that meet U.S. Geological Survey graphics guidelines.
1994-06-30
tip Opening Displacement (CTOD) Fracture Toughness Measurement". 48 The method has found application in the elastic-plastic fracture mechanics ( EPFM ...68 6.1 Proposed Material Property Database Format and Hierarchy .............. 68 6.2 Sample Application of the Material Property Database...the E 49.05 sub-committee. The relevant quality indicators applicable to the present program are: source of data, statistical basis of data
Statistical inference of protein structural alignments using information and compression.
Collier, James H; Allison, Lloyd; Lesk, Arthur M; Stuckey, Peter J; Garcia de la Banda, Maria; Konagurthu, Arun S
2017-04-01
Structural molecular biology depends crucially on computational techniques that compare protein three-dimensional structures and generate structural alignments (the assignment of one-to-one correspondences between subsets of amino acids based on atomic coordinates). Despite its importance, the structural alignment problem has not been formulated, much less solved, in a consistent and reliable way. To overcome these difficulties, we present here a statistical framework for the precise inference of structural alignments, built on the Bayesian and information-theoretic principle of Minimum Message Length (MML). The quality of any alignment is measured by its explanatory power-the amount of lossless compression achieved to explain the protein coordinates using that alignment. We have implemented this approach in MMLigner , the first program able to infer statistically significant structural alignments. We also demonstrate the reliability of MMLigner 's alignment results when compared with the state of the art. Importantly, MMLigner can also discover different structural alignments of comparable quality, a challenging problem for oligomers and protein complexes. Source code, binaries and an interactive web version are available at http://lcb.infotech.monash.edu.au/mmligner . arun.konagurthu@monash.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Chandrasekaran, A; Ravisankar, R; Harikrishnan, N; Satapathy, K K; Prasad, M V R; Kanagasabapathy, K V
2015-02-25
Anthropogenic activities increase the accumulation of heavy metals in the soil environment. Soil pollution significantly reduces environmental quality and affects the human health. In the present study soil samples were collected at different locations of Yelagiri Hills, Tamilnadu, India for heavy metal analysis. The samples were analyzed for twelve selected heavy metals (Mg, Al, K, Ca, Ti, Fe, V, Cr, Mn, Co, Ni and Zn) using energy dispersive X-ray fluorescence (EDXRF) spectroscopy. Heavy metals concentration in soil were investigated using enrichment factor (EF), geo-accumulation index (Igeo), contamination factor (CF) and pollution load index (PLI) to determine metal accumulation, distribution and its pollution status. Heavy metal toxicity risk was assessed using soil quality guidelines (SQGs) given by target and intervention values of Dutch soil standards. The concentration of Ni, Co, Zn, Cr, Mn, Fe, Ti, K, Al, Mg were mainly controlled by natural sources. Multivariate statistical methods such as correlation matrix, principal component analysis and cluster analysis were applied for the identification of heavy metal sources (anthropogenic/natural origin). Geo-statistical methods such as kirging identified hot spots of metal contamination in road areas influenced mainly by presence of natural rocks. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting
2018-08-30
Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie
1991-01-01
Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.
Petroleum marketing monthly, September 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum product sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of imported crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. The Petroleum Marketing Division, Office of Oil and Gas, Energy Information Administration ensuresmore » the accuracy, quality, and confidentiality of the published data in the Petroleum Marketing Monthly.« less
Misconceptions of the p-value among Chilean and Italian Academic Psychologists
Badenes-Ribera, Laura; Frias-Navarro, Dolores; Iotti, Bryan; Bonilla-Campos, Amparo; Longobardi, Claudio
2016-01-01
Common misconceptions of p-values are based on certain beliefs and attributions about the significance of the results. Thus, they affect the professionals' decisions and jeopardize the quality of interventions and the accumulation of valid scientific knowledge. We conducted a survey on 164 academic psychologists (134 Italian, 30 Chilean) questioned on this topic. Our findings are consistent with previous research and suggest that some participants do not know how to correctly interpret p-values. The inverse probability fallacy presents the greatest comprehension problems, followed by the replication fallacy. These results highlight the importance of the statistical re-education of researchers. Recommendations for improving statistical cognition are proposed. PMID:27602007
Advances for the Topographic Characterisation of SMC Materials
Calvimontes, Alfredo; Grundke, Karina; Müller, Anett; Stamm, Manfred
2009-01-01
For a comprehensive study of Sheet Moulding Compound (SMC) surfaces, topographical data obtained by a contact-free optical method (chromatic aberration confocal imaging) were systematically acquired to characterise these surfaces with regard to their statistical, functional and volumetrical properties. Optimal sampling conditions (cut-off length and resolution) were obtained by a topographical-statistical procedure proposed in the present work. By using different length scales specific morphologies due to the influence of moulding conditions, metallic mould topography, glass fibre content and glass fibre orientation can be characterized. The aim of this study is to suggest a systematic topographical characterization procedure for composite materials in order to study and recognize the influence of production conditions on their surface quality.
Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes
Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca
2007-01-01
The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...
Quality and Consistency of the NASA Ocean Color Data Record
NASA Technical Reports Server (NTRS)
Franz, Bryan A.
2012-01-01
The NASA Ocean Biology Processing Group (OBPG) recently reprocessed the multimission ocean color time-series from SeaWiFS, MODIS-Aqua, and MODIS-Terra using common algorithms and improved instrument calibration knowledge. Here we present an analysis of the quality and consistency of the resulting ocean color retrievals, including spectral water-leaving reflectance, chlorophyll a concentration, and diffuse attenuation. Statistical analysis of satellite retrievals relative to in situ measurements will be presented for each sensor, as well as an assessment of consistency in the global time-series for the overlapping periods of the missions. Results will show that the satellite retrievals are in good agreement with in situ measurements, and that the sensor ocean color data records are highly consistent over the common mission lifespan for the global deep oceans, but with degraded agreement in higher productivity, higher complexity coastal regions.
High dynamic range subjective testing
NASA Astrophysics Data System (ADS)
Allan, Brahim; Nilsson, Mike
2016-09-01
This paper describes of a set of subjective tests that the authors have carried out to assess the end user perception of video encoded with High Dynamic Range technology when viewed in a typical home environment. Viewers scored individual single clips of content, presented in High Definition (HD) and Ultra High Definition (UHD), in Standard Dynamic Range (SDR), and in High Dynamic Range (HDR) using both the Perceptual Quantizer (PQ) and Hybrid Log Gamma (HLG) transfer characteristics, and presented in SDR as the backwards compatible rendering of the HLG representation. The quality of SDR HD was improved by approximately equal amounts by either increasing the dynamic range or increasing the resolution to UHD. A further smaller increase in quality was observed in the Mean Opinion Scores of the viewers by increasing both the dynamic range and the resolution, but this was not quite statistically significant.
Sb2Te3 and Its Superlattices: Optimization by Statistical Design.
Behera, Jitendra K; Zhou, Xilin; Ranjan, Alok; Simpson, Robert E
2018-05-02
The objective of this work is to demonstrate the usefulness of fractional factorial design for optimizing the crystal quality of chalcogenide van der Waals (vdW) crystals. We statistically analyze the growth parameters of highly c axis oriented Sb 2 Te 3 crystals and Sb 2 Te 3 -GeTe phase change vdW heterostructured superlattices. The statistical significance of the growth parameters of temperature, pressure, power, buffer materials, and buffer layer thickness was found by fractional factorial design and response surface analysis. Temperature, pressure, power, and their second-order interactions are the major factors that significantly influence the quality of the crystals. Additionally, using tungsten rather than molybdenum as a buffer layer significantly enhances the crystal quality. Fractional factorial design minimizes the number of experiments that are necessary to find the optimal growth conditions, resulting in an order of magnitude improvement in the crystal quality. We highlight that statistical design of experiment methods, which is more commonly used in product design, should be considered more broadly by those designing and optimizing materials.
NASA Astrophysics Data System (ADS)
Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen
2016-07-01
The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our experiments indicate that the new POP ensemble consistency test (POP-ECT) tool is capable of distinguishing cases that should be statistically consistent with the ensemble and those that should not, as well as providing a simple, subjective and systematic way to detect errors in CESM-POP due to the hardware or software stack, positively contributing to quality assurance for the CESM-POP code.
Cirera, Lluís; Salmerón, Diego; Martínez, Consuelo; Bañón, Rafael María; Navarro, Carmen
2018-06-06
After the return of Spain to democracy and the regional assumption of government powers, actions were initiated to improve the mortality statistics of death causes. The objective of this work was to describe the evolution of the quality activities improvements into the statistics of death causes on Murcia's region during 1989 to 2011. Descriptive epidemiological study of all death documents processed by the Murcia mortality registry. Use of indicators related to the quality of the completion of death in medical and judicial notification; recovery of information on the causes and circumstances of death; and impact on the statistics of ill-defined, unspecific and less specific causes. During the study period, the medical notification without a temporary sequence on the death certificate (DC) has decreased from 46% initial to 21% final (p less than 0.001). Information retrieval from sources was successful in 93% of the cases in 2001 compared to 38%, at the beginning of the period (p less than 0.001). Regional rates of ill-defined and unspecific causes fell more than national ones, and they were in the last year with a differential of 10.3 (p less than 0.001) and 2.8 points (p=0.001), respectively. The medical death certification improved in form and suitability. Regulated recovery of the causes of death and circumstances corrected medical and judicial information. The Murcia's region presented lower rates in less specified causes and ill-defined entities than national averages.
Resende, Lucas; Carmo, Carolina do; Mocellin, Leão; Pasinato, Rogério; Mocellin, Marcos
2017-07-29
Septal deviations might cause nasal obstruction and negative impact on the quality of life of individuals. The efficacy of septoplasty for treatment of septal deviation and the predictors of satisfactory surgical outcomes remain controversial. Technical variability, heterogeneity of research samples and absence of a solid tool for clinical evaluation are the main hindrances to the establishment of reliable statistical data regarding the procedure. To evaluate the clinical improvements in the disease-specific quality-of-life between patients submitted to septoplasty with bilateral outfracture of the inferior turbinate under sedation and local anesthesia in a tertiary hospital and to assess possible clinical-epidemiological variables associated with functional outcome. Fifty-two patients consecutively submitted to septoplasty with bilateral outfracture of the inferior turbinate for treatment of nasal obstruction filled in forms regarding clinical and epidemiological information during enrollment and had their symptom objectively quantified using the Nose Obstruction Symptom Evaluation (NOSE) scale preoperatively and one and three months after the procedure. Statistical analysis aimed to determine overall and stratified surgical outcomes and to investigate correlations between the clinical-epidemiological variables with the scores obtained. Statistically significant improvement in the preoperative NOSE questionnaire compared to the scores obtained three months after surgery was demonstrated (p<0.001, T-Wilcoxon), with strong correlation between the preoperative score and the postoperative improvement during this period (r=-0.614, p<0.001, Spearman). After one month, patients reached in average 87.15% of the result obtained at the study termination. Smokers and patients with rhinitis and/or pulmonary comorbidity showed increased average preoperative NOSE scores, although without statistical significance (p>0.05). Gender, age, history of rhinitis and presence of pulmonary comorbidity did not influence significantly surgical outcomes (p>0.05). Smokers presented greater reduction in NOSE scores during the study (p=0.043, U-Mann-Whitney). Septoplasty with bilateral outfracture of the inferior turbinate has proven to significantly improve disease-specific quality-of-life and this favorable outcome seems to occur precociously. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Hezel, Marcus; von Usslar, Kathrin; Kurzweg, Thiemo; Lörincz, Balazs B; Knecht, Rainald
2016-04-01
This article reviews the methodical and statistical basics of designing a trial, with a special focus on the process of defining and choosing endpoints and cutpoints as the foundations of clinical research, and ultimately that of evidence-based medicine. There has been a significant progress in the treatment of head and neck cancer in the past few decades. Currently available treatment options can have a variety of different goals, depending e.g. on tumor stage, among other factors. The outcome of a specific treatment in clinical trials is measured using endpoints. Besides classical endpoints, such as overall survival or organ preservation, other endpoints like quality of life are becoming increasingly important in designing and conducting a trial. The present work is based on electronic research and focuses on the solid methodical and statistical basics of a clinical trial, on the structure of study designs and on the presentation of various endpoints.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
Creating Near-Term Climate Scenarios for AgMIP
NASA Astrophysics Data System (ADS)
Goddard, L.; Greene, A. M.; Baethgen, W.
2012-12-01
For the next assessment report of the IPCC (AR5), attention is being given to development of climate information that is appropriate for adaptation, such as decadal-scale and near-term predictions intended to capture the combined effects of natural climate variability and the emerging climate change signal. While the science and practice evolve for the production and use of dynamic decadal prediction, information relevant to agricultural decision-makers can be gained from analysis of past decadal-scale trends and variability. Statistical approaches that mimic the characteristics of observed year-to-year variability can indicate the range of possibilities and their likelihood. In this talk we present work towards development of near-term climate scenarios, which are needed to engage decision-makers and stakeholders in the regions in current decision-making. The work includes analyses of decadal-scale variability and trends in the AgMIP regions, and statistical approaches that capture year-to-year variability and the associated persistence of wet and dry years. We will outline the general methodology and some of the specific considerations in the regional application of the methodology for different AgMIP regions, such those for Western Africa versus southern Africa. We will also show some examples of quality checks and informational summaries of the generated data, including (1) metrics of information quality such as probabilistic reliability for a suite of relevant climate variables and indices important for agriculture; (2) quality checks relative to the use of this climate data in crop models; and, (3) summary statistics (e.g., for 5-10-year periods or across given spatial scales).
Paré, Pierre; Lee, Joanna; Hawes, Ian A
2010-03-01
To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician's discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management.
The impact of primary open-angle glaucoma: Quality of life in Indian patients.
Kumar, Suresh; Ichhpujani, Parul; Singh, Roopali; Thakur, Sahil; Sharma, Madhu; Nagpal, Nimisha
2018-03-01
Glaucoma significantly affects the quality of life (QoL) of a patient. Despite the huge number of glaucoma patients in India, not many, QoL studies have been carried out. The purpose of the present study was to evaluate the QoL in Indian patients with varying severity of glaucoma. This was a hospital-based, cross-sectional, analytical study of 180 patients. The QoL was assessed using orally administered QoL instruments comprising of two glaucoma-specific instruments; Glaucoma Quality of Life-15 (GQL-15) and Viswanathan 10 instrument, and 1 vision-specific instrument; National Eye Institute Visual Function Questionnaire-25 (NEIVFQ25). Using NEIVFQ25, the difference between mean QoL scores among cases (88.34 ± 4.53) and controls (95.32 ± 5.76) was statistically significant. In GQL-15, there was a statistically significant difference between mean scores of cases (22.58 ± 5.23) and controls (16.52 ± 1.24). The difference in mean scores with Viswanathan 10 instrument in cases (7.92 ± 0.54) and controls (9.475 ± 0.505) was also statistically significant. QoL scores also showed moderate correlation with mean deviation, pattern standard deviation, and vertical cup-disc ratio. In our study, all the three instruments showed decrease in QoL in glaucoma patients compared to controls. With the increase in severity of glaucoma, corresponding decrease in QoL was observed. It is important for ophthalmologists to understand about the QoL in glaucoma patients so as to have a more holistic approach to patients and for effective delivery of treatment.
Paré, Pierre; Math, Joanna Lee M; Hawes, Ian A
2010-01-01
OBJECTIVE: To determine whether strategies to counsel and empower patients with heartburn-predominant dyspepsia could improve health-related quality of life. METHODS: Using a cluster randomized, parallel group, multicentre design, nine centres were assigned to provide either basic or comprehensive counselling to patients (age range 18 to 50 years) presenting with heartburn-predominant upper gastrointestinal symptoms, who would be considered for drug therapy without further investigation. Patients were treated for four weeks with esomeprazole 40 mg once daily, followed by six months of treatment that was at the physician’s discretion. The primary end point was the baseline change in Quality of Life in Reflux and Dyspepsia (QOLRAD) questionnaire score. RESULTS: A total of 135 patients from nine centres were included in the intention-to-treat analysis. There was a statistically significant baseline improvement in all domains of the QOLRAD questionnaire in both study arms at four and seven months (P<0.0001). After four months, the overall mean change in QOLRAD score appeared greater in the comprehensive counselling group than in the basic counselling group (1.77 versus 1.47, respectively); however, this difference was not statistically significant (P=0.07). After seven months, the overall mean baseline change in QOLRAD score between the comprehensive and basic counselling groups was not statistically significant (1.69 versus 1.56, respectively; P=0.63). CONCLUSIONS: A standardized, comprehensive counselling intervention showed a positive initial trend in improving quality of life in patients with heartburn-predominant uninvestigated dyspepsia. Further investigation is needed to confirm the potential benefits of providing patients with comprehensive counselling regarding disease management. PMID:20352148
The Effect of Hydration on the Voice Quality of Future Professional Vocal Performers.
van Wyk, Liezl; Cloete, Mariaan; Hattingh, Danel; van der Linde, Jeannie; Geertsema, Salome
2017-01-01
The application of systemic hydration as an instrument for optimal voice quality has been a common practice by several professional voice users over the years. Although the physiological action has been determined, the benefits on acoustic and perceptual characteristics are relatively unknown. The present study aimed to determine whether systemic hydration has beneficial outcomes on the voice quality of future professional voice users. A within-subject, pretest posttest design is applied to determine quantitative research results of female singing students between 18 and 32 years of age without a history of voice pathology. Acoustic and perceptual data were collected before and after a 2-hour singing rehearsal. The difference between the hypohydrated condition (controlled) and the hydrated condition (experimental) and the relationship between adequate hydration and acoustic and perceptual parameters of voice was then investigated. A statistical significant (P = 0.041) increase in jitter values were obtained for the hypohydrated condition. Increased maximum phonation time (MPT/z/) and higher maximum frequency for hydration indicated further statistical significant changes in voice quality (P = 0.028 and P = 0.015, respectively). Systemic hydration has positive outcomes on perceptual and acoustic parameters of voice quality for future professional singers. The singer's ability to sustain notes for longer and reach higher frequencies may reflect well in performances. Any positive change in voice quality may benefit the singer's occupational success and subsequently their social, emotional, and vocational well-being. More research evidence is needed to determine the parameters for implementing adequate hydration in vocal hygiene programs. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
[Quality of life of children with bronchial asthma disease].
Chromá, Jana; Slaný, Jaroslav
2011-01-01
The aim of this study was to determine how children with bronchial asthma disease assess their quality of life and to find domains of physical and psychosocial health in relation to age and gender. The quality of life compared with the healthy children and parents of asthmatic children and healthy parents. The research sample consisted of 199 children and 125 parents. Adepts for the study were selected by standardized questionnaires on the quality of life of the pediatric version of the PedsQL 4.0 and questionnaires PedsQL 2.0 module impact on the family. The research was conducted between September 2010 and January 2011 in the pediatric allergology ambulances and physicians in the University and Municipal Hospital in Ostrava. The mean quality of life of asthmatic children is 74.41, a statistically significant difference between the physical (78,81) and psychosocial (72,06) dimensions of health. The analysis shows that girls evaluate their quality of life worse than boys. The worst quality of life was found among children in the age group 5-7 years. No statistically significant difference in the quality of life was found between the asthmatic and healthy children. Between parents of asthmatic and healthy children statistically significant difference in the quality of life was found. Between asthmatic and healthy children no difference in the quality of life was found. We must not forget that the quality of life of the parents of asthmatic children is significantly influenced by the chronic disease of their children.
The frequency of fibromyalgia syndrome and quality of life in hospitalized cancer patients.
Eyigor, S; Karapolat, H; Korkmaz, O K; Eyigor, C; Durmaz, B; Uslu, R; Uyar, M
2009-03-01
To explore the frequency of fibromyalgia syndrome (FMS) among hospitalized cancer patients and address the relationships between pain, fatigue and quality of life with regard to the extent of pain, a cross-sectional and descriptive study was carried out in the Oncology Supportive Care Unit on 122 hospitalized cancer patients. Pain, sleep, disease impact (Fibromyalgia Impact Questionnaire), fatigue (Brief Fatigue Inventory), quality of life (Short Form 36 and European Organization for Research on Treatment of Cancer questionnaires Quality of Life-C30) were gathered using standardized measures. Thirteen of the hospitalized cancer patients (10.7%) included in the study were diagnosed with FMS. There were no statistically significant differences among three pain groups with respect to demographic characteristics (P > 0.05). There were significant differences among groups with regard to the presence of metastasis, fatigue, sleep disorder, pain, Brief Fatigue Inventory, Fibromyalgia Impact Questionnaire, most of subscores of Short Form 36 and European Organization for Research on Treatment of Cancer questionnaires Quality of Life-C30 scores (P < 0.05). In the present study, we have calculated the frequency of FMS among patients admitted to the oncology hospital in addition to establishing the relationships between pain, fatigue and quality of life with regard to the extent of pain. We believe that the descriptive data presented in this study would be helpful in future studies and therapeutic approaches.
Control of maglev vehicles with aerodynamic and guideway disturbances
NASA Technical Reports Server (NTRS)
Flueckiger, Karl; Mark, Steve; Caswell, Ruth; Mccallum, Duncan
1994-01-01
A modeling, analysis, and control design methodology is presented for maglev vehicle ride quality performance improvement as measured by the Pepler Index. Ride quality enhancement is considered through active control of secondary suspension elements and active aerodynamic surfaces mounted on the train. To analyze and quantify the benefits of active control, the authors have developed a five degree-of-freedom lumped parameter model suitable for describing a large class of maglev vehicles, including both channel and box-beam guideway configurations. Elements of this modeling capability have been recently employed in studies sponsored by the U.S. Department of Transportation (DOT). A perturbation analysis about an operating point, defined by vehicle and average crosswind velocities, yields a suitable linearized state space model for multivariable control system analysis and synthesis. Neglecting passenger compartment noise, the ride quality as quantified by the Pepler Index is readily computed from the system states. A statistical analysis is performed by modeling the crosswind disturbances and guideway variations as filtered white noise, whereby the Pepler Index is established in closed form through the solution to a matrix Lyapunov equation. Data is presented which indicates the anticipated ride quality achieved through various closed-loop control arrangements.
Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1.
Zahel, Thomas; Marschall, Lukas; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Mueller, Eric M; Murphy, Patrick; Natschläger, Thomas; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-12
Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0). However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.
Petroleum supply annual, 1990. [Contains Glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-05-30
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1990 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains three sections, Summary Statistics, Detailed Statistics, and Refinery Capacity, each with final annual data. The second volume contains final statistics for each month of 1990, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes,more » located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.« less
Petroleum supply annual 1992. [Contains glossary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-05-27
The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1992 through annual and monthly surveys. The PSA is divided into two volumes. The first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. This second volume contains final statistics for each month of 1992, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them.more » Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary.« less
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Statistical analysis of QC data and estimation of fuel rod behaviour
NASA Astrophysics Data System (ADS)
Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.
1991-02-01
The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2012-01-01
Water, bed sediment, and biota were sampled in streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork basin. The sampling program was conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork basin of western Montana, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2009 through September 2010. Bed-sediment and biota samples were collected once at 13 sites during August 2010. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2009 through September 2010. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity was analyzed for water samples collected at the four sites where seasonal daily values of turbidity were being determined. Daily values of suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork basin are provided for the period of record since 1985.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2014-01-01
Water, bed sediment, and biota were sampled in streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork Basin of western Montana. The sampling program was conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork Basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2011 through September 2012. Bed-sediment and biota samples were collected once at 13 sites during August 2012. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2011 through September 2012. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity was analyzed for water samples collected at the four sites where seasonal daily values of turbidity were being determined. Daily values of suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record since 1985.
Schenone, Mauro; Ziebarth, Sarah; Duncan, Jose; Stokes, Lea; Hernandez, Angela
2018-02-05
To investigate the proportion of documented ultrasound findings that were unsupported by stored ultrasound images in the obstetric ultrasound unit, before and after the implementation of a quality improvement process consisting of a checklist and feedback. A quality improvement process was created involving utilization of a checklist and feedback from physician to sonographer. The feedback was based on findings of the physician's review of the report and images using a check list. To assess the impact of this process, two groups were compared. Group 1 consisted of 58 ultrasound reports created prior to initiation of the process. Group 2 included 65 ultrasound reports created after process implementation. Each chart was reviewed by a physician and a sonographer. Findings considered unsupported by stored images by both reviewers were used for analysis, and the proportion of unsupported findings was compared between the two groups. Results are expressed as mean ± standard error. A p value of < .05 was used to determine statistical significance. Univariate analysis of baseline characteristics and potential confounders showed no statistically significant difference between the groups. The mean proportion of unsupported findings in Group 1 was 5.1 ± 0.87, with Group 2 having a significantly lower proportion (2.6 ± 0.62) (p value = .018). Results suggest a significant decrease in the proportion of unsupported findings in ultrasound reports after quality improvement process implementation. Thus, we present a simple yet effective quality improvement process to reduce unsupported ultrasound findings.
NASA Astrophysics Data System (ADS)
Denamiel, Cléa; Šepić, Jadranka; Vilibić, Ivica
2018-05-01
In engineering studies, harbor resonance, including quality and amplification factors, is typically computed for swell and waves with periods shorter than 10 min. However, in various locations around the world, such as Vela Luka Bay in Croatia, meteotsunami waves of periods greater than 10 min can excite the bay or harbor natural modes and produce substantial structural damages. In this theoretical study, the impact of some geomorphological changes of Vela Luka Bay—i.e. deepening of the bay, dredging the harbor, adding a pier or a marina—to the amplification of the meteotsunami waves are presented for a set of 6401 idealized pressure wave field forcing used to derive robust statistics. The most substantial increase in maximum elevation is found when the Vela Luka harbor is dredged to a 5 m depth, which is in contradiction with the calculation of the quality factor showing a decrease of the harbor natural resonance. It has been shown that the forcing energy content at different frequency bands should also be taken into account when estimating the quality and amplification factors, as their typical definitions derived from the peak frequency of the sea level spectrum fail to represent the harbor response during meteotsunami events. New definitions of these factors are proposed in this study and are shown to be in good agreement with the results of the statistical analysis of the Vela Luka Bay maximum elevation results. In addition, the presented methodology can easily be applicable to any other location in the world where meteotsunamis occur.
No-Reference Video Quality Assessment Based on Statistical Analysis in 3D-DCT Domain.
Li, Xuelong; Guo, Qun; Lu, Xiaoqiang
2016-05-13
It is an important task to design models for universal no-reference video quality assessment (NR-VQA) in multiple video processing and computer vision applications. However, most existing NR-VQA metrics are designed for specific distortion types which are not often aware in practical applications. A further deficiency is that the spatial and temporal information of videos is hardly considered simultaneously. In this paper, we propose a new NR-VQA metric based on the spatiotemporal natural video statistics (NVS) in 3D discrete cosine transform (3D-DCT) domain. In the proposed method, a set of features are firstly extracted based on the statistical analysis of 3D-DCT coefficients to characterize the spatiotemporal statistics of videos in different views. These features are used to predict the perceived video quality via the efficient linear support vector regression (SVR) model afterwards. The contributions of this paper are: 1) we explore the spatiotemporal statistics of videos in 3DDCT domain which has the inherent spatiotemporal encoding advantage over other widely used 2D transformations; 2) we extract a small set of simple but effective statistical features for video visual quality prediction; 3) the proposed method is universal for multiple types of distortions and robust to different databases. The proposed method is tested on four widely used video databases. Extensive experimental results demonstrate that the proposed method is competitive with the state-of-art NR-VQA metrics and the top-performing FR-VQA and RR-VQA metrics.
Tang, Jie; Nett, Brian E; Chen, Guang-Hong
2009-10-07
Of all available reconstruction methods, statistical iterative reconstruction algorithms appear particularly promising since they enable accurate physical noise modeling. The newly developed compressive sampling/compressed sensing (CS) algorithm has shown the potential to accurately reconstruct images from highly undersampled data. The CS algorithm can be implemented in the statistical reconstruction framework as well. In this study, we compared the performance of two standard statistical reconstruction algorithms (penalized weighted least squares and q-GGMRF) to the CS algorithm. In assessing the image quality using these iterative reconstructions, it is critical to utilize realistic background anatomy as the reconstruction results are object dependent. A cadaver head was scanned on a Varian Trilogy system at different dose levels. Several figures of merit including the relative root mean square error and a quality factor which accounts for the noise performance and the spatial resolution were introduced to objectively evaluate reconstruction performance. A comparison is presented between the three algorithms for a constant undersampling factor comparing different algorithms at several dose levels. To facilitate this comparison, the original CS method was formulated in the framework of the statistical image reconstruction algorithms. Important conclusions of the measurements from our studies are that (1) for realistic neuro-anatomy, over 100 projections are required to avoid streak artifacts in the reconstructed images even with CS reconstruction, (2) regardless of the algorithm employed, it is beneficial to distribute the total dose to more views as long as each view remains quantum noise limited and (3) the total variation-based CS method is not appropriate for very low dose levels because while it can mitigate streaking artifacts, the images exhibit patchy behavior, which is potentially harmful for medical diagnosis.
Kamioka, Hiroharu; Tsutani, Kiichiro; Okuizumi, Hiroyasu; Mutoh, Yoshiteru; Ohta, Miho; Handa, Shuichi; Okada, Shinpei; Kitayuguchi, Jun; Kamada, Masamitsu; Shiozawa, Nobuyoshi; Honda, Takuya
2010-01-01
The objective of this review was to summarize findings on aquatic exercise and balneotherapy and to assess the quality of systematic reviews based on randomized controlled trials. Studies were eligible if they were systematic reviews based on randomized clinical trials (with or without a meta-analysis) that included at least 1 treatment group that received aquatic exercise or balneotherapy. We searched the following databases: Cochrane Database Systematic Review, MEDLINE, CINAHL, Web of Science, JDream II, and Ichushi-Web for articles published from the year 1990 to August 17, 2008. We found evidence that aquatic exercise had small but statistically significant effects on pain relief and related outcome measures of locomotor diseases (eg, arthritis, rheumatoid diseases, and low back pain). However, long-term effectiveness was unclear. Because evidence was lacking due to the poor methodological quality of balneotherapy studies, we were unable to make any conclusions on the effects of intervention. There were frequent flaws regarding the description of excluded RCTs and the assessment of publication bias in several trials. Two of the present authors independently assessed the quality of articles using the AMSTAR checklist. Aquatic exercise had a small but statistically significant short-term effect on locomotor diseases. However, the effectiveness of balneotherapy in curing disease or improving health remains unclear.
NASA Astrophysics Data System (ADS)
Lee, Soon Hwan; Kim, Ji Sun; Lee, Kang Yeol; Shon, Keon Tae
2017-04-01
Air quality due to increasing Particulate Matter(PM) in Korea in Asia is getting worse. At present, the PM forecast is announced based on the PM concentration predicted from the air quality prediction numerical model. However, forecast accuracy is not as high as expected due to various uncertainties for PM physical and chemical characteristics. The purpose of this study was to develop a numerical-statistically ensemble models to improve the accuracy of prediction of PM10 concentration. Numerical models used in this study are the three dimensional atmospheric model Weather Research and Forecasting(WRF) and the community multiscale air quality model (CMAQ). The target areas for the PM forecast are Seoul, Busan, Daegu, and Daejeon metropolitan areas in Korea. The data used in the model development are PM concentration and CMAQ predictions and the data period is 3 months (March 1 - May 31, 2014). The dynamic-statistical technics for reducing the systematic error of the CMAQ predictions was applied to the dynamic linear model(DLM) based on the Baysian Kalman filter technic. As a result of applying the metrics generated from the dynamic linear model to the forecasting of PM concentrations accuracy was improved. Especially, at the high PM concentration where the damage is relatively large, excellent improvement results are shown.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2008-01-01
Water, bed sediment, and biota were sampled in streams from Butte to below Milltown Reservoir as part of a long-term monitoring program in the upper Clark Fork basin; additional water-quality samples were collected in the Clark Fork basin from sites near Milltown Reservoir downstream to near the confluence of the Clark Fork and Flathead River as part of a supplemental sampling program. The sampling programs were conducted in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork basin of western Montana, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water-quality samples were collected periodically at 22 sites from October 2006 through September 2007. Bed-sediment and biological samples were collected once at 12 sites during August 2007. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at all long-term and supplemental monitoring sites from October 2006 through September 2007. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity was analyzed for samples collected at sites where seasonal daily values of turbidity were being determined. Nutrients also were analyzed in the supplemental water-quality samples. Daily values of suspended-sediment concentration and suspended-sediment discharge were determined for four sites, and seasonal daily values of turbidity were determined for five sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of long-term water-quality, bed-sediment, and biological data for sites in the upper Clark Fork basin are provided for the period of record since 1985.
North Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; David E. Haugen; Charles J. Barnett
2011-01-01
The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...
South Dakota's forests, 2005: statistics, methods, and quality assurance
Patrick D. Miles; Ronald J. Piva; Charles J. Barnett
2011-01-01
The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...
Analysis of ETMS Data Quality for Traffic Flow Management Decisions
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.; Sridhar, Banavar; Kim, Douglas
2003-01-01
The data needed for air traffic flow management decision support tools is provided by the Enhanced Traffic Management System (ETMS). This includes both the tools that are in current use and the ones being developed for future deployment. Since the quality of decision support provided by all these tools will be influenced by the quality of the input ETMS data, an assessment of ETMS data quality is needed. Motivated by this desire, ETMS data quality is examined in this paper in terms of the unavailability of flight plans, deviation from the filed flight plans, departure delays, altitude errors and track data drops. Although many of these data quality issues are not new, little is known about their extent. A goal of this paper is to document the magnitude of data quality issues supported by numerical analysis of ETMS data. Guided by this goal, ETMS data for a 24-hour period were processed to determine the number of aircraft with missing flight plan messages at any given instant of time. Results are presented for aircraft above 18,000 feet altitude and also at all altitudes. Since deviation from filed flight plan is also a major cause of trajectory-modeling errors, statistics of deviations are presented. Errors in proposed departure times and ETMS-generated vertical profiles are also shown. A method for conditioning the vertical profiles for improving demand prediction accuracy is described. Graphs of actual sector counts obtained using these vertical profiles are compared with those obtained using the Host data for sectors in the Fort Worth Center to demonstrate the benefit of preprocessing. Finally, results are presented to quantify the extent of data drops. A method for propagating track positions during ETMS data drops is also described.
Potential errors and misuse of statistics in studies on leakage in endodontics.
Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J
2013-04-01
To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.
NASA Astrophysics Data System (ADS)
Edjah, Adwoba; Stenni, Barbara; Cozzi, Giulio; Turetta, Clara; Dreossi, Giuliano; Tetteh Akiti, Thomas; Yidana, Sandow
2017-04-01
Adwoba Kua- Manza Edjaha, Barbara Stennib,c,Giuliano Dreossib, Giulio Cozzic, Clara Turetta c,T.T Akitid ,Sandow Yidanae a,eDepartment of Earth Science, University of Ghana Legon, Ghana West Africa bDepartment of Enviromental Sciences, Informatics and Statistics, Ca Foscari University of Venice, Italy cInstitute for the Dynamics of Environmental Processes, CNR, Venice, Italy dDepartment of Nuclear Application and Techniques, Graduate School of Nuclear and Allied Sciences University of Ghana Legon This research is part of a PhD research work "Hydrogeological Assessment of the Lower Tano river basin for sustainable economic usage, Ghana, West - Africa". In this study, the researcher investigated surface water and groundwater quality in the Lower Tano river basin. This assessment was based on some selected sampling sites associated with mining activities, and the development of oil and gas. Statistical approach was applied to characterize the quality of surface water and groundwater. Also, water stable isotopes, which is a natural tracer of the hydrological cycle was used to investigate the origin of groundwater recharge in the basin. The study revealed that Pb and Ni values of the surface water and groundwater samples exceeded the WHO standards for drinking water. In addition, water quality index (WQI), based on physicochemical parameters(EC, TDS, pH) and major ions(Ca2+, Na+, Mg2+, HCO3-,NO3-, CL-, SO42-, K+) exhibited good quality water for 60% of the sampled surface water and groundwater. Other statistical techniques, such as Heavy metal pollution index (HPI), degree of contamination (Cd), and heavy metal evaluation index (HEI), based on trace element parameters in the water samples, reveal that 90% of the surface water and groundwater samples belong to high level of pollution. Principal component analysis (PCA) also suggests that the water quality in the basin is likely affected by rock - water interaction and anthropogenic activities (sea water intrusion). This was confirm by further statistical analysis (cluster analysis and correlation matrix) of the water quality parameters. Spatial distribution of water quality parameters, trace elements and the results obtained from the statistical analysis was determined by geographical information system (GIS). In addition, the isotopic analysis of the sampled surface water and groundwater revealed that most of the surface water and groundwater were of meteoric origin with little or no isotopic variations. It is expected that outcomes of this research will form a baseline for making appropriate decision on water quality management by decision makers in the Lower Tano river Basin. Keywords: Water stable isotopes, Trace elements, Multivariate statistics, Evaluation indices, Lower Tano river basin.
[Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].
Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D
2016-01-01
The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.
Initial Steps toward Validating and Measuring the Quality of Computerized Provider Documentation
Hammond, Kenric W.; Efthimiadis, Efthimis N.; Weir, Charlene R.; Embi, Peter J.; Thielke, Stephen M.; Laundry, Ryan M.; Hedeen, Ashley
2010-01-01
Background: Concerns exist about the quality of electronic health care documentation. Prior studies have focused on physicians. This investigation studied document quality perceptions of practitioners (including physicians), nurses and administrative staff. Methods: An instrument developed from staff interviews and literature sources was administered to 110 practitioners, nurses and administrative staff. Short, long and original versions of records were rated. Results: Length transformation did not affect quality ratings. On several scales practitioners rated notes less favorably than administrators or nurses. The original source document was associated with the quality rating, as was tf·idf, a relevance statistic computed from document text. Tf·idf was strongly associated with practitioner quality ratings. Conclusion: Document quality estimates were not sensitive to modifying redundancy in documents. Some perceptions of quality differ by role. Intrinsic document properties are associated with staff judgments of document quality. For practitioners, the tf·idf statistic was strongly associated with the quality dimensions evaluated. PMID:21346983
Lee, L.; Helsel, D.
2005-01-01
Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
1996-01-01
As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.
Soltani, Shahla; Asghari Moghaddam, Asghar; Barzegar, Rahim; Kazemian, Naeimeh; Tziritis, Evangelos
2017-08-18
Kordkandi-Duzduzan plain is one of the fertile plains of East Azarbaijan Province, NW of Iran. Groundwater is an important resource for drinking and agricultural purposes due to the lack of surface water resources in the region. The main objectives of the present study are to identify the hydrogeochemical processes and the potential sources of major, minor, and trace metals and metalloids such as Cr, Mn, Cd, Fe, Al, and As by using joint hydrogeochemical techniques and multivariate statistical analysis and to evaluate groundwater quality deterioration with the use of PoS environmental index. To achieve these objectives, 23 groundwater samples were collected in September 2015. Piper diagram shows that the mixed Ca-Mg-Cl is the dominant groundwater type, and some of the samples have Ca-HCO 3 , Ca-Cl, and Na-Cl types. Multivariate statistical analyses indicate that weathering and dissolution of different rocks and minerals, e.g., silicates, gypsum, and halite, ion exchange, and agricultural activities influence the hydrogeochemistry of the study area. The cluster analysis divides the samples into two distinct clusters which are completely different in EC (and its dependent variables such as Na + , K + , Ca 2+ , Mg 2+ , SO 4 2- , and Cl - ), Cd, and Cr variables according to the ANOVA statistical test. Based on the median values, the concentrations of pH, NO 3 - , SiO 2 , and As in cluster 1 are elevated compared with those of cluster 2, while their maximum values occur in cluster 2. According to the PoS index, the dominant parameter that controls quality deterioration is As, with 60% of contribution. Samples of lowest PoS values are located in the southern and northern parts (recharge area) while samples of the highest values are located in the discharge area and the eastern part.
Data on customer perceptions on the role of celebrity endorsement on brand preference.
Ibidunni, Ayodotun Stephen; Olokundun, Maxwell Ayodele; Ibidunni, Oyebisi Mary; Borishade, Taiye Tairat; Falola, Hezekiah Olubusayo; Salau, Odunayo Paul; Amaihian, Augusta Bosede; Fred, Peter
2018-06-01
This research presents data on the effect of celebrity endorsement on consumers' brand preference. Copies of structured questionnaire were administered to 384 customers of telecommunication industry. Using descriptive, correlation and regression statistical analysis, the data revealed that celebrity image has an effect on consumer brand loyalty, celebrity trustworthiness has an influence on consumer brand association. More so, the relationship between celebrity expertise and perceived quality of the product was established.
ERIC Educational Resources Information Center
Wendling, Wayne
This report is divided into four sections. Section 1 is a short discussion of the economic theory underlying the construction of the cost of education index and an example of how the index is calculated. Also presented are descriptions of the factors included in the statistical analysis to control for quality, quantity, and cost differences and…
Timmermans, Catherine; Doffagne, Erik; Venet, David; Desmet, Lieven; Legrand, Catherine; Burzykowski, Tomasz; Buyse, Marc
2016-01-01
Data quality may impact the outcome of clinical trials; hence, there is a need to implement quality control strategies for the data collected. Traditional approaches to quality control have primarily used source data verification during on-site monitoring visits, but these approaches are hugely expensive as well as ineffective. There is growing interest in central statistical monitoring (CSM) as an effective way to ensure data quality and consistency in multicenter clinical trials. CSM with SMART™ uses advanced statistical tools that help identify centers with atypical data patterns which might be the sign of an underlying quality issue. This approach was used to assess the quality and consistency of the data collected in the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, involving 1495 patients across 232 centers in Japan. In the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, very few atypical data patterns were found among the participating centers, and none of these patterns were deemed to be related to a quality issue that could significantly affect the outcome of the trial. CSM can be used to provide a check of the quality of the data from completed multicenter clinical trials before analysis, publication, and submission of the results to regulatory agencies. It can also form the basis of a risk-based monitoring strategy in ongoing multicenter trials. CSM aims at improving data quality in clinical trials while also reducing monitoring costs.
42 CFR 480.101 - Scope and definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs... Quality Control Quality Improvement Organization (QIO) (or the review component of a QIO subcontractor) in... sought under Title XVIII of the Act. Aggregate statistical data means any utilization, admission...
Quality control and quality assurance of hot mix asphalt construction in Delaware.
DOT National Transportation Integrated Search
2006-07-01
Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...
MR spectroscopy of the fetal brain: is it possible without sedation?
Berger-Kulemann, V; Brugger, P C; Pugash, D; Krssak, M; Weber, M; Wielandner, A; Prayer, D
2013-02-01
The quality of spectroscopic studies may be limited because of unrestricted fetal movement. Sedation is recommended to avoid motion artefacts. However, sedation involves side effects. The aim of this study was to assess the feasibility and quality of brain (1)H-MR spectroscopy in unsedated fetuses and to evaluate whether quality is dependent on the type of spectra, fetal presentation, GA, and/or fetal pathology. Seventy-five single-voxel spectroscopic studies of the fetal brain, performed at gestational weeks 19-38 at 1.5T, were evaluated retrospectively. A PRESS (TE = 144 or 35 ms) was used. Fetal presentation, GA, and kind of pathology were recorded. The quality of the spectra was assessed by reviewing the spectral appearance (line width, signal-to-noise) of the creatine resonance obtained relative to concentrations (ratios-to-creatine) of choline, myo-inositol, and NAA. Of 75 studies, 50 (66.6%) were rated as readable: short TE = 17/50 (34%), long TE = 33/50 (66%), cephalic presentation in 36/50 (72%) studies, breech in 10/50 (20%) studies, and "other" presentation in 4/50 (8%) studies (mean GA, 31.0 weeks). Twenty-eight of 50 fetuses (56%) showed normal development (short TE = 12/28, long TE = 16/28), and 22/50 (44%) showed pathology. Of the 75 studies, 25 (33.3%) were not readable: short TE = 14/25 (56%), long TE = 11/25 (44%), cephalic presentation in 20/25 (80%) studies, breech in 4/25 (16%) studies, and other presentation in 1 study (4%) (mean GA, 30.1 week). Thirteen of 25 fetuses (52%) showed normal development; 12/25 (48%) showed pathology. Statistical analysis revealed no impact of the different parameters on the quality of spectra. Single-voxel spectroscopy can be performed in approximately two-thirds of unsedated fetuses, regardless of the type of spectra, fetal presentation, GA, and pathology.
Hydrological influences on the water quality trends in Tamiraparani Basin, South India.
Ravichandran, S
2003-09-01
Water quality variables--Turbidity, pH, Electrical Conductivity (EC), Chlorides and Total Hardness (TH) were monitored at a downstream location in the Tamiraparani River during 1978-1992. The observations were made at weekly intervals in a water treatment and supply plant using standard methods. Graphical and statistical analyses were used for data exploration, trend detection and assessment. Box-Whisker plots of annual and seasonal changes in variables indicated apparent trends being present in the data and their response to the seasonal influence of the monsoon rainfall. Further, the examination of the median values of the variables indicated that changes in the direction of trend occurred during 1985-1986, especially in pH, EC and TH. The statistical analyses were done using non-parametric methods, the ANCOVA on rank transformed data and the Seasonal Man-Kendall test. The presence of monotonic trend in all the water quality variables was confirmed, however, with independent direction of change. The trend line was fitted by the method of least squares. The estimated values indicated significant increases in EC (28 microS cm(-1)) while significant decreases were observed in turbidity (90 NTU), pH (0.78), and total hardness (23 ppm) in a span of 15 years. The changes induced in river flow by the addition of a stabilizing reservoir, the influence of seasonal and spatial pattern of monsoon rainfall across the river basin and the increased agriculture appear causative factors for the water quality trends seen in the Tamiraparani River system.
NASA Technical Reports Server (NTRS)
Neustadter, H. E.; Sidik, S. M.; Burr, J. C., Jr.
1972-01-01
Air quality data for Cleveland, Ohio, for the period of 1967 to 1971 were collated and subjected to statistical analysis. The total suspended particulate component is lognormally distributed; while sulfur dioxide and nitrogen dioxide are reasonably approximated by lognormal distributions. Only sulfur dioxide, in some residential neighborhoods, meets Ohio air quality standards. Air quality has definitely improved in the industrial valley, while in the rest of the city, only sulfur dioxide has shown consistent improvement. A pollution index is introduced which displays directly the degree to which the environmental air conforms to mandated standards.
Baqué, Michèle; Amendt, Jens
2013-01-01
Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.
Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods.
Vizcaíno, Iván P; Carrera, Enrique V; Muñoz-Romero, Sergio; Cumbal, Luis H; Rojo-Álvarez, José Luis
2017-10-16
Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.
Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods
Vizcaíno, Iván P.; Muñoz-Romero, Sergio; Cumbal, Luis H.
2017-01-01
Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer’s kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer’s kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem. PMID:29035333
Influence of pilates training on the quality of life of chronic stroke patients.
Yun, Seok-Min; Park, Sang-Kyoon; Lim, Hee Sung
2017-10-01
[Purpose] This study was to observe the influence of Pilates training on the quality of life in chronic stoke patients. [Subjects and Methods] Forty chronic stroke patients participated in this study. They were divided into same number of experimental group (EG) and control group (CG). EG participated in a 60-min Pilates training program, twice a week for 12 weeks, while the CG did not participate in any exercise-related activities for the duration and participating in general occupational therapy without any exercise-related activities. Then the MMSE-K was performed before and after Pilates training to observe the influence of Pilates training on the quality of life in chronic stroke patients. [Results] Statistically significant improvement in the physical, social, and psychological domains was found in EG after the training. No statistically significant difference was found in all three quality of life domains for the CG. EG experienced a statistically significant improvement in all quality of life domains compared with that of CG. [Conclusion] Therefore, participation in Pilates training was found to effectively improve the quality of life in stroke patients. Pilates training involves low and intermediate intensity resistance and repetition that match the patient's physical ability and can be a remedial exercise program that can improve physical ability and influence quality of life.
Influence of pilates training on the quality of life of chronic stroke patients
Yun, Seok-Min; Park, Sang-Kyoon; Lim, Hee Sung
2017-01-01
[Purpose] This study was to observe the influence of Pilates training on the quality of life in chronic stoke patients. [Subjects and Methods] Forty chronic stroke patients participated in this study. They were divided into same number of experimental group (EG) and control group (CG). EG participated in a 60-min Pilates training program, twice a week for 12 weeks, while the CG did not participate in any exercise-related activities for the duration and participating in general occupational therapy without any exercise-related activities. Then the MMSE-K was performed before and after Pilates training to observe the influence of Pilates training on the quality of life in chronic stroke patients. [Results] Statistically significant improvement in the physical, social, and psychological domains was found in EG after the training. No statistically significant difference was found in all three quality of life domains for the CG. EG experienced a statistically significant improvement in all quality of life domains compared with that of CG. [Conclusion] Therefore, participation in Pilates training was found to effectively improve the quality of life in stroke patients. Pilates training involves low and intermediate intensity resistance and repetition that match the patient’s physical ability and can be a remedial exercise program that can improve physical ability and influence quality of life. PMID:29184300
NASA Astrophysics Data System (ADS)
Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan
2014-03-01
We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.
Littin, Gregory R.; Schnoebelen, Douglas J.
2010-01-01
The Cedar River alluvial aquifer is the primary source of municipal water in the Cedar Rapids, Iowa area. Municipal wells are completed in the alluvial aquifer at approximately 40 to 80 feet deep. The City of Cedar Rapids and the U.S. Geological Survey have been conducting a cooperative study of the groundwater-flow system and water quality near the well fields since 1992. Previous cooperative studies between the City of Cedar Rapids and the U.S. Geological Survey have documented hydrologic and water-quality data, geochemistry, and groundwater models. Water-quality samples were collected for studies involving well field monitoring, trends, source-water protection, groundwater geochemistry, evaluation of surface and ground-water interaction, assessment of pesticides in groundwater and surface water, and to evaluate water quality near a wetland area in the Seminole well field. Typical water-quality analyses included major ions (boron, bromide, calcium, chloride, fluoride, iron, magnesium, manganese, potassium, silica, sodium, and sulfate), nutrients (ammonia as nitrogen, nitrite as nitrogen, nitrite plus nitrate as nitrogen, and orthophosphate as phosphorus), dissolved organic carbon, and selected pesticides including two degradates of the herbicide atrazine. In addition, two synoptic samplings included analyses of additional pesticide degradates in water samples. Physical field parameters (alkalinity, dissolved oxygen, pH, specific conductance and water temperature) were recorded with each water sample collected. This report presents the results of water quality data-collection activities from January 1999 through December 2005. Methods of data collection, quality-assurance samples, water-quality analyses, and statistical summaries are presented. Data include the results of water-quality analyses from quarterly and synoptic sampling from monitoring wells, municipal wells, and the Cedar River.
The interactive effects of housing and neighbourhood quality on psychological well-being.
Jones-Rounds, McKenzie L; Evans, Gary W; Braubach, Matthias
2014-02-01
Many individuals are subject to the physically and mentally detrimental effects of living in substandard housing and inadequate neighbourhoods. We propose that better physical neighbourhood quality can partially offset some of the negative effects of poor housing quality on psychological well-being. Interviews and questionnaires were used to collect data in a cross-sectional study of housing quality, the state of the surrounding environment, and individual health and well-being for 5605 European adults from the Large Analysis and Review of European housing and health Status conducted by WHO in eight European cities. Multilevel random coefficient modelling was used to statistically analyse the main and interactive effects of housing quality and neighbourhood quality on psychological well-being. Socioeconomic status, employment status, gender and marital status were included as statistical controls. Substandard housing quality and poor neighbourhood quality each contribute to lower psychological well-being. Furthermore better neighbourhood quality buffers against the negative effects of poor housing quality on psychological well-being. These results fill a gap in research concerning the ability of neighbourhood quality to amplify or attenuate housing quality impacts on well-being.
A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment.
Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm; Nielsen, Michael Bachmann; Jensen, Jorgen Arendt
2017-01-01
This paper discusses the methods for the assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology is valuable in the continuing process of method optimization and guided development of new imaging methods. It includes a three phased study plan covering from initial prototype development to clinical assessment. Recommendations to the clinical assessment protocol, software, and statistical analysis are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer to properly reveal the clinical value. This paper exemplifies the methodology using recent studies of synthetic aperture sequential beamforming tissue harmonic imaging.
Stawowczyk, Ewa
2018-01-01
Introduction Ulcerative colitis (UC) is an idiopathic inflammatory bowel disorder, which requires lifelong treatment. It generates substantial direct and indirect costs, and significantly affects the quality of life, especially in the active state of the disease. Aim To evaluate the direct and indirect costs of UC as well as to assess disease activity and quality of life reported by patients with UC in Polish settings. Material and methods A questionnaire, cross-sectional study among UC patients as well as physicians involved in the therapy of the patients was conducted. The Clinical Activity Index (CAI) was used to assess disease activity, and the WPAI questionnaire to assess productivity loss. The quality of life was presented as utility calculated using the EQ-5D-3L questionnaire. Indirect costs included absenteeism, presenteeism, and informal care were assessed with the Human Capital Approach and expressed in euros (€). The productivity loss among informal caregivers was valuated with the average wage in Poland. Correlations were presented using the Spearman’s coefficient, and the between-group difference was assessed with Mann-Whitney U-test. Results One hundred and forty-seven patients participated in the study, including 95 working persons. Mean cost of absenteeism and presenteeism was €1615.2 (95% CI: 669.5–2561.0) and €3684.4 (95% CI: 2367.8–5001.1), respectively, per year per patient with a disease in remission. The mean yearly cost of productivity loss due to informal care was estimated to be €256.6 (range: 0.0–532.6). The corresponding values for patients with active disease were: €8,913.3 (95% CI: 6223.3–11,603.3), €4325.1 (95% CI: 2282.4–6367.8), and €2396.1 (95% CI: 402.0–4390.3). The between-group difference in total indirect costs, cost of absenteeism, and cost of informal care was statistically significant (p < 0.05). The average weighted monthly costs of therapy with particular drugs categories (e.g. mesalazine or biologic drugs) differed significantly between active disease or remission patients. The difference in utility values between patients with a disease in remission (0.898 ±0.126) and patients with an active disease (0.646 ±0.302) was statistically significant. Conclusions Our study revealed the social burden of UC and high dependency of direct and indirect costs as well as quality of life on the severity of UC in Poland. The statistically significant differences were identified in total direct and indirect cost, cost of absenteeism, cost of informal care, and health-related quality of life among patients with an active disease compared to patients with a disease in remission. PMID:29657613
Set-free Markov state model building
NASA Astrophysics Data System (ADS)
Weber, Marcus; Fackeldey, Konstantin; Schütte, Christof
2017-03-01
Molecular dynamics (MD) simulations face challenging problems since the time scales of interest often are much longer than what is possible to simulate; and even if sufficiently long simulations are possible the complex nature of the resulting simulation data makes interpretation difficult. Markov State Models (MSMs) help to overcome these problems by making experimentally relevant time scales accessible via coarse grained representations that also allow for convenient interpretation. However, standard set-based MSMs exhibit some caveats limiting their approximation quality and statistical significance. One of the main caveats results from the fact that typical MD trajectories repeatedly re-cross the boundary between the sets used to build the MSM which causes statistical bias in estimating the transition probabilities between these sets. In this article, we present a set-free approach to MSM building utilizing smooth overlapping ansatz functions instead of sets and an adaptive refinement approach. This kind of meshless discretization helps to overcome the recrossing problem and yields an adaptive refinement procedure that allows us to improve the quality of the model while exploring state space and inserting new ansatz functions into the MSM.
Gunnarsson, Bjarni Kristinn; Hansdottir, Ingun; Bjornsdottir, Erla; Birgisdottir, Erl Bjorg; Arnadottir, Anna Thora; Magnusson, Bjorn
2016-02-01
The aim of this treatment study was to evaluate both short- and long-term effects of a multidisciplinary obesity treatment. Long-term outcomes of patients receiving gastric bypass surgery in addition to behavioral obesity treatment were compared with those who did not undergo surgery. The participants were 100 patients undergoing a four week inpatient obesity treatment at the Hospital in Neskaupsstaður (Fjórðungsjúkrahúsið í Neskaupstað (FSN). After treatment was completed, 28 of these patients underwent further treatment, receiving gastric bypass surgery. All patients were followed for two years after completing the four week treatment. Body mass index (BMI), quality of life and symptoms of depression and anxiety were measured for all participants before and after treatment, and again using mailed questionnaires in a cross-sectional data collection in the summer of 2012. Participants achieved statistically significant weight loss (median 1,85 BMI points), improved their quality of life and mental health after four week obesity treatment, and long term results remained significant. Three years after the conclusion of treatment, statistically significant weight loss was still present for patients that had not undergone gastric bypass surgery (median 2.13 BMI points), but improvements in mental health and quality of life were no longer present among subjects who did not undergo surgery. Patients who underwent gastric bypass surgery achieved greater weight loss (median 13.12 BMI points) and longer lasting improvements in mental health and quality of life. Results show that the multidisciplinary obesity treatment is effective in reducing obesity and improving mental health and quality of life in the short term. With follow-up treatment, the weight loss is maintained for up to three years after treatment for all participants. The bypass surgery group lost more weight and showed more permanent improvements in mental health and quality of life. These results underline the necessity of providing long-term treatment in maintaining improvements when treating obesity. 1Municipal Service Centre for Miðborg and Hlíðar 2University of Iceland, 3National University Hospital of Iceland, 4East Coast Regional Hospital in Iceland, 5The Health Care Institution of South Iceland. obesity, short- and long term treatment outcomes, weight loss, quality of life, mental health, interdisciplinary treatment. Correspondence: Bjarni Kristinn Gunnarsson bjarnikris@gmail.com.
Robust GPS autonomous signal quality monitoring
NASA Astrophysics Data System (ADS)
Ndili, Awele Nnaemeka
The Global Positioning System (GPS), introduced by the U.S. Department of Defense in 1973, provides unprecedented world-wide navigation capabilities through a constellation of 24 satellites in global orbit, each emitting a low-power radio-frequency signal for ranging. GPS receivers track these transmitted signals, computing position to within 30 meters from range measurements made to four satellites. GPS has a wide range of applications, including aircraft, marine and land vehicle navigation. Each application places demands on GPS for various levels of accuracy, integrity, system availability and continuity of service. Radio frequency interference (RFI), which results from natural sources such as TV/FM harmonics, radar or Mobile Satellite Systems (MSS), presents a challenge in the use of GPS, by posing a threat to the accuracy, integrity and availability of the GPS navigation solution. In order to use GPS for integrity-sensitive applications, it is therefore necessary to monitor the quality of the received signal, with the objective of promptly detecting the presence of RFI, and thus provide a timely warning of degradation of system accuracy. This presents a challenge, since the myriad kinds of RFI affect the GPS receiver in different ways. What is required then, is a robust method of detecting GPS accuracy degradation, which is effective regardless of the origin of the threat. This dissertation presents a new method of robust signal quality monitoring for GPS. Algorithms for receiver autonomous interference detection and integrity monitoring are demonstrated. Candidate test statistics are derived from fundamental receiver measurements of in-phase and quadrature correlation outputs, and the gain of the Active Gain Controller (AGC). Performance of selected test statistics are evaluated in the presence of RFI: broadband interference, pulsed and non-pulsed interference, coherent CW at different frequencies; and non-RFI: GPS signal fading due to physical blockage and multipath. Results are presented which verify the effectiveness of these proposed methods. The benefits of pseudolites in reducing service outages due to interference are demonstrated. Pseudolites also enhance the geometry of the GPS constellation, improving overall system accuracy. Designs for pseudolites signals, to reduce the near-far problem associated with pseudolite use, are also presented.
Uribe-Patarroyo, Néstor; Bouma, Brett E.
2015-01-01
We present a new technique for the correction of nonuniform rotation distortion in catheter-based optical coherence tomography (OCT), based on the statistics of speckle between A-lines using intensity-based dynamic light scattering. This technique does not rely on tissue features and can be performed on single frames of data, thereby enabling real-time image correction. We demonstrate its suitability in a gastrointestinal balloon-catheter OCT system, determining the actual rotational speed with high temporal resolution, and present corrected cross-sectional and en face views showing significant enhancement of image quality. PMID:26625040
Taking the initiative: A leadership conference for women in science and engineering
NASA Technical Reports Server (NTRS)
1994-01-01
The conference sprang from discussions on the current climate that women face in science, mathematics, engineering, and technology. The conference (and this document) is a beginning, not a culmination, of women's learning leadership skills. Conferees were active, articulate, energetic, and ready to learn leadership qualities, some of which seem universal, others that appear to require skills in specific fields. After the introduction, the workshops and presentations are arranged under vision and direction, barriers, alignment and communication, and motivation and inspiration. Some statistics are presented on women degrees and employment in various fields.
Taking the initiative. A leadership conference for women in science and engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1994-01-01
The conference sprang from discussions on the current climate that women face in science, mathematics, engineering, and technology. The conference (and this document) is a beginning, not a culmination, of women`s learning leadership skills. Conferees were active, articulate, energetic, and ready to learn leadership qualities, some of which seem universal, others that appear to require skills in specific fields. After the introduction, the workshops and presentations are arranged under vision and direction, barriers, alignment and communication, and motivation and inspiration. Some statistics are presented on women degrees and employment in various fields.
2012-01-01
Background Health status is one of the basic factors of a high quality of life and the problem of the acceptance of illness is important for adaptation to the limitations imposed by it. The purpose of the study was the evaluation of the quality of life, satisfaction with life and the acceptance of illness by malaria patients, as well as the discovery of a relationship between studied parameters. Methods The study was undertaken in August 2010, on 120 Nigerian patients with confirmed malaria. A method of diagnostic survey, based on standardized scales - Acceptance of Illness Scale, The Satisfaction With Life Scale and a standardized survey questionnaire World Health Organization Quality of Life/BREF - was used in this study. Descriptive statistics, variability range, 95% confidence interval, correlation analysis, Spearman’s non-parametric correlation coefficient, Mann–Whitney test and Kruskal-Wallis test were applied and the, so called, test statistics was calculated, followed by the calculation of the test probability p. Results of analyses were presented in a box graph, and a graph of dispersion. Results A dominating share in the adjective scale of the AIS scale was the category of “no acceptance”, given by 71.7% of respondents. The average level of a “somatic domain” was 41.7, and of a “social domain” was 62.8. The mean satisfaction of life evaluation in the SWLS scale was 18 points. The correlation between acceptance of the disease and quality of life for the psychological domain was 0.39***, and between acceptance of the disease and satisfaction with life was 0.40***. The correlation between satisfaction with life and quality of life for the psychological domain was 0.65***, and between satisfaction with life and quality of life for the environment domain was 0.60***. The mean level of AIS for the studied population of men was 16.5, and test probability: p = 0.0014**, and for the environment domain the level was 50, and the test probability: p = 0.0073**. For quality of life in the social sphere the test probability: p = 0.0013** in relatively older individuals. Conclusion The majority of people do not accept their condition. Evaluation of the quality of life was the highest in the social domain, and the lowest in the somatic domain. There is a statistically significant correlation between the level of acceptance of illness and the quality of life and satisfaction with life. The strongest correlation is found between satisfaction with life and the evaluation of the quality of life in psychological and environmental domains. Men evaluate their quality of life in the environmental domain higher and demonstrate a higher acceptance of their disease. There is a correlation regarding a significantly higher quality of life in the social sphere in relatively older people. PMID:22616635
Oelsner, Gretchen P.; Sprague, Lori A.; Murphy, Jennifer C.; Zuellig, Robert E.; Johnson, Henry M.; Ryberg, Karen R.; Falcone, James A.; Stets, Edward G.; Vecchia, Aldo V.; Riskin, Melissa L.; De Cicco, Laura A.; Mills, Taylor J.; Farmer, William H.
2017-04-04
Since passage of the Clean Water Act in 1972, Federal, State, and local governments have invested billions of dollars to reduce pollution entering rivers and streams. To understand the return on these investments and to effectively manage and protect the Nation’s water resources in the future, we need to know how and why water quality has been changing over time. As part of the National Water-Quality Assessment Project, of the U.S. Geological Survey’s National Water-Quality Program, data from the U.S. Geological Survey, along with multiple other Federal, State, Tribal, regional, and local agencies, have been used to support the most comprehensive assessment conducted to date of surface-water-quality trends in the United States. This report documents the methods used to determine trends in water quality and ecology because these methods are vital to ensuring the quality of the results. Specific objectives are to document (1) the data compilation and processing steps used to identify river and stream sites throughout the Nation suitable for water-quality, pesticide, and ecology trend analysis, (2) the statistical methods used to determine trends in target parameters, (3) considerations for water-quality, pesticide, and ecology data and streamflow data when modeling trends, (4) sensitivity analyses for selecting data and interpreting trend results with the Weighted Regressions on Time, Discharge, and Season method, and (5) the final trend results at each site. The scope of this study includes trends in water-quality concentrations and loads (nutrient, sediment, major ion, salinity, and carbon), pesticide concentrations and loads, and metrics for aquatic ecology (fish, invertebrates, and algae) for four time periods: (1) 1972–2012, (2) 1982–2012, (3) 1992–2012, and (4) 2002–12. In total, nearly 12,000 trends in concentration, load, and ecology metrics were evaluated in this study; there were 11,893 combinations of sites, parameters, and trend periods. The final trend results are presented with examples of how to interpret the results from each trend model. Interpretation of the trend results, such as causal analysis, is not included.
Hold My Calls: An Activity for Introducing the Statistical Process
ERIC Educational Resources Information Center
Abel, Todd; Poling, Lisa
2015-01-01
Working with practicing teachers, this article demonstrates, through the facilitation of a statistical activity, how to introduce and investigate the unique qualities of the statistical process including: formulate a question, collect data, analyze data, and interpret data.
Petroleum marketing monthly with data for May 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-08-01
The Petroleum Marketing Monthly (PMM) provides information and statistical data on a variety of crude oils and refined petroleum products. The publication presents statistics on crude oil costs and refined petroleum products sales for use by industry, government, private sector analysts, educational institutions, and consumers. Data on crude oil include the domestic first purchase price, the f.o.b. and landed cost of crude oil, and the refiners` acquisition cost of crude oil. Refined petroleum product sales data include motor gasoline, distillates, residuals, aviation fuels, kerosene, and propane. The Petroleum Marketing Division, Office of Oil and Gas, Energy Information Administration ensures themore » accuracy, quality, and confidentiality of the published data in the Petroleum Marketing Monthly.« less
Bayesian methods in reliability
NASA Astrophysics Data System (ADS)
Sander, P.; Badoux, R.
1991-11-01
The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.
Fassihi, Afshin; Sabet, Razieh
2008-01-01
Quantitative relationships between molecular structure and p56lck protein tyrosine kinase inhibitory activity of 50 flavonoid derivatives are discovered by MLR and GA-PLS methods. Different QSAR models revealed that substituent electronic descriptors (SED) parameters have significant impact on protein tyrosine kinase inhibitory activity of the compounds. Between the two statistical methods employed, GA-PLS gave superior results. The resultant GA-PLS model had a high statistical quality (R2 = 0.74 and Q2 = 0.61) for predicting the activity of the inhibitors. The models proposed in the present work are more useful in describing QSAR of flavonoid derivatives as p56lck protein tyrosine kinase inhibitors than those provided previously. PMID:19325836
Assessment of NDE reliability data
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Couchman, J. C.; Chang, F. H.; Packman, D. F.
1975-01-01
Twenty sets of relevant nondestructive test (NDT) reliability data were identified, collected, compiled, and categorized. A criterion for the selection of data for statistical analysis considerations was formulated, and a model to grade the quality and validity of the data sets was developed. Data input formats, which record the pertinent parameters of the defect/specimen and inspection procedures, were formulated for each NDE method. A comprehensive computer program was written and debugged to calculate the probability of flaw detection at several confidence limits by the binomial distribution. This program also selects the desired data sets for pooling and tests the statistical pooling criteria before calculating the composite detection reliability. An example of the calculated reliability of crack detection in bolt holes by an automatic eddy current method is presented.
Nieri, Michele; Clauser, Carlo; Franceschi, Debora; Pagliaro, Umberto; Saletta, Daniele; Pini-Prato, Giovanpaolo
2007-08-01
The aim of the present study was to investigate the relationships among reported methodological, statistical, clinical and paratextual variables of randomized clinical trials (RCTs) in implant therapy, and their influence on subsequent research. The material consisted of the RCTs in implant therapy published through the end of the year 2000. Methodological, statistical, clinical and paratextual features of the articles were assessed and recorded. The perceived clinical relevance was subjectively evaluated by an experienced clinician on anonymous abstracts. The impact on research was measured by the number of citations found in the Science Citation Index. A new statistical technique (Structural learning of Bayesian Networks) was used to assess the relationships among the considered variables. Descriptive statistics revealed that the reported methodology and statistics of RCTs in implant therapy were defective. Follow-up of the studies was generally short. The perceived clinical relevance appeared to be associated with the objectives of the studies and with the number of published images in the original articles. The impact on research was related to the nationality of the involved institutions and to the number of published images. RCTs in implant therapy (until 2000) show important methodological and statistical flaws and may not be appropriate for guiding clinicians in their practice. The methodological and statistical quality of the studies did not appear to affect their impact on practice and research. Bayesian Networks suggest new and unexpected relationships among the methodological, statistical, clinical and paratextual features of RCTs.
Assessment and prediction of air quality using fuzzy logic and autoregressive models
NASA Astrophysics Data System (ADS)
Carbajal-Hernández, José Juan; Sánchez-Fernández, Luis P.; Carrasco-Ochoa, Jesús A.; Martínez-Trinidad, José Fco.
2012-12-01
In recent years, artificial intelligence methods have been used for the treatment of environmental problems. This work, presents two models for assessment and prediction of air quality. First, we develop a new computational model for air quality assessment in order to evaluate toxic compounds that can harm sensitive people in urban areas, affecting their normal activities. In this model we propose to use a Sigma operator to statistically asses air quality parameters using their historical data information and determining their negative impact in air quality based on toxicity limits, frequency average and deviations of toxicological tests. We also introduce a fuzzy inference system to perform parameter classification using a reasoning process and integrating them in an air quality index describing the pollution levels in five stages: excellent, good, regular, bad and danger, respectively. The second model proposed in this work predicts air quality concentrations using an autoregressive model, providing a predicted air quality index based on the fuzzy inference system previously developed. Using data from Mexico City Atmospheric Monitoring System, we perform a comparison among air quality indices developed for environmental agencies and similar models. Our results show that our models are an appropriate tool for assessing site pollution and for providing guidance to improve contingency actions in urban areas.
[Applications of the hospital statistics management system].
Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao
2008-01-01
The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.
Quality Control of the Print with the Application of Statistical Methods
NASA Astrophysics Data System (ADS)
Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.
2018-04-01
The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.
Smylie, Janet; Firestone, Michelle
2015-01-01
Canada is known internationally for excellence in both the quality and public policy relevance of its health and social statistics. There is a double standard however with respect to the relevance and quality of statistics for Indigenous populations in Canada. Indigenous specific health and social statistics gathering is informed by unique ethical, rights-based, policy and practice imperatives regarding the need for Indigenous participation and leadership in Indigenous data processes throughout the spectrum of indicator development, data collection, management, analysis and use. We demonstrate how current Indigenous data quality challenges including misclassification errors and non-response bias systematically contribute to a significant underestimate of inequities in health determinants, health status, and health care access between Indigenous and non-Indigenous people in Canada. The major quality challenge underlying these errors and biases is the lack of Indigenous specific identifiers that are consistent and relevant in major health and social data sources. The recent removal of an Indigenous identity question from the Canadian census has resulted in further deterioration of an already suboptimal system. A revision of core health data sources to include relevant, consistent, and inclusive Indigenous self-identification is urgently required. These changes need to be carried out in partnership with Indigenous peoples and their representative and governing organizations. PMID:26793283
Defining the best quality-control systems by design and inspection.
Hinckley, C M
1997-05-01
Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.
A tutorial in displaying mass spectrometry-based proteomic data using heat maps.
Key, Melissa
2012-01-01
Data visualization plays a critical role in interpreting experimental results of proteomic experiments. Heat maps are particularly useful for this task, as they allow us to find quantitative patterns across proteins and biological samples simultaneously. The quality of a heat map can be vastly improved by understanding the options available to display and organize the data in the heat map. This tutorial illustrates how to optimize heat maps for proteomics data by incorporating known characteristics of the data into the image. First, the concepts used to guide the creating of heat maps are demonstrated. Then, these concepts are applied to two types of analysis: visualizing spectral features across biological samples, and presenting the results of tests of statistical significance. For all examples we provide details of computer code in the open-source statistical programming language R, which can be used for biologists and clinicians with little statistical background. Heat maps are a useful tool for presenting quantitative proteomic data organized in a matrix format. Understanding and optimizing the parameters used to create the heat map can vastly improve both the appearance and the interoperation of heat map data.
Evaluation of asbestos levels in two schools before and after asbestos removal. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karaffa, M.A.; Chesson, J.; Russell, J.
This report presents a statistical evaluation of airborne asbestos data collected at two schools before and after removal of asbestos-containing material (ACM). Although the monitoring data are not totally consistent with new Asbestos Hazard Emergency Response Act (AHERA) requirements and recent EPA guidelines, the study evaluates these historical data by standard statistical methods to determine if abated work areas meet proposed clearance criteria. The objectives of this statistical analysis were to compare (1) airborne asbestos levels indoors after removal with levels outdoors, (2) airborne asbestos levels before and after removal of asbestos, and (3) static sampling and aggressive sampling ofmore » airborne asbestos. The results of this evaluation indicated the following: the effect of asbestos removal on indoor air quality is unpredictable; the variability in fiber concentrations among different sampling sites within the same building indicates the need to treat different sites as separate areas for the purpose of clearance; and aggressive sampling is appropriate for clearance testing because it captures more entrainable asbestos structures. Aggressive sampling lowers the chance of declaring a worksite clean when entrainable asbestos is still present.« less
Baier, Bernhard; Thömke, Frank; Wilting, Janine; Heinze, Caroline; Geber, Christian; Dieterich, Marianne
2012-10-24
The perceived subjective visual vertical (SVV) is an important sign of a vestibular otolith tone imbalance in the roll plane. Previous studies suggested that unilateral pontomedullary brainstem lesions cause ipsiversive roll-tilt of SVV, whereas pontomesencephalic lesions cause contraversive roll-tilts of SVV. However, previous data were of limited quality and lacked a statistical approach. We therefore tested roll-tilt of the SVV in 79 human patients with acute unilateral brainstem lesions due to stroke by applying modern statistical lesion-behavior mapping analysis. Roll-tilt of the SVV was verified to be a brainstem sign, and for the first time it was confirmed statistically that lesions of the medial longitudinal fasciculus (MLF) and the medial vestibular nucleus are associated with ipsiversive tilt of the SVV, whereas contraversive tilts are associated with lesions affecting the rostral interstitial nucleus of the MLF, the superior cerebellar peduncle, the oculomotor nucleus, and the interstitial nucleus of Cajal. Thus, these structures constitute the anatomical pathway in the brainstem for verticality perception. Present data indicate that graviceptive otolith signals present a predominant role in the multisensory system of verticality perception.
Liumbruno, Giancarlo Maria; Panetta, Valentina; Bonini, Rosaria; Chianese, Rosa; Fiorin, Francesco; Lupi, Maria Antonietta; Tomasini, Ivana; Grazzini, Giuliano
2011-01-01
Introduction The aim of the survey described in this article was to determine decisional and strategic factors useful for redefining minimum structural, technological and organisational requisites for transfusion structures, as well as for the production of guidelines for accreditation of transfusion structures by the National Blood Centre. Materials and methods A structured questionnaire containing 65 questions was sent to all Transfusion Services in Italy. The questions covered: management of the quality system, accreditation, conformity with professional standards, structural and technological requisites, as well as potential to supply transfusion medicine-related health care services. All the questionnaires returned underwent statistical analysis. Results Replies were received from 64.7% of the Transfusion Services. Thirty-nine percent of these had an ISO 9001 certificate, with marked differences according to geographical location; location-related differences were also present for responses to other questions and were confirmed by multivariate statistical analysis. Over half of the Transfusion Services (53.6%) had blood donation sites run by donor associations. The statistical analysis revealed only one statistically significant difference between these donation sites: those connected to certified Transfusion Services were more likely themselves to have ISO 9001 certification than those connected to services who did not have such certification. Conclusions The data collected in this survey are representative of the Italian national transfusion system. A re-definition of the authorisation and accreditation requisites for transfusion activities must take into account European and national legislation when determining these requisites in order to facilitate their effective applicability, promote their efficient fulfilment and enhance the development of homogeneous and transparent quality systems. PMID:21839026
Quality in End User Documentation.
ERIC Educational Resources Information Center
Morrison, Ronald
1994-01-01
Discusses quality in end-user documentation for computer applications and explains four approaches to improving quality in end-user documents. Highlights include online help, usability testing, technical writing elements, statistical approaches, and concepts relating to software quality that are also applicable to user manuals. (LRW)
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Statistical moments of the Strehl ratio
NASA Astrophysics Data System (ADS)
Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon
2012-07-01
Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.
Lehmann, Thomas; Redies, Christoph
2017-01-01
For centuries, oil paintings have been a major segment of the visual arts. The JenAesthetics data set consists of a large number of high-quality images of oil paintings of Western provenance from different art periods. With this database, we studied the relationship between objective image measures and subjective evaluations of the images, especially evaluations on aesthetics (defined as artistic value) and beauty (defined as individual liking). The objective measures represented low-level statistical image properties that have been associated with aesthetic value in previous research. Subjective rating scores on aesthetics and beauty correlated not only with each other but also with different combinations of the objective measures. Furthermore, we found that paintings from different art periods vary with regard to the objective measures, that is, they exhibit specific patterns of statistical image properties. In addition, clusters of participants preferred different combinations of these properties. In conclusion, the results of the present study provide evidence that statistical image properties vary between art periods and subject matters and, in addition, they correlate with the subjective evaluation of paintings by the participants. PMID:28694958
Berkő, Péter
2016-05-01
It is a regrettable deficiency in the Hungarian healthcare that the culture and the system of quality control of cure have not been formed (except for a few subspecialties, units or wards). If hospital wards do not have a national, professionally unified and modern information system presenting the most important quantity and quality indicators of their medicinal activity annually, a stable basis for definition of future tasks is absent. The author puts forward a proposal for the establishment of the information systems for different professional fields. On the basis of experience of perinatological information system operating for over 3 decades in Borsod-Abaúj-Zemplén county, he also proposes introduction of a nationally unified, Europeristat-compatible information system following Tauffer-statistics which may serve as a uniform quality control of obstetrics and perinatological care, as well as introduction of its base, the dataform "TePERA" (Form of Obstetrics and Perinatological Care Risk).
The impact of primary open-angle glaucoma: Quality of life in Indian patients
Kumar, Suresh; Ichhpujani, Parul; Singh, Roopali; Thakur, Sahil; Sharma, Madhu; Nagpal, Nimisha
2018-01-01
Purpose: Glaucoma significantly affects the quality of life (QoL) of a patient. Despite the huge number of glaucoma patients in India, not many, QoL studies have been carried out. The purpose of the present study was to evaluate the QoL in Indian patients with varying severity of glaucoma. Methods: This was a hospital-based, cross-sectional, analytical study of 180 patients. The QoL was assessed using orally administered QoL instruments comprising of two glaucoma-specific instruments; Glaucoma Quality of Life-15 (GQL-15) and Viswanathan 10 instrument, and 1 vision-specific instrument; National Eye Institute Visual Function Questionnaire-25 (NEIVFQ25). Results: Using NEIVFQ25, the difference between mean QoL scores among cases (88.34 ± 4.53) and controls (95.32 ± 5.76) was statistically significant. In GQL-15, there was a statistically significant difference between mean scores of cases (22.58 ± 5.23) and controls (16.52 ± 1.24). The difference in mean scores with Viswanathan 10 instrument in cases (7.92 ± 0.54) and controls (9.475 ± 0.505) was also statistically significant. QoL scores also showed moderate correlation with mean deviation, pattern standard deviation, and vertical cup-disc ratio. Conclusion: In our study, all the three instruments showed decrease in QoL in glaucoma patients compared to controls. With the increase in severity of glaucoma, corresponding decrease in QoL was observed. It is important for ophthalmologists to understand about the QoL in glaucoma patients so as to have a more holistic approach to patients and for effective delivery of treatment. PMID:29480254
Lichen planus affecting the female genitalia: A retrospective review of patients at Mayo Clinic.
Fahy, Caoimhe M R; Torgerson, Rochelle R; Davis, Mark D P
2017-12-01
Genital or vulval lichen planus (VLP) may have a disabling effect on a patient's quality of life. Evidence-based management guidelines are lacking for VLP. We sought to review clinical presentation and treatment of patients who received a diagnosis of VLP. The 100 consecutive patients who received a diagnosis of VLP at Mayo Clinic between January 1, 1997, and December 31, 2015, were reviewed retrospectively. Descriptive statistics were used for data analysis. Fisher's exact test and the Wilcoxon rank sum test were used for analysis of categorical and continuous variables, respectively. All statistical tests were 2 sided, with the α level set at .05 for statistical significance. The time to diagnosis for 49% of patients was more than 1 year. Three patients (3%) had vulval dysplasia, including invasive squamous cell carcinoma. Sixty-eight patients (68%) had multisite lichen planus disease. Eleven patients (11%) had disease remission. Dermatology was the lead specialty for 9 of these cases of remission. This was a retrospective, small-cohort study. A low frequency of disease remission was seen in patients with VLP. Patients with lichen planus benefit considerably from dermatology consultation. Further research is warranted to establish high-quality, evidence-based guidelines for multidisciplinary management of this challenging disease. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.
Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan
2016-01-01
We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.
Advances and Best Practices in Airborne Gravimetry from the U.S. GRAV-D Project
NASA Astrophysics Data System (ADS)
Diehl, Theresa; Childers, Vicki; Preaux, Sandra; Holmes, Simon; Weil, Carly
2013-04-01
The Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project, an official policy of the U.S. National Geodetic Survey as of 2007, is working to survey the entire U.S. and its holdings with high-altitude airborne gravimetry. The goal of the project is to provide a consistent, high-quality gravity dataset that will become the cornerstone of a new gravimetric geoid and national vertical datum in 2022. Over the last five years, the GRAV-D project has surveyed more than 25% of the country, accomplishing almost 500 flights on six different aircraft platforms and producing more than 3.7 Million square km of data thus far. This wealth of experience has led to advances in the collection, processing, and evaluation of high-altitude (20,000 - 35,000 ft) airborne gravity data. This presentation will highlight the most important practical and theoretical advances of the GRAV-D project, giving an introduction to each. Examples of innovation include: 1. Use of navigation grade inertial measurement unit data and precise lever arm measurements for positioning; 2. New quality control tests and software for near real-time analysis of data in the field; 3. Increased accuracy of gravity post-processing by reexamining assumptions and simplifications that were inconsistent with a goal of 1 mGal precision; and 4. Better final data evaluation through crossovers, additional statistics, and inclusion of airborne data into harmonic models that use EGM08 as a base model. The increases in data quality that resulted from implementation of the above advances (and others) will be shown with a case study of the GRAV-D 2008 southern Alaska survey near Anchorage, over Cook Inlet. The case study's statistics and comparisons to global models illustrate the impact that these advances have had on the final airborne gravity data quality. Finally, the presentation will summarize the best practices identified by the project from its last five years of experience.
NASA Astrophysics Data System (ADS)
Robichaud, A.; Ménard, R.
2013-05-01
We present multi-year objective analyses (OA) on a high spatio-temporal resolution (15 or 21 km, every hour) for the warm season period (1 May-31 October) for ground-level ozone (2002-2012) and for fine particulate matter (diameter less than 2.5 microns (PM2.5)) (2004-2012). The OA used here combines the Canadian Air Quality forecast suite with US and Canadian surface air quality monitoring sites. The analysis is based on an optimal interpolation with capabilities for adaptive error statistics for ozone and PM2.5 and an explicit bias correction scheme for the PM2.5 analyses. The estimation of error statistics has been computed using a modified version of the Hollingsworth-Lönnberg's (H-L) method. Various quality controls (gross error check, sudden jump test and background check) have been applied to the observations to remove outliers. An additional quality control is applied to check the consistency of the error statistics estimation model at each observing station and for each hour. The error statistics are further tuned "on the fly" using a χ2 (chi-square) diagnostic, a procedure which verifies significantly better than without tuning. Successful cross-validation experiments were performed with an OA set-up using 90% of observations to build the objective analysis and with the remainder left out as an independent set of data for verification purposes. Furthermore, comparisons with other external sources of information (global models and PM2.5 satellite surface derived measurements) show reasonable agreement. The multi-year analyses obtained provide relatively high precision with an absolute yearly averaged systematic error of less than 0.6 ppbv (parts per billion by volume) and 0.7 μg m-3 (micrograms per cubic meter) for ozone and PM2.5 respectively and a random error generally less than 9 ppbv for ozone and under 12 μg m-3 for PM2.5. In this paper, we focus on two applications: (1) presenting long term averages of objective analysis and analysis increments as a form of summer climatology and (2) analyzing long term (decadal) trends and inter-annual fluctuations using OA outputs. Our results show that high percentiles of ozone and PM2.5 are both following a decreasing trend overall in North America with the eastern part of United States (US) presenting the highest decrease likely due to more effective pollution controls. Some locations, however, exhibited an increasing trend in the mean ozone and PM2.5 such as the northwestern part of North America (northwest US and Alberta). The low percentiles are generally rising for ozone which may be linked to increasing emissions from emerging countries and the resulting pollution brought by the intercontinental transport. After removing the decadal trend, we demonstrate that the inter-annual fluctuations of the high percentiles are significantly correlated with temperature fluctuations for ozone and precipitation fluctuations for PM2.5. We also show that there was a moderately significant correlation between the inter-annual fluctuations of the high percentiles of ozone and PM2.5 with economic indices such as the Industrial Dow Jones and/or the US gross domestic product growth rate.
Rostami, Reza; Nahm, Meredith; Pieper, Carl F.
2011-01-01
Background Despite a pressing and well-documented need for better sharing of information on clinical trials data quality assurance methods, many research organizations remain reluctant to publish descriptions of and results from their internal auditing and quality assessment methods. Purpose We present findings from a review of a decade of internal data quality audits performed at the Duke Clinical Research Institute, a large academic research organization that conducts data management for a diverse array of clinical studies, both academic and industry-sponsored. In so doing, we hope to stimulate discussions that could benefit the wider clinical research enterprise by providing insight into methods of optimizing data collection and cleaning, ultimately helping patients and furthering essential research. Methods We present our audit methodologies, including sampling methods, audit logistics, sample sizes, counting rules used for error rate calculations, and characteristics of audited trials. We also present database error rates as computed according to two analytical methods, which we address in detail, and discuss the advantages and drawbacks of two auditing methods used during this ten-year period. Results Our review of the DCRI audit program indicates that higher data quality may be achieved from a series of small audits throughout the trial rather than through a single large database audit at database lock. We found that error rates trended upward from year to year in the period characterized by traditional audits performed at database lock (1997–2000), but consistently trended downward after periodic statistical process control type audits were instituted (2001–2006). These increases in data quality were also associated with cost savings in auditing, estimated at 1000 hours per year, or the efforts of one-half of a full time equivalent (FTE). Limitations Our findings are drawn from retrospective analyses and are not the result of controlled experiments, and may therefore be subject to unanticipated confounding. In addition, the scope and type of audits we examine here are specific to our institution, and our results may not be broadly generalizable. Conclusions Use of statistical process control methodologies may afford advantages over more traditional auditing methods, and further research will be necessary to confirm the reliability and usability of such techniques. We believe that open and candid discussion of data quality assurance issues among academic and clinical research organizations will ultimately benefit the entire research community in the coming era of increased data sharing and re-use. PMID:19342467
Statistical characteristics of excess fiber length in loose tubes of optical cable
NASA Astrophysics Data System (ADS)
Andreev, Vladimir A.; Gavryushin, Sergey A.; Popov, Boris V.; Popov, Victor B.; Vazhdaev, Michael A.
2017-04-01
This paper presents an analysis of the data measurements of excess fiber length in the loose tubes of optical cable during the post-process quality control of ready-made products. At determining estimates of numerical characteristics of excess fiber length method of results processing of direct multiple equally accurate measurements has been used. The results of experimental research of the excess length value at the manufacturing technology of loose tube remains constant.
Neilson, Jennifer R.; Lamb, Berton Lee; Swann, Earlene M.; Ratz, Joan; Ponds, Phadrea D.; Liverca, Joyce
2005-01-01
The findings presented in this report represent the basic results derived from the attitude assessment survey conducted in the last quarter of 2004. The findings set forth in this report are the frequency distributions for each question in the survey instrument for all respondents. The only statistics provided are descriptive in character - namely, means and associated standard deviations.
Visual ergonomic aspects of glare on computer displays: glossy screens and angular dependence
NASA Astrophysics Data System (ADS)
Brunnström, Kjell; Andrén, Börje; Konstantinides, Zacharias; Nordström, Lukas
2007-02-01
Recently flat panel computer displays and notebook computer are designed with a so called glare panel i.e. highly glossy screens, have emerged on the market. The shiny look of the display appeals to the costumers, also there are arguments that the contrast, colour saturation etc improves by using a glare panel. LCD displays suffer often from angular dependent picture quality. This has been even more pronounced by the introduction of Prism Light Guide plates into displays for notebook computers. The TCO label is the leading labelling system for computer displays. Currently about 50% of all computer displays on the market are certified according to the TCO requirements. The requirements are periodically updated to keep up with the technical development and the latest research in e.g. visual ergonomics. The gloss level of the screen and the angular dependence has recently been investigated by conducting user studies. A study of the effect of highly glossy screens compared to matt screens has been performed. The results show a slight advantage for the glossy screen when no disturbing reflexes are present, however the difference was not statistically significant. When disturbing reflexes are present the advantage is changed into a larger disadvantage and this difference is statistically significant. Another study of angular dependence has also been performed. The results indicates a linear relationship between the picture quality and the centre luminance of the screen.
Sharif, Farkhondeh; Jahanbin, Iran; Amirsadat, Afsar; Hosseini Moghadam, Mahboobeh
2018-04-01
Life review therapy, used as part of a comprehensive therapy plan for increasing the quality of life of the elderly, helps them to resolve their past conflicts, reconstruct their life stories, and accept their present conditions. The present study aimed to explore the effectiveness of life review therapy on the quality of life of the elderly. The present study was a randomized controlled trial with a pre-posttest design during April to Aug 2014. The study was conducted on 35 members of the elderly day care centers in Shiraz, Iran, that were randomly assigned to two groups (experimental and control). The subjects in the experimental group attended 8 two-hour sessions of life review therapy. The quality of life of the elderly participants was evaluated before, immediately, one month, and three months after the intervention using the quality of life questionnaire (WHOQOL_BREF). Data analysis was conducted through SPSS version 22, using statistical tests including Chi-square, repeated measures test and T-test, with the significance level of 0.05. The results of the study showed that life review therapy interventions significantly improved the quality of life of the elderly (P<0.05). Moreover, group interaction with passage of time was also significant, which indicates that the pattern of changes has been different between the two groups. The findings of the study confirm the research hypotheses, showing that the application of life review is effective and viable. It is recommended that all nursing homes and even the families of the elderly should employ this convenient, inexpensive, quick, and practical method. Trial Registration Number: IRCT2015021621106N1.
On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.
Yang, Harry; Novick, Steven; Burdick, Richard K
Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.
Kim, Kon Hee; Hwang, Eun Hee
2017-01-01
The purpose of the present study was to identify the sleep quality, depression, and life satisfaction between nursing home and long-term care hospital residents. Data was collected through a structured questionnaire survey of 61 nursing home residents and 74 long-term care hospital residents. Descriptive statistics, t-test, χ 2 -test, anova, Pearson's correlation were used to analyze the data. The residents living in a nursing home showed higher subjective health status and sleep quality than long-term care hospital residents. Depression did not show a significant difference between them. However, there was a significant difference in depression score by subjective health status. Sleep quality and depression showed a significant negative correlation for both residents. In terms of depression and life satisfaction, nursing home residents showed a significant negative correlation, and long-term care hospital residents showed a significant positive correlation. These results show that environmental management is essential to enhance sleep quality, thus depression and subjective health status will be improved. Geriatr Gerontol Int 2017; 17: 142-149. © 2015 Japan Geriatrics Society.
The influence of stigma on the quality of life for prostate cancer survivors.
Wood, Andrew W; Barden, Sejal; Terk, Mitchell; Cesaretti, Jamie
2017-01-01
The purpose of the present study was to investigate the influence of stigma on prostate cancer (PCa) survivors' quality of life. Stigma for lung cancer survivors has been the focus of considerable research (Else-Quest & Jackson, 2014); however, gaps remain in understanding the experience of PCa stigma. A cross-sectional correlational study was designed to assess the incidence of PCa stigma and its influence on the quality of life of survivors. Eighty-five PCa survivors were administered survey packets consisting of a stigma measure, a PCa-specific quality of life measure, and a demographic survey during treatment of their disease. A linear regression analysis was conducted with the data received from PCa survivors. Results indicated that PCa stigma has a significant, negative influence on the quality of life for survivors (R 2 = 0.33, F(4, 80) = 11.53, p < 0.001). There were no statistically significant differences in PCa stigma based on demographic variables (e.g., race and age). Implications for physical and mental health practitioners and researchers are discussed.
Dodge, Kent A.; Hornberger, Michelle I.; Turner, Matthew A.
2017-01-19
Water, bed sediment, and biota were sampled in selected streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork Basin of western Montana. The sampling program was led by the U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency, to characterize aquatic resources in the Clark Fork Basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2014 through September 2015. Bed-sediment and biota samples were collected once at 13 sites during August 2015.This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2014 through September 2015. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. At 12 sites, samples for analysis of dissolved organic carbon and turbidity were collected. In addition, samples for analysis of nitrogen (nitrate plus nitrite) were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for three sites. Seasonal daily values of turbidity were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.
Dodge, Kent A.; Hornberger, Michelle I.; Dyke, Jessica
2013-01-01
Water, bed sediment, and biota were sampled in streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork basin of western Montana; additional water samples were collected from near Galen to near Missoula at select sites as part of a supplemental sampling program. The sampling program was conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency to characterize aquatic resources in the Clark Fork basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were located on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2010 through September 2011. Bed-sediment and biota samples were collected once at 14 sites during August 2011. This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2010 through September 2011. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Turbidity was analyzed for water samples collected at the four sites where seasonal daily values of turbidity were being determined. Daily values of suspended-sediment concentration and suspended-sediment discharge were determined for four sites. Bed-sediment data include trace-element concentrations in the fine-grained fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork basin are provided for the period of record since 1985.
Tsai, William; Lu, Qian
2017-10-01
Ambivalence over emotional expression (AEE) is the inner conflict of desiring emotion expression and fearing consequence of emotion expression. Few studies to date have examined the effects of AEE within an ethnic group that prioritizes emotional self-control. The present study examined the associations between AEE and well-being (viz., quality of life and depressive symptoms) as a function of acculturation among a sample of Chinese American breast cancer survivors. Ninety-six Chinese breast cancer survivors (M age = 54.64 years old, SD = 7.98) were recruited from Southern California. Participants filled out a paper-pen questionnaire containing the Ambivalence over Emotional Expression Questionnaire (AEQ), the Functional Assessment of Cancer Therapy-Breast (FACT-B), and the Center for Epidemiologic Studies Depression Scale-Short Form (CESD-10). Acculturation was a statistically significant moderator of the relations between AEE and depressive symptoms, and a statistically marginally significant moderator of the relations between AEE and quality of life. Simple slopes revealed that AEE was negatively associated with quality of life (B = -.45, p < .001) and depressive symptoms (B = .20, p < .001) for women with high acculturation, but not associated for women with low acculturation (Bs = -.15 and .04, ps > .05, for quality of life and depressive symptoms, respectively). These results suggest that less acculturated Chinese breast cancer survivors are protected by Chinese cultural values of emotional self-control and restraint, and thus do not experience the detrimental effects of AEE on their depressive symptoms and quality of life. Implications are discussed.
Dodge, Kent A.; Hornberger, Michelle I.; Turner, Matthew A.
2018-03-30
Water, bed sediment, and biota were sampled in selected streams from Butte to near Missoula, Montana, as part of a monitoring program in the upper Clark Fork Basin of western Montana. The sampling program was led by the U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency, to characterize aquatic resources in the Clark Fork Basin, with emphasis on trace elements associated with historic mining and smelting activities. Sampling sites were on the Clark Fork and selected tributaries. Water samples were collected periodically at 20 sites from October 2015 through September 2016. Bed-sediment and biota samples were collected once at 13 sites during August 2016.This report presents the analytical results and quality-assurance data for water-quality, bed-sediment, and biota samples collected at sites from October 2015 through September 2016. Water-quality data include concentrations of selected major ions, trace elements, and suspended sediment. Samples for analysis of turbidity were collected at 13 sites, whereas samples for analysis of dissolved organic carbon were collected at 10 sites. In addition, samples for analysis of nitrogen (nitrate plus nitrite) were collected at two sites. Daily values of mean suspended-sediment concentration and suspended-sediment discharge were determined for three sites. Seasonal daily values of turbidity were determined for five sites. Bed-sediment data include trace-element concentrations in the fine-grained (less than 0.063 millimeter) fraction. Biological data include trace-element concentrations in whole-body tissue of aquatic benthic insects. Statistical summaries of water-quality, bed-sediment, and biological data for sites in the upper Clark Fork Basin are provided for the period of record.
Chukmaitov, Askar; Harless, David W; Bazzoli, Gloria J; Carretta, Henry J; Siangphoe, Umaporn
2015-01-01
Implementation of accountable care organizations (ACOs) is currently underway, but there is limited empirical evidence on the merits of the ACO model. The aim was to study the associations between delivery system characteristics and ACO competencies, including centralization strategies to manage organizations, hospital integration with physicians and outpatient facilities, health information technology, infrastructure to monitor community health and report quality, and risk-adjusted 30-day all-cause mortality and case-mixed-adjusted inpatient costs for the Medicare population. Panel data (2006-2009) were assembled from Florida and multiple sources: inpatient hospital discharge, vital statistics, the American Hospital Association, the Healthcare Information and Management Systems Society, and other databases. We applied a panel study design, controlling for hospital and market characteristics. Hospitals that were in centralized health systems or became more centralized over the study period had significantly larger reductions in mortality compared with hospitals that remained freestanding. Surprisingly, tightly integrated hospital-physician arrangements were associated with increased mortality; as such, hospitals may wish to proceed cautiously when developing specific types of alignment with local physician organizations. We observed no statistically significant differences in the growth rate of costs across hospitals in any of the health systems studied relative to freestanding hospitals. Although we observed quality improvement in some organizational types, these outcome improvements were not coupled with the additional desired objective of lower cost growth. This implies that additional changes not present during our study period, potentially changes in provider payment approaches, are essential for achieving the ACO objectives of higher quality of care at lower costs. Provider organizations implementing ACOs should consider centralizing service delivery as a viable strategy to improve quality of care, although the strategy did not result in lower cost growth.
Alzarea, Bader K
2016-04-01
Peri-implant tissue health is a requisite for success of dental implant therapy. Plaque accumulation leads to initiation of gingivitis around natural teeth and peri-implantitis around dental implants. Peri-implantitis around dental implants may result in implant placement failure. For obtaining long-term success, timely assessment of dental implant site is mandatory. To assess and evaluate Quality of Life (OHRQoL) of individuals with dental implants using the Oral Health Impact Profile (OHIP-14). Total 92 patients were evaluated for assessment of the health of peri-implant tissues by recording, Plaque Index (PI), Probing Pocket Depth (PD), Bleeding On Probing (BOP) and Probing Attachment Level (PAL) as compared to contra-lateral natural teeth (control). In the same patients Quality of Life Assessment was done by utilizing Oral Health Impact Profile Index (OHIP-14). The mean plaque index around natural teeth was more compared to implants and it was statistically significant. Other three dimensions mean bleeding on probing; mean probing attachment level and mean pocket depth around both natural teeth and implant surfaces was found to be not statistically significant. OHIP-14 revealed that patients with dental implants were satisfied with their Oral Health-Related Quality of Life (OHRQoL). Similar inflammatory conditions are present around both natural teeth and implant prostheses as suggested by results of mean plaque index, mean bleeding on probing, mean pocket depth and mean probing attachment level, hence reinforcing the periodontal health maintenance both prior to and after incorporation of dental implants. Influence of implant prostheses on patient's oral health related quality of life (as depicted by OHIP-14) and patients' perceptions and expectations may guide the clinician in providing the best implant services.
Statistical process control: a practical application for hospitals.
VanderVeen, L M
1992-01-01
A six-step plan based on using statistics was designed to improve quality in the central processing and distribution department of a 223-bed hospital in Oakland, CA. This article describes how the plan was implemented sequentially, starting with the crucial first step of obtaining administrative support. The QI project succeeded in overcoming beginners' fear of statistics and in training both managers and staff to use inspection checklists, Pareto charts, cause-and-effect diagrams, and control charts. The best outcome of the program was the increased commitment to quality improvement by the members of the department.
Petroleum supply monthly, February 1991. [Glossary included
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-02-01
Data presented in the Petroleum Supply Monthly (PSM) describe the supply and disposition of petroleum products in the United States and major US geographic regions. The data series describe production, imports and exports, inter-Petroleum Administration for Defense (PAD) District movements, and inventories by the primary suppliers of petroleum products in the United States (50 States and the District of Columbia). The reporting universe includes those petroleum sectors in Primary Supply. Included are: petroleum refiners, motor gasoline blenders, operators of natural gas processing plants and fractionators, inter-PAD transporters, importers, and major inventory holders of petroleum products and crude oil. When aggregated,more » the data reported by these sectors approximately represent the consumption of petroleum products in the United States. Data presented in the PSM are divided into two sections (1) the Summary Statistics and (2) the Detailed Statistics. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 12 figs., 54 tabs.« less
A Simple and Practical Index to Measure Dementia-Related Quality of Life.
Arons, Alexander M M; Schölzel-Dorenbos, Carla J M; Olde Rikkert, Marcel G M; Krabbe, Paul F M
2016-01-01
Research on new treatments for dementia is gaining pace worldwide in an effort to alleviate this growing health care problem. The optimal evaluation of such interventions, however, calls for a practical and credible patient-reported outcome measure. To describe the refinement of the Dementia Quality-of-life Instrument (DQI) and present its revised version. A prototype of the DQI was adapted to cover a broader range of health-related quality of life (HRQOL) and to improve consistency in the descriptions of its domains. A valuation study was then conducted to assign meaningful numbers to all DQI health states. Pairs of DQI states were presented to a sample of professionals working with people with dementia and a representative sample of the Dutch population. They had to repeatedly select the best DQI state, and their responses were statistically modeled to obtain values for each health state. In total, 207 professionals working with people with dementia and 631 members of the general population completed the paired comparison tasks. Statistically significant differences between the two samples were found for the domains of social functioning, mood, and memory. Severe problems with physical health and severe memory problems were deemed most important by the general population. In contrast, severe mood problems were considered most important by professionals working with people with dementia. The DQI is a simple and feasible measurement instrument that expresses the overall HRQOL of people suffering from dementia in a single meaningful number. Current results suggest that revisiting the discussion of using values from the general population might be warranted in the dementia context. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
McLean, Andrew; Lawlor, Jenine; Mitchell, Rob; Kault, David; O'Kane, Carl; Lees, Michelle
2015-02-01
To evaluate the impact of More Learning for Interns in Emergency (MoLIE) on clinical documentation in the ED of a large regional hospital. MoLIE was implemented at The Townsville Hospital (TTH) in 2010, and has since provided ED interns with structured off-floor teaching and a dedicated clinical supervisor. A pre- and post-intervention study was conducted using retrospective medical record review methodology. Charts were selected by identifying all TTH ED patients seen by interns in the period 2008-2011. Two hundred pre-intervention records (2008-2009) and 200 post-intervention records (2010-2011) were reviewed. These were randomly selected following an initial screen by an ED staff specialist. The quality of clinical documentation for five common ED presentations (asthma, chest pain, lacerations, abdominal pain and upper limb fractures) was assessed. For each presentation, documentation quality was scored out of 10 using predefined criteria. An improvement of two or more was thought to be clinically significant. Mean scores for each group were compared using a Student's t-test for independent samples. Mean documentation scores (and 95% confidence intervals) were 5.55 (5.17-5.93) in 2008, 5.42 (4.98-5.86) in 2009, 6.37 (5.99-6.75) in 2010 and 6.08 (5.71-6.45) in 2011. There was a statistically but not clinically significant improvement in scores pre- and post-intervention (P ≤ 0.001). The introduction of MoLIE was associated with a small but statistically significant improvement in documentation, despite an 80% increase in intern placements. These results suggest that structured training programmes have potential to improve intern performance while simultaneously enhancing training capacity. The impact on quality of care requires further evaluation. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Noordin, Mohamed I; Chung, L Y
2004-01-01
This study adopts Differential Scanning Calorimetry (DSC) to analyze the thermal properties of samples (2.5-4.0 mg) from the tip, middle, and base sections of individual paracetamol suppositories, which were sampled carefully using a stainless steel scalpel. The contents of paracetamol present in the samples obtained from these sections were determined from the enthalpies of fusion of paracetamol and expressed as % w/w paracetamol to allow comparison of the amount of paracetamol found in each section. The tip, middle, and base sections contained 10.1+/-0.2%, 10.1+/-0.2%, and 10.3+/-0.2% w/w paracetamol, and are statistically similar (One-way anova; p>0.05). This indicates that the preparation technique adopted produces high quality suppositories in terms of content uniformity. The contents of paracetamol in the 120-mg paracetamol suppositories determined by DSC and UV spectrophotometry were statistically equivalent (Students's t-test; p>0.05), 120.8+/-2.6 mg and 120.8+/-1.5 mg, respectively, making DSC a clear alternative method for the measurement of content of drug in suppositories. The main advantages of the method are that samples of only 2.5-4.0 mg are required and the procedure does not require an extraction process, which allows for the analysis to be completed rapidly. In addition, it is highly sensitive and reproducible, with the lower detection limit at 4.0% w/w paracetamol, which is about 2.5 times lower than the content of paracetamol (10% w/w) present in our 120-mg paracetamol suppositories and commercial paracetamol suppositories, which contained about 125 mg paracetamol. Therefore, this method is particularly suited for determination of content uniformity in individual suppositories in quality control (QC) and in process quality control (PQC).
Odor measurements according to EN 13725: A statistical analysis of variance components
NASA Astrophysics Data System (ADS)
Klarenbeek, Johannes V.; Ogink, Nico W. M.; van der Voet, Hilko
2014-04-01
In Europe, dynamic olfactometry, as described by the European standard EN 13725, has become the preferred method for evaluating odor emissions emanating from industrial and agricultural sources. Key elements of this standard are the quality criteria for trueness and precision (repeatability). Both are linked to standard values of n-butanol in nitrogen. It is assumed in this standard that whenever a laboratory complies with the overall sensory quality criteria for n-butanol, the quality level is transferable to other, environmental, odors. Although olfactometry is well established, little has been done to investigate inter laboratory variance (reproducibility). Therefore, the objective of this study was to estimate the reproducibility of odor laboratories complying with EN 13725 as well as to investigate the transferability of n-butanol quality criteria to other odorants. Based upon the statistical analysis of 412 odor measurements on 33 sources, distributed in 10 proficiency tests, it was established that laboratory, panel and panel session are components of variance that significantly differ between n-butanol and other odorants (α = 0.05). This finding does not support the transferability of the quality criteria, as determined on n-butanol, to other odorants and as such is a cause for reconsideration of the present single reference odorant as laid down in EN 13725. In case of non-butanol odorants, repeatability standard deviation (sr) and reproducibility standard deviation (sR) were calculated to be 0.108 and 0.282 respectively (log base-10). The latter implies that the difference between two consecutive single measurements, performed on the same testing material by two or more laboratories under reproducibility conditions, will not be larger than a factor 6.3 in 95% of cases. As far as n-butanol odorants are concerned, it was found that the present repeatability standard deviation (sr = 0.108) compares favorably to that of EN 13725 (sr = 0.172). It is therefore suggested that the repeatability limit (r), as laid down in EN 13725, can be reduced from r ≤ 0.477 to r ≤ 0.31.
The RCP Information Laboratory (iLab): breaking the cycle of poor data quality.
Croft, Giles P; Williams, John G
2005-01-01
A review of data quality in the NHS by the Audit Commission cited a lack of clinician involvement in the validation and use of centrally held activity data as one of the key issues to resolve. The perception that hospital episode statistics cannot support the needs of the individual clinician results in mistrust and disinterest. This in turn leads to under-development of such data from a clinical perspective, and the cycle continues. The RCP Information Laboratory (iLab) aims to address this problem by accessing, analysing and presenting information from these central repositories concerning the activity of visiting individual consultant physicians. With support from iLab staff--an information analyst and a clinician--local data quality issues are highlighted and local solutions sought. The information obtained can be used as an objective measure of activity to support the processes of appraisal and revalidation.
Colman, John A.; Sanzolone, R.F.
1991-01-01
Geochemical data are presented from a synoptic survey of 46 elements in fine-fraction streambed sediments of the Upper Illinois River Basin during the fall of 1987. The survey was a component study of the Illinois pilot project of the U.S. Geological Survey's National Water-Quality Assessment program. Most of the sampling sites were randomly chosen--135 on main stems of rivers and 238 on first- and second-order streams. In addition, 196 samples were collected for quality-assurance and special-study purposes. The report includes element concentration data and summary-statistics tables of percentiles, nested analysis of variance, and correlation coefficients. All concentration data are included in tabular form and can be selected by map reference number, latitude and longitude, or remark code indicating purpose for collecting sample.
The development of the Pictorial Thai Quality of Life.
Phattharayuttawat, Sucheera; Ngamthipwatthana, Thienchai; Pitiyawaranun, Buncha
2005-11-01
"Quality of life" has become a main focus of interest in medicine. The Pictorial Thai Quality of Life (PTQL) was developed in order to measure the Thai mental illness both in a clinical setting and community. The purpose of this study was to develop the Pictorial Thai Quality of Life (PTQL), having adequate and sufficient construct validity, discriminant power, concurrent validity, and reliability. To develop the Pictorial Thai Quality of Life Test, two samples groups were used in the present study: (1) pilot study samples: 30 samples and (2) survey samples were 672 samples consisting of normal, and psychiatric patients. The developing tests items were collected from a review of the literature in which all the items were based on the WHO definition of Quality of Life. Then, experts judgment by the Delphi technique was used in the first stage. After that a pilot study was used to evaluate the testing administration, and wording of the tests items. The final stage was collected data from the survey samples. The results of the present study showed that the final test was composed 25 items. The construct validity of this test consists of six domains: Physical, Cognitive, Affective, Social Function, Economic and Self-Esteem. All the PTQL items have sufficient discriminant power It was found to be statistically significant different at the. 001 level between those people with mental disorders and normal people. There was a high level of concurrent validity association with WHOQOL-BREF, Pearson correlation coefficient and Area under ROC curve were 0.92 and 0.97 respectively. The reliability coefficients for the Alpha coefficients of the PTQL total test was 0.88. The values of the six scales were from 0.81 to 0:91. The present study was directed at developing an effective psychometric properties pictorial quality of life questionnaire. The result will be a more direct and meaningful application of an instrument to detect the mental health illness poor quality of life in Thai communities.
Writing Quality in Chinese Children: Speed and Fluency Matter
Yan, Cathy Ming Wai; McBride-Chang, Catherine; Wagner, Richard K.; Zhang, Juan; Wong, Anita M. Y.; Shu, Hua
2015-01-01
There were two goals of the present study. The first was to create a scoring scheme by which 9-year-old Chinese children’s writing compositions could be rated to form a total score for writing quality. The second was to examine cognitive correlates of writing quality at age 9 from measures administered at ages 6–9. Age 9 writing compositions were scored using a 7-element rubric; following confirmatory factor analyses, 5 of these elements were retained to represent overall writing quality for subsequent analyses. Measures of vocabulary knowledge, Chinese word dictation, phonological awareness, speed of processing, speeded naming, and handwriting fluency at ages 6–9 were all significantly associated with the obtained overall writing quality measure even when the statistical effect of age was removed. With vocabulary knowledge, dictation skill, age, gender, and phonological awareness included in a regression equation, 35% of the variance in age 9 writing quality was explained. With the variables of speed of processing, speeded naming, and handwriting fluency additionally included as a block, 12% additional variance in the equation was explained. In addition to gender, overall unique correlates of writing quality were dictation, speed of processing, and handwriting fluency, underscoring the importance of both general automaticity and specific writing fluency for writing quality development in children. PMID:25750486
ERIC Educational Resources Information Center
Garfield, Joan; delMas, Robert
2010-01-01
The Assessment Resource Tools for Improving Statistical Thinking (ARTIST) Web site was developed to provide high-quality assessment resources for faculty who teach statistics at the tertiary level but resources are also useful to statistics teachers at the secondary level. This article describes some of the numerous ARTIST resources and suggests…
Educational quality and the crisis of educational research
NASA Astrophysics Data System (ADS)
Heyneman, Stephen
1993-11-01
This paper was designed not as a research product but as a speech to comparative education colleagues. It argues that there is a crisis of educational quality in many parts of the world, and that there is a parallel crisis in the quality of educational research and statistics. Compared to other major public responsibilities in health, agriculture, population and family planning, educational statistics are poor and often getting worse. Our international and national statistical institutions are impoverished, and we as a profession have been part of the problem. We have been so busy arguing over differing research paradigms that we have not paid sufficient attention to our common professional responsibilities and common professional goals. The paper suggests that we, as professionals interested in comparative education issues, begin to act together more on these common and important issues.
Dietrich, Shellene K; Francis-Jimenez, Coleen M; Knibbs, Melida Delcina; Umali, Ismael L; Truglio-Londrigan, Marie
2016-09-01
Sleep health is essential for overall health, quality of life and safety. Researchers have found a reduction in the average hours of sleep among college students. Poor sleep has been associated with deficits in attention, reduction in academic performance, impaired driving, risk-taking behaviors, depression, impaired social relationships and poorer health. College students may have limited knowledge about sleep hygiene and the behaviors that supports sleep health, which may lead to poor sleep hygiene behavior. To identify, appraise and synthesize the best available evidence on the effectiveness of sleep education programs in improving sleep hygiene knowledge, sleep hygiene behavior and/or sleep quality versus traditional strategies. All undergraduate or graduate college students, male or female, 18 years and older and of any culture or ethnicity. Formal sleep education programs that included a curriculum on sleep hygiene behavior. Educational delivery methods that took place throughout the participants' college experience and included a variety of delivery methods. Randomized controlled trials (RCTs) and quasi-experimental studies. Sleep hygiene knowledge, sleep hygiene behavior and/or sleep quality. Literature including published and unpublished studies in the English language from January 1, 1980 through August 17, 2015. A search of CINAHL, CENTRAL, EMBASE, Academic Search Complete, PsychINFO, Healthsource: Nursing/Academic edition, ProQuest Central, PubMed and ERIC were conducted using identified keywords and indexed terms. A gray literature search was also performed. Quantitative papers were assessed by two reviewers using critical appraisal instruments from the Joanna Briggs Institute Meta-Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI). Data were extracted using the JBI-MAStARI data extraction tool. Data extracted included interventions, populations, study methods and outcomes of significance to the review question and objectives. Meta-analysis was not possible due to limited studies and variability of design and interventions; therefore, results are presented in narrative form. This systematic review yielded three RCTs and one quasi-experimental study for inclusion. Two studies reported outcomes on sleep hygiene knowledge; one showing a statistically significant improvement (P = 0.025) and the other reported no difference (test of significance not provided). Two studies reported on sleep hygiene behavior; one showing no difference (P > 0.05) and the other reporting a statistically significant improvement (P = 0.0001). Four studies reported on sleep quality; three reporting no difference (P > 0.05) and the other reporting a statistically significant improvement (P = 0.017). This reviewed article identified insufficient evidence to determine the effectiveness of sleep education on sleep hygiene knowledge, sleep hygiene behavior or sleep quality in this population.
Ebrahimi, Milad; Gerber, Erin L; Rockaway, Thomas D
2017-05-15
For most water treatment plants, a significant number of performance data variables are attained on a time series basis. Due to the interconnectedness of the variables, it is often difficult to assess over-arching trends and quantify operational performance. The objective of this study was to establish simple and reliable predictive models to correlate target variables with specific measured parameters. This study presents a multivariate analysis of the physicochemical parameters of municipal wastewater. Fifteen quality and quantity parameters were analyzed using data recorded from 2010 to 2016. To determine the overall quality condition of raw and treated wastewater, a Wastewater Quality Index (WWQI) was developed. The index summarizes a large amount of measured quality parameters into a single water quality term by considering pre-established quality limitation standards. To identify treatment process performance, the interdependencies between the variables were determined by using Principal Component Analysis (PCA). The five extracted components from the 15 variables accounted for 75.25% of total dataset information and adequately represented the organic, nutrient, oxygen demanding, and ion activity loadings of influent and effluent streams. The study also utilized the model to predict quality parameters such as Biological Oxygen Demand (BOD), Total Phosphorus (TP), and WWQI. High accuracies ranging from 71% to 97% were achieved for fitting the models with the training dataset and relative prediction percentage errors less than 9% were achieved for the testing dataset. The presented techniques and procedures in this paper provide an assessment framework for the wastewater treatment monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Methodological quality of behavioural weight loss studies: a systematic review
Lemon, S. C.; Wang, M. L.; Haughton, C. F.; Estabrook, D. P.; Frisard, C. F.; Pagoto, S. L.
2018-01-01
Summary This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate >75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775
Zvara, Bharathi J.; Mills-Koonce, W. Roger; Heilbron, Nicole; Clincy, Amanda; Cox, Martha J.
2015-01-01
The present study extends the spillover and crossover hypotheses to more carefully model the potential interdependence between parent–parent interaction quality and parent–child interaction quality in family systems. Using propensity score matching, the present study attempted to isolate family processes that are unique across African American and European American couples that are independent of other socio-demographic factors to further clarify how interparental relationships may be related to parenting in a rural, low-income sample. The Actor–Partner Interdependence Model (APIM), a statistical analysis technique that accounts for the interdependence of relationship data, was used with a sample of married and non-married cohabiting African American and European American couples (n = 82 dyads) to evaluate whether mothers' and fathers' observed parenting behaviours are related to their behaviours and their partner's behaviours observed in a couple problem-solving interaction. Findings revealed that interparental withdrawal behaviour, but not conflict behaviour, was associated with less optimal parenting for fathers but not mothers, and specifically so for African American fathers. Our findings support the notion of interdependence across subsystems within the family and suggest that African American fathers may be specifically responsive to variations in interparental relationship quality. PMID:26430390
Groundwater quality in the Upper Susquehanna River Basin, New York, 2009
Reddy, James E.; Risen, Amy J.
2012-01-01
Water samples were collected from 16 production wells and 14 private residential wells in the Upper Susquehanna River Basin from August through December 2009 and were analyzed to characterize the groundwater quality in the basin. Wells at 16 of the sites were completed in sand and gravel aquifers, and 14 were finished in bedrock aquifers. In 2004–2005, six of these wells were sampled in the first Upper Susquehanna River Basin study. Water samples from the 2009 study were analyzed for 10 physical properties and 137 constituents that included nutrients, organic carbon, major inorganic ions, trace elements, radionuclides, pesticides, volatile organic compounds, and 4 types of bacterial analyses. Results of the water-quality analyses are presented in tabular form for individual wells, and summary statistics for specific constituents are presented by aquifer type. The results are compared with Federal and New York State drinking-water standards, which typically are identical. The results indicate that groundwater genrally is of acceptable quality, although concentrations of some constituents exceeded at least one drinking-water standard at 28 of the 30 wells. These constituents include: pH, sodium, aluminum, manganese, iron, arsenic, radon-222, residue on evaporation, total and fecal coliform including Escherichia coli and heterotrophic plate count.
Relevance of the c-statistic when evaluating risk-adjustment models in surgery.
Merkow, Ryan P; Hall, Bruce L; Cohen, Mark E; Dimick, Justin B; Wang, Edward; Chow, Warren B; Ko, Clifford Y; Bilimoria, Karl Y
2012-05-01
The measurement of hospital quality based on outcomes requires risk adjustment. The c-statistic is a popular tool used to judge model performance, but can be limited, particularly when evaluating specific operations in focused populations. Our objectives were to examine the interpretation and relevance of the c-statistic when used in models with increasingly similar case mix and to consider an alternative perspective on model calibration based on a graphical depiction of model fit. From the American College of Surgeons National Surgical Quality Improvement Program (2008-2009), patients were identified who underwent a general surgery procedure, and procedure groups were increasingly restricted: colorectal-all, colorectal-elective cases only, and colorectal-elective cancer cases only. Mortality and serious morbidity outcomes were evaluated using logistic regression-based risk adjustment, and model c-statistics and calibration curves were used to compare model performance. During the study period, 323,427 general, 47,605 colorectal-all, 39,860 colorectal-elective, and 21,680 colorectal cancer patients were studied. Mortality ranged from 1.0% in general surgery to 4.1% in the colorectal-all group, and serious morbidity ranged from 3.9% in general surgery to 12.4% in the colorectal-all procedural group. As case mix was restricted, c-statistics progressively declined from the general to the colorectal cancer surgery cohorts for both mortality and serious morbidity (mortality: 0.949 to 0.866; serious morbidity: 0.861 to 0.668). Calibration was evaluated graphically by examining predicted vs observed number of events over risk deciles. For both mortality and serious morbidity, there was no qualitative difference in calibration identified between the procedure groups. In the present study, we demonstrate how the c-statistic can become less informative and, in certain circumstances, can lead to incorrect model-based conclusions, as case mix is restricted and patients become more homogenous. Although it remains an important tool, caution is advised when the c-statistic is advanced as the sole measure of a model performance. Copyright © 2012 American College of Surgeons. All rights reserved.
Brown, Juliane B.
2008-01-01
Historical water-quality data in the National Park Service Southern Colorado Plateau Network have been collected irregularly and with little followup interpretation, restricting the value of the data. To help address these issues, to inform future water-quality monitoring planning efforts, and to address relevant National Park Service Inventory and Monitoring Program objectives, the U.S. Geological Survey, in cooperation with the National Park Service, compiled, reviewed, and summarized available historical water-quality data for 19 park units in the Southern Colorado Plateau Network. The data are described in terms of availability by major water-quality classes, park unit, site type, and selected identified water sources. The report also describes the geology, water resources, water-quality issues, data gaps, and water-quality standard exceedances identified in five of the park units determined to be of high priority. The five park units are Bandelier National Monument in New Mexico, Canyon de Chelly National Monument in Arizona, Chaco Culture National Historical Park in New Mexico, Glen Canyon National Recreation Area in Arizona and Utah, and Mesa Verde National Park in Colorado. Statistical summaries of water-quality characteristics are presented and considerations for future water-quality monitoring are provided for these five park units.
Sañudo-Fontaneda, Luis A; Charlesworth, Susanne M; Castro-Fresno, Daniel; Andres-Valeri, Valerio C A; Rodriguez-Hernandez, Jorge
2014-01-01
Pervious pavements have become one of the most used sustainable urban drainage system (SUDS) techniques in car parks. This research paper presents the results of monitoring water quality from several experimental car park areas designed and constructed in Spain with bays made of interlocking concrete block pavement, porous asphalt, polymer-modified porous concrete and reinforced grass with plastic and concrete cells. Moreover, two different sub-base materials were used (limestone aggregates and basic oxygen furnace slag). This study therefore encompasses the majority of the materials used as permeable surfaces and sub-base layers all over the world. Effluent from the test bays was monitored for dissolved oxygen, pH, electric conductivity, total suspended solids, turbidity and total petroleum hydrocarbons in order to analyze the behaviour shown by each combination of surface and sub-base materials. In addition, permeability tests were undertaken in all car parks using the 'Laboratorio Caminos Santander' permeameter and the Cantabrian Portable Infiltrometer. All results are presented together with the influence of surface and sub-base materials on water quality indicators using bivariate correlation statistical analysis at a confidence level of 95%. The polymer-modified porous concrete surface course in combination with limestone aggregate sub-base presented the best performance.
Li, Yan; Zhang, Ji; Jin, Hang; Liu, Honggao; Wang, Yuanzhong
2016-08-05
A quality assessment system comprised of a tandem technique of ultraviolet (UV) spectroscopy and ultra-fast liquid chromatography (UFLC) aided by multivariate analysis was presented for the determination of geographic origin of Wolfiporia extensa collected from five regions in Yunnan Province of China. Characteristic UV spectroscopic fingerprints of samples were determined based on its methanol extract. UFLC was applied for the determination of pachymic acid (a biomarker) presented in individual test samples. The spectrum data matrix and the content of pachymic acid were integrated and analyzed by partial least squares discriminant analysis (PLS-DA) and hierarchical cluster analysis (HCA). The results showed that chemical properties of samples were clearly dominated by the epidermis and inner part as well as geographical origins. The relationships among samples obtained from these five regions have been also presented. Moreover, an interesting finding implied that geographical origins had much greater influence on the chemical properties of epidermis compared with that of the inner part. This study demonstrated that a rapid tool for accurate discrimination of W. extensa by UV spectroscopy and UFLC could be available for quality control of complicated medicinal mushrooms. Copyright © 2016 Elsevier B.V. All rights reserved.
Thirty Meter Telescope Site Testing V: Seeing and Isoplanatic Angle
NASA Astrophysics Data System (ADS)
Skidmore, Warren; Els, Sebastian; Travouillon, Tony; Riddle, Reed; Schöck, Matthias; Bustos, Edison; Seguel, Juan; Walker, David
2009-10-01
In this article we present an analysis of the statistical and temporal properties of seeing and isoplanatic angle measurements obtained with combined Differential Image Motion Monitor (DIMM) and Multi-Aperture Scintillation Sensor (MASS) units at the Thirty Meter Telescope (TMT) candidate sites. For each of the five candidate sites we obtained multiyear, high-cadence, high-quality seeing measurements. These data allow for a broad and detailed analysis, giving us a good understanding of the characteristics of each of the sites. The overall seeing statistics for the five candidate sites are presented, broken into total seeing (measured by the DIMM), free-atmosphere seeing and isoplanatic angle (measured by the MASS), and ground-layer seeing (difference between the total and free-atmosphere seeing). We examine the statistical distributions of seeing measurements and investigate annual and nightly behavior. The properties of the seeing measurements are discussed in terms of the geography and meteorological conditions at each site. The temporal variability of the seeing measurements over timescales of minutes to hours is derived for each site. We find that each of the TMT candidate sites has its own strengths and weaknesses when compared against the other candidate sites. The results presented in this article form part of the full set of results that are used for the TMT site-selection process. This is the fifth article in a series discussing the TMT site-testing project.
Water Resources Data - New Jersey, Water Year 1999, Volume 3, Water-Quality Data
DeLuca, M.J.; Romanok, K.M.; Riskin, M.L.; Mattes, G.L.; Thomas, A.M.; Gray, B.J.
2000-01-01
Water-resources data for the 1999 water year for New Jersey are presented in three volumes, and consists of records of stage, discharge, and water quality of streams; stage and contents of lakes and reservoirs; and water levels and water quality of ground water. Volume 3 contains a summary of surface and ground water hydrologic conditions for the 1999 water year, a listing of current water-resource projects in New Jersey, a bibliography of water-related reports, articles, and fact sheets for New Jersey completed by the Geological Survey in recent years, water-quality records of chemical analyses from 133 surface-water stations, 46 miscellaneous surface-water sites, 30 ground-water stations, 41 miscellaneous ground-water sites, and records of daily statistics of temperature and other physical measurements from 17 continuous-monitoring stations. Locations of water-quality stations are shown in figures 11 and 17-20. Locations of miscellaneous water-quality sites are shown in figures 29-32 and 34. These data represent the part of the National Water Data System operated by the U.S. Geological Survey and cooperating Federal, State, and local agencies in New Jersey.
NASA Astrophysics Data System (ADS)
Osterman, G. B.; Neu, J. L.; Eldering, A.; Pinder, R. W.; Tang, Y.; McQueen, J.
2012-12-01
At night, ozone can be transported long distances above the surface inversion layer without chemical destruction or deposition. As the boundary layer breaks up in the morning, this nocturnal ozone can be mixed down to the surface and rapidly increase ozone concentrations at a rate that can rival chemical ozone production. Most regional scale models that are used for air quality forecasts and ozone source attribution do not adequately capture nighttime ozone concentrations and transport. We combine ozone profile data from the NASA Earth Observing System (EOS) Tropospheric Emission Spectrometer (TES) and other sensors, ozonesonde data collected during the INTEX Ozonesonde Network Study (IONS), EPA AirNow ground station ozone data, the Community Multi-Scale Air Quality (CMAQ) model, and the National Air Quality Forecast Capability (NAQFC) model to examine air quality events during August 2006. We present both aggregated statistics and case-study analyses that assess the relationship between the models' ability to reproduce surface air quality events and their ability to capture the vertical distribution of ozone both during the day and at night. We perform the comparisons looking at the geospatial dependence in the differences between the measurements and models under different surface ozone conditions.
The concept of a composite perioperative quality index in kidney transplantation.
Taber, David J; McGillicuddy, John W; Bratton, Charles F; Lin, Angello; Chavin, Kenneth D; Baliga, Prabhakar K
2014-04-01
Public reporting of patient and graft outcomes in a national registry and close Centers for Medicare and Medicaid Services oversight has resulted in transplantation being a highly regulated surgical discipline. Despite this, transplantation surgery lacks comprehensive tracking and reporting of perioperative quality measures. Therefore, the aim of this study was to determine the association between a kidney transplantation centers' perioperative quality benchmarking and graft and patient outcomes. This was an analysis of 2011 aggregate data compiled from 2 national datasets that track outcomes from member hospitals and transplantation centers. The transplantation centers included in this study were composed of accredited US kidney transplantation centers that report data through the national registry and are associate members of the University HealthSystem Consortium. A total of 16,811 kidney transplantations were performed at 236 centers in the United States in 2011, of which 10,241 (61%) from 93 centers were included in the analysis. Of the 6 perioperative quality indicators, 3 benchmarked metrics were significantly associated with a kidney transplantation center's underperformance: mean ICU length of stay (C-statistic 0.731; p = 0.002), 30-day readmissions (C-statistic 0.697; p = 0.012) and in-hospital complications (C-statistic 0.785; p = 0.001). The composite quality index strongly correlated with inadequate center performance (C-statistic 0.854; p < 0.001, R(2) = 0.349). The centers in the lowest quartile of the quality index performed 2,400 kidney transplantations in 2011, which led to 2,640 more hospital days, 4,560 more ICU days, 120 more postoperative complications, and 144 more patients with 30-day readmissions, when compared with centers in the 3 higher-quality quartiles. An objective index of a transplantation center's quality of perioperative care is significantly associated with patient and graft survival. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Effectiveness of Exercise on Functional Mobility in Adults with Cerebral Palsy: A Systematic Review
Lawrence, Hillary; Hills, Sara; Kline, Nicole; Weems, Kyra
2016-01-01
Purpose: We identified evidence evaluating the effect of exercise on functional mobility in adults (aged 18 y or older) with cerebral palsy (CP). Method: An exhaustive search was conducted using the electronic databases PubMed, MEDLINE, CINAHL, PsycINFO, SPORTDiscus, and Cochrane Database of Systematic Reviews from the earliest available evidence (1975) to the present (January 2016) for studies whose participants were ambulatory adults with CP receiving conservative treatment to address functional mobility limitations. Two independent reviewers agreed on the eligibility, inclusion, and level of evidence of each study. The Maastricht-Amsterdam List (MAL) was used to assess evidence quality. Results: Five of the six studies included were randomized controlled trials, and one was a pre–post case series. Interventions included whole-body vibration, treadmill training without body-weight support, rhythmic auditory stimulation, dynamic balance and gait activities, progressive resistance training, and interactive serious gaming for balance. All studies were considered high quality, as indicated by their MAL scores. Four studies showed no statistical difference and trivial effect sizes between the intervention and the control group. Rhythmic auditory stimulation and interactive serious gaming were found to be statistically significant in benefiting adults with CP. Conclusions: Evidence of the effect of exercise on functional mobility for ambulatory adults with CP is lacking. A need exists for quality research to determine the best interventions for adults with CP to maximize functional mobility. PMID:27904240
Does quality of drinking water matter in kidney stone disease: A study in West Bengal, India.
Mitra, Pubali; Pal, Dilip Kumar; Das, Madhusudan
2018-05-01
The combined interaction of epidemiology, environmental exposure, dietary habits, and genetic factors causes kidney stone disease (KSD), a common public health problem worldwide. Because a high water intake (>3 L daily) is widely recommended by physicians to prevent KSD, the present study evaluated whether the quantity of water that people consume daily is associated with KSD and whether the quality of drinking water has any effect on disease prevalence. Information regarding residential address, daily volume of water consumption, and source of drinking water was collected from 1,266 patients with kidney stones in West Bengal, India. Drinking water was collected by use of proper methods from case (high stone prevalence) and control (zero stone prevalence) areas thrice yearly. Water samples were analyzed for pH, alkalinity, hardness, total dissolved solutes, electrical conductivity, and salinity. Average values of the studied parameters were compared to determine if there were any statistically significant differences between the case and control areas. We observed that as many as 53.6% of the patients consumed <3 L of water daily. Analysis of drinking water samples from case and control areas, however, did not show any statistically significant alterations in the studied parameters. All water samples were found to be suitable for consumption. It is not the quality of water, rather the quantity of water consumed that matters most in the occurrence of KSD.
Kamioka, Hiroharu; Tsutani, Kiichiro; Okuizumi, Hiroyasu; Mutoh, Yoshiteru; Ohta, Miho; Handa, Shuichi; Okada, Shinpei; Kitayuguchi, Jun; Kamada, Masamitsu; Shiozawa, Nobuyoshi; Honda, Takuya
2010-01-01
Background The objective of this review was to summarize findings on aquatic exercise and balneotherapy and to assess the quality of systematic reviews based on randomized controlled trials. Methods Studies were eligible if they were systematic reviews based on randomized clinical trials (with or without a meta-analysis) that included at least 1 treatment group that received aquatic exercise or balneotherapy. We searched the following databases: Cochrane Database Systematic Review, MEDLINE, CINAHL, Web of Science, JDream II, and Ichushi-Web for articles published from the year 1990 to August 17, 2008. Results We found evidence that aquatic exercise had small but statistically significant effects on pain relief and related outcome measures of locomotor diseases (eg, arthritis, rheumatoid diseases, and low back pain). However, long-term effectiveness was unclear. Because evidence was lacking due to the poor methodological quality of balneotherapy studies, we were unable to make any conclusions on the effects of intervention. There were frequent flaws regarding the description of excluded RCTs and the assessment of publication bias in several trials. Two of the present authors independently assessed the quality of articles using the AMSTAR checklist. Conclusions Aquatic exercise had a small but statistically significant short-term effect on locomotor diseases. However, the effectiveness of balneotherapy in curing disease or improving health remains unclear. PMID:19881230
Quality control for quantitative PCR based on amplification compatibility test.
Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W
2010-04-01
Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Dong, Yabing; Zhu, Yong; Ma, Chuan; Zhao, Huaqiang
2015-01-01
To illustrate whether the steroid-antivirals treatment could acquire a better recovery in patients with Bell's palsy than the steroids alone treatment. We conducted an exhaustive search over Pub med/Medline, Ovid, Elsevier search engines and the Cochrane library thereby collecting the randomized controlled trials in the treatment of patients with Bell's palsy with steroid-antivirals and steroids. The qualities of relevant articles were assessed by GRADE, which was used to present the overall quality of evidence as recommended by the Cochrane Handbook for Systematic Reviews of Interventions. Two investigators evaluated these papers independently, and resolved the disagreements by discussion. At last 8 eligible papers (1816 patients included: 896 treated with steroid-antivirals and 920 treated with steroids alone) match the criteria. Owing to the result (chi(2) = 12.57, P = 0.08, I(2) = 44%) presented by the formal test for heterogeneity, the fixed effect meta-analysis model was chosen. The facial muscle recovery between the steroids-antivirals group and the steroids alone group show significant differences (OR = 1.52, 95% CI: 1.20-1.94), while the statistical outcome of adverse effect shows no statistical significance (OR = 1.28, 95% CI: 0.71-2.31). The present meta-analysis indicates that the steroid-antivirals treatment could improve the recovery rate in patients with Bell's palsy when comparing with the steroid alone treatment. This meta-analysis showed that the steroid-antivirals treatment achieved the better outcomes in patients with Bell's palsy. Clinicians should consider that steroid-antivirals therapy is an alternative choice for the patients with Bell's palsy.
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
Metrology: Calibration and measurement processes guidelines
NASA Technical Reports Server (NTRS)
Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.
1994-01-01
The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.
Zelt, R.B.; Jordan, P.R.
1993-01-01
Among the first activities undertaken in each National Water-Quality Assessment (NAWQA) program study-unit investigation are compilation, screening, and statistical summary of available data concerning recent, general water-quality conditions in the study unit. This report (1) identifies which of the existing water-quality data are suitable for characterizing general conditions in a nationally consistent manner and (2) describes, to the extent possible, recent, general water-quality conditions in the Central Nebraska Basins. The study unit con- sists of the area drained by the Platte River between the confluence of the North Platte and South Platte Rivers near North Platte downstream to its confluence with the Missouri River south of Omaha. The report includes (1) a description of the sources and characteristics of water-quality data that are available, (2) a description of the approach used for screening data to identify a subset of the data suitable for summary and comparisons, (3) a presen- tation of the results of statistical and graphical summaries of recent, general water-quality con- ditions, and (4) comparisons of recent, general water-quality conditions to established national water-quality criteria, where applicable. Stream- and lake-water data are summarized for selected sampling sites, and data are summarized by major subunits of the study unit (the Sandhills, Loess Hills, Glaciated Area, and Platte Valley subunits) for streambed-sediment, fish-tissue, aquatic- ecological, and ground-water data. The summaries focus on the central tendencies and typical variation in the data and use nonparametric statistics such as frequencies and percentile values.
Quality evaluation of no-reference MR images using multidirectional filters and image statistics.
Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik
2018-09-01
This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.
Defect Analysis Of Quality Palm Kernel Meal Using Statistical Quality Control In Kernels Factory
NASA Astrophysics Data System (ADS)
Sembiring, M. T.; Marbun, N. J.
2018-04-01
The production quality has an important impact retain the totality of characteristics of a product or service to pay attention to its capabilities to meet the needs that have been established. Quality criteria Palm Kernel Meal (PKM) set Factory kernel is as follows: oil content: max 8.50%, water content: max 12,00% and impurity content: max 4.00% While the average quality of the oil content of 8.94%, the water content of 5.51%, and 8.45% impurity content. To identify the defective product quality PKM produced, then used a method of analysis using Statistical Quality Control (SQC). PKM Plant Quality Kernel shows the oil content was 0.44% excess of a predetermined maximum value, and 4.50% impurity content. With excessive PKM content of oil and dirt cause disability content of production for oil, amounted to 854.6078 kg PKM and 8643.193 kg impurity content of PKM. Analysis of the results of cause and effect diagram and SQC, the factors that lead to poor quality of PKM is Ampere second press oil expeller and hours second press oil expeller.
Ritter, Lutz; Mischkowski, Robert A; Neugebauer, Jörg; Dreiseidler, Timo; Scheer, Martin; Keeve, Erwin; Zöller, Joachim E
2009-09-01
The aim was to determine the influence of patient age, gender, body mass index (BMI), amount of dental restorations, and implants on image quality of cone-beam computerized tomography (CBCT). Fifty CBCT scans of a preretail version of Galileos (Sirona, Germany) were investigated retrospectively by 4 observers regarding image quality of 6 anatomic structures, pathologic findings detection, subjective exposure quality, and artifacts. Patient age, BMI, gender, amount of dental restorations, and implants were recorded and statistically tested for correlations to image quality. A negative effect on image quality was found statistically significantly correlated with age and the amount of dental restorations. None of the investigated image features were garbled by any of the investigated influence factors. Age and the amount of dental restorations appear to have a negative impact on CBCT image quality, whereas gender and BMI do not. Image quality of mental foramen, mandibular canal, and nasal floor are affected negatively by age but not by the amount of dental restorations. Further studies are required to elucidate influence factors on CBCT image quality.
Opportunities for Applied Behavior Analysis in the Total Quality Movement.
ERIC Educational Resources Information Center
Redmon, William K.
1992-01-01
This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…
DNA barcode identification of Podocarpaceae--the second largest conifer family.
Little, Damon P; Knopf, Patrick; Schulz, Christian
2013-01-01
We have generated matK, rbcL, and nrITS2 DNA barcodes for 320 specimens representing all 18 extant genera of the conifer family Podocarpaceae. The sample includes 145 of the 198 recognized species. Comparative analyses of sequence quality and species discrimination were conducted on the 159 individuals from which all three markers were recovered (representing 15 genera and 97 species). The vast majority of sequences were of high quality (B 30 = 0.596-0.989). Even the lowest quality sequences exceeded the minimum requirements of the BARCODE data standard. In the few instances that low quality sequences were generated, the responsible mechanism could not be discerned. There were no statistically significant differences in the discriminatory power of markers or marker combinations (p = 0.05). The discriminatory power of the barcode markers individually and in combination is low (56.7% of species at maximum). In some instances, species discrimination failed in spite of ostensibly useful variation being present (genotypes were shared among species), but in many cases there was simply an absence of sequence variation. Barcode gaps (maximum intraspecific p-distance > minimum interspecific p-distance) were observed in 50.5% of species when all three markers were considered simultaneously. The presence of a barcode gap was not predictive of discrimination success (p = 0.02) and there was no statistically significant difference in the frequency of barcode gaps among markers (p = 0.05). In addition, there was no correlation between number of individuals sampled per species and the presence of a barcode gap (p = 0.27).
Generalized watermarking attack based on watermark estimation and perceptual remodulation
NASA Astrophysics Data System (ADS)
Voloshynovskiy, Sviatoslav V.; Pereira, Shelby; Herrigel, Alexander; Baumgartner, Nazanin; Pun, Thierry
2000-05-01
Digital image watermarking has become a popular technique for authentication and copyright protection. For verifying the security and robustness of watermarking algorithms, specific attacks have to be applied to test them. In contrast to the known Stirmark attack, which degrades the quality of the image while destroying the watermark, this paper presents a new approach which is based on the estimation of a watermark and the exploitation of the properties of Human Visual System (HVS). The new attack satisfies two important requirements. First, image quality after the attack as perceived by the HVS is not worse than the quality of the stego image. Secondly, the attack uses all available prior information about the watermark and cover image statistics to perform the best watermark removal or damage. The proposed attack is based on a stochastic formulation of the watermark removal problem, considering the embedded watermark as additive noise with some probability distribution. The attack scheme consists of two main stages: (1) watermark estimation and partial removal by a filtering based on a Maximum a Posteriori (MAP) approach; (2) watermark alteration and hiding through addition of noise to the filtered image, taking into account the statistics of the embedded watermark and exploiting HVS characteristics. Experiments on a number of real world and computer generated images show the high efficiency of the proposed attack against known academic and commercial methods: the watermark is completely destroyed in all tested images without altering the image quality. The approach can be used against watermark embedding schemes that operate either in coordinate domain, or transform domains like Fourier, DCT or wavelet.
DNA Barcode Identification of Podocarpaceae—The Second Largest Conifer Family
Little, Damon P.; Knopf, Patrick; Schulz, Christian
2013-01-01
We have generated matK, rbcL, and nrITS2 DNA barcodes for 320 specimens representing all 18 extant genera of the conifer family Podocarpaceae. The sample includes 145 of the 198 recognized species. Comparative analyses of sequence quality and species discrimination were conducted on the 159 individuals from which all three markers were recovered (representing 15 genera and 97 species). The vast majority of sequences were of high quality (B 30 = 0.596–0.989). Even the lowest quality sequences exceeded the minimum requirements of the BARCODE data standard. In the few instances that low quality sequences were generated, the responsible mechanism could not be discerned. There were no statistically significant differences in the discriminatory power of markers or marker combinations (p = 0.05). The discriminatory power of the barcode markers individually and in combination is low (56.7% of species at maximum). In some instances, species discrimination failed in spite of ostensibly useful variation being present (genotypes were shared among species), but in many cases there was simply an absence of sequence variation. Barcode gaps (maximum intraspecific p–distance > minimum interspecific p–distance) were observed in 50.5% of species when all three markers were considered simultaneously. The presence of a barcode gap was not predictive of discrimination success (p = 0.02) and there was no statistically significant difference in the frequency of barcode gaps among markers (p = 0.05). In addition, there was no correlation between number of individuals sampled per species and the presence of a barcode gap (p = 0.27). PMID:24312258
77 FR 62602 - Privacy Act of 1974, as Amended; System of Records Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
... Record; (12) Statistical Reports--retrievable by names: (a) Personnel Transcript Report, (b) Class... training processes, such as the collection of statistical information on training programs, development of... systems, creating and reviewing statistics to improve the quality of services provided, or conducting debt...
The effect of baryons in the cosmological lensing PDFs
NASA Astrophysics Data System (ADS)
Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus
2018-07-01
Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations, we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple images by a factor of 5-500, depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested to guarantee that our uncertainties are much smaller than the effects here presented.
The effect of baryons in the cosmological lensing PDFs
NASA Astrophysics Data System (ADS)
Castro, Tiago; Quartin, Miguel; Giocoli, Carlo; Borgani, Stefano; Dolag, Klaus
2018-05-01
Observational cosmology is passing through a unique moment of grandeur with the amount of quality data growing fast. However, in order to better take advantage of this moment, data analysis tools have to keep up the pace. Understanding the effect of baryonic matter on the large-scale structure is one of the challenges to be faced in cosmology. In this work, we have thoroughly studied the effect of baryonic physics on different lensing statistics. Making use of the Magneticum Pathfinder suite of simulations we show that the influence of luminous matter on the 1-point lensing statistics of point sources is significant, enhancing the probability of magnified objects with μ > 3 by a factor of 2 and the occurrence of multiple-images by a factor 5 - 500 depending on the source redshift and size. We also discuss the dependence of the lensing statistics on the angular resolution of sources. Our results and methodology were carefully tested in order to guarantee that our uncertainties are much smaller than the effects here presented.
Croft, Giles P; Williams, John G; Mann, Robin Y; Cohen, David; Phillips, Ceri J
2007-08-01
Hospital episode statistics were originally designed to monitor activity and allocate resources in the NHS. Recently their uses have widened to include analysis of individuals' activity, to inform appraisal and revalidation, and monitor performance. This study investigated physician attitudes to the validity and usefulness of these data for such purposes, and the effect of supporting individuals in data interpretation. A randomised study was conducted with consultant physicians in England, Wales and Scotland. The intervention group was supported by a clinician and an information analyst in obtaining and analysing their own data. The control group was unsupported. Attitudes to the data and confidence in their ability to reflect clinical practice were examined before and after the intervention. It was concluded that hospital episode statistics are not presently fit for monitoring the performance of individual physicians. A more comprehensive description of activity is required for these purposes. Improvements in the quality of existing data through clinical engagement at a local level, however, are possible.
Statistical model for speckle pattern optimization.
Su, Yong; Zhang, Qingchuan; Gao, Zeren
2017-11-27
Image registration is the key technique of optical metrologies such as digital image correlation (DIC), particle image velocimetry (PIV), and speckle metrology. Its performance depends critically on the quality of image pattern, and thus pattern optimization attracts extensive attention. In this article, a statistical model is built to optimize speckle patterns that are composed of randomly positioned speckles. It is found that the process of speckle pattern generation is essentially a filtered Poisson process. The dependence of measurement errors (including systematic errors, random errors, and overall errors) upon speckle pattern generation parameters is characterized analytically. By minimizing the errors, formulas of the optimal speckle radius are presented. Although the primary motivation is from the field of DIC, we believed that scholars in other optical measurement communities, such as PIV and speckle metrology, will benefit from these discussions.
Shen, Yu-Ming; Le, Lien D; Wilson, Rory; Mansmann, Ulrich
2017-01-09
Biomarkers providing evidence for patient-treatment interaction are key in the development and practice of personalized medicine. Knowledge that a patient with a specific feature - as demonstrated through a biomarker - would have an advantage under a given treatment vs. a competing treatment can aid immensely in medical decision-making. Statistical strategies to establish evidence of continuous biomarkers are complex and their formal results are thus not easy to communicate. Good graphical representations would help to translate such findings for use in the clinical community. Although general guidelines on how to present figures in clinical reports are available, there remains little guidance for figures elucidating the role of continuous biomarkers in patient-treatment interaction (CBPTI). To combat the current lack of comprehensive reviews or adequate guides on graphical presentation within this topic, our study proposes presentation principles for CBPTI plots. In order to understand current practice, we review the development of CBPTI methodology and how CBPTI plots are currently used in clinical research. The quality of a CBPTI plot is determined by how well the presentation provides key information for clinical decision-making. Several criteria for a good CBPTI plot are proposed, including general principles of visual display, use of units presenting absolute outcome measures, appropriate quantification of statistical uncertainty, correct display of benchmarks, and informative content for answering clinical questions especially on the quantitative advantage for an individual patient with regard to a specific treatment. We examined the development of CBPTI methodology from the years 2000 - 2014, and reviewed how CBPTI plots were currently used in clinical research in six major clinical journals from 2013 - 2014 using the principle of theoretical saturation. Each CBPTI plot found was assessed for appropriateness of its presentation and clinical utility. In our review, a total of seven methodological papers and five clinical reports used CBPTI plots which we categorized into four types: those that distinguish the outcome effect for each treatment group; those that show the outcome differences between treatment groups (by either partitioning all individuals into subpopulations or modelling the functional form of the interaction); those that evaluate the proportion of population impact of the biomarker; and those that show the classification accuracy of the biomarker. The current practice of utilizing CBPTI plots in clinical reports suffers from methodological shortcomings: the lack of presentation of statistical uncertainty, the outcome measure scaled by relative unit instead of absolute unit, incorrect use of benchmarks, and being non-informative in answering clinical questions. There is considerable scope for improvement in the graphical representation of CBPTI in clinical reports. The current challenge is to develop instruments for high-quality graphical plots which not only convey quantitative concepts to readers with limited statistical knowledge, but also facilitate medical decision-making.
Predicting the Ability of Marine Mammal Populations to Compensate for Behavioral Disturbances
2015-09-30
approaches, including simple theoretical models as well as statistical analysis of data rich conditions. Building on models developed for PCoD [2,3], we...conditions is population trajectory most likely to be affected (the central aim of PCoD ). For the revised model presented here, we include a population...averaged condition individuals (here used as a proxy for individual health as defined in PCoD ), and E is the quality of the environment in which the
NASA Technical Reports Server (NTRS)
Neustadter, H. E.; King, R. B.; Fordyce, J. S.; Burr, J. C., Jr.
1972-01-01
The NASA Lewis Research Center is assisting the City of Cleveland, Ohio, in its effort to monitor its air pollution. This report describes the Cleveland program of the past 4 years and the supportive Lewis program currently being developed. The data accumulated by Cleveland over the past 4 years are presented together with some preliminary statistical analyses indicating in a semiquantitive manner the degree of air pollution existing within the boundaries of Cleveland.
Setting of index system of environmental and economic accounting of water
NASA Astrophysics Data System (ADS)
Tan, Yarong
2017-10-01
To realize the quality advancement of integrated water management in China, a scientific and perfect index system of environmental and economic accounting should be built. At present, the water shortage in China becomes increasingly serious, which further highlights the importance of efficient water management and improving the index system of water economic accounting. Based on the internal structure of the new statistical method of environmental and economic accounting, this paper focuses on analyzing and discussing the index system which it should have.
Understanding reverberating chambers as an alternative facility for EMC testing
NASA Astrophysics Data System (ADS)
Ma, M. T.
A relatively new facility called a reverberating chamber designed for EMC testing is described. The purpose is to create a statistically uniform electric field inside a metal enclosure for testing radiated susceptibility or immunity of equipment. Design criteria in terms of the number of cavity modes, mode density, and composite quality factor are presented in details in order to understand the physical insight and to enhance interpretations of measurement results. Recent experimental data are included to illustrate the underlying principle.
[Functional impairment and quality of life after rectal cancer surgery].
Mora, Laura; Zarate, Alba; Serra-Aracil, Xavier; Pallisera, Anna; Serra, Sheila; Navarro-Soto, Salvador
2018-01-01
This study determines the quality of life and the anorectal function of these patients. Observational study of two cohorts comparing patients undergoing rectal tumor surgery using TaETM or conventional ETM after a minimum of six months of intestinal transit reconstruction. EORTC-30, EORTC-29 quality of life questionnaires and the anorectal function assessment questionnaire (LARS score) are applied. General variables are also collected. 31 patients between 2011 and 2014: 15 ETM group and 16 TaETM. We do not find statistically significant differences in quality of life questionnaires or in anorectal function. Statistically significant general variables: longer surgical time in the TaETM group. Nosocomial infection and minor suture failure in the TaETM group. The performance of TaETM achieves the same results in terms of quality of life and anorectal function as conventional ETM. Copyright: © 2018 Permanyer.