Opportunities for Applied Behavior Analysis in the Total Quality Movement.
ERIC Educational Resources Information Center
Redmon, William K.
1992-01-01
This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…
Small Steps, Big Reward: Quality Improvement through Pilot Groups.
ERIC Educational Resources Information Center
Bindl, Jim; Schuler, Jim
1988-01-01
Because of a need for quality improvement, Wisconsin Power and Light trained two six-person pilot groups in statistical process control, had them apply that knowledge to actual problems, and showed management the dollars-and-cents savings that come from quality improvement. (JOW)
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Statistical issues in quality control of proteomic analyses: good experimental design and planning.
Cairns, David A
2011-03-01
Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Strengths and Weaknesses of Total Quality Management in Higher Education.
ERIC Educational Resources Information Center
Hazzard, Terry
This paper defines Total Quality Management (TQM), discusses its origins, and identifies its strengths and weaknesses as they apply to higher education. The paper defines TQM as a philosophy of organizations that defines quality and improves organizational performance and administrative systems. The system originated from statistical quality…
Multivariate Statistical Analysis of Water Quality data in Indian River Lagoon, Florida
NASA Astrophysics Data System (ADS)
Sayemuzzaman, M.; Ye, M.
2015-12-01
The Indian River Lagoon, is part of the longest barrier island complex in the United States, is a region of particular concern to the environmental scientist because of the rapid rate of human development throughout the region and the geographical position in between the colder temperate zone and warmer sub-tropical zone. Thus, the surface water quality analysis in this region always brings the newer information. In this present study, multivariate statistical procedures were applied to analyze the spatial and temporal water quality in the Indian River Lagoon over the period 1998-2013. Twelve parameters have been analyzed on twelve key water monitoring stations in and beside the lagoon on monthly datasets (total of 27,648 observations). The dataset was treated using cluster analysis (CA), principle component analysis (PCA) and non-parametric trend analysis. The CA was used to cluster twelve monitoring stations into four groups, with stations on the similar surrounding characteristics being in the same group. The PCA was then applied to the similar groups to find the important water quality parameters. The principal components (PCs), PC1 to PC5 was considered based on the explained cumulative variances 75% to 85% in each cluster groups. Nutrient species (phosphorus and nitrogen), salinity, specific conductivity and erosion factors (TSS, Turbidity) were major variables involved in the construction of the PCs. Statistical significant positive or negative trends and the abrupt trend shift were detected applying Mann-Kendall trend test and Sequential Mann-Kendall (SQMK), for each individual stations for the important water quality parameters. Land use land cover change pattern, local anthropogenic activities and extreme climate such as drought might be associated with these trends. This study presents the multivariate statistical assessment in order to get better information about the quality of surface water. Thus, effective pollution control/management of the surface waters can be undertaken.
Operator agency in process intervention: tampering versus application of tacit knowledge
NASA Astrophysics Data System (ADS)
Van Gestel, P.; Pons, D. J.; Pulakanam, V.
2015-09-01
Statistical process control (SPC) theory takes a negative view of adjustment of process settings, which is termed tampering. In contrast, quality and lean programmes actively encourage operators to acts of intervention and personal agency in the improvement of production outcomes. This creates a conflict that requires operator judgement: How does one differentiate between unnecessary tampering and needful intervention? Also, difficult is that operators apply tacit knowledge to such judgements. There is a need to determine where in a given production process the operators are applying tacit knowledge, and whether this is hindering or aiding quality outcomes. The work involved the conjoint application of systems engineering, statistics, and knowledge management principles, in the context of a case study. Systems engineering was used to create a functional model of a real plant. Actual plant data were analysed with the statistical methods of ANOVA, feature selection, and link analysis. This identified the variables to which the output quality was most sensitive. These key variables were mapped back to the functional model. Fieldwork was then directed to those areas to prospect for operator judgement activities. A natural conversational approach was used to determine where and how operators were applying judgement. This contrasts to the interrogative approach of conventional knowledge management. Data are presented for a case study of a meat rendering plant. The results identify specific areas where operators' tacit knowledge and mental model contribute to quality outcomes and untangles the motivations behind their agency. Also evident is how novice and expert operators apply their knowledge differently. Novices were focussed on meeting throughput objectives, and their incomplete understanding of the plant characteristics led them to inadvertently sacrifice quality in the pursuit of productivity in certain situations. Operators' responses to the plant are affected by their individual mental models of the plant, which differ between operators and have variable validity. Their behaviour is also affected by differing interpretations of how their personal agency should be applied to the achievement of production objectives. The methodology developed here is an integration of systems engineering, statistical analysis, and knowledge management. It shows how to determine where in a given production process the operator intervention is occurring, how it affects quality outcomes, and what tacit knowledge operators are using. It thereby assists the continuous quality improvement processes in a different way to SPC. A second contribution is the provision of a novel methodology for knowledge management, one that circumvents the usual codification barriers to knowledge management.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
Elevation, rootstock, and soil depth affect the nutritional quality of mandarin oranges
USDA-ARS?s Scientific Manuscript database
The effects of elevation, rootstock, and soil depth on the nutritional quality of mandarin oranges from 11 groves in California were investigated by nuclear magnetic resonance (NMR) spectroscopy by quantifying 29 compounds and applying multivariate statistical data analysis. A comparison of the juic...
Total Quality Management in Education.
ERIC Educational Resources Information Center
Johnson, James H.
1993-01-01
Ways to apply the concepts and processes of Total Quality Management (TQM) to education are discussed in this document. Following the introduction and the preface, chapter 1 provides a historical overview and describes the four cornerstones of TQM--an understanding of systems, psychology, knowledge, and statistics. Chapter 2 describes some of the…
Process safety improvement--quality and target zero.
Van Scyoc, Karl
2008-11-15
Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.
Groundwater quality assessment of urban Bengaluru using multivariate statistical techniques
NASA Astrophysics Data System (ADS)
Gulgundi, Mohammad Shahid; Shetty, Amba
2018-03-01
Groundwater quality deterioration due to anthropogenic activities has become a subject of prime concern. The objective of the study was to assess the spatial and temporal variations in groundwater quality and to identify the sources in the western half of the Bengaluru city using multivariate statistical techniques. Water quality index rating was calculated for pre and post monsoon seasons to quantify overall water quality for human consumption. The post-monsoon samples show signs of poor quality in drinking purpose compared to pre-monsoon. Cluster analysis (CA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the groundwater quality data measured on 14 parameters from 67 sites distributed across the city. Hierarchical cluster analysis (CA) grouped the 67 sampling stations into two groups, cluster 1 having high pollution and cluster 2 having lesser pollution. Discriminant analysis (DA) was applied to delineate the most meaningful parameters accounting for temporal and spatial variations in groundwater quality of the study area. Temporal DA identified pH as the most important parameter, which discriminates between water quality in the pre-monsoon and post-monsoon seasons and accounts for 72% seasonal assignation of cases. Spatial DA identified Mg, Cl and NO3 as the three most important parameters discriminating between two clusters and accounting for 89% spatial assignation of cases. Principal component analysis was applied to the dataset obtained from the two clusters, which evolved three factors in each cluster, explaining 85.4 and 84% of the total variance, respectively. Varifactors obtained from principal component analysis showed that groundwater quality variation is mainly explained by dissolution of minerals from rock water interactions in the aquifer, effect of anthropogenic activities and ion exchange processes in water.
Guo, Hui; Zhang, Zhen; Yao, Yuan; Liu, Jialin; Chang, Ruirui; Liu, Zhao; Hao, Hongyuan; Huang, Taohong; Wen, Jun; Zhou, Tingting
2018-08-30
Semen sojae praeparatum with homology of medicine and food is a famous traditional Chinese medicine. A simple and effective quality fingerprint analysis, coupled with chemometrics methods, was developed for quality assessment of Semen sojae praeparatum. First, similarity analysis (SA) and hierarchical clusting analysis (HCA) were applied to select the qualitative markers, which obviously influence the quality of Semen sojae praeparatum. 21 chemicals were selected and characterized by high resolution ion trap/time-of-flight mass spectrometry (LC-IT-TOF-MS). Subsequently, principal components analysis (PCA) and orthogonal partial least squares discriminant analysis (OPLS-DA) were conducted to select the quantitative markers of Semen sojae praeparatum samples from different origins. Moreover, 11 compounds with statistical significance were determined quantitatively, which provided an accurate and informative data for quality evaluation. This study proposes a new strategy for "statistic analysis-based fingerprint establishment", which would be a valuable reference for further study. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Roy, P. K.; Pal, S.; Banerjee, G.; Biswas Roy, M.; Ray, D.; Majumder, A.
2014-12-01
River is considered as one of the main sources of freshwater all over the world. Hence analysis and maintenance of this water resource is globally considered a matter of major concern. This paper deals with the assessment of surface water quality of the Ichamati river using multivariate statistical techniques. Eight distinct surface water quality observation stations were located and samples were collected. For the samples collected statistical techniques were applied to the physico-chemical parameters and depth of siltation. In this paper cluster analysis is done to determine the relations between surface water quality and siltation depth of river Ichamati. Multiple regressions and mathematical equation modeling have been done to characterize surface water quality of Ichamati river on the basis of physico-chemical parameters. It was found that surface water quality of the downstream river was different from the water quality of the upstream. The analysis of the water quality parameters of the Ichamati river clearly indicate high pollution load on the river water which can be accounted to agricultural discharge, tidal effect and soil erosion. The results further reveal that with the increase in depth of siltation, water quality degraded.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Design, analysis, and interpretation of field quality-control data for water-sampling projects
Mueller, David K.; Schertz, Terry L.; Martin, Jeffrey D.; Sandstrom, Mark W.
2015-01-01
The report provides extensive information about statistical methods used to analyze quality-control data in order to estimate potential bias and variability in environmental data. These methods include construction of confidence intervals on various statistical measures, such as the mean, percentiles and percentages, and standard deviation. The methods are used to compare quality-control results with the larger set of environmental data in order to determine whether the effects of bias and variability might interfere with interpretation of these data. Examples from published reports are presented to illustrate how the methods are applied, how bias and variability are reported, and how the interpretation of environmental data can be qualified based on the quality-control analysis.
Statistics usage in the American Journal of Obstetrics and Gynecology: has anything changed?
Welch, Gerald E; Gabbe, Steven G
2002-03-01
Our purpose was to compare statistical listing and usage between articles published in the American Journal of Obstetrics and Gynecology in 1994 with those published in 1999. All papers included in the obstetrics, fetus-placenta-newborn, and gynecology sections and the transactions of societies sections of the January through June 1999 issues of the American Journal of Obstetrics and Gynecology (volume 180, numbers 1 to 6) were reviewed for statistical usage. Each paper was given a rating for the cataloging of applied statistics and a rating for the appropriateness of statistical usage, when possible. These results were compared with the data collected on a similar review of articles published in 1994. Of the 238 available articles, 195 contained statistics and were reviewed. In comparison to the articles published in 1994, there were significantly more articles that completely cataloged applied statistics (74.3% vs 47.4%) (P <.0001), and there was a significant improvement in appropriateness of statistical usage (56.4% vs 30.3%) (P <.0001). Changes in the Instructions to Authors regarding the description of applied statistics and probable changes in the behavior of researchers and Editors have led to an improvement in the quality of statistics in papers published in the American Journal of Obstetrics and Gynecology.
Quality control troubleshooting tools for the mill floor
John Dramm
2000-01-01
Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...
A quality improvement management model for renal care.
Vlchek, D L; Day, L M
1991-04-01
The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.
Uhm, Dong-choon
2010-08-01
The purpose of this study was to evaluate the effects on immigrant couples' communication, intimacy, conflict and quality of life when using foot massage. The research design consisted of pre-and-post test consecutive experimental design through a nonequivalent control group. Data were collected July 6, 2009 to February 27, 2010. The 36 couples were divided into two groups, experimental and control with 18 couples in each group. Foot massage was applied twice a week for 6 weeks by the couples in the experimental group. There were statistically significant increases in communication (p=.011), intimacy (p<.001), quality of life (p=.017) between the couples in the experimental group compared to the control group. There was also a statistically significant decrease in conflict (p=.003) between the couples in the experimental group compared to the control group. Foot massage can be applied as a nursing intervention for improvement of marital relationship in immigrant couples.
Design of experiments (DoE) in pharmaceutical development.
N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios
2017-06-01
At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.
Indirect Comparisons: A Review of Reporting and Methodological Quality
Donegan, Sarah; Williamson, Paula; Gamble, Carrol; Tudur-Smith, Catrin
2010-01-01
Background The indirect comparison of two interventions can be valuable in many situations. However, the quality of an indirect comparison will depend on several factors including the chosen methodology and validity of underlying assumptions. Published indirect comparisons are increasingly more common in the medical literature, but as yet, there are no published recommendations of how they should be reported. Our aim is to systematically review the quality of published indirect comparisons to add to existing empirical data suggesting that improvements can be made when reporting and applying indirect comparisons. Methodology/Findings Reviews applying statistical methods to indirectly compare the clinical effectiveness of two interventions using randomised controlled trials were eligible. We searched (1966–2008) Database of Abstracts and Reviews of Effects, The Cochrane library, and Medline. Full review publications were assessed for eligibility. Specific criteria to assess quality were developed and applied. Forty-three reviews were included. Adequate methodology was used to calculate the indirect comparison in 41 reviews. Nineteen reviews assessed the similarity assumption using sensitivity analysis, subgroup analysis, or meta-regression. Eleven reviews compared trial-level characteristics. Twenty-four reviews assessed statistical homogeneity. Twelve reviews investigated causes of heterogeneity. Seventeen reviews included direct and indirect evidence for the same comparison; six reviews assessed consistency. One review combined both evidence types. Twenty-five reviews urged caution in interpretation of results, and 24 reviews indicated when results were from indirect evidence by stating this term with the result. Conclusions This review shows that the underlying assumptions are not routinely explored or reported when undertaking indirect comparisons. We recommend, therefore, that the quality of indirect comparisons should be improved, in particular, by assessing assumptions and reporting the assessment methods applied. We propose that the quality criteria applied in this article may provide a basis to help review authors carry out indirect comparisons and to aid appropriate interpretation. PMID:21085712
[Functional impairment and quality of life after rectal cancer surgery].
Mora, Laura; Zarate, Alba; Serra-Aracil, Xavier; Pallisera, Anna; Serra, Sheila; Navarro-Soto, Salvador
2018-01-01
This study determines the quality of life and the anorectal function of these patients. Observational study of two cohorts comparing patients undergoing rectal tumor surgery using TaETM or conventional ETM after a minimum of six months of intestinal transit reconstruction. EORTC-30, EORTC-29 quality of life questionnaires and the anorectal function assessment questionnaire (LARS score) are applied. General variables are also collected. 31 patients between 2011 and 2014: 15 ETM group and 16 TaETM. We do not find statistically significant differences in quality of life questionnaires or in anorectal function. Statistically significant general variables: longer surgical time in the TaETM group. Nosocomial infection and minor suture failure in the TaETM group. The performance of TaETM achieves the same results in terms of quality of life and anorectal function as conventional ETM. Copyright: © 2018 Permanyer.
NASA Astrophysics Data System (ADS)
Ye, M.; Pacheco Castro, R. B.; Pacheco Avila, J.; Cabrera Sansores, A.
2014-12-01
The karstic aquifer of Yucatan is a vulnerable and complex system. The first fifteen meters of this aquifer have been polluted, due to this the protection of this resource is important because is the only source of potable water of the entire State. Through the assessment of groundwater quality we can gain some knowledge about the main processes governing water chemistry as well as spatial patterns which are important to establish protection zones. In this work multivariate statistical techniques are used to assess the groundwater quality of the supply wells (30 to 40 meters deep) in the hidrogeologic region of the Ring of Cenotes, located in Yucatan, Mexico. Cluster analysis and principal component analysis are applied in groundwater chemistry data of the study area. Results of principal component analysis show that the main sources of variation in the data are due sea water intrusion and the interaction of the water with the carbonate rocks of the system and some pollution processes. The cluster analysis shows that the data can be divided in four clusters. The spatial distribution of the clusters seems to be random, but is consistent with sea water intrusion and pollution with nitrates. The overall results show that multivariate statistical analysis can be successfully applied in the groundwater quality assessment of this karstic aquifer.
NASA Astrophysics Data System (ADS)
Valder, J.; Kenner, S.; Long, A.
2008-12-01
Portions of the Cheyenne River are characterized as impaired by the U.S. Environmental Protection Agency because of water-quality exceedences. The Cheyenne River watershed includes the Black Hills National Forest and part of the Badlands National Park. Preliminary analysis indicates that the Badlands National Park is a major contributor to the exceedances of the water-quality constituents for total dissolved solids and total suspended solids. Water-quality data have been collected continuously since 2007, and in the second year of collection (2008), monthly grab and passive sediment samplers are being used to collect total suspended sediment and total dissolved solids in both base-flow and runoff-event conditions. In addition, sediment samples from the river channel, including bed, bank, and floodplain, have been collected. These samples are being analyzed at the South Dakota School of Mines and Technology's X-Ray Diffraction Lab to quantify the mineralogy of the sediments. A multivariate statistical approach (including principal components, least squares, and maximum likelihood techniques) is applied to the mineral percentages that were characterized for each site to identify the contributing source areas that are causing exceedances of sediment transport in the Cheyenne River watershed. Results of the multivariate analysis demonstrate the likely sources of solids found in the Cheyenne River samples. A further refinement of the methods is in progress that utilizes a conceptual model which, when applied with the multivariate statistical approach, provides a better estimate for sediment sources.
The Statistical point of view of Quality: the Lean Six Sigma methodology
Viti, Andrea; Terzi, Alberto
2015-01-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality. PMID:25973253
The Statistical point of view of Quality: the Lean Six Sigma methodology.
Bertolaccini, Luca; Viti, Andrea; Terzi, Alberto
2015-04-01
Six Sigma and Lean are two quality improvement methodologies. The Lean Six Sigma methodology is applicable to repetitive procedures. Therefore, the use of this methodology in the health-care arena has focused mainly on areas of business operations, throughput, and case management and has focused on efficiency outcomes. After the revision of methodology, the paper presents a brief clinical example of the use of Lean Six Sigma as a quality improvement method in the reduction of the complications during and after lobectomies. Using Lean Six Sigma methodology, the multidisciplinary teams could identify multiple modifiable points across the surgical process. These process improvements could be applied to different surgical specialties and could result in a measurement, from statistical point of view, of the surgical quality.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...
A measure of the signal-to-noise ratio of microarray samples and studies using gene correlations.
Venet, David; Detours, Vincent; Bersini, Hugues
2012-01-01
The quality of gene expression data can vary dramatically from platform to platform, study to study, and sample to sample. As reliable statistical analysis rests on reliable data, determining such quality is of the utmost importance. Quality measures to spot problematic samples exist, but they are platform-specific, and cannot be used to compare studies. As a proxy for quality, we propose a signal-to-noise ratio for microarray data, the "Signal-to-Noise Applied to Gene Expression Experiments", or SNAGEE. SNAGEE is based on the consistency of gene-gene correlations. We applied SNAGEE to a compendium of 80 large datasets on 37 platforms, for a total of 24,380 samples, and assessed the signal-to-noise ratio of studies and samples. This allowed us to discover serious issues with three studies. We show that signal-to-noise ratios of both studies and samples are linked to the statistical significance of the biological results. We showed that SNAGEE is an effective way to measure data quality for most types of gene expression studies, and that it often outperforms existing techniques. Furthermore, SNAGEE is platform-independent and does not require raw data files. The SNAGEE R package is available in BioConductor.
Speckle reduction in optical coherence tomography by adaptive total variation method
NASA Astrophysics Data System (ADS)
Wu, Tong; Shi, Yaoyao; Liu, Youwen; He, Chongjun
2015-12-01
An adaptive total variation method based on the combination of speckle statistics and total variation restoration is proposed and developed for reducing speckle noise in optical coherence tomography (OCT) images. The statistical distribution of the speckle noise in OCT image is investigated and measured. With the measured parameters such as the mean value and variance of the speckle noise, the OCT image is restored by the adaptive total variation restoration method. The adaptive total variation restoration algorithm was applied to the OCT images of a volunteer's hand skin, which showed effective speckle noise reduction and image quality improvement. For image quality comparison, the commonly used median filtering method was also applied to the same images to reduce the speckle noise. The measured results demonstrate the superior performance of the adaptive total variation restoration method in terms of image signal-to-noise ratio, equivalent number of looks, contrast-to-noise ratio, and mean square error.
Enhance Video Film using Retnix method
NASA Astrophysics Data System (ADS)
Awad, Rasha; Al-Zuky, Ali A.; Al-Saleh, Anwar H.; Mohamad, Haidar J.
2018-05-01
An enhancement technique used to improve the studied video quality. Algorithms like mean and standard deviation are used as a criterion within this paper, and it applied for each video clip that divided into 80 images. The studied filming environment has different light intensity (315, 566, and 644Lux). This different environment gives similar reality to the outdoor filming. The outputs of the suggested algorithm are compared with the results before applying it. This method is applied into two ways: first, it is applied for the full video clip to get the enhanced film; second, it is applied for every individual image to get the enhanced image then compiler them to get the enhanced film. This paper shows that the enhancement technique gives good quality video film depending on a statistical method, and it is recommended to use it in different application.
Liu, Yingchun; Sun, Guoxiang; Wang, Yan; Yang, Lanping; Yang, Fangliang
2015-06-01
Micellar electrokinetic chromatography fingerprinting combined with quantification was successfully developed and applied to monitor the quality consistency of Weibizhi tablets, which is a classical compound preparation used to treat gastric ulcers. A background electrolyte composed of 57 mmol/L sodium borate, 21 mmol/L sodium dodecylsulfate and 100 mmol/L sodium hydroxide was used to separate compounds. To optimize capillary electrophoresis conditions, multivariate statistical analyses were applied. First, the most important factors influencing sample electrophoretic behavior were identified as background electrolyte concentrations. Then, a Box-Benhnken design response surface strategy using resolution index RF as an integrated response was set up to correlate factors with response. RF reflects the effective signal amount, resolution, and signal homogenization in an electropherogram, thus, it was regarded as an excellent indicator. In fingerprint assessments, simple quantified ratio fingerprint method was established for comprehensive quality discrimination of traditional Chinese medicines/herbal medicines from qualitative and quantitative perspectives, by which the quality of 27 samples from the same manufacturer were well differentiated. In addition, the fingerprint-efficacy relationship between fingerprints and antioxidant activities was established using partial least squares regression, which provided important medicinal efficacy information for quality control. The present study offered an efficient means for monitoring Weibizhi tablet quality consistency. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
THE ATMOSPHERIC MODEL EVALUATION TOOL
This poster describes a model evaluation tool that is currently being developed and applied for meteorological and air quality model evaluation. The poster outlines the framework and provides examples of statistical evaluations that can be performed with the model evaluation tool...
Statistical analysis of subjective preferences for video enhancement
NASA Astrophysics Data System (ADS)
Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli
2010-02-01
Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.
[Triple-type theory of statistics and its application in the scientific research of biomedicine].
Hu, Liang-ping; Liu, Hui-gang
2005-07-20
To point out the crux of why so many people failed to grasp statistics and to bring forth a "triple-type theory of statistics" to solve the problem in a creative way. Based on the experience in long-time teaching and research in statistics, the "three-type theory" was raised and clarified. Examples were provided to demonstrate that the 3 types, i.e., expressive type, prototype and the standardized type are the essentials for people to apply statistics rationally both in theory and practice, and moreover, it is demonstrated by some instances that the "three types" are correlated with each other. It can help people to see the essence by interpreting and analyzing the problems of experimental designs and statistical analyses in medical research work. Investigations reveal that for some questions, the three types are mutually identical; for some questions, the prototype is their standardized type; however, for some others, the three types are distinct from each other. It has been shown that in some multifactor experimental researches, it leads to the nonexistence of the standardized type corresponding to the prototype at all, because some researchers have committed the mistake of "incomplete control" in setting experimental groups. This is a problem which should be solved by the concept and method of "division". Once the "triple-type" for each question is clarified, a proper experimental design and statistical method can be carried out easily. "Triple-type theory of statistics" can help people to avoid committing statistical mistakes or at least to decrease the misuse rate dramatically and improve the quality, level and speed of biomedical research during the process of applying statistics. It can also help people to improve the quality of statistical textbooks and the teaching effect of statistics and it has demonstrated how to advance biomedical statistics.
Source apportionment of groundwater pollution around landfill site in Nagpur, India.
Pujari, Paras R; Deshpande, Vijaya
2005-12-01
The present work attempts statistical analysis of groundwater quality near a Landfill site in Nagpur, India. The objective of the present work is to figure out the impact of different factors on the quality of groundwater in the study area. Statistical analysis of the data has been attempted by applying Factor Analysis concept. The analysis brings out the effect of five different factors governing the groundwater quality in the study area. Based on the contribution of the different parameters present in the extracted factors, the latter are linked to the geological setting, the leaching from the host rock, leachate of heavy metals from the landfill as well as the bacterial contamination from landfill site and other anthropogenic activities. The analysis brings out the vulnerability of the unconfined aquifer to contamination.
Estimation of urban runoff and water quality using remote sensing and artificial intelligence.
Ha, S R; Park, S Y; Park, D H
2003-01-01
Water quality and quantity of runoff are strongly dependent on the landuse and landcover (LULC) criteria. In this study, we developed a more improved parameter estimation procedure for the environmental model using remote sensing (RS) and artificial intelligence (AI) techniques. Landsat TM multi-band (7bands) and Korea Multi-Purpose Satellite (KOMPSAT) panchromatic data were selected for input data processing. We employed two kinds of artificial intelligence techniques, RBF-NN (radial-basis-function neural network) and ANN (artificial neural network), to classify LULC of the study area. A bootstrap resampling method, a statistical technique, was employed to generate the confidence intervals and distribution of the unit load. SWMM was used to simulate the urban runoff and water quality and applied to the study watershed. The condition of urban flow and non-point contaminations was simulated with rainfall-runoff and measured water quality data. The estimated total runoff, peak time, and pollutant generation varied considerably according to the classification accuracy and percentile unit load applied. The proposed procedure would efficiently be applied to water quality and runoff simulation in a rapidly changing urban area.
Hyatt, M.W.; Hubert, W.A.
2001-01-01
We assessed relative weight (Wr) distributions among 291 samples of stock-to-quality-length brook trout Salvelinus fontinalis, brown trout Salmo trutta, rainbow trout Oncorhynchus mykiss, and cutthroat trout O. clarki from lentic and lotic habitats. Statistics describing Wr sample distributions varied slightly among species and habitat types. The average sample was leptokurtotic and slightly skewed to the right with a standard deviation of about 10, but the shapes of Wr distributions varied widely among samples. Twenty-two percent of the samples had nonnormal distributions, suggesting the need to evaluate sample distributions before applying statistical tests to determine whether assumptions are met. In general, our findings indicate that samples of about 100 stock-to-quality-length fish are needed to obtain confidence interval widths of four Wr units around the mean. Power analysis revealed that samples of about 50 stock-to-quality-length fish are needed to detect a 2% change in mean Wr at a relatively high level of power (beta = 0.01, alpha = 0.05).
Statistical process control applied to mechanized peanut sowing as a function of soil texture.
Zerbato, Cristiano; Furlani, Carlos Eduardo Angeli; Ormond, Antonio Tassio Santana; Gírio, Lucas Augusto da Silva; Carneiro, Franciele Morlin; da Silva, Rouverson Pereira
2017-01-01
The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing.
Statistical process control applied to mechanized peanut sowing as a function of soil texture
Furlani, Carlos Eduardo Angeli; da Silva, Rouverson Pereira
2017-01-01
The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing. PMID:28742095
[Quality of life domains affected in women with breast cancer].
Garcia, Sabrina Nunes; Jacowski, Michele; Castro, Gisele Cordeiro; Galdino, Carila; Guimarães, Paulo Ricardo Bittencourt; Kalinke, Luciana Puchalski
2015-06-01
This study aimed to investigate the quality of life of women suffering from breast cancer undergoing chemotherapy in public and private health care systems. It is an observational, prospective study with 64 women suffering from breast cancer. Data was collected with two instruments: Quality of Life Questionnaire C30 and Breast Cancer Module BR23. By applying Mann Whitney and Friedman's statistical tests, p values < 0.05 were considered statistically significant. The significant results in public health care systems were: physical functions, pain symptom, body image, systemic effects and outlook for the future. In private health care systems, the results were sexual, social functions and body image. Women's quality of life was harmed by chemotherapy in both institutions. The quality of life of women has been harmed as a result of the chemotherapy treatment in both institutions, but in different domains, indicating the type of nursing care that should be provided according to the characteristics of each group.
A ranking index for quality assessment of forensic DNA profiles forensic DNA profiles
2010-01-01
Background Assessment of DNA profile quality is vital in forensic DNA analysis, both in order to determine the evidentiary value of DNA results and to compare the performance of different DNA analysis protocols. Generally the quality assessment is performed through manual examination of the DNA profiles based on empirical knowledge, or by comparing the intensities (allelic peak heights) of the capillary electrophoresis electropherograms. Results We recently developed a ranking index for unbiased and quantitative quality assessment of forensic DNA profiles, the forensic DNA profile index (FI) (Hedman et al. Improved forensic DNA analysis through the use of alternative DNA polymerases and statistical modeling of DNA profiles, Biotechniques 47 (2009) 951-958). FI uses electropherogram data to combine the intensities of the allelic peaks with the balances within and between loci, using Principal Components Analysis. Here we present the construction of FI. We explain the mathematical and statistical methodologies used and present details about the applied data reduction method. Thereby we show how to adapt the ranking index for any Short Tandem Repeat-based forensic DNA typing system through validation against a manual grading scale and calibration against a specific set of DNA profiles. Conclusions The developed tool provides unbiased quality assessment of forensic DNA profiles. It can be applied for any DNA profiling system based on Short Tandem Repeat markers. Apart from crime related DNA analysis, FI can therefore be used as a quality tool in paternal or familial testing as well as in disaster victim identification. PMID:21062433
Impact of magnetic field strength and receiver coil in ocular MRI: a phantom and patient study.
Erb-Eigner, K; Warmuth, C; Taupitz, M; Willerding, G; Bertelmann, E; Asbach, P
2013-09-01
Generally, high-resolution MRI of the eye is performed with small loop surface coils. The purpose of this phantom and patient study was to investigate the influence of magnetic field strength and receiver coils on image quality in ocular MRI. The eyeball and the complex geometry of the facial bone were simulated by a skull phantom with swine eyes. MR images were acquired with two small loop surface coils with diameters of 4 cm and 7 cm and with a multi-channel head coil at 1.5 and 3 Tesla, respectively. Furthermore, MRI of the eye was performed prospectively in 20 patients at 1.5 Tesla (7 cm loop surface coil) and 3 Tesla (head coil). These images were analysed qualitatively and quantitatively and statistical significance was tested using the Wilcoxon-signed-rank test (a p-value of less than 0.05 was considered to indicate statistical significance). The analysis of the phantom images yielded the highest mean signal-to-noise ratio (SNR) at 3 Tesla with the use of the 4 cm loop surface coil. In the phantom experiment as well as in the patient studies the SNR was higher at 1.5 Tesla by applying the 7 cm surface coil than at 3 Tesla by applying the head coil. Concerning the delineation of anatomic structures no statistically significant differences were found. Our results show that the influence of small loop surface coils on image quality (expressed in SNR) in ocular MRI is higher than the influence of the magnetic field strength. The similar visibility of detailed anatomy leads to the conclusion that the image quality of ocular MRI at 3 Tesla remains acceptable by applying the head coil as a receiver coil. © Georg Thieme Verlag KG Stuttgart · New York.
Fu, Liya; Wang, You-Gan
2011-02-15
Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which clearly demonstrates the advantages of the rank regression models.
Ibikunle, Adebayo A; Adeyemo, Wasiu L
2016-09-01
To evaluate the effect of ice pack therapy on oral health-related quality of life (OHRQoL) following third molar surgery. All consecutive subjects who required surgical extraction of lower third molars and satisfied the inclusion criteria were randomly allocated into two groups. Subjects in group A were instructed to apply ice packs directly over the masseteric region on the operated side intermittently after third molar surgery. This first application was supervised in the clinic and was repeated at the 24-h postoperative review. Subjects in group A were further instructed to apply the ice pack when at home every one and a half hours on postoperative days 0 and 1 while he/she was awake as described. Group B subjects did not apply ice pack therapy. Facial swelling, pain, trismus, and quality of life (using Oral Health Impact Profile-14 (OHIP-14) instrument) were evaluated both preoperatively and postoperatively. Postoperative scores in both groups were compared. A significant increase in the mean total and subscale scores of OHIP-14 was found in both groups postoperatively when compared with preoperative value. Subjects who received ice pack therapy had a better quality of life than those who did not. Subjects whose postoperative QoL were affected were statistically significantly higher in group B than in group A at all postoperative evaluation points (P < 0.05). Statistically significant differences were also observed between the groups in the various subscales analyzed, with better quality of life seen among subjects in group A. Quality of life after third molar surgery was significantly better in subjects who had cryotherapy after third molar than those who did not have cryotherapy. Cryotherapy is a viable alternative or adjunct to other established modes of improving the quality of life of patients following surgical extraction of third molars.
Defraene, Bruno; van Waterschoot, Toon; Diehl, Moritz; Moonen, Marc
2016-07-01
Subjective audio quality evaluation experiments have been conducted to assess the performance of embedded-optimization-based precompensation algorithms for mitigating perceptible linear and nonlinear distortion in audio signals. It is concluded with statistical significance that the perceived audio quality is improved by applying an embedded-optimization-based precompensation algorithm, both in case (i) nonlinear distortion and (ii) a combination of linear and nonlinear distortion is present. Moreover, a significant positive correlation is reported between the collected subjective and objective PEAQ audio quality scores, supporting the validity of using PEAQ to predict the impact of linear and nonlinear distortion on the perceived audio quality.
Assessing National Data on Education.
ERIC Educational Resources Information Center
Plisko, Valena White; And Others
This paper applies questions of coverage, quality and linkages to the current collection of national statistics on education at the preprimary, elementary/secondary, and higher education levels. The main questions raised at the preprimary level pertain to availability of programs, standards, and family-school interaction. At the…
Statistical Quality Control of Moisture Data in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D. P.; Rukhovets, L.; Todling, R.
1999-01-01
A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.
Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David
2015-01-01
New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc. 2015.
Application of Statistical Quality Control Techniques to Detonator Fabrication: Feasibility Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, J. Frank
1971-05-20
A feasibility study was performed on the use of process control techniques which might reduce the need for a duplicate inspection by production inspection and quality control inspection. Two active detonator fabrication programs were selected for the study. Inspection areas accounting for the greatest percentage of total inspection costs were selected by applying "Pareto's Principle of Maldistribution." Data from these areas were then gathered and analyzed by a process capabiltiy study.
Ding, S N; Pan, H Y; Zhang, J G
2017-03-14
Objective: To evaluate the methodological quality and impacts on outcomes for systematic reviews (SRs) of accelerated rehabilitation versus traditional control for colorectal surgery. Methods: We comprehensively searched six databases and additional websites to collect SRs, or meta-analysis from inception to July 2016. The Overview Quality Assessment Questionnaire (OQAQ) was applied for quality assessment of the included studies, the tools recommended by the Cochrane Collaboration was applied for quality assessment for RCT and CCT and the Newcastle-Ottawa Scale (NOS) was applied to assess observational study. The relative ratios (RRs) and 95% confidence intervals (CIs) were integrated using Review Manager 5.3 software. Results: Fourteen meta-analyses were included in total. The mean OQAQ score was 3.8 with 95% CI 3.2 to 4.3. Only three meta-analyses were assessed as good quality. Two studies misused statistical models. A total of 42 primary studies referenced by meta-analyses were included, of which, 25 RCTs were levelled grade B and 1 CCT was levelled grade C. An estimated mean NOS score of 16 observation studies was 6.75 (totally scored 9 with 95% CI 6.4 to 7.1), of which, 10 studies scored ≥7 were high quality, 6 studies scored 6 were moderate quality. Conclusions: Currently, the overall quality of meta-analyses about comparing the effects and safety between accelerated rehabilitation and traditional control for colorectal surgery is fairly poor and the evidence level is lower. Health providers should apply the evidence with caution in clinical practice.
Selected papers in the hydrologic sciences, 1986
Subitzky, Seymour
1987-01-01
Water-quality data from long-term (24 years), fixed- station monitoring at the Cape Fear River at Lock 1 near Kelly, N.C., and various measures of basin development are correlated. Subbasin population, number of acres of cropland in the subbasin, number of people employed in manufacturing, and tons of fertilizer applied in the basin are considered as measures of basinwide development activity. Linear correlations show statistically significant posi- tive relations between both population and manufacturing activity and most of the dissolved constituents considered. Negative correlations were found between the acres of harvested cropland and most of the water-quality measures. The amount of fertilizer sold in the subbasin was not statistically related to the water-quality measures considered in this report. The statistical analysis was limited to several commonly used measures of water quality including specific conductance, pH, dissolved solids, several major dissolved ions, and a few nutrients. The major dissolved ions included in the analysis were calcium, sodium, potassium, magnesium, chloride, sulfate, silica, bicarbonate, and fluoride. The nutrients included were dissolved nitrite plus nitrate nitrogen, dissolved ammonia nitrogen, total nitrogen, dissolved phosphates, and total phosphorus. For the chemicals evaluated, manufacturing and population sources are more closely associated with water quality in the Cape Fear River at Lock 1 than are agricultural variables.
Quality evaluation of no-reference MR images using multidirectional filters and image statistics.
Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik
2018-09-01
This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Ueno, Tamio; Matuda, Junichi; Yamane, Nobuhisa
2013-03-01
To evaluate the occurrence of out-of acceptable ranges and accuracy of antimicrobial susceptibility tests, we applied a new statistical tool to the Inter-Laboratory Quality Control Program established by the Kyushu Quality Control Research Group. First, we defined acceptable ranges of minimum inhibitory concentration (MIC) for broth microdilution tests and inhibitory zone diameter for disk diffusion tests on the basis of Clinical and Laboratory Standards Institute (CLSI) M100-S21. In the analysis, more than two out-of acceptable range results in the 20 tests were considered as not allowable according to the CLSI document. Of the 90 participating laboratories, 46 (51%) experienced one or more occurrences of out-of acceptable range results. Then, a binomial test was applied to each participating laboratory. The results indicated that the occurrences of out-of acceptable range results in the 11 laboratories were significantly higher when compared to the CLSI recommendation (allowable rate < or = 0.05). The standard deviation indices(SDI) were calculated by using reported results, mean and standard deviation values for the respective antimicrobial agents tested. In the evaluation of accuracy, mean value from each laboratory was statistically compared with zero using a Student's t-test. The results revealed that 5 of the 11 above laboratories reported erroneous test results that systematically drifted to the side of resistance. In conclusion, our statistical approach has enabled us to detect significantly higher occurrences and source of interpretive errors in antimicrobial susceptibility tests; therefore, this approach can provide us with additional information that can improve the accuracy of the test results in clinical microbiology laboratories.
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.
Duncan, Fiona; Haigh, Carol
2013-10-01
To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.
Statistical Techniques for Assessing water‐quality effects of BMPs
Walker, John F.
1994-01-01
Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)
Implementing clinical protocols in oncology: quality gaps and the learning curve phenomenon.
Kedikoglou, Simos; Syrigos, Konstantinos; Skalkidis, Yannis; Ploiarchopoulou, Fani; Dessypris, Nick; Petridou, Eleni
2005-08-01
The quality improvement effort in clinical practice has focused mostly on 'performance quality', i.e. on the development of comprehensive, evidence-based guidelines. This study aimed to assess the 'conformance quality', i.e. the extent to which guidelines once developed are correctly and consistently applied. It also aimed to assess the existence of quality gaps in the treatment of certain patient segments as defined by age or gender and to investigate methods to improve overall conformance quality. A retrospective audit of clinical practice in a well-defined oncology setting was undertaken and the results compared to those obtained from prospectively applying an internally developed clinical protocol in the same setting and using specific tools to increase conformance quality. All indicators showed improvement after the implementation of the protocol that in many cases reached statistical significance, while in the entire cohort advanced age was associated (although not significantly) with sub-optimal delivery of care. A 'learning curve' phenomenon in the implementation of quality initiatives was detected, with all indicators improving substantially in the second part of the prospective study. Clinicians should pay separate attention to the implementation of chosen protocols and employ specific tools to increase conformance quality in patient care.
What affects the subjective sleep quality of hospitalized elderly patients?
Park, Mi Jeong; Kim, Kon Hee
2017-03-01
The present study aimed to identify the factors affecting the subjective sleep quality in elderly inpatients. The participants were 290 older adults admitted in three general hospitals. Data were collected using a structured questionnaire consisting of scales for general characteristics, sleep quality, activities of daily living, instrumental activities of daily living and depression. Collected data were analyzed by descriptive statistics, t-test, one-way anova, Scheffé post-hoc, Pearson's correlation coefficient and stepwise multiple regression. There were statistically significant differences in sleep quality according to age, education level, marital status, monthly income and number of cohabitants. The most powerful predictor of sleep quality was depression (P < 0.01, R 2 = 0.30). Five variables, depression, perceived health status, diagnosis, number of cohabitants and duration of hospitalization; explained 43.0% of the total variance in sleep quality. Elderly inpatients suffered from low sleep quality, and depression affected their sleep. We should develop and apply hospital-tailored sleep interventions considering older adults' depression, and then hospitalized older adults' sleep could improve. Furthermore, it is useful to identify other sleep-related factors. Geriatr Gerontol Int 2017; 17: 471-479. © 2016 Japan Geriatrics Society.
Cicchetti, D V; Rosenheck, R; Showalter, D; Charney, D; Cramer, J
1999-05-01
Sir Ronald Fisher used a single-subject design to derive the concepts of appropriate research design, randomization, sensitivity, and tests of statistical significance. The seminal work of Broca demonstrated that valid and generalizable findings can and have emerged from studies of a single patient in neuropsychology. In order to assess the reliability and/or validity of any clinical phenomena that derive from single subject research, it becomes necessary to apply appropriate biostatistical methodology. The authors develop just such an approach and apply it successfully to the evaluation of the functioning, quality of life, and neuropsychological symptomatology of a single schizophrenic patient.
Olmo, M; Galvan, L; Capdevila, J; Serna, C; Mangues, I; Schoenenberger, J A
2011-01-01
To verify that implementing a policy of management by objectives, based on collaboration between hospital pharmacy, primary care and specialised medical managers, improves prescription quality indicators in specialised care and reduces unwanted "induced" prescriptions (i.e. those issued by specialists, hospital doctors or the patients themselves) in primary care. A four year quasi-experimental controlled intervention study on prescription at discharge and in outpatient hospital consultations was conducted. In hospital A, a quality cycle was applied: assessment, identifying improvement opportunities, implementing corrective actions and re-assessment. However, it was not applied in control hospital B. The indicators chosen were the percentage of generic medicines prescribed, the percentage of prescriptions for new therapies with no added value and the percentage of prescriptions for ACE inhibitors recommended. In hospital A, an increase in indicators 1 and 3 has been observed, both being statistically significant, between the last year of intervention and the year previous to intervention. Hospital A managed to reduce indicator 2 to 4.5%, while this indicator increased in hospital B to 8.8%. Furthermore, a statistically significant difference in indicators between the two hospitals has been registered. Pay-for-Performance programs in prescription practices of hospital physicians are effective actions to improve quality indicators of medication use. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
The accurate assessment of small-angle X-ray scattering data
Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...
2015-01-23
Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less
Comfort and quality of life in patients with breast cancer undergoing radiation therapy.
Pehlivan, Seda; Kuzhan, Abdurrahman; Yildirim, Yasemin; Fadiloglu, Cicek
2016-01-01
Radiation therapy is generally applied after surgery for the treatment of breast cancer, which is among the most frequently observed types of cancer in females. Radiation therapy may have some negative effects on the quality of life due to various side effects such as changes in the skin, mucositis and fatigue. Our study was planned as a descriptive study, in order to examine the relationship between comfort and quality of life in breast cancer patients undergoing radiation therapy. This study involved 61 patients with breast cancer undergoing radiation therapy. Data were collected using "Patient Information Form", "Radiation Therapy Comfort Questionnaire" and "EORTC QLQ-BR23". The scales were applied twice, before the start and at the end of treatment. Data were evaluated via Wilcoxon test and Spearman correlation analyses. No statistically significant difference was determined between comfort and quality of life average score before and after radiotherapy (p>0.05). A positive relationship was determined between the pain and symptom quality of life (p<0.05). Although a positive relationship was determined between comfort score and the functional and general quality of life areas, a negative relationship was detected with the symptom quality of life (p<0.01). Radiation therapy applied to breast cancer patients did not affect comfort and quality of life, On the contrary, the quality of life of patients increased along with their comfort levels and that comfort levels decreased as the experienced symptoms increased.
A quality assessment of randomized controlled trial reports in endodontics.
Lucena, C; Souza, E M; Voinea, G C; Pulgar, R; Valderrama, M J; De-Deus, G
2017-03-01
To assess the quality of the randomized clinical trial (RCT) reports published in Endodontics between 1997 and 2012. Retrieval of RCTs in Endodontics was based on a search of the Thomson Reuters Web of Science (WoS) database (March 2013). Quality evaluation was performed using a checklist based on the Jadad criteria, CONSORT (Consolidated Standards of Reporting Trials) statement and SPIRIT (Standard Protocol Items: Recommendations for Interventional Trials). Descriptive statistics were used for frequency distribution of data. Student's t-test and Welch test were used to identify the influence of certain trial characteristics upon report quality (α = 0.05). A total of 89 RCTs were evaluated, and several methodological flaws were found: only 45% had random sequence generation at low risk of bias, 75% did not provide information on allocation concealment, and 19% were nonblinded designs. Regarding statistics, only 55% of the RCTs performed adequate sample size estimations, only 16% presented confidence intervals, and 25% did not provide the exact P-value. Also, 2% of the articles used no statistical tests, and in 87% of the RCTs, the information provided was insufficient to determine whether the statistical methodology applied was appropriate or not. Significantly higher scores were observed for multicentre trials (P = 0.023), RCTs signed by more than 5 authors (P = 0.03), articles belonging to journals ranked above the JCR median (P = 0.03), and articles complying with the CONSORT guidelines (P = 0.000). The quality of RCT reports in key areas for internal validity of the study was poor. Several measures, such as compliance with the CONSORT guidelines, are important in order to raise the quality of RCTs in Endodontics. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
Added value in health care with six sigma.
Lenaz, Maria P
2004-06-01
Six sigma is the structured application of the tools and techniques of quality management applied on a project basis that can enable organizations to achieve superior performance and strategic business results. The Greek character sigma has been used as a statistical term that measures how much a process varies from perfection, based on the number of defects per million units. Health care organizations using this model proceed from the lower levels of quality performance to the highest level, in which the process is nearly error free.
NASA Astrophysics Data System (ADS)
Lee, Soon Hwan; Kim, Ji Sun; Lee, Kang Yeol; Shon, Keon Tae
2017-04-01
Air quality due to increasing Particulate Matter(PM) in Korea in Asia is getting worse. At present, the PM forecast is announced based on the PM concentration predicted from the air quality prediction numerical model. However, forecast accuracy is not as high as expected due to various uncertainties for PM physical and chemical characteristics. The purpose of this study was to develop a numerical-statistically ensemble models to improve the accuracy of prediction of PM10 concentration. Numerical models used in this study are the three dimensional atmospheric model Weather Research and Forecasting(WRF) and the community multiscale air quality model (CMAQ). The target areas for the PM forecast are Seoul, Busan, Daegu, and Daejeon metropolitan areas in Korea. The data used in the model development are PM concentration and CMAQ predictions and the data period is 3 months (March 1 - May 31, 2014). The dynamic-statistical technics for reducing the systematic error of the CMAQ predictions was applied to the dynamic linear model(DLM) based on the Baysian Kalman filter technic. As a result of applying the metrics generated from the dynamic linear model to the forecasting of PM concentrations accuracy was improved. Especially, at the high PM concentration where the damage is relatively large, excellent improvement results are shown.
Analysis of Statistical Methods and Errors in the Articles Published in the Korean Journal of Pain
Yim, Kyoung Hoon; Han, Kyoung Ah; Park, Soo Young
2010-01-01
Background Statistical analysis is essential in regard to obtaining objective reliability for medical research. However, medical researchers do not have enough statistical knowledge to properly analyze their study data. To help understand and potentially alleviate this problem, we have analyzed the statistical methods and errors of articles published in the Korean Journal of Pain (KJP), with the intention to improve the statistical quality of the journal. Methods All the articles, except case reports and editorials, published from 2004 to 2008 in the KJP were reviewed. The types of applied statistical methods and errors in the articles were evaluated. Results One hundred and thirty-nine original articles were reviewed. Inferential statistics and descriptive statistics were used in 119 papers and 20 papers, respectively. Only 20.9% of the papers were free from statistical errors. The most commonly adopted statistical method was the t-test (21.0%) followed by the chi-square test (15.9%). Errors of omission were encountered 101 times in 70 papers. Among the errors of omission, "no statistics used even though statistical methods were required" was the most common (40.6%). The errors of commission were encountered 165 times in 86 papers, among which "parametric inference for nonparametric data" was the most common (33.9%). Conclusions We found various types of statistical errors in the articles published in the KJP. This suggests that meticulous attention should be given not only in the applying statistical procedures but also in the reviewing process to improve the value of the article. PMID:20552071
NASA Technical Reports Server (NTRS)
Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.
1990-01-01
Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging
Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895
Simultaneous analysis and quality assurance for diffusion tensor imaging.
Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.
Péron, Julien; Pond, Gregory R; Gan, Hui K; Chen, Eric X; Almufti, Roula; Maillet, Denis; You, Benoit
2012-07-03
The Consolidated Standards of Reporting Trials (CONSORT) guidelines were developed in the mid-1990s for the explicit purpose of improving clinical trial reporting. However, there is little information regarding the adherence to CONSORT guidelines of recent publications of randomized controlled trials (RCTs) in oncology. All phase III RCTs published between 2005 and 2009 were reviewed using an 18-point overall quality score for reporting based on the 2001 CONSORT statement. Multivariable linear regression was used to identify features associated with improved reporting quality. To provide baseline data for future evaluations of reporting quality, RCTs were also assessed according to the 2010 revised CONSORT statement. All statistical tests were two-sided. A total of 357 RCTs were reviewed. The mean 2001 overall quality score was 13.4 on a scale of 0-18, whereas the mean 2010 overall quality score was 19.3 on a scale of 0-27. The overall RCT reporting quality score improved by 0.21 points per year from 2005 to 2009. Poorly reported items included method used to generate the random allocation (adequately reported in 29% of trials), whether and how blinding was applied (41%), method of allocation concealment (51%), and participant flow (59%). High impact factor (IF, P = .003), recent publication date (P = .008), and geographic origin of RCTs (P = .003) were independent factors statistically significantly associated with higher reporting quality in a multivariable regression model. Sample size, tumor type, and positivity of trial results were not associated with higher reporting quality, whereas funding source and treatment type had a borderline statistically significant impact. The results show that numerous items remained unreported for many trials. Thus, given the potential impact of poorly reported trials, oncology journals should require even stricter adherence to the CONSORT guidelines.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.
Impact of Requirements Quality on Project Success or Failure
NASA Astrophysics Data System (ADS)
Tamai, Tetsuo; Kamata, Mayumi Itakura
We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.
1999-01-01
Section 211(m) of the Clean Air Act (42 U.S.C. 7401-7671q) requires that gasoline containing at least 2.7% oxygen by weight is to be used in the wintertime in those areas of the county that exceed the carbon monoxide National Ambient Air Quality Standards (NAAQS). The winter oxygenated gasoline program applies to all gasoline sold in the larger of the Consolidated Metropolitan Statistical Area (CMSA) or Metropolitan Statistical Area (MSA) in which the nonattainment area is located.
Statistical tools for transgene copy number estimation based on real-time PCR.
Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal
2007-11-01
As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.
Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques
NASA Astrophysics Data System (ADS)
Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein
2017-10-01
The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.
A unified framework for physical print quality
NASA Astrophysics Data System (ADS)
Eid, Ahmed; Cooper, Brian; Rippetoe, Ed
2007-01-01
In this paper we present a unified framework for physical print quality. This framework includes a design for a testbed, testing methodologies and quality measures of physical print characteristics. An automatic belt-fed flatbed scanning system is calibrated to acquire L* data for a wide range of flat field imagery. Testing methodologies based on wavelet pre-processing and spectral/statistical analysis are designed. We apply the proposed framework to three common printing artifacts: banding, jitter, and streaking. Since these artifacts are directional, wavelet based approaches are used to extract one artifact at a time and filter out other artifacts. Banding is characterized as a medium-to-low frequency, vertical periodic variation down the page. The same definition is applied to the jitter artifact, except that the jitter signal is characterized as a high-frequency signal above the banding frequency range. However, streaking is characterized as a horizontal aperiodic variation in the high-to-medium frequency range. Wavelets at different levels are applied to the input images in different directions to extract each artifact within specified frequency bands. Following wavelet reconstruction, images are converted into 1-D signals describing the artifact under concern. Accurate spectral analysis using a DFT with Blackman-Harris windowing technique is used to extract the power (strength) of periodic signals (banding and jitter). Since streaking is an aperiodic signal, a statistical measure is used to quantify the streaking strength. Experiments on 100 print samples scanned at 600 dpi from 10 different printers show high correlation (75% to 88%) between the ranking of these samples by the proposed metrologies and experts' visual ranking.
Dasari, Surendra; Chambers, Matthew C.; Martinez, Misti A.; Carpenter, Kristin L.; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo J.; Tabb, David L.
2012-01-01
Spectral libraries have emerged as a viable alternative to protein sequence databases for peptide identification. These libraries contain previously detected peptide sequences and their corresponding tandem mass spectra (MS/MS). Search engines can then identify peptides by comparing experimental MS/MS scans to those in the library. Many of these algorithms employ the dot product score for measuring the quality of a spectrum-spectrum match (SSM). This scoring system does not offer a clear statistical interpretation and ignores fragment ion m/z discrepancies in the scoring. We developed a new spectral library search engine, Pepitome, which employs statistical systems for scoring SSMs. Pepitome outperformed the leading library search tool, SpectraST, when analyzing data sets acquired on three different mass spectrometry platforms. We characterized the reliability of spectral library searches by confirming shotgun proteomics identifications through RNA-Seq data. Applying spectral library and database searches on the same sample revealed their complementary nature. Pepitome identifications enabled the automation of quality analysis and quality control (QA/QC) for shotgun proteomics data acquisition pipelines. PMID:22217208
Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila
2017-06-01
For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Gálvez, Jaime; Conejo, Ricardo; Guzmán, Eduardo
2013-01-01
One of the most popular student modeling approaches is Constraint-Based Modeling (CBM). It is an efficient approach that can be easily applied inside an Intelligent Tutoring System (ITS). Even with these characteristics, building new ITSs requires carefully designing the domain model to be taught because different sources of errors could affect…
Statistical Model Selection for TID Hardness Assurance
NASA Technical Reports Server (NTRS)
Ladbury, R.; Gorelick, J. L.; McClure, S.
2010-01-01
Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.
Crawford, Charles G.; Wangsness, David J.
1993-01-01
The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.
Exploring the use of statistical process control methods to assess course changes
NASA Astrophysics Data System (ADS)
Vollstedt, Ann-Marie
This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control (SPC) methods were employed to evaluate the impact of the treatments applied to improve the courses. It was determined that using SPC is advantageous because it does not require additional resources and is not affected if a course is curved by adding the same amount of points to each student's grade. It was also determined that SPC results, unlike average grade, correlated well with anecdotal evidence from the instructors concerning how well the students performed in any given year. In addition to application of SPC to evaluate curriculum change, statistical analysis was used to show that course grades correlate with quiz grades, but do not correlate with critical thinking, self efficacy, or cognitive level which implies that treatments need to be implemented to increase these qualities.
NASA Astrophysics Data System (ADS)
Edjah, Adwoba; Stenni, Barbara; Cozzi, Giulio; Turetta, Clara; Dreossi, Giuliano; Tetteh Akiti, Thomas; Yidana, Sandow
2017-04-01
Adwoba Kua- Manza Edjaha, Barbara Stennib,c,Giuliano Dreossib, Giulio Cozzic, Clara Turetta c,T.T Akitid ,Sandow Yidanae a,eDepartment of Earth Science, University of Ghana Legon, Ghana West Africa bDepartment of Enviromental Sciences, Informatics and Statistics, Ca Foscari University of Venice, Italy cInstitute for the Dynamics of Environmental Processes, CNR, Venice, Italy dDepartment of Nuclear Application and Techniques, Graduate School of Nuclear and Allied Sciences University of Ghana Legon This research is part of a PhD research work "Hydrogeological Assessment of the Lower Tano river basin for sustainable economic usage, Ghana, West - Africa". In this study, the researcher investigated surface water and groundwater quality in the Lower Tano river basin. This assessment was based on some selected sampling sites associated with mining activities, and the development of oil and gas. Statistical approach was applied to characterize the quality of surface water and groundwater. Also, water stable isotopes, which is a natural tracer of the hydrological cycle was used to investigate the origin of groundwater recharge in the basin. The study revealed that Pb and Ni values of the surface water and groundwater samples exceeded the WHO standards for drinking water. In addition, water quality index (WQI), based on physicochemical parameters(EC, TDS, pH) and major ions(Ca2+, Na+, Mg2+, HCO3-,NO3-, CL-, SO42-, K+) exhibited good quality water for 60% of the sampled surface water and groundwater. Other statistical techniques, such as Heavy metal pollution index (HPI), degree of contamination (Cd), and heavy metal evaluation index (HEI), based on trace element parameters in the water samples, reveal that 90% of the surface water and groundwater samples belong to high level of pollution. Principal component analysis (PCA) also suggests that the water quality in the basin is likely affected by rock - water interaction and anthropogenic activities (sea water intrusion). This was confirm by further statistical analysis (cluster analysis and correlation matrix) of the water quality parameters. Spatial distribution of water quality parameters, trace elements and the results obtained from the statistical analysis was determined by geographical information system (GIS). In addition, the isotopic analysis of the sampled surface water and groundwater revealed that most of the surface water and groundwater were of meteoric origin with little or no isotopic variations. It is expected that outcomes of this research will form a baseline for making appropriate decision on water quality management by decision makers in the Lower Tano river Basin. Keywords: Water stable isotopes, Trace elements, Multivariate statistics, Evaluation indices, Lower Tano river basin.
NASA Astrophysics Data System (ADS)
Asal, F. F.
2012-07-01
Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical analysis has proven good potential in the DEM quality assessment.
NASA Astrophysics Data System (ADS)
Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun
2017-11-01
This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.
Analysis of trends in water-quality data for water conservation area 3A, the Everglades, Florida
Mattraw, H.C.; Scheidt, D.J.; Federico, A.C.
1987-01-01
Rainfall and water quality data bases from the South Florida Water Management District were used to evaluate water quality trends at 10 locations near or in Water Conservation Area 3A in The Everglades. The Seasonal Kendall test was applied to specific conductance, orthophosphate-phosphorus, nitrate-nitrogen, total Kjeldahl nitrogen, and total nitrogen regression residuals for the period 1978-82. Residuals of orthophosphate and nitrate quadratic models, based on antecedent 7-day rainfall at inflow gate S-11B, were the only two constituent-structure pairs that showed apparent significant (p < 0.05) increases in constituent concentrations. Elimination of regression models with distinct residual patterns and data outlines resulted in 17 statistically significant station water quality combinations for trend analysis. No water quality trends were observed. The 1979 Memorandum of Agreement outlining the water quality monitoring program between the Everglades National Park and the U.S. Army Corps of Engineers stressed collection four times a year at three stations, and extensive coverage of water quality properties. Trend analysis and other rigorous statistical evaluation programs are better suited to data monitoring programs that include more frequent sampling and that are organized in a water quality data management system. Pronounced areal differences in water quality suggest that a water quality monitoring system for Shark River Slough in Everglades National Park include collection locations near the source of inflow to Water Conservation Area 3A. (Author 's abstract)
Analyzing longitudinal data with the linear mixed models procedure in SPSS.
West, Brady T
2009-09-01
Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.
Ekici, Behice; Cimete, Güler
2015-01-01
OBJECTIVES To determine the effects of an asthma training and monitoring program on children’s disease management and quality of life. MATERIAL AND METHODS The sample consisted of 120 children and their parents. Data were collected during, at the beginning, and at the end of the 3-month monitoring period using four forms and a quality of life scale. After an initial evaluation, approaches to control symptoms and asthma triggers and measures that might be taken for them were taught to the children and parents. The children recorded the conditions of trigger exposure, experience of disease symptoms, their effects on daily activities, and therapeutic implementations on a daily basis. RESULTS During the 3-month monitoring period, the number of days when the children were exposed to triggers (p=0.000) and experienced disease symptoms decreased to a statistically significant level (p=0.006). Majority of domestic triggers disappeared, but those stemming from the structure of the house and non-domestic triggers indicated no change (p>0.05). Moreover, 30.8% of the children applied to a physician/hospital/emergency service, 4.2% of the children were hospitalized, and 30% of them could not go to school. The number of times when the children applied to a physician/hospital/emergency (p=0.013), the number of times they used medicines (p=0.050), and the number of days they could not go to school (p=0.002) decreased at a statistically significant level, and their quality of life increased (p=0.001). CONCLUSION Asthma training and monitoring program decreased children’s rate of experiencing asthma symptoms and implementations of therapeutic purposes and increased their life quality. PMID:29404097
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344
1994-03-01
optimize, and perform "what-if" analysis on a complicated simulation model of the greenhouse effect . Regression metamodels were applied to several modules of...the large integrated assessment model of the greenhouse effect . In this study, the metamodels gave "acceptable forecast errors" and were shown to
Verhulst, Brad
2016-01-01
P values have become the scapegoat for a wide variety of problems in science. P values are generally over-emphasized, often incorrectly applied, and in some cases even abused. However, alternative methods of hypothesis testing will likely fall victim to the same criticisms currently leveled at P values if more fundamental changes are not made in the research process. Increasing the general level of statistical literacy and enhancing training in statistical methods provide a potential avenue for identifying, correcting, and preventing erroneous conclusions from entering the academic literature and for improving the general quality of patient care. PMID:28366961
Application of machine vision to pup loaf bread evaluation
NASA Astrophysics Data System (ADS)
Zayas, Inna Y.; Chung, O. K.
1996-12-01
Intrinsic end-use quality of hard winter wheat breeding lines is routinely evaluated at the USDA, ARS, USGMRL, Hard Winter Wheat Quality Laboratory. Experimental baking test of pup loaves is the ultimate test for evaluating hard wheat quality. Computer vision was applied to developing an objective methodology for bread quality evaluation for the 1994 and 1995 crop wheat breeding line samples. Computer extracted features for bread crumb grain were studied, using subimages (32 by 32 pixel) and features computed for the slices with different threshold settings. A subsampling grid was located with respect to the axis of symmetry of a slice to provide identical topological subimage information. Different ranking techniques were applied to the databases. Statistical analysis was run on the database with digital image and breadmaking features. Several ranking algorithms and data visualization techniques were employed to create a sensitive scale for porosity patterns of bread crumb. There were significant linear correlations between machine vision extracted features and breadmaking parameters. Crumb grain scores by human experts were correlated more highly with some image features than with breadmaking parameters.
Alves, Darlan Daniel; Riegel, Roberta Plangg; de Quevedo, Daniela Müller; Osório, Daniela Montanari Migliavacca; da Costa, Gustavo Marques; do Nascimento, Carlos Augusto; Telöken, Franko
2018-06-08
Assessment of surface water quality is an issue of currently high importance, especially in polluted rivers which provide water for treatment and distribution as drinking water, as is the case of the Sinos River, southern Brazil. Multivariate statistical techniques allow a better understanding of the seasonal variations in water quality, as well as the source identification and source apportionment of water pollution. In this study, the multivariate statistical techniques of cluster analysis (CA), principal component analysis (PCA), and positive matrix factorization (PMF) were used, along with the Kruskal-Wallis test and Spearman's correlation analysis in order to interpret a water quality data set resulting from a monitoring program conducted over a period of almost two years (May 2013 to April 2015). The water samples were collected from the raw water inlet of the municipal water treatment plant (WTP) operated by the Water and Sewage Services of Novo Hamburgo (COMUSA). CA allowed the data to be grouped into three periods (autumn and summer (AUT-SUM); winter (WIN); spring (SPR)). Through the PCA, it was possible to identify that the most important parameters in contribution to water quality variations are total coliforms (TCOLI) in SUM-AUT, water level (WL), water temperature (WT), and electrical conductivity (EC) in WIN and color (COLOR) and turbidity (TURB) in SPR. PMF was applied to the complete data set and enabled the source apportionment water pollution through three factors, which are related to anthropogenic sources, such as the discharge of domestic sewage (mostly represented by Escherichia coli (ECOLI)), industrial wastewaters, and agriculture runoff. The results provided by this study demonstrate the contribution provided by the use of integrated statistical techniques in the interpretation and understanding of large data sets of water quality, showing also that this approach can be used as an efficient methodology to optimize indicators for water quality assessment.
Granato, Gregory E.
2014-01-01
The U.S. Geological Survey (USGS) developed the Stochastic Empirical Loading and Dilution Model (SELDM) in cooperation with the Federal Highway Administration (FHWA) to indicate the risk for stormwater concentrations, flows, and loads to be above user-selected water-quality goals and the potential effectiveness of mitigation measures to reduce such risks. SELDM models the potential effect of mitigation measures by using Monte Carlo methods with statistics that approximate the net effects of structural and nonstructural best management practices (BMPs). In this report, structural BMPs are defined as the components of the drainage pathway between the source of runoff and a stormwater discharge location that affect the volume, timing, or quality of runoff. SELDM uses a simple stochastic statistical model of BMP performance to develop planning-level estimates of runoff-event characteristics. This statistical approach can be used to represent a single BMP or an assemblage of BMPs. The SELDM BMP-treatment module has provisions for stochastic modeling of three stormwater treatments: volume reduction, hydrograph extension, and water-quality treatment. In SELDM, these three treatment variables are modeled by using the trapezoidal distribution and the rank correlation with the associated highway-runoff variables. This report describes methods for calculating the trapezoidal-distribution statistics and rank correlation coefficients for stochastic modeling of volume reduction, hydrograph extension, and water-quality treatment by structural stormwater BMPs and provides the calculated values for these variables. This report also provides robust methods for estimating the minimum irreducible concentration (MIC), which is the lowest expected effluent concentration from a particular BMP site or a class of BMPs. These statistics are different from the statistics commonly used to characterize or compare BMPs. They are designed to provide a stochastic transfer function to approximate the quantity, duration, and quality of BMP effluent given the associated inflow values for a population of storm events. A database application and several spreadsheet tools are included in the digital media accompanying this report for further documentation of methods and for future use. In this study, analyses were done with data extracted from a modified copy of the January 2012 version of International Stormwater Best Management Practices Database, designated herein as the January 2012a version. Statistics for volume reduction, hydrograph extension, and water-quality treatment were developed with selected data. Sufficient data were available to estimate statistics for 5 to 10 BMP categories by using data from 40 to more than 165 monitoring sites. Water-quality treatment statistics were developed for 13 runoff-quality constituents commonly measured in highway and urban runoff studies including turbidity, sediment and solids; nutrients; total metals; organic carbon; and fecal coliforms. The medians of the best-fit statistics for each category were selected to construct generalized cumulative distribution functions for the three treatment variables. For volume reduction and hydrograph extension, interpretation of available data indicates that selection of a Spearman’s rho value that is the average of the median and maximum values for the BMP category may help generate realistic simulation results in SELDM. The median rho value may be selected to help generate realistic simulation results for water-quality treatment variables. MIC statistics were developed for 12 runoff-quality constituents commonly measured in highway and urban runoff studies by using data from 11 BMP categories and more than 167 monitoring sites. Four statistical techniques were applied for estimating MIC values with monitoring data from each site. These techniques produce a range of lower-bound estimates for each site. Four MIC estimators are proposed as alternatives for selecting a value from among the estimates from multiple sites. Correlation analysis indicates that the MIC estimates from multiple sites were weakly correlated with the geometric mean of inflow values, which indicates that there may be a qualitative or semiquantitative link between the inflow quality and the MIC. Correlations probably are weak because the MIC is influenced by the inflow water quality and the capability of each individual BMP site to reduce inflow concentrations.
Parson, W; Gusmão, L; Hares, D R; Irwin, J A; Mayr, W R; Morling, N; Pokorak, E; Prinz, M; Salas, A; Schneider, P M; Parsons, T J
2014-11-01
The DNA Commission of the International Society of Forensic Genetics (ISFG) regularly publishes guidelines and recommendations concerning the application of DNA polymorphisms to the question of human identification. Previous recommendations published in 2000 addressed the analysis and interpretation of mitochondrial DNA (mtDNA) in forensic casework. While the foundations set forth in the earlier recommendations still apply, new approaches to the quality control, alignment and nomenclature of mitochondrial sequences, as well as the establishment of mtDNA reference population databases, have been developed. Here, we describe these developments and discuss their application to both mtDNA casework and mtDNA reference population databasing applications. While the generation of mtDNA for forensic casework has always been guided by specific standards, it is now well-established that data of the same quality are required for the mtDNA reference population data used to assess the statistical weight of the evidence. As a result, we introduce guidelines regarding sequence generation, as well as quality control measures based on the known worldwide mtDNA phylogeny, that can be applied to ensure the highest quality population data possible. For both casework and reference population databasing applications, the alignment and nomenclature of haplotypes is revised here and the phylogenetic alignment proffered as acceptable standard. In addition, the interpretation of heteroplasmy in the forensic context is updated, and the utility of alignment-free database searches for unbiased probability estimates is highlighted. Finally, we discuss statistical issues and define minimal standards for mtDNA database searches. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Simon, Heather; Baker, Kirk R; Akhtar, Farhan; Napelenok, Sergey L; Possiel, Norm; Wells, Benjamin; Timin, Brian
2013-03-05
In setting primary ambient air quality standards, the EPA's responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and risks associated with ambient ozone pollution using a statistical approach to adjust recent air quality to simulate just meeting the current standard level, without specifying emission control strategies. One drawback of this purely statistical concentration rollback approach is that it does not take into account spatial and temporal heterogeneity of ozone response to emissions changes. The application of the higher-order decoupled direct method (HDDM) in the community multiscale air quality (CMAQ) model is discussed here to provide an example of a methodology that could incorporate this variability into the risk assessment analyses. Because this approach includes a full representation of the chemical production and physical transport of ozone in the atmosphere, it does not require assumed background concentrations, which have been applied to constrain estimates from past statistical techniques. The CMAQ-HDDM adjustment approach is extended to measured ozone concentrations by determining typical sensitivities at each monitor location and hour of the day based on a linear relationship between first-order sensitivities and hourly ozone values. This approach is demonstrated by modeling ozone responses for monitor locations in Detroit and Charlotte to domain-wide reductions in anthropogenic NOx and VOCs emissions. As seen in previous studies, ozone response calculated using HDDM compared well to brute-force emissions changes up to approximately a 50% reduction in emissions. A new stepwise approach is developed here to apply this method to emissions reductions beyond 50% allowing for the simulation of more stringent reductions in ozone concentrations. Compared to previous rollback methods, this application of modeled sensitivities to ambient ozone concentrations provides a more realistic spatial response of ozone concentrations at monitors inside and outside the urban core and at hours of both high and low ozone concentrations.
Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data
NASA Astrophysics Data System (ADS)
Martins, Fabio J. W. A.; Foucaut, Jean-Marc; Thomas, Lionel; Azevedo, Luis F. A.; Stanislas, Michel
2015-08-01
Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time.
[Burnout and quality of life in medical residents].
Prieto-Miranda, Sergio Emilio; Rodríguez-Gallardo, Gisela Bethsabé; Jiménez-Bernardino, Carlos Alberto; Guerrero-Quintero, Laura Guadalupe
2013-01-01
burnout and quality of life are poorly studied phenomena in postgraduate students, and its effects are unknown. The aim was to investigate the relationship between quality of life and burnout in medical residents. a longitudinal study was performed. We included medical residents who began their postgraduate studies in 2010. The Spanish version of the Quality of Life Profile for the Chronically Ill (PLC, according to its initials in German), and the Maslach Burnout Inventory specific to physicians were applied at the beginning, and six and 12 months later. Descriptive statistics were used for nominal variables. Chi-square and ANOVA were applied to numerical variables. we included 45 residents, the average age was 26.9 ± 2.93 years, 18 (40 %) were female and 27 (60 %) were male. The PLC survey found significant decrease in four of the six scales assessed in the three measurements. The Maslach Burnout Inventory found high levels of emotional exhaustion in the three tests, low levels of depersonalization and low personal gains at the beginning, rising at six and 12 months. The most affected specialty was Internal Medicine. burnout and impaired quality of life for residents exist in postgraduate physicians and it is maintained during the first year of residency.
Longitudinal flying qualities criteria for single-pilot instrument flight operations
NASA Technical Reports Server (NTRS)
Stengel, R. F.; Bar-Gill, A.
1983-01-01
Modern estimation and control theory, flight testing, and statistical analysis were used to deduce flying qualities criteria for General Aviation Single Pilot Instrument Flight Rule (SPIFR) operations. The principal concern is that unsatisfactory aircraft dynamic response combined with high navigation/communication workload can produce problems of safety and efficiency. To alleviate these problems. The relative importance of these factors must be determined. This objective was achieved by flying SPIFR tasks with different aircraft dynamic configurations and assessing the effects of such variations under these conditions. The experimental results yielded quantitative indicators of pilot's performance and workload, and for each of them, multivariate regression was applied to evaluate several candidate flying qualities criteria.
Does a hospital's quality depend on the quality of other hospitals? A spatial econometrics approach
Gravelle, Hugh; Santos, Rita; Siciliani, Luigi
2014-01-01
We examine whether a hospital's quality is affected by the quality provided by other hospitals in the same market. We first sketch a theoretical model with regulated prices and derive conditions on demand and cost functions which determine whether a hospital will increase its quality if its rivals increase their quality. We then apply spatial econometric methods to a sample of English hospitals in 2009–10 and a set of 16 quality measures including mortality rates, readmission, revision and redo rates, and three patient reported indicators, to examine the relationship between the quality of hospitals. We find that a hospital's quality is positively associated with the quality of its rivals for seven out of the sixteen quality measures. There are no statistically significant negative associations. In those cases where there is a significant positive association, an increase in rivals' quality by 10% increases a hospital's quality by 1.7% to 2.9%. The finding suggests that for some quality measures a policy which improves the quality in one hospital will have positive spillover effects on the quality in other hospitals. PMID:25843994
Does a hospital's quality depend on the quality of other hospitals? A spatial econometrics approach.
Gravelle, Hugh; Santos, Rita; Siciliani, Luigi
2014-11-01
We examine whether a hospital's quality is affected by the quality provided by other hospitals in the same market. We first sketch a theoretical model with regulated prices and derive conditions on demand and cost functions which determine whether a hospital will increase its quality if its rivals increase their quality. We then apply spatial econometric methods to a sample of English hospitals in 2009-10 and a set of 16 quality measures including mortality rates, readmission, revision and redo rates, and three patient reported indicators, to examine the relationship between the quality of hospitals. We find that a hospital's quality is positively associated with the quality of its rivals for seven out of the sixteen quality measures. There are no statistically significant negative associations. In those cases where there is a significant positive association, an increase in rivals' quality by 10% increases a hospital's quality by 1.7% to 2.9%. The finding suggests that for some quality measures a policy which improves the quality in one hospital will have positive spillover effects on the quality in other hospitals.
Chen, C; Xiang, J Y; Hu, W; Xie, Y B; Wang, T J; Cui, J W; Xu, Y; Liu, Z; Xiang, H; Xie, Q
2015-11-01
To screen and identify safe micro-organisms used during Douchi fermentation, and verify the feasibility of producing high-quality Douchi using these identified micro-organisms. PCR-denaturing gradient gel electrophoresis (DGGE) and automatic amino-acid analyser were used to investigate the microbial diversity and free amino acids (FAAs) content of 10 commercial Douchi samples. The correlations between microbial communities and FAAs were analysed by statistical analysis. Ten strains with significant positive correlation were identified. Then an experiment on Douchi fermentation by identified strains was carried out, and the nutritional composition in Douchi was analysed. Results showed that FAAs and relative content of isoflavone aglycones in verification Douchi samples were generally higher than those in commercial Douchi samples. Our study indicated that fungi, yeasts, Bacillus and lactic acid bacteria were the key players in Douchi fermentation, and with identified probiotic micro-organisms participating in fermentation, a higher quality Douchi product was produced. This is the first report to analyse and confirm the key micro-organisms during Douchi fermentation by statistical analysis. This work proves fermentation micro-organisms to be the key influencing factor of Douchi quality, and demonstrates the feasibility of fermenting Douchi using identified starter micro-organisms. © 2015 The Society for Applied Microbiology.
Acoustic correlates of Japanese expressions associated with voice quality of male adults
NASA Astrophysics Data System (ADS)
Kido, Hiroshi; Kasuya, Hideki
2004-05-01
Japanese expressions associated with the voice quality of male adults were extracted by a series of questionnaire surveys and statistical multivariate analysis. One hundred and thirty-seven Japanese expressions were collected through the first questionnaire and careful investigations of well-established Japanese dictionaries and articles. From the second questionnaire about familiarity with each of the expressions and synonymity that were addressed to 249 subjects, 25 expressions were extracted. The third questionnaire was about an evaluation of their own voice quality. By applying a statistical clustering method and a correlation analysis to the results of the questionnaires, eight bipolar expressions and one unipolar expression were obtained. They constituted high-pitched/low-pitched, masculine/feminine, hoarse/clear, calm/excited, powerful/weak, youthful/elderly, thick/thin, tense/lax, and nasal, respectively. Acoustic correlates of each of the eight bipolar expressions were extracted by means of perceptual evaluation experiments that were made with sentence utterances of 36 males and by a statistical decision tree method. They included an average of the fundamental frequency (F0) of the utterance, speaking rate, spectral tilt, formant frequency parameter, standard deviation of F0 values, and glottal noise, when SPL of each of the stimuli was maintained identical in the perceptual experiments.
What Is the Methodologic Quality of Human Therapy Studies in ISI Surgical Publications?
Manterola, Carlos; Pineda, Viviana; Vial, Manuel; Losada, Héctor
2006-01-01
Objective: To determine the methodologic quality of therapy articles about humans published in ISI surgical journals, and to explore the association between methodologic quality, origin, and subject matter. Summary Background Data: It is supposed that ISI journals contain the best methodologic articles. Methods: This is a bibliometric study. All journals listed in the 2002 ISI under the subject heading of “Surgery” were included. A simple randomized sampling was conducted for selected journals (Annals of Surgery, The American Surgeon, Archives of Surgery, British Journal of Surgery, European Journal of Surgery, Journal of the American College of Surgeons, Surgery, and World Journal of Surgery). Published articles related to therapy on humans of the selected journals were reviewed and analyzed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to the editor, and experimental studies. The variables considered were: place of origin, design, and the methodologic quality of articles, which was determined by applying a valid and reliable scale. The review was performed interchangeably and independently by 2 research teams. Descriptive and analytical statistics were used. Statistical significance was defined as P values less than 1%. Results: A total of 653 articles were studied. Studies came predominantly from the United States and Europe (43.6% and 36.8%, respectively). The subject areas most frequently found were digestive and hepatobiliopancreatic surgery (29.1% and 24.5%, respectively). Average and median methodologic quality scores of the entire series were 11.6 ± 4.9 points and 11 points, respectively. The association between methodologic quality and journals was determined. Also, the association between methodologic quality and origin was observed, but no association with subject area was verified. Conclusions: The methodologic quality of therapy articles published in the journals analyzed is low; however, statistical significance was determined between them. Association was observed between methodologic quality and origin, but not with subject matter. PMID:17060778
NASA Astrophysics Data System (ADS)
Al-Hayani, Nazar; Al-Jawad, Naseer; Jassim, Sabah A.
2014-05-01
Video compression and encryption became very essential in a secured real time video transmission. Applying both techniques simultaneously is one of the challenges where the size and the quality are important in multimedia transmission. In this paper we proposed a new technique for video compression and encryption. Both encryption and compression are based on edges extracted from the high frequency sub-bands of wavelet decomposition. The compression algorithm based on hybrid of: discrete wavelet transforms, discrete cosine transform, vector quantization, wavelet based edge detection, and phase sensing. The compression encoding algorithm treats the video reference and non-reference frames in two different ways. The encryption algorithm utilized A5 cipher combined with chaotic logistic map to encrypt the significant parameters and wavelet coefficients. Both algorithms can be applied simultaneously after applying the discrete wavelet transform on each individual frame. Experimental results show that the proposed algorithms have the following features: high compression, acceptable quality, and resistance to the statistical and bruteforce attack with low computational processing.
NASA Astrophysics Data System (ADS)
Cisar, J. L.; Williams, K. E.; Vivas, H. E.; Haydu, J. J.
2000-05-01
Even with routine irrigation, soil water-repellency on sand-based turfgrass systems can occur. This study evaluated three commercially available surfactants alone or in combination in 1996, four experimental surfactant formulations in 1997, and four commercially available surfactants and one experimental surfactant in 1998 for their effect on reducing soil-water repellency in mature Cynodon dactylon X Cynodon transvaalensis cv. Tifdwarf sand-based greens. The treatments in 1996 were a commercial standard AquaGro (AG), and two new products, Primer (P) and Aqueduct (AD), applied as liquids at the rates 250, 190 and 250 ml per 100 m2, respectively, and a control. Combination treatments of P+AG, and P+AD were also applied at standard rates. Surfactants were evaluated for their effect on turfgrass quality and percent dry spot incidence through a period of drought that induced soil-water repellency symptoms and subsequently through a period of recovery. Water drop penetration time (WDPT), on the soil cores were determined. Data were analyzed for statistical significance (P<0.05) by automated ANOVA procedures. Results in 1996 demonstrated that during a period of drought, P or AD generally provided both significantly (P<0.05) higher turfgrass quality and reduced percent dry spotting than AG and the control. Primer or AD significantly (P<0.05) reduced WDPT. Furthermore, during a recovery period following the drought, P or AD provided significantly (P<0.05) higher turfgrass quality than untreated controls. Combinations of P+AG or P+AD did not provide significantly higher quality turfgrass or less percent dry spots than individual applications of either P or AD. The second experiment in 1997 consisted of four experimental surfactant formulations of (ACA 1257, ACA 1313, ACA 1455, and ACA 1457), and a control applied at the recommended rate of 250 ml per 100 m2, weekly, to plots. As in 1996, surfactants were visually evaluated for turfgrass quality and percent dry spot incidence and soil cores for WDPT. Results demonstrated that ACA treatments generally provided significantly (P<0.10) higher turfgrass quality and reduced percent dry spotting than the untreated control. In 1998, for the third experiment, on a green with extensive soil-water repellency, AD, P, Cascade, LescoFlo, and an experimental surfactant (N-07/05) were applied to alleviate soil-water repellency symptoms. The four commercially available surfactants performed well and provided statistically equivalent (P<0.01) and better turfgrass quality and percent dry spot reduction than the untreated control. The N-07/05 treatment improved turfgrass quality and reduced dry spots compared to the untreated plots as well, but on most dates did not perform as well as the commercial standards.
Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A
2010-06-01
The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Huedo-Medina, Tania B; Garcia, Marissa; Bihuniak, Jessica D; Kenny, Anne; Kerstetter, Jane
2016-03-01
Several systematic reviews/meta-analyses published within the past 10 y have examined the associations of Mediterranean-style diets (MedSDs) on cardiovascular disease (CVD) risk. However, these reviews have not been evaluated for satisfying contemporary methodologic quality standards. This study evaluated the quality of recent systematic reviews/meta-analyses on MedSD and CVD risk outcomes by using an established methodologic quality scale. The relation between review quality and impact per publication value of the journal in which the article had been published was also evaluated. To assess compliance with current standards, we applied a modified version of the Assessment of Multiple Systematic Reviews (AMSTARMedSD) quality scale to systematic reviews/meta-analyses retrieved from electronic databases that had met our selection criteria: 1) used systematic or meta-analytic procedures to review the literature, 2) examined MedSD trials, and 3) had MedSD interventions independently or combined with other interventions. Reviews completely satisfied from 8% to 75% of the AMSTARMedSD items (mean ± SD: 31.2% ± 19.4%), with those published in higher-impact journals having greater quality scores. At a minimum, 60% of the 24 reviews did not disclose full search details or apply appropriate statistical methods to combine study findings. Only 5 of the reviews included participant or study characteristics in their analyses, and none evaluated MedSD diet characteristics. These data suggest that current meta-analyses/systematic reviews evaluating the effect of MedSD on CVD risk do not fully comply with contemporary methodologic quality standards. As a result, there are more research questions to answer to enhance our understanding of how MedSD affects CVD risk or how these effects may be modified by the participant or MedSD characteristics. To clarify the associations between MedSD and CVD risk, future meta-analyses and systematic reviews should not only follow methodologic quality standards but also include more statistical modeling results when data allow. © 2016 American Society for Nutrition.
Selen, Arzu; Cruañes, Maria T; Müllertz, Anette; Dickinson, Paul A; Cook, Jack A; Polli, James E; Kesisoglou, Filippos; Crison, John; Johnson, Kevin C; Muirhead, Gordon T; Schofield, Timothy; Tsong, Yi
2010-09-01
A biopharmaceutics and Quality by Design (QbD) conference was held on June 10-12, 2009 in Rockville, Maryland, USA to provide a forum and identify approaches for enhancing product quality for patient benefit. Presentations concerned the current biopharmaceutical toolbox (i.e., in vitro, in silico, pre-clinical, in vivo, and statistical approaches), as well as case studies, and reflections on new paradigms. Plenary and breakout session discussions evaluated the current state and envisioned a future state that more effectively integrates QbD and biopharmaceutics. Breakout groups discussed the following four topics: Integrating Biopharmaceutical Assessment into the QbD Paradigm, Predictive Statistical Tools, Predictive Mechanistic Tools, and Predictive Analytical Tools. Nine priority areas, further described in this report, were identified for advancing integration of biopharmaceutics and support a more fundamentally based, integrated approach to setting product dissolution/release acceptance criteria. Collaboration among a broad range of disciplines and fostering a knowledge sharing environment that places the patient's needs as the focus of drug development, consistent with science- and risk-based spirit of QbD, were identified as key components of the path forward.
Connecting optical and X-ray tracers of galaxy cluster relaxation
NASA Astrophysics Data System (ADS)
Roberts, Ian D.; Parker, Laura C.; Hlavacek-Larrondo, Julie
2018-04-01
Substantial effort has been devoted in determining the ideal proxy for quantifying the morphology of the hot intracluster medium in clusters of galaxies. These proxies, based on X-ray emission, typically require expensive, high-quality X-ray observations making them difficult to apply to large surveys of groups and clusters. Here, we compare optical relaxation proxies with X-ray asymmetries and centroid shifts for a sample of Sloan Digital Sky Survey clusters with high-quality, archival X-ray data from Chandra and XMM-Newton. The three optical relaxation measures considered are the shape of the member-galaxy projected velocity distribution - measured by the Anderson-Darling (AD) statistic, the stellar mass gap between the most-massive and second-most-massive cluster galaxy, and the offset between the most-massive galaxy (MMG) position and the luminosity-weighted cluster centre. The AD statistic and stellar mass gap correlate significantly with X-ray relaxation proxies, with the AD statistic being the stronger correlator. Conversely, we find no evidence for a correlation between X-ray asymmetry or centroid shift and the MMG offset. High-mass clusters (Mhalo > 1014.5 M⊙) in this sample have X-ray asymmetries, centroid shifts, and Anderson-Darling statistics which are systematically larger than for low-mass systems. Finally, considering the dichotomy of Gaussian and non-Gaussian clusters (measured by the AD test), we show that the probability of being a non-Gaussian cluster correlates significantly with X-ray asymmetry but only shows a marginal correlation with centroid shift. These results confirm the shape of the radial velocity distribution as a useful proxy for cluster relaxation, which can then be applied to large redshift surveys lacking extensive X-ray coverage.
NASA Technical Reports Server (NTRS)
Stolzer, Alan J.; Halford, Carl
2007-01-01
In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.
[Review of meta-analysis research on exercise in South Korea].
Song, Youngshin; Gang, Moonhee; Kim, Sun Ae; Shin, In Soo
2014-10-01
The purpose of this study was to evaluate the quality of meta-analysis regarding exercise using Assessment of Multiple Systematic Reviews (AMSTAR) as well as to compare effect size according to outcomes. Electronic databases including the Korean Studies Information Service System (KISS), the National Assembly Library and the DBpia, HAKJISA and RISS4U for the dates 1990 to January 2014 were searched for 'meta-analysis' and 'exercise' in the fields of medical, nursing, physical therapy and physical exercise in Korea. AMSTAR was scored for quality assessment of the 33 articles included in the study. Data were analyzed using descriptive statistics, t-test, ANOVA and χ²-test. The mean score for AMSTAR evaluations was 4.18 (SD=1.78) and about 67% were classified at the low-quality level and 30% at the moderate-quality level. The scores of quality were statistically different by field of research, number of participants, number of databases, financial support and approval by IRB. The effect size that presented in individual studies were different by type of exercise in the applied intervention. This critical appraisal of meta-analysis published in various field that focused on exercise indicates that a guideline such as the PRISMA checklist should be strongly recommended for optimum reporting of meta-analysis across research fields.
Pizolato, Raquel Aparecida; Rehder, Maria Inês Beltrati Cornacchioni; Meneghim, Marcelo de Castro; Ambrosano, Glaucia Maria Bovi; Mialhe, Fábio Luiz; Pereira, Antonio Carlos
2013-02-27
Voice problems are more common in teachers due to intensive voice use during routine at work. There is evidence that occupational disphonia prevention programs are important in improving the quality voice and consequently the quality of subjects' lives. To investigate the impact of educational voice interventions for teachers on quality of life and voice. A longitudinal interventional study involving 70 teachers randomly selected from 11 public schools, 30 to receive educational intervention with vocal training exercises and vocal hygiene habits (experimental group) and 40 to receive guidance on vocal hygiene habits (control group control). Before the process of educational activities, the Voice-Related Quality of Life instrument (V-RQOL) was applied, and 3 months after conclusion of the activities, the subjects were interviewed again, using the same instrument. For data analysis, Prox MIXED were applied, with a level of significance α < 0.05. Teachers showed significantly higher domain and overall V-RQOL scores after preventive intervention, in both control and experimental groups. Nevertheless, there was no statistical difference in scores between the groups. Educational actions for vocal health had a positive impact on the quality of life of the participants, and the incorporation of permanent educational actions at institutional level is suggested.
2013-01-01
Background Voice problems are more common in teachers due to intensive voice use during routine at work. There is evidence that occupational disphonia prevention programs are important in improving the quality voice and consequently the quality of subjects’ lives. Aim To investigate the impact of educational voice interventions for teachers on quality of life and voice. Methods A longitudinal interventional study involving 70 teachers randomly selected from 11 public schools, 30 to receive educational intervention with vocal training exercises and vocal hygiene habits (experimental group) and 40 to receive guidance on vocal hygiene habits (control group control). Before the process of educational activities, the Voice-Related Quality of Life instrument (V-RQOL) was applied, and 3 months after conclusion of the activities, the subjects were interviewed again, using the same instrument. For data analysis, Prox MIXED were applied, with a level of significance α < 0.05. Results: Teachers showed significantly higher domain and overall V-RQOL scores after preventive intervention, in both control and experimental groups. Nevertheless, there was no statistical difference in scores between the groups. Conclusion Educational actions for vocal health had a positive impact on the quality of life of the participants, and the incorporation of permanent educational actions at institutional level is suggested. PMID:23445566
Integrating image quality in 2nu-SVM biometric match score fusion.
Vatsa, Mayank; Singh, Richa; Noore, Afzel
2007-10-01
This paper proposes an intelligent 2nu-support vector machine based match score fusion algorithm to improve the performance of face and iris recognition by integrating the quality of images. The proposed algorithm applies redundant discrete wavelet transform to evaluate the underlying linear and non-linear features present in the image. A composite quality score is computed to determine the extent of smoothness, sharpness, noise, and other pertinent features present in each subband of the image. The match score and the corresponding quality score of an image are fused using 2nu-support vector machine to improve the verification performance. The proposed algorithm is experimentally validated using the FERET face database and the CASIA iris database. The verification performance and statistical evaluation show that the proposed algorithm outperforms existing fusion algorithms.
Journal of Air Transportation, Volume 12, No. 1
NASA Technical Reports Server (NTRS)
Bowers, Brent D. (Editor); Kabashkin, Igor (Editor)
2007-01-01
Topics discussed include: a) Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods; b) Financial Comparisons across Different Business Models in the Canadian Airline Industry; c) Carving a Niche for the "No-Frills" Carrier, Air Arabia, in Oil-Rich Skies; d) Situational Leadership in Air Traffic Control; and e) The Very Light Jet Arrives: Stakeholders and Their Perceptions.
Michael Arbaugh; Larry Bednar
1996-01-01
The sampling methods used to monitor ozone injury to ponderosa and Jeffrey pines depend on the objectives of the study, geographic and genetic composition of the forest, and the source and composition of air pollutant emissions. By using a standardized sampling methodology, it may be possible to compare conditions within local areas more accurately, and to apply the...
Lee, Sangyun; Kwon, Heejin; Cho, Jihan
2016-12-01
To investigate image quality characteristics of abdominal computed tomography (CT) scans reconstructed with adaptive statistical iterative reconstruction V (ASIR-V) vs currently using applied adaptive statistical iterative reconstruction (ASIR). This institutional review board-approved study included 35 consecutive patients who underwent CT of the abdomen. Among these 35 patients, 27 with focal liver lesions underwent abdomen CT with a 128-slice multidetector unit using the following parameters: fixed noise index of 30, 1.25 mm slice thickness, 120 kVp, and a gantry rotation time of 0.5 seconds. CT images were analyzed depending on the method of reconstruction: ASIR (30%, 50%, and 70%) vs ASIR-V (30%, 50%, and 70%). Three radiologists independently assessed randomized images in a blinded manner. Imaging sets were compared to focal lesion detection numbers, overall image quality, and objective noise with a paired sample t test. Interobserver agreement was assessed with the intraclass correlation coefficient. The detection of small focal liver lesions (<10 mm) was significantly higher when ASIR-V was used when compared to ASIR (P <0.001). Subjective image noise, artifact, and objective image noise in liver were generally significantly better for ASIR-V compared to ASIR, especially in 50% ASIR-V. Image sharpness and diagnostic acceptability were significantly worse in 70% ASIR-V compared to various levels of ASIR. Images analyzed using 50% ASIR-V were significantly better than three different series of ASIR or other ASIR-V conditions at providing diagnostically acceptable CT scans without compromising image quality and in the detection of focal liver lesions. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Parks, Nathan A.; Gannon, Matthew A.; Long, Stephanie M.; Young, Madeleine E.
2016-01-01
Analysis of event-related potential (ERP) data includes several steps to ensure that ERPs meet an appropriate level of signal quality. One such step, subject exclusion, rejects subject data if ERP waveforms fail to meet an appropriate level of signal quality. Subject exclusion is an important quality control step in the ERP analysis pipeline as it ensures that statistical inference is based only upon those subjects exhibiting clear evoked brain responses. This critical quality control step is most often performed simply through visual inspection of subject-level ERPs by investigators. Such an approach is qualitative, subjective, and susceptible to investigator bias, as there are no standards as to what constitutes an ERP of sufficient signal quality. Here, we describe a standardized and objective method for quantifying waveform quality in individual subjects and establishing criteria for subject exclusion. The approach uses bootstrap resampling of ERP waveforms (from a pool of all available trials) to compute a signal-to-noise ratio confidence interval (SNR-CI) for individual subject waveforms. The lower bound of this SNR-CI (SNRLB) yields an effective and objective measure of signal quality as it ensures that ERP waveforms statistically exceed a desired signal-to-noise criterion. SNRLB provides a quantifiable metric of individual subject ERP quality and eliminates the need for subjective evaluation of waveform quality by the investigator. We detail the SNR-CI methodology, establish the efficacy of employing this approach with Monte Carlo simulations, and demonstrate its utility in practice when applied to ERP datasets. PMID:26903849
Savithra, Prakash; Nagesh, Lakshminarayan Shetty
2013-01-01
To assess a) whether the quality of reporting of randomised controlled trials (RCTs) has improved since the formulation of the Consolidated Standards of Reporting Trials (CONSORT) statement and b) whether there is any difference in reporting of RCTs between the selected public health dentistry journals. A hand search of the journals of public health dentistry was performed and four journals were identified for the study. They were Community Dentistry and Oral Epidemiology (CDOE), Community Dental Health (CDH), Journal of Public Health Dentistry (JPHD) and Oral Health and Preventive Dentistry (OHPD). A total of 114 RCTs published between 1990 and 2009 were selected. CONSORT guidelines were applied to each selected article in order to assess and determine any improvement since the publication of CONSORT guidelines. The chi-square test was employed to determine any statistical significant difference in quality of reporting of RCTs before and after the publication of the CONSORT guidelines. A comparison was also done to determine any statistically significant difference in quality of reporting of RCTs between the selected journals. Title, abstract, discussion and conclusion sections of the selected articles showed adherence to the CONSORT guidelines, whereas the compliance was poor with respect to the methodology section. The quality of reporting of RCTs is generally poor in public health dentistry journals. Overall, the quality of reporting has not substantially improved since the publication of CONSORT guidelines.
Vanniyasingam, Thuva; Daly, Caitlin; Jin, Xuejing; Zhang, Yuan; Foster, Gary; Cunningham, Charles; Thabane, Lehana
2018-06-01
This study reviews simulation studies of discrete choice experiments to determine (i) how survey design features affect statistical efficiency, (ii) and to appraise their reporting quality. Statistical efficiency was measured using relative design (D-) efficiency, D-optimality, or D-error. For this systematic survey, we searched Journal Storage (JSTOR), Since Direct, PubMed, and OVID which included a search within EMBASE. Searches were conducted up to year 2016 for simulation studies investigating the impact of DCE design features on statistical efficiency. Studies were screened and data were extracted independently and in duplicate. Results for each included study were summarized by design characteristic. Previously developed criteria for reporting quality of simulation studies were also adapted and applied to each included study. Of 371 potentially relevant studies, 9 were found to be eligible, with several varying in study objectives. Statistical efficiency improved when increasing the number of choice tasks or alternatives; decreasing the number of attributes, attribute levels; using an unrestricted continuous "manipulator" attribute; using model-based approaches with covariates incorporating response behaviour; using sampling approaches that incorporate previous knowledge of response behaviour; incorporating heterogeneity in a model-based design; correctly specifying Bayesian priors; minimizing parameter prior variances; and using an appropriate method to create the DCE design for the research question. The simulation studies performed well in terms of reporting quality. Improvement is needed in regards to clearly specifying study objectives, number of failures, random number generators, starting seeds, and the software used. These results identify the best approaches to structure a DCE. An investigator can manipulate design characteristics to help reduce response burden and increase statistical efficiency. Since studies varied in their objectives, conclusions were made on several design characteristics, however, the validity of each conclusion was limited. Further research should be conducted to explore all conclusions in various design settings and scenarios. Additional reviews to explore other statistical efficiency outcomes and databases can also be performed to enhance the conclusions identified from this review.
It's Only a Phase: Applying the 5 Phases of Clinical Trials to the NSCR Model Improvement Process
NASA Technical Reports Server (NTRS)
Elgart, S. R.; Milder, C. M.; Chappell, L. J.; Semones, E. J.
2017-01-01
NASA limits astronaut radiation exposures to a 3% risk of exposure-induced death from cancer (REID) at the upper 95% confidence level. Since astronauts approach this limit, it is important that the estimate of REID be as accurate as possible. The NASA Space Cancer Risk 2012 (NSCR-2012) model has been the standard for NASA's space radiation protection guidelines since its publication in 2013. The model incorporates elements from U.S. baseline statistics, Japanese atomic bomb survivor research, animal models, cellular studies, and radiation transport to calculate astronaut baseline risk of cancer and REID. The NSCR model is under constant revision to ensure emerging research is incorporated into radiation protection standards. It is important to develop guidelines, however, to determine what new research is appropriate for integration. Certain standards of transparency are necessary in order to assess data quality, statistical quality, and analytical quality. To this effect, all original source code and any raw data used to develop the code are required to confirm there are no errors which significantly change reported outcomes. It is possible to apply a clinical trials approach to select and assess the improvement concepts that will be incorporated into future iterations of NSCR. This poster describes the five phases of clinical trials research, pre-clinical research, and clinical research phases I-IV, explaining how each step can be translated into an appropriate NSCR model selection guideline.
Sales, C; Cervera, M I; Gil, R; Portolés, T; Pitarch, E; Beltran, J
2017-02-01
The novel atmospheric pressure chemical ionization (APCI) source has been used in combination with gas chromatography (GC) coupled to hybrid quadrupole time-of-flight (QTOF) mass spectrometry (MS) for determination of volatile components of olive oil, enhancing its potential for classification of olive oil samples according to their quality using a metabolomics-based approach. The full-spectrum acquisition has allowed the detection of volatile organic compounds (VOCs) in olive oil samples, including Extra Virgin, Virgin and Lampante qualities. A dynamic headspace extraction with cartridge solvent elution was applied. The metabolomics strategy consisted of three different steps: a full mass spectral alignment of GC-MS data using MzMine 2.0, a multivariate analysis using Ez-Info and the creation of the statistical model with combinations of responses for molecular fragments. The model was finally validated using blind samples, obtaining an accuracy in oil classification of 70%, taking the official established method, "PANEL TEST", as reference. Copyright © 2016 Elsevier Ltd. All rights reserved.
de Couto Nascimento, Vanessa; de Castro Ferreira Conti, Ana Cláudia; de Almeida Cardoso, Maurício; Valarelli, Danilo Pinelli; de Almeida-Pedrin, Renata Rodrigues
2016-09-01
To evaluate whether orthodontic treatment in adults requiring oral rehabilitation is effective for increasing patients' self-esteem and quality of life (QoL). The sample consisted of 102 adult patients (77 women and 25 men) aged between 18 and 66 years (mean, 35.1 years) requiring oral rehabilitation and orthodontic treatment simultaneously. Rosenberg's Self-Esteem (RSE) Scale and a questionnaire about QoL based on the Oral Health Impact Profile (OHIP-14) were used to determine self-esteem and QoL scores retrospectively. Questionnaires were carried out in two stages, T1 (start of treatment) and T2 (6 months after). To compare score changes between T1 and T2, the data obtained from the RSE Scale were evaluated with paired t tests, and data from the quality-of-life questionnaire were assessed by applying descriptive statistics. The results showed a statistically significant increase in self-esteem (P < .001) and a great improvement on patients' QoL. Orthodontic treatment causes a significant increase in self-esteem and QoL, providing psychological benefits for adult patients in need of oral rehabilitation.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Applied learning-based color tone mapping for face recognition in video surveillance system
NASA Astrophysics Data System (ADS)
Yew, Chuu Tian; Suandi, Shahrel Azmin
2012-04-01
In this paper, we present an applied learning-based color tone mapping technique for video surveillance system. This technique can be applied onto both color and grayscale surveillance images. The basic idea is to learn the color or intensity statistics from a training dataset of photorealistic images of the candidates appeared in the surveillance images, and remap the color or intensity of the input image so that the color or intensity statistics match those in the training dataset. It is well known that the difference in commercial surveillance cameras models, and signal processing chipsets used by different manufacturers will cause the color and intensity of the images to differ from one another, thus creating additional challenges for face recognition in video surveillance system. Using Multi-Class Support Vector Machines as the classifier on a publicly available video surveillance camera database, namely SCface database, this approach is validated and compared to the results of using holistic approach on grayscale images. The results show that this technique is suitable to improve the color or intensity quality of video surveillance system for face recognition.
A novel approach to generating CER hypotheses based on mining clinical data.
Zhang, Shuo; Li, Lin; Yu, Yiqin; Sun, Xingzhi; Xu, Linhao; Zhao, Wei; Teng, Xiaofei; Pan, Yue
2013-01-01
Comparative effectiveness research (CER) is a scientific method of investigating the effectiveness of alternative intervention methods. In a CER study, clinical researchers typically start with a CER hypothesis, and aim to evaluate it by applying a series of medical statistical methods. Traditionally, the CER hypotheses are defined manually by clinical researchers. This makes the task of hypothesis generation very time-consuming and the quality of hypothesis heavily dependent on the researchers' skills. Recently, with more electronic medical data being collected, it is highly promising to apply the computerized method for discovering CER hypotheses from clinical data sets. In this poster, we proposes a novel approach to automatically generating CER hypotheses based on mining clinical data, and presents a case study showing that the approach can facilitate clinical researchers to identify potentially valuable hypotheses and eventually define high quality CER studies.
NASA Astrophysics Data System (ADS)
Lemaire, Vincent; Colette, Augustin; Menut, Laurent
2016-04-01
Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.
Garcia, Ana Maria.; Alexander, Richard B.; Arnold, Jeffrey G.; Norfleet, Lee; White, Michael J.; Robertson, Dale M.; Schwarz, Gregory E.
2016-01-01
Despite progress in the implementation of conservation practices, related improvements in water quality have been challenging to measure in larger river systems. In this paper we quantify these downstream effects by applying the empirical U.S. Geological Survey water-quality model SPARROW to investigate whether spatial differences in conservation intensity were statistically correlated with variations in nutrient loads. In contrast to other forms of water quality data analysis, the application of SPARROW controls for confounding factors such as hydrologic variability, multiple sources and environmental processes. A measure of conservation intensity was derived from the USDA-CEAP regional assessment of the Upper Mississippi River and used as an explanatory variable in a model of the Upper Midwest. The spatial pattern of conservation intensity was negatively correlated (p = 0.003) with the total nitrogen loads in streams in the basin. Total phosphorus loads were weakly negatively correlated with conservation (p = 0.25). Regional nitrogen reductions were estimated to range from 5 to 34% and phosphorus reductions from 1 to 10% in major river basins of the Upper Mississippi region. The statistical associations between conservation and nutrient loads are consistent with hydrological and biogeochemical processes such as denitrification. The results provide empirical evidence at the regional scale that conservation practices have had a larger statistically detectable effect on nitrogen than on phosphorus loadings in streams and rivers of the Upper Mississippi Basin.
García, Ana María; Alexander, Richard B; Arnold, Jeffrey G; Norfleet, Lee; White, Michael J; Robertson, Dale M; Schwarz, Gregory
2016-07-05
Despite progress in the implementation of conservation practices, related improvements in water quality have been challenging to measure in larger river systems. In this paper we quantify these downstream effects by applying the empirical U.S. Geological Survey water-quality model SPARROW to investigate whether spatial differences in conservation intensity were statistically correlated with variations in nutrient loads. In contrast to other forms of water quality data analysis, the application of SPARROW controls for confounding factors such as hydrologic variability, multiple sources and environmental processes. A measure of conservation intensity was derived from the USDA-CEAP regional assessment of the Upper Mississippi River and used as an explanatory variable in a model of the Upper Midwest. The spatial pattern of conservation intensity was negatively correlated (p = 0.003) with the total nitrogen loads in streams in the basin. Total phosphorus loads were weakly negatively correlated with conservation (p = 0.25). Regional nitrogen reductions were estimated to range from 5 to 34% and phosphorus reductions from 1 to 10% in major river basins of the Upper Mississippi region. The statistical associations between conservation and nutrient loads are consistent with hydrological and biogeochemical processes such as denitrification. The results provide empirical evidence at the regional scale that conservation practices have had a larger statistically detectable effect on nitrogen than on phosphorus loadings in streams and rivers of the Upper Mississippi Basin.
Iwasaki, Satoshi; Usami, Shin-Ichi; Takahashi, Haruo; Kanda, Yukihiko; Tono, Tetsuya; Doi, Katsumi; Kumakawa, Kozo; Gyo, Kiyofumi; Naito, Yasushi; Kanzaki, Sho; Yamanaka, Noboru; Kaga, Kimitaka
2017-07-01
To report on the safety and efficacy of an investigational active middle ear implant (AMEI) in Japan, and to compare results to preoperative results with a hearing aid. Prospective study conducted in Japan in which 23 Japanese-speaking adults suffering from conductive or mixed hearing loss received a VIBRANT SOUNDBRIDGE with implantation at the round window. Postoperative thresholds, speech perception results (word recognition scores, speech reception thresholds, signal-to-noise ratio [SNR]), and quality of life questionnaires at 20 weeks were compared with preoperative results with all patients receiving the same, best available hearing aid (HA). Statistically significant improvements in postoperative AMEI-aided thresholds (1, 2, 4, and 8 kHz) and on the speech reception thresholds and word recognition scores tests, compared with preoperative HA-aided results, were observed. On the SNR, the subjects' mean values showed statistically significant improvement, with -5.7 dB SNR for the AMEI-aided mean and -2.1 dB SNR for the preoperative HA-assisted mean. The APHAB quality of life questionnaire also showed statistically significant improvement with the AMEI. Results with the AMEI applied to the round window exceeded those of the best available hearing aid in speech perception as well as quality of life questionnaires. There were minimal adverse events or changes to patients' residual hearing.
Li, Yi; Liu, Xue-bing; Zhang, Yao
2012-08-01
To study the efficacy and safety of acupuncture therapy for the improvement of sleep quality of outpatients receiving methadone maintenance treatment (MMT). Using randomized double-blinded controlled design, seventy-five MMT outpatients with low sleep quality [score of Pittsburgh sleep quality index (PSQI) > or = 8], were randomly assigned to the acupuncture group (38 cases) and the sham-acupuncture group (37 cases). All patients maintained previous MMT. Acupuncture was applied to Baihui (GV20), Shenmen (bilateral, TF4), Shenting (GV24), Sanyinjiao (bilateral, SP6), and Sishencong (EX-HN1) in the acupuncture group. The same procedures were performed in the sham-acupuncture group, but not to the acupoints (5 mm lateral to the acupoints selected in the acupuncture group) with shallow needling technique. The treatment was performed 5 times each week for 8 successive weeks. The PSQI was assessed before treatment, at the end of the 2nd, 4th, 6th, and 8th week of the treatment. The detection ratio of low sleep quality and the incidence of adverse acupuncture reactions were compared between the two groups at the end of the 8th week. The overall PSQI score was obviously higher in the acupuncture group than in the sham-acupuncture group with statistical difference (P < 0.01). The detection ratio of low sleep quality at the end of the 8th week was lower in the acupuncture group (60.53%, 23/38 cases) than in the sham-acupuncture group (83.78%, 31/37 cases) with statistical difference (P < 0.05). The rate of adverse acupuncture reaction was 5.26% (2/38 cases) in the acupuncture group and 2.70% (1/37 cases) in the sham-acupuncture group respectively, showing no statistical difference (P > 0.05). Acupuncture therapy could effectively and safely improve the sleep quality of outpatients receiving MMT.
Robertson, Dale M.; Saad, D.A.; Heisey, D.M.
2006-01-01
Various approaches are used to subdivide large areas into regions containing streams that have similar reference or background water quality and that respond similarly to different factors. For many applications, such as establishing reference conditions, it is preferable to use physical characteristics that are not affected by human activities to delineate these regions. However, most approaches, such as ecoregion classifications, rely on land use to delineate regions or have difficulties compensating for the effects of land use. Land use not only directly affects water quality, but it is often correlated with the factors used to define the regions. In this article, we describe modifications to SPARTA (spatial regression-tree analysis), a relatively new approach applied to water-quality and environmental characteristic data to delineate zones with similar factors affecting water quality. In this modified approach, land-use-adjusted (residualized) water quality and environmental characteristics are computed for each site. Regression-tree analysis is applied to the residualized data to determine the most statistically important environmental characteristics describing the distribution of a specific water-quality constituent. Geographic information for small basins throughout the study area is then used to subdivide the area into relatively homogeneous environmental water-quality zones. For each zone, commonly used approaches are subsequently used to define its reference water quality and how its water quality responds to changes in land use. SPARTA is used to delineate zones of similar reference concentrations of total phosphorus and suspended sediment throughout the upper Midwestern part of the United States. ?? 2006 Springer Science+Business Media, Inc.
Self-correcting multi-atlas segmentation
NASA Astrophysics Data System (ADS)
Gao, Yi; Wilford, Andrew; Guo, Liang
2016-03-01
In multi-atlas segmentation, one typically registers several atlases to the new image, and their respective segmented label images are transformed and fused to form the final segmentation. After each registration, the quality of the registration is reflected by the single global value: the final registration cost. Ideally, if the quality of the registration can be evaluated at each point, independent of the registration process, which also provides a direction in which the deformation can further be improved, the overall segmentation performance can be improved. We propose such a self-correcting multi-atlas segmentation method. The method is applied on hippocampus segmentation from brain images and statistically significantly improvement is observed.
NASA Astrophysics Data System (ADS)
Czernecki, Bartosz; Nowosad, Jakub; Jabłońska, Katarzyna
2018-04-01
Changes in the timing of plant phenological phases are important proxies in contemporary climate research. However, most of the commonly used traditional phenological observations do not give any coherent spatial information. While consistent spatial data can be obtained from airborne sensors and preprocessed gridded meteorological data, not many studies robustly benefit from these data sources. Therefore, the main aim of this study is to create and evaluate different statistical models for reconstructing, predicting, and improving quality of phenological phases monitoring with the use of satellite and meteorological products. A quality-controlled dataset of the 13 BBCH plant phenophases in Poland was collected for the period 2007-2014. For each phenophase, statistical models were built using the most commonly applied regression-based machine learning techniques, such as multiple linear regression, lasso, principal component regression, generalized boosted models, and random forest. The quality of the models was estimated using a k-fold cross-validation. The obtained results showed varying potential for coupling meteorological derived indices with remote sensing products in terms of phenological modeling; however, application of both data sources improves models' accuracy from 0.6 to 4.6 day in terms of obtained RMSE. It is shown that a robust prediction of early phenological phases is mostly related to meteorological indices, whereas for autumn phenophases, there is a stronger information signal provided by satellite-derived vegetation metrics. Choosing a specific set of predictors and applying a robust preprocessing procedures is more important for final results than the selection of a particular statistical model. The average RMSE for the best models of all phenophases is 6.3, while the individual RMSE vary seasonally from 3.5 to 10 days. Models give reliable proxy for ground observations with RMSE below 5 days for early spring and late spring phenophases. For other phenophases, RMSE are higher and rise up to 9-10 days in the case of the earliest spring phenophases.
de Souza, Ana Célia Caetano; Moreira, Thereza Maria Magalhaes; de Oliveira, Edmar Souza; de Menezes, Anaíze Viana Bezerra; Loureiro, Aline Maria Oliveira; Silva, Camila Brasileiro de Araújo; Linard, Jair Gomes; de Almeida, Italo Lennon Sales; Mattos, Samuel Miranda; Borges, José Wicto Pereira
2016-01-01
The objective of this study was to test the effectiveness of an educational intervention with use of educational technology (flipchart) to promote quality of life (QOL) and treatment adherence in people with hypertension. It was an intervention study of before-and-after type conducted with 116 hypertensive people registered in Primary Health Care Units. The educational interventions were conducted using the flipchart educational technology. Quality of life was assessed through the MINICHAL (lowest score = better QOL) and the QATSH (higher score = better adherence) was used to assess the adherence to hypertension treatment. Both were measured before and after applying the intervention. In the analysis, we used the Student’s t-test for paired data. The average baseline quality of life was 11.66 ± 7.55, and 7.71 ± 5.72 two months after the intervention, showing a statistically significant reduction (p <0.001) and mean of differences of 3.95. The average baseline adherence to treatment was 98.03 ± 7.08 and 100.71 ± 6.88 two months after the intervention, which is statistically significant (p < 0.001), and mean of differences of 2.68. The conclusion was that the educational intervention using the flipchart improved the total score of quality of life in the scores of physical and mental domains, and increased adherence to hypertension treatment in people with the disease. PMID:27851752
de Souza, Ana Célia Caetano; Moreira, Thereza Maria Magalhaes; Oliveira, Edmar Souza de; Menezes, Anaíze Viana Bezerra de; Loureiro, Aline Maria Oliveira; Silva, Camila Brasileiro de Araújo; Linard, Jair Gomes; Almeida, Italo Lennon Sales de; Mattos, Samuel Miranda; Borges, José Wicto Pereira
2016-01-01
The objective of this study was to test the effectiveness of an educational intervention with use of educational technology (flipchart) to promote quality of life (QOL) and treatment adherence in people with hypertension. It was an intervention study of before-and-after type conducted with 116 hypertensive people registered in Primary Health Care Units. The educational interventions were conducted using the flipchart educational technology. Quality of life was assessed through the MINICHAL (lowest score = better QOL) and the QATSH (higher score = better adherence) was used to assess the adherence to hypertension treatment. Both were measured before and after applying the intervention. In the analysis, we used the Student's t-test for paired data. The average baseline quality of life was 11.66 ± 7.55, and 7.71 ± 5.72 two months after the intervention, showing a statistically significant reduction (p <0.001) and mean of differences of 3.95. The average baseline adherence to treatment was 98.03 ± 7.08 and 100.71 ± 6.88 two months after the intervention, which is statistically significant (p < 0.001), and mean of differences of 2.68. The conclusion was that the educational intervention using the flipchart improved the total score of quality of life in the scores of physical and mental domains, and increased adherence to hypertension treatment in people with the disease.
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
Gozuyesil, Ebru; Baser, Muruvvet
2016-08-01
This study aims to identify the effects of foot reflexology applied to women on their vasomotor complaints and quality of life. A randomised controlled study was conducted with 120 women. The experimental group received foot reflexology treatment, while the control group received nonspecific foot massage. The mean scores for hot flashes, sweats, and night sweats, were lower in the reflexology group than the control group after the practice; and the difference between the groups was statistically significant (p < 0.001). The mean scores for the sub-groups of the MENQOL demonstrated improvements in both groups after the application (p < 0.001). As for the sexual domain, there was a significant improvement in the reflexology group (p < 0.05), but no improvements were found in the control group (p > 0.05). Results showed that reflexology might be effective in decreasing vasomotor problems and increasing quality of life in women in the menopausal period. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cheong, Kwang-Ho; Lee, Me-Yeon; Kang, Sei-Kwon; Yoon, Jai-Woong; Park, Soah; Hwang, Taejin; Kim, Haeyoung; Kim, Kyoung Ju; Han, Tae Jin; Bae, Hoonsik
2015-07-01
The aim of this study is to set up statistical quality control for monitoring the volumetric modulated arc therapy (VMAT) delivery error by using the machine's log data. Eclipse and a Clinac iX linac with the RapidArc system (Varian Medical Systems, Palo Alto, USA) are used for delivery of the VMAT plan. During the delivery of the RapidArc fields, the machine determines the delivered monitor units (MUs) and the gantry angle's position accuracy and the standard deviations of the MU ( σMU: dosimetric error) and the gantry angle ( σGA: geometric error) are displayed on the console monitor after completion of the RapidArc delivery. In the present study, first, the log data were analyzed to confirm its validity and usability; then, statistical process control (SPC) was applied to monitor the σMU and the σGA in a timely manner for all RapidArc fields: a total of 195 arc fields for 99 patients. The MU and the GA were determined twice for all fields, that is, first during the patient-specific plan QA and then again during the first treatment. The sMU and the σGA time series were quite stable irrespective of the treatment site; however, the sGA strongly depended on the gantry's rotation speed. The σGA of the RapidArc delivery for stereotactic body radiation therapy (SBRT) was smaller than that for the typical VMAT. Therefore, SPC was applied for SBRT cases and general cases respectively. Moreover, the accuracy of the potential meter of the gantry rotation is important because the σGA can change dramatically due to its condition. By applying SPC to the σMU and σGA, we could monitor the delivery error efficiently. However, the upper and the lower limits of SPC need to be determined carefully with full knowledge of the machine and log data.
Kaiser, Ulrike; Sabatowski, Rainer; Balck, Friedrich
2017-08-01
The assessment of treatment effectiveness in public health settings is ensured by indicators that reflect the changes caused by specific interventions. These indicators are also applied in benchmarking systems. The selection of constructs should be guided by their relevance for affected patients (patient reported outcomes). The interdisciplinary multimodal pain therapy (IMPT) is a complex intervention based on a biopsychosocial understanding of chronic pain. For quality assurance purposes, psychological parameters (depression, general anxiety, health-related quality of life) are included in standardized therapy assessment in pain medicine (KEDOQ), which can also be used for comparative analyses in a benchmarking system. The aim of the present study was to investigate the relevance of depressive symptoms, general anxiety and mental quality of life in patients undergoing IMPT under real life conditions. In this retrospective, one-armed and exploratory observational study we used secondary data of a routine documentation of IMST in routine care, applying several variables of the German Pain Questionnaire and the facility's comprehensive basic documentation. 352 participants with IMPT (from 2006 to 2010) were included, and the follow-up was performed over two years with six assessments. Because of statistically heterogeneous characteristics a complex analysis consisting of factor and cluster analyses was applied to build subgroups. These subgroups were explored to identify differences in depressive symptoms (HADS-D), general anxiety (HADS-A), and mental quality of life (SF 36 PSK) at the time of therapy admission and their development estimated by means of effect sizes. Analyses were performed using SPSS 21.0®. Six subgroups were derived and mainly proved to be clinically and psychologically normal, with the exception of one subgroup that consistently showed psychological impairment for all three parameters. The follow-up of the total study population revealed medium or large effects; changes in the subgroups were consistently caused by two subgroups, while the other four showed little or no change. In summary, only a small proportion of the target population (20 %) demonstrated clinically relevant scores in the psychological parameters applied. When selecting indicators for quality assurance, the heterogeneity of the target populations as well as conceptual and methodological aspects should be considered. The characteristics of the parameters intended, along with clinical and personal relevance of indicators for patients, should be investigated by specific procedures such as patient surveys and statistical analyses. Copyright © 2017. Published by Elsevier GmbH.
Anani, Nadim; Mazya, Michael V; Chen, Rong; Prazeres Moreira, Tiago; Bill, Olivier; Ahmed, Niaz; Wahlgren, Nils; Koch, Sabine
2017-01-10
Interoperability standards intend to standardise health information, clinical practice guidelines intend to standardise care procedures, and patient data registries are vital for monitoring quality of care and for clinical research. This study combines all three: it uses interoperability specifications to model guideline knowledge and applies the result to registry data. We applied the openEHR Guideline Definition Language (GDL) to data from 18,400 European patients in the Safe Implementation of Treatments in Stroke (SITS) registry to retrospectively check their compliance with European recommendations for acute stroke treatment. Comparing compliance rates obtained with GDL to those obtained by conventional statistical data analysis yielded a complete match, suggesting that GDL technology is reliable for guideline compliance checking. The successful application of a standard guideline formalism to a large patient registry dataset is an important step toward widespread implementation of computer-interpretable guidelines in clinical practice and registry-based research. Application of the methodology gave important results on the evolution of stroke care in Europe, important both for quality of care monitoring and clinical research.
Modular design and implementation of field-programmable-gate-array-based Gaussian noise generator
NASA Astrophysics Data System (ADS)
Li, Yuan-Ping; Lee, Ta-Sung; Hwang, Jeng-Kuang
2016-05-01
The modular design of a Gaussian noise generator (GNG) based on field-programmable gate array (FPGA) technology was studied. A new range reduction architecture was included in a series of elementary function evaluation modules and was integrated into the GNG system. The approximation and quantisation errors for the square root module with a first polynomial approximation were high; therefore, we used the central limit theorem (CLT) to improve the noise quality. This resulted in an output rate of one sample per clock cycle. We subsequently applied Newton's method for the square root module, thus eliminating the need for the use of the CLT because applying the CLT resulted in an output rate of two samples per clock cycle (>200 million samples per second). Two statistical tests confirmed that our GNG is of high quality. Furthermore, the range reduction, which is used to solve a limited interval of the function approximation algorithms of the System Generator platform using Xilinx FPGAs, appeared to have a higher numerical accuracy, was operated at >350 MHz, and can be suitably applied for any function evaluation.
Revising the lower statistical limit of x-ray grating-based phase-contrast computed tomography.
Marschner, Mathias; Birnbacher, Lorenz; Willner, Marian; Chabior, Michael; Herzen, Julia; Noël, Peter B; Pfeiffer, Franz
2017-01-01
Phase-contrast x-ray computed tomography (PCCT) is currently investigated as an interesting extension of conventional CT, providing high soft-tissue contrast even if examining weakly absorbing specimen. Until now, the potential for dose reduction was thought to be limited compared to attenuation CT, since meaningful phase retrieval fails for scans with very low photon counts when using the conventional phase retrieval method via phase stepping. In this work, we examine the statistical behaviour of the reverse projection method, an alternative phase retrieval approach and compare the results to the conventional phase retrieval technique. We investigate the noise levels in the projections as well as the image quality and quantitative accuracy of the reconstructed tomographic volumes. The results of our study show that this method performs better in a low-dose scenario than the conventional phase retrieval approach, resulting in lower noise levels, enhanced image quality and more accurate quantitative values. Overall, we demonstrate that the lower statistical limit of the phase stepping procedure as proposed by recent literature does not apply to this alternative phase retrieval technique. However, further development is necessary to overcome experimental challenges posed by this method which would enable mainstream or even clinical application of PCCT.
Welzenbach, Julia; Neuhoff, Christiane; Looft, Christian; Schellander, Karl; Tholen, Ernst; Große-Brinkhaus, Christine
2016-01-01
The aim of this study was to elucidate the underlying biochemical processes to identify potential key molecules of meat quality traits drip loss, pH of meat 1 h post-mortem (pH1), pH in meat 24 h post-mortem (pH24) and meat color. An untargeted metabolomics approach detected the profiles of 393 annotated and 1,600 unknown metabolites in 97 Duroc × Pietrain pigs. Despite obvious differences regarding the statistical approaches, the four applied methods, namely correlation analysis, principal component analysis, weighted network analysis (WNA) and random forest regression (RFR), revealed mainly concordant results. Our findings lead to the conclusion that meat quality traits pH1, pH24 and color are strongly influenced by processes of post-mortem energy metabolism like glycolysis and pentose phosphate pathway, whereas drip loss is significantly associated with metabolites of lipid metabolism. In case of drip loss, RFR was the most suitable method to identify reliable biomarkers and to predict the phenotype based on metabolites. On the other hand, WNA provides the best parameters to investigate the metabolite interactions and to clarify the complex molecular background of meat quality traits. In summary, it was possible to attain findings on the interaction of meat quality traits and their underlying biochemical processes. The detected key metabolites might be better indicators of meat quality especially of drip loss than the measured phenotype itself and potentially might be used as bio indicators. PMID:26919205
Correlations between state anxiety and quality of life in metastatic breast cancer patients.
Dragomir, Bianca-Iasmina; Fodoreanu, Liana
2013-01-01
to evaluate the correlations between perceived state anxiety during chemotherapy and quality of life in metastatic breast cancer patients. 62 metastatic breast cancer patients were evaluated during chemotherapy concerning age, living environment, marital status, social support and preexisting financial difficulties, the histological type of cancer, the site of the metastasis, the time from diagnosis, the type of surgical intervention, dexamethasone use, somatic comorbidities and the radiotherapy. The STAI-X1, BDI- IIA and the QLQ 30 (Quality of Life Questionnaire 30) plus BR 23 (Breast 23) questionnaires were applied. For the statistical analysis we used the SPSS 13 package. 24 subjects were experiencing low state anxiety (< or =39), whilst 38 had significant state anxiety (>39). Statistically significant differences were encountered between the two compared subgroups concerning the living environment, the type of surgical intervention, the marital status, the social support and the mean BDI scores, adjusted means were calculated for the items considered to potentially influence quality of life. Social, emotional and role functioning had lower scores in the low state anxiety group. Fatigue, future perspective, chemotherapy induced side effects, breast symptoms, upset by hair loss and sexual functioning were more disturbing in the high state anxiety group. The general health/quality of life mean score was lower in the low state anxiety group. Higher state anxiety correlates with certain quality of life items, suggesting that psychological counselling and appropriate therapy induced side effects management should be a priority in the palliative care for metastatic breast cancer patients.
US EPA 2012 Air Quality Fused Surface for the Conterminous U.S. Map Service
This web service contains a polygon layer that depicts fused air quality predictions for 2012 for census tracts in the conterminous United States. Fused air quality predictions (for ozone and PM2.5) are modeled using a Bayesian space-time downscaling fusion model approach described in a series of three published journal papers: 1) (Berrocal, V., Gelfand, A. E. and Holland, D. M. (2012). Space-time fusion under error in computer model output: an application to modeling air quality. Biometrics 68, 837-848; 2) Berrocal, V., Gelfand, A. E. and Holland, D. M. (2010). A bivariate space-time downscaler under space and time misalignment. The Annals of Applied Statistics 4, 1942-1975; and 3) Berrocal, V., Gelfand, A. E., and Holland, D. M. (2010). A spatio-temporal downscaler for output from numerical models. J. of Agricultural, Biological,and Environmental Statistics 15, 176-197) is used to provide daily, predictive PM2.5 (daily average) and O3 (daily 8-hr maximum) surfaces for 2012. Summer (O3) and annual (PM2.5) means calculated and published. The downscaling fusion model uses both air quality monitoring data from the National Air Monitoring Stations/State and Local Air Monitoring Stations (NAMS/SLAMS) and numerical output from the Models-3/Community Multiscale Air Quality (CMAQ). Currently, predictions at the US census tract centroid locations within the 12 km CMAQ domain are archived. Predictions at the CMAQ grid cell centroids, or any desired set of locations co
Considering whether Medicaid is worth the cost: revisiting the Oregon Health Study.
Muennig, Peter A; Quan, Ryan; Chiuzan, Codruta; Glied, Sherry
2015-05-01
The Oregon Health Study was a groundbreaking experiment in which uninsured participants were randomized to either apply for Medicaid or stay with their current care. The study showed that Medicaid produced numerous important socioeconomic and health benefits but had no statistically significant impact on hypertension, hypercholesterolemia, or diabetes. Medicaid opponents interpreted the findings to mean that Medicaid is not a worthwhile investment. Medicaid proponents viewed the experiment as statistically underpowered and, irrespective of the laboratory values, suggestive that Medicaid is a good investment. We tested these competing claims and, using a sensitive joint test and statistical power analysis, confirmed that the Oregon Health Study did not improve laboratory values. However, we also found that Medicaid is a good value, with a cost of just $62 000 per quality-adjusted life-years gained.
Total Quality Leadership as it Applies to the Surface Navy
1990-12-01
with statistical control methods. Dr. Deming opened the eyes of the Japanese. They embraced his ideas and accepted his 14 principles of management shown...move closer to fully embracing Deming’s fourteen principles of management . 3. Shipboard Leadership Compared To TQL Many activities on board Navy ships...The results of the comparison of Deming’s principles of management and the Navalized TQL principles show both similar- ities and differences do appear
Kottner, Jan; Halfens, Ruud
2010-05-01
Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.
Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan
2016-05-01
The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.
Technical Quality of Complete Dentures: Influence on Masticatory Efficiency and Quality of Life.
Tôrres, Ana Clara Soares Paiva; Maciel, Amanda de Queiroz; de Farias, Danielle Bezerra; de Medeiros, Annie Karoline Bezerra; Vieira, Flávia Patrícia Tavares Veras; Carreiro, Adriana da Fonte Porto
2017-11-09
To evaluate the effect of the technical quality of conventional complete dentures (CD) on masticatory efficiency and quality of life (QoL) of denture wearers during a 1-year follow-up. A prospective clinical trial with 32 edentulous patients (mean age of 60.2 years) wearing mandibular and maxillary dentures was conducted. All patients were evaluated wearing their preexisting dentures and after 3, 6, and 12 months postinsertion of new dentures. A reproducible method for objective evaluation of the technical quality of CDs was employed. Masticatory efficiency was evaluated by the colorimetric method using beads as artificial testing food. The oral health impact on patient QoL was measured using the OHIP-EDENT (Oral Health Impact Profile in Edentulous Adults) questionnaire. The nonparametric Wilcoxon test was applied to reveal any differences in technical quality between the preexisting and new dentures. The Friedman test was used to detect differences in masticatory efficiency and oral health impact on QoL. Spearman's correlation was applied to reveal correlation between the variables. Comparing preexisting and new dentures, significant improvement was found in technical quality between the dentures (p < 0.001). There was no statistically significant difference in masticatory efficiency. A significant decrease was found in the total OHIP-EDENT scores after denture replacement. A positive correlation was found between technical quality and OHIP in the new denture wearers (p = 0.011). According to the results of this study, denture quality significantly improved patients' oral health-related QoL; however, insertion of new dentures did not influence masticatory efficiency. © 2017 by the American College of Prosthodontists.
A data-driven approach to quality risk management.
Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David
2013-10-01
An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. ONLY A SUBSET OF THE RISK FACTORS HAD A SIGNIFICANT ASSOCIATION WITH QUALITY ISSUES, AND INCLUDED: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety.
JPEG2000 still image coding quality.
Chen, Tzong-Jer; Lin, Sheng-Chieh; Lin, You-Chen; Cheng, Ren-Gui; Lin, Li-Hui; Wu, Wei
2013-10-01
This work demonstrates the image qualities between two popular JPEG2000 programs. Two medical image compression algorithms are both coded using JPEG2000, but they are different regarding the interface, convenience, speed of computation, and their characteristic options influenced by the encoder, quantization, tiling, etc. The differences in image quality and compression ratio are also affected by the modality and compression algorithm implementation. Do they provide the same quality? The qualities of compressed medical images from two image compression programs named Apollo and JJ2000 were evaluated extensively using objective metrics. These algorithms were applied to three medical image modalities at various compression ratios ranging from 10:1 to 100:1. Following that, the quality of the reconstructed images was evaluated using five objective metrics. The Spearman rank correlation coefficients were measured under every metric in the two programs. We found that JJ2000 and Apollo exhibited indistinguishable image quality for all images evaluated using the above five metrics (r > 0.98, p < 0.001). It can be concluded that the image quality of the JJ2000 and Apollo algorithms is statistically equivalent for medical image compression.
[Teaching performance assessment in Public Health employing three different strategies].
Martínez-González, Adrián; Moreno-Altamirano, Laura; Ponce-Rosas, Efrén Raúl; Martínez-Franco, Adrián Israel; Urrutia-Aguilar, María Esther
2011-01-01
The educational system depends upon the quality and performance of their faculty and should therefore be process of continuous improvement. To assess the teaching performance of the Public Health professors, at the Faculty of Medicine, UNAM through three strategies. Justification study. The evaluation was conducted under a mediational model through three strategies: students' opinion assessment, self-assessment and students' academic achievement. We applied descriptive statistics, Student t test, ANOVA and Pearson correlation. Twenty professors were evaluated from the Public Health department, representing 57% of all them who teach the subject. The professor's performance was highly valued self-assessment compared with assessment of student opinion, was confirmed by statistical analysis the difference was significant. The difference amongst the three evaluation strategies became more evident between self-assessment and the scores obtained by students in their academic achievement. The integration of these three strategies offers a more complete view of the teacher's performance quality. Academic achievement appears to be a more objective strategy for teaching performance assessment than students' opinion and self-assessment.
Statistics Quality Control Statistics CIDR is dedicated to producing the highest quality data for our investigators. These cumulative quality control statistics are based on data from 419 released CIDR Program
Li, Da; Liang, Li; Zhang, Jing; Kang, Tingguo
2015-01-01
Background: Quality control is one of the bottleneck problems limiting the application and development of traditional Chinese medicine (TCM). In recent years, microscopy and high-performance liquid chromatography (HPLC) techniques have been frequently applied in the quality control of TCM. However, studies combining conventional microscopy and HPLC techniques for the quality control of the flower bud of Tussilago farfara L. (Kuandonghua) have not been reported. Objective: This study was undertaken to evaluate the quality of the flower bud of T. farfara L. and to establish the relationships between the quantity of pollen grains and four main bioactive constituents: tussilagone, chlorogenic acid, rutin and isoquercitrin. Materials and Methods: In this study, microscopic examination was used to quantify microscopic characteristics of the flower bud of T. farfara L., and the chemical components were determined by HPLC. The data were analyzed by Statistical Package for the Social Sciences statistics software. Results: The results of the analysis showed that tussilagone, chlorogenic acid, rutin and isoquercitrin were significantly correlated with the quantity of pollen grains in the flower bud of T. farfara L. There is a positive correlation between them. From these results, it can be deduced that the flower bud of T. farfara L. with a greater quantity of pollen grains should be of better quality. Conclusion: The study showed that the established method can be helpful for evaluating the quality of the flower bud of T. farfara L. based on microscopic characteristic constants and chemical quantitation. PMID:26246737
No-reference multiscale blur detection tool for content based image retrieval
NASA Astrophysics Data System (ADS)
Ezekiel, Soundararajan; Stocker, Russell; Harrity, Kyle; Alford, Mark; Ferris, David; Blasch, Erik; Gorniak, Mark
2014-06-01
In recent years, digital cameras have been widely used for image capturing. These devices are equipped in cell phones, laptops, tablets, webcams, etc. Image quality is an important component of digital image analysis. To assess image quality for these mobile products, a standard image is required as a reference image. In this case, Root Mean Square Error and Peak Signal to Noise Ratio can be used to measure the quality of the images. However, these methods are not possible if there is no reference image. In our approach, a discrete-wavelet transformation is applied to the blurred image, which decomposes into the approximate image and three detail sub-images, namely horizontal, vertical, and diagonal images. We then focus on noise-measuring the detail images and blur-measuring the approximate image to assess the image quality. We then compute noise mean and noise ratio from the detail images, and blur mean and blur ratio from the approximate image. The Multi-scale Blur Detection (MBD) metric provides both an assessment of the noise and blur content. These values are weighted based on a linear regression against full-reference y values. From these statistics, we can compare to normal useful image statistics for image quality without needing a reference image. We then test the validity of our obtained weights by R2 analysis as well as using them to estimate image quality of an image with a known quality measure. The result shows that our method provides acceptable results for images containing low to mid noise levels and blur content.
STRengthening analytical thinking for observational studies: the STRATOS initiative.
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-12-30
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?
Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R
2013-01-01
The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.
STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-01-01
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480
Ashrafi, Parivash; Sun, Yi; Davey, Neil; Adams, Roderick G; Wilkinson, Simon C; Moss, Gary Patrick
2018-03-01
The aim of this study was to investigate how to improve predictions from Gaussian Process models by optimising the model hyperparameters. Optimisation methods, including Grid Search, Conjugate Gradient, Random Search, Evolutionary Algorithm and Hyper-prior, were evaluated and applied to previously published data. Data sets were also altered in a structured manner to reduce their size, which retained the range, or 'chemical space' of the key descriptors to assess the effect of the data range on model quality. The Hyper-prior Smoothbox kernel results in the best models for the majority of data sets, and they exhibited significantly better performance than benchmark quantitative structure-permeability relationship (QSPR) models. When the data sets were systematically reduced in size, the different optimisation methods generally retained their statistical quality, whereas benchmark QSPR models performed poorly. The design of the data set, and possibly also the approach to validation of the model, is critical in the development of improved models. The size of the data set, if carefully controlled, was not generally a significant factor for these models and that models of excellent statistical quality could be produced from substantially smaller data sets. © 2018 Royal Pharmaceutical Society.
Helsel, Dennis R.; Gilliom, Robert J.
1986-01-01
Estimates of distributional parameters (mean, standard deviation, median, interquartile range) are often desired for data sets containing censored observations. Eight methods for estimating these parameters have been evaluated by R. J. Gilliom and D. R. Helsel (this issue) using Monte Carlo simulations. To verify those findings, the same methods are now applied to actual water quality data. The best method (lowest root-mean-squared error (rmse)) over all parameters, sample sizes, and censoring levels is log probability regression (LR), the method found best in the Monte Carlo simulations. Best methods for estimating moment or percentile parameters separately are also identical to the simulations. Reliability of these estimates can be expressed as confidence intervals using rmse and bias values taken from the simulation results. Finally, a new simulation study shows that best methods for estimating uncensored sample statistics from censored data sets are identical to those for estimating population parameters. Thus this study and the companion study by Gilliom and Helsel form the basis for making the best possible estimates of either population parameters or sample statistics from censored water quality data, and for assessments of their reliability.
The Mathematics of Four or More N-Localizers for Stereotactic Neurosurgery.
Brown, Russell A
2015-10-13
The mathematics that were originally developed for the N-localizer apply to three N-localizers that produce three sets of fiducials in a tomographic image. Some applications of the N-localizer use four N-localizers that produce four sets of fiducials; however, the mathematics that apply to three sets of fiducials do not apply to four sets of fiducials. This article presents mathematics that apply to four or more sets of fiducials that all lie within one planar tomographic image. In addition, these mathematics are extended to apply to four or more fiducials that do not all lie within one planar tomographic image, as may be the case with magnetic resonance (MR) imaging where a volume is imaged instead of a series of planar tomographic images. Whether applied to a planar image or a volume image, the mathematics of four or more N-localizers provide a statistical measure of the quality of the image data that may be influenced by factors, such as the nonlinear distortion of MR images.
ERIC Educational Resources Information Center
Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain
2004-01-01
Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…
López-Pérez, Patricia; Miranda-Novales, Guadalupe; Segura-Méndez, Nora Hilda; Del Rivero-Hernández, Leonel; Cambray-Gutiérrez, Cesar; Chávez-García, Aurora
2014-01-01
Quality of life is a multidimensional concept that includes physical, emotional and social components associated with the disease. The use of tools to assess the Quality of Life Health Related (HRQOL) has increased in recent decades. Common variable immunodeficiency (CVID) is the most commonly diagnosed primary immunodeficiency. To evaluate the quality of life in patients with CVID using the questionnaire SF -36. A descriptive cross-sectional survey included 23 patients diagnosed with CVID, belonging to the Immunodeficiency Clinic Service of Allergology and Clinical Immunology in CMN Siglo XXI, IMSS. The questionnaire SF- 36 validated in Spanish was applied. descriptive statistics with simple frequencies and percentages, inferential statistics: Fisher exact test and ANOVA to compare means. The study involved 23 patients, 14 women (60%) and 9 men (40%), mean age 38.6 ± 14.7 years. The highest score was obtained in 83% emotional role. Dimensions with further deterioration in both genders were: 54% general health, vitality 59% and physical performance 72%. No differences were found regarding gender. The only issue in which statistically significant differences were found in patients with more than 3 comorbidities was change in health status in the past year (p=0.007). Patients with severe comorbidities, such as haematologicaloncological (leukemias, lymphomas, neoplasms), and pulmonary (severe bronchiectasis) showed further deterioration in the aspects of physical performance 73% and 64% emotional role. 65% of patients reported an improvement in health status in 74% in the last year. Adult patients with CVID show deterioration in different dimensions, particularly in the areas of general health, vitality and physical performance. Patients with severe comorbidities such as leukemia, lymphomas, malignancies and severe bronchiectasis show further deterioration in some aspects of quality of life, especially in physical performance and emotional role. A higher number of comorbidities was significantly associated with a lower score in changing health. The questionnaire SF-36 is useful for evaluating the quality of life of our patients with CVID.
Using creative problem solving (TRIZ) in improving the quality of hospital services.
LariSemnani, Behrouz; Mohebbi Far, Rafat; Shalipoor, Elham; Mohseni, Mohammad
2014-08-14
TRIZ is an initiative and SERVQUAL is a structured methodology for quality improvement. Using these tools, inventive problem solving can be applied for quality improvement, and the highest quality can be reached using creative quality improvement methodology. The present study seeks to determine the priority of quality aspects of services provided for patients in the hospital as well as how TRIZ can help in improving the quality of those services. This Study is an applied research which used a dynamic qualitative descriptive survey method during year 2011. Statistical population includes every patient who visited in one of the University Hospitals from March 2011. There existed a big gap between patients' expectations from what seemingly is seen (the design of the hospital) and timely provision of services with their perceptions. Also, quality aspects of services were prioritized as follows: keeping the appearance of hospital (the design), accountability, assurance, credibility and having empathy. Thus, the only thing which mattered most for all staff and managers of studied hospital was the appearance of hospital as well as its staff look. This can grasp a high percentage of patients' satisfaction. By referring to contradiction matrix, the most important principles of TRIZ model were related to tangible factors including principles No. 13 (discarding and recovering), 25 (self-service), 35 (parameter changes), and 2 (taking out). Furthermore, in addition to these four principles, principle No. 24 (intermediary) was repeated most among the others. By utilizing TRIZ, hospital problems can be examined with a more open view, Go beyond The conceptual framework of the organization and responded more quickly to patients ' needs.
The relationship of sleep problems to life quality and depression
Sarıarslan, Hacı A.; Gulhan, Yıldırım B.; Unalan, Demet; Basturk, Mustafa; Delibas, Senol
2015-01-01
Objective: To identify the level of depression, the level of life quality, and the relationship between these, in patients applying to sleep centers for various sleep problems. Methods: This cross-sectional study included 229 patients who applied for polysomnography at sleeping centers under supervision of the Neurology and Chest Diseases Clinics of Kayseri Education and Research Hospital, Kayseri, Turkey between June and August 2013. The data collection tools were a socio-demographical data form, Beck Depression Inventory (BDI), Pittsburgh Sleep Quality Index (PSQI), and the World Health Organization Quality of Life Scale (WHOQOL-BREF). For statistical analyses, the Student t-test, Kruskal-Wallis-variant analysis, and chi-square tests were used. Significance level was considered as p<0.05. Results: In our study, patients who were older aged, married, not working, and who had a chronic disease, and a severe depressive symptom were observed to have significantly poorer sleep quality. While patients with any chronic disease had significantly higher scores for total PSQI and depression, their physical, mental, and social WHOQOL-BREF scores were significantly lower. The PSQI total scores, and depression scores of the smoking patients were significantly higher for physical, mental, and social WHOQOL-BREF fields. There was a positive correlation between PSQI scores and BDI scores while there was a negative correlation among BDI, PSQI, and WHOQOL-BREF life quality sub-scale scores. Conclusions: Sleep quality was significantly poorer in patients who were older aged, married, not working, and who had a chronic disease, and a severe depressive symptom. There was a significantly negative correlation among depression, sleep quality, and life quality, while there was a significantly positive correlation between life quality and depression. PMID:26166591
Saad, Ahmed S; Attia, Ali K; Alaraki, Manal S; Elzanfaly, Eman S
2015-11-05
Five different spectrophotometric methods were applied for simultaneous determination of fenbendazole and rafoxanide in their binary mixture; namely first derivative, derivative ratio, ratio difference, dual wavelength and H-point standard addition spectrophotometric methods. Different factors affecting each of the applied spectrophotometric methods were studied and the selectivity of the applied methods was compared. The applied methods were validated as per the ICH guidelines and good accuracy; specificity and precision were proven within the concentration range of 5-50 μg/mL for both drugs. Statistical analysis using one-way ANOVA proved no significant differences among the proposed methods for the determination of the two drugs. The proposed methods successfully determined both drugs in laboratory prepared and commercially available binary mixtures, and were found applicable for the routine analysis in quality control laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.
Heggen, Kristin Livelten; Pedersen, Hans Kristian; Andersen, Hilde Kjernlie; Martinsen, Anne Catrine T
2016-01-01
Background Iterative reconstruction can reduce image noise and thereby facilitate dose reduction. Purpose To evaluate qualitative and quantitative image quality for full dose and dose reduced head computed tomography (CT) protocols reconstructed using filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR). Material and Methods Fourteen patients undergoing follow-up head CT were included. All patients underwent full dose (FD) exam and subsequent 15% dose reduced (DR) exam, reconstructed using FBP and 30% ASIR. Qualitative image quality was assessed using visual grading characteristics. Quantitative image quality was assessed using ROI measurements in cerebrospinal fluid (CSF), white matter, peripheral and central gray matter. Additionally, quantitative image quality was measured in Catphan and vendor’s water phantom. Results There was no significant difference in qualitative image quality between FD FBP and DR ASIR. Comparing same scan FBP versus ASIR, a noise reduction of 28.6% in CSF and between −3.7 and 3.5% in brain parenchyma was observed. Comparing FD FBP versus DR ASIR, a noise reduction of 25.7% in CSF, and −7.5 and 6.3% in brain parenchyma was observed. Image contrast increased in ASIR reconstructions. Contrast-to-noise ratio was improved in DR ASIR compared to FD FBP. In phantoms, noise reduction was in the range of 3 to 28% with image content. Conclusion There was no significant difference in qualitative image quality between full dose FBP and dose reduced ASIR. CNR improved in DR ASIR compared to FD FBP mostly due to increased contrast, not reduced noise. Therefore, we recommend using caution if reducing dose and applying ASIR to maintain image quality. PMID:27583169
Østerås, Bjørn Helge; Heggen, Kristin Livelten; Pedersen, Hans Kristian; Andersen, Hilde Kjernlie; Martinsen, Anne Catrine T
2016-08-01
Iterative reconstruction can reduce image noise and thereby facilitate dose reduction. To evaluate qualitative and quantitative image quality for full dose and dose reduced head computed tomography (CT) protocols reconstructed using filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR). Fourteen patients undergoing follow-up head CT were included. All patients underwent full dose (FD) exam and subsequent 15% dose reduced (DR) exam, reconstructed using FBP and 30% ASIR. Qualitative image quality was assessed using visual grading characteristics. Quantitative image quality was assessed using ROI measurements in cerebrospinal fluid (CSF), white matter, peripheral and central gray matter. Additionally, quantitative image quality was measured in Catphan and vendor's water phantom. There was no significant difference in qualitative image quality between FD FBP and DR ASIR. Comparing same scan FBP versus ASIR, a noise reduction of 28.6% in CSF and between -3.7 and 3.5% in brain parenchyma was observed. Comparing FD FBP versus DR ASIR, a noise reduction of 25.7% in CSF, and -7.5 and 6.3% in brain parenchyma was observed. Image contrast increased in ASIR reconstructions. Contrast-to-noise ratio was improved in DR ASIR compared to FD FBP. In phantoms, noise reduction was in the range of 3 to 28% with image content. There was no significant difference in qualitative image quality between full dose FBP and dose reduced ASIR. CNR improved in DR ASIR compared to FD FBP mostly due to increased contrast, not reduced noise. Therefore, we recommend using caution if reducing dose and applying ASIR to maintain image quality.
1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.
Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr
2015-12-01
Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.
Chang, Ching-Sheng; Chen, Su-Yueh; Lan, Yi-Ting
2012-11-21
No previous studies have addressed the integrated relationships among system quality, service quality, job satisfaction, and system performance; this study attempts to bridge such a gap with evidence-based practice study. The convenience sampling method was applied to the information system users of three hospitals in southern Taiwan. A total of 500 copies of questionnaires were distributed, and 283 returned copies were valid, suggesting a valid response rate of 56.6%. SPSS 17.0 and AMOS 17.0 (structural equation modeling) statistical software packages were used for data analysis and processing. The findings are as follows: System quality has a positive influence on service quality (γ11= 0.55), job satisfaction (γ21= 0.32), and system performance (γ31= 0.47). Service quality (β31= 0.38) and job satisfaction (β32= 0.46) will positively influence system performance. It is thus recommended that the information office of hospitals and developers take enhancement of service quality and user satisfaction into consideration in addition to placing b on system quality and information quality when designing, developing, or purchasing an information system, in order to improve benefits and gain more achievements generated by hospital information systems.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series
NASA Technical Reports Server (NTRS)
Vautard, R.; Ghil, M.
1989-01-01
Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.
Determination of pasture quality using airborne hyperspectral imaging
NASA Astrophysics Data System (ADS)
Pullanagari, R. R.; Kereszturi, G.; Yule, Ian J.; Irwin, M. E.
2015-10-01
Pasture quality is a critical determinant which influences animal performance (live weight gain, milk and meat production) and animal health. Assessment of pasture quality is therefore required to assist farmers with grazing planning and management, benchmarking between seasons and years. Traditionally, pasture quality is determined by field sampling which is laborious, expensive and time consuming, and the information is not available in real-time. Hyperspectral remote sensing has potential to accurately quantify biochemical composition of pasture over wide areas in great spatial detail. In this study an airborne imaging spectrometer (AisaFENIX, Specim) was used with a spectral range of 380-2500 nm with 448 spectral bands. A case study of a 600 ha hill country farm in New Zealand is used to illustrate the use of the system. Radiometric and atmospheric corrections, along with automatized georectification of the imagery using Digital Elevation Model (DEM), were applied to the raw images to convert into geocoded reflectance images. Then a multivariate statistical method, partial least squares (PLS), was applied to estimate pasture quality such as crude protein (CP) and metabolisable energy (ME) from canopy reflectance. The results from this study revealed that estimates of CP and ME had a R2 of 0.77 and 0.79, and RMSECV of 2.97 and 0.81 respectively. By utilizing these regression models, spatial maps were created over the imaged area. These pasture quality maps can be used for adopting precision agriculture practices which improves farm profitability and environmental sustainability.
Deep learning and non-negative matrix factorization in recognition of mammograms
NASA Astrophysics Data System (ADS)
Swiderski, Bartosz; Kurek, Jaroslaw; Osowski, Stanislaw; Kruk, Michal; Barhoumi, Walid
2017-02-01
This paper presents novel approach to the recognition of mammograms. The analyzed mammograms represent the normal and breast cancer (benign and malignant) cases. The solution applies the deep learning technique in image recognition. To obtain increased accuracy of classification the nonnegative matrix factorization and statistical self-similarity of images are applied. The images reconstructed by using these two approaches enrich the data base and thanks to this improve of quality measures of mammogram recognition (increase of accuracy, sensitivity and specificity). The results of numerical experiments performed on large DDSM data base containing more than 10000 mammograms have confirmed good accuracy of class recognition, exceeding the best results reported in the actual publications for this data base.
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation
Wilke, Marko; Altaye, Mekibib; Holland, Scott K.
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating “unusual” populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php. PMID:28275348
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation.
Wilke, Marko; Altaye, Mekibib; Holland, Scott K
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating "unusual" populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php.
Mindfulness for palliative care patients. Systematic review.
Latorraca, Carolina de Oliveira Cruz; Martimbianco, Ana Luiza Cabrera; Pachito, Daniela Vianna; Pacheco, Rafael Leite; Riera, Rachel
2017-12-01
Nineteen million adults worldwide are in need of palliative care. Of those who have access to it, 80% fail to receive an efficient management of symptoms. To assess the effectiveness and safety of mindfulness meditation for palliative care patients. We searched CENTRAL, MEDLINE, Embase, LILACS, PEDro, CINAHL, PsycINFO, Opengrey, ClinicalTrials.gov and WHO-ICTRP. No restriction of language, status or date of publication was applied. We considered randomised clinical trials (RCTs) comparing any mindfulness meditation scheme vs any comparator for palliative care. Cochrane Risk of Bias (Rob) Table was used for assessing methodological quality of RCTs. Screening, data extraction and methodological assessments were performed by two reviewers. Mean differences (MD) (confidence intervals of 95% (CI 95%)) were considered for estimating effect size. Quality of evidence was appraised by GRADE. Four RCTs, 234 participants, were included. All studies presented high risk of bias in at least one RoB table criteria. We assessed 4 comparisons, but only 2 studies showed statistically significant difference for at least one outcome. 1. Mindfulness meditation (eight weeks, one session/week, daily individual practice) vs control: statistically significant difference in favour of control for quality of life - physical aspects. 2. Mindfulness meditation (single 5-minute session) vs control: benefit in favour of mindfulness for stress outcome in both time-points. None of the included studies analysed safety and harms outcomes. Although two studies have showed statistically significant difference, only one showed effectiveness of mindfulness meditation in improving perceived stress. This study focused on one single session of mindfulness of 5 minutes for adult cancer patients in palliative care, but it was considered as possessing high risk of bias. Other schemes of mindfulness meditation did not show benefit in any outcome evaluated (low and very low quality evidence). © 2017 John Wiley & Sons Ltd.
Margitai, Barnabás; Dózsa, Csaba; Bárdos-Csenteri, Orsolya Karola; Sándor, János; Gáll, Tibor; Gődény, Sándor
2018-01-01
Objective Quantitative studies have shown the various benefits for having accreditation in hospitals. However, neither of these explored the general conditions before applying for an accreditation. To close this gap, this study aimed to investigate the possible association between joining an accreditation programme with various hospital characteristics. Design A cross-sectional study was implemented using the databases of the 2013 Hungarian hospital survey and of the Hungarian State Treasury. Setting Public general hospitals in Hungary. Participants The analysis involved 44 public general hospitals, 14 of which joined the preparatory project for a newly developed accreditation programme. Main outcome measures The outcomes included the percentage of compliance in quality management, patient information and identification, internal professional regulation, safe surgery, pressure sore prevention, infection control, the opinions of the heads of quality management regarding the usefulness of quality management and clinical audits, and finally, the total debt of the hospital per bed and per discharged patient. Results According to our findings, the general hospitals joining the preparatory project of the accreditation programme performed better in four of the six investigated activities, the head of quality management had a better opinion on the usefulness of quality management, and both the debt per bed number and the debt per discharged patient were lower than those who did not join. However, no statistically significant differences between the two groups were found in any of the examined outcomes. Conclusions The findings suggest that hospitals applying for an accreditation programme do not differ significantly in characteristics from those which did not apply. This means that if in the future the accredited hospitals become better than other hospitals, then the improvement could be solely contributed to the accreditation. PMID:29391381
Hay, Peter D; Smith, Julie; O'Connor, Richard A
2016-02-01
The aim of this study was to evaluate the benefits to SPECT bone scan image quality when applying resolution recovery (RR) during image reconstruction using software provided by a third-party supplier. Bone SPECT data from 90 clinical studies were reconstructed retrospectively using software supplied independent of the gamma camera manufacturer. The current clinical datasets contain 120×10 s projections and are reconstructed using an iterative method with a Butterworth postfilter. Five further reconstructions were created with the following characteristics: 10 s projections with a Butterworth postfilter (to assess intraobserver variation); 10 s projections with a Gaussian postfilter with and without RR; and 5 s projections with a Gaussian postfilter with and without RR. Two expert observers were asked to rate image quality on a five-point scale relative to our current clinical reconstruction. Datasets were anonymized and presented in random order. The benefits of RR on image scores were evaluated using ordinal logistic regression (visual grading regression). The application of RR during reconstruction increased the probability of both observers of scoring image quality as better than the current clinical reconstruction even where the dataset contained half the normal counts. Type of reconstruction and observer were both statistically significant variables in the ordinal logistic regression model. Visual grading regression was found to be a useful method for validating the local introduction of technological developments in nuclear medicine imaging. RR, as implemented by the independent software supplier, improved bone SPECT image quality when applied during image reconstruction. In the majority of clinical cases, acquisition times for bone SPECT intended for the purposes of localization can safely be halved (from 10 s projections to 5 s) when RR is applied.
Tabrizi, Jafar S; Askari, Samira; Fardiazar, Zahra; Koshavar, Hossein; Gholipour, Kamal
2014-01-01
Our aim was to determine the service quality of delivered care for people with Caesarean Section and Normal Delivery. A cross-sectional study was conducted among 200 people who had caesarean section and normal delivery in Al-Zahra Teaching Hospital in Tabriz, north western Iran. Service quality was calculated using: Service Quality = 10 - (Importance × Performance) based on importance and performance of service quality aspects from the postpartum women's perspective.A hierarchical regression analysis was applied in two steps using the enter method to examine the associations between demographics and SQ scores. Data were analysed using the SPSS-17 software. "Confidentiality", "autonomy", "choice of care provider" and "communication" achieved scores at the highest level of quality; and "support group", "prompt attention", "prevention and early detection", "continuity of care", "dignity", "safety", "accessibility and "basic amenities" got service quality score less than eight. Statistically significant relationship was found between service quality score and continuity of care (P=0.008). A notable gap between the participants‟ expectations and what they have actually received in most aspects of provided care. So, there is an opportunityto improve the quality of delivered care.
NASA Astrophysics Data System (ADS)
Jacak, Monika; Melniczuk, Damian; Jacak, Janusz; Jóźwiak, Ireneusz; Gruber, Jacek; Jóźwiak, Piotr
2015-02-01
In order to assess the susceptibility of the quantum key distribution (QKD) systems to the hacking attack including simultaneous and frequent system self-decalibrations, we analyze the stability of the QKD transmission organized in two commercially available systems. The first one employs non-entangled photons as flying qubits in the dark quantum channel for communication whereas the second one utilizes the entangled photon pairs to secretly share the cryptographic key. Applying standard methods of the statistical data analysis to the characteristic indicators of the quality of the QKD communication (the raw key exchange rate [RKER] and the quantum bit error rate [QBER]), we have estimated the pace of the self-decalibration of both systems and the repeatability rate in the case of controlled worsening of the dark channel quality.
Bench to bedside: the quest for quality in experimental stroke research.
Dirnagl, Ulrich
2006-12-01
Over the past decades, great progress has been made in clinical as well as experimental stroke research. Disappointingly, however, hundreds of clinical trials testing neuroprotective agents have failed despite efficacy in experimental models. Recently, several systematic reviews have exposed a number of important deficits in the quality of preclinical stroke research. Many of the issues raised in these reviews are not specific to experimental stroke research, but apply to studies of animal models of disease in general. It is the aim of this article to review some quality-related sources of bias with a particular focus on experimental stroke research. Weaknesses discussed include, among others, low statistical power and hence reproducibility, defects in statistical analysis, lack of blinding and randomization, lack of quality-control mechanisms, deficiencies in reporting, and negative publication bias. Although quantitative evidence for quality problems at present is restricted to preclinical stroke research, to spur discussion and in the hope that they will be exposed to meta-analysis in the near future, I have also included some quality-related sources of bias, which have not been systematically studied. Importantly, these may be also relevant to mechanism-driven basic stroke research. I propose that by a number of rather simple measures reproducibility of experimental results, as well as the step from bench to bedside in stroke research may be made more successful. However, the ultimate proof for this has to await successful phase III stroke trials, which were built on basic research conforming to the criteria as put forward in this article.
Conformity and statistical tolerancing
NASA Astrophysics Data System (ADS)
Leblond, Laurent; Pillet, Maurice
2018-02-01
Statistical tolerancing was first proposed by Shewhart (Economic Control of Quality of Manufactured Product, (1931) reprinted 1980 by ASQC), in spite of this long history, its use remains moderate. One of the probable reasons for this low utilization is undoubtedly the difficulty for designers to anticipate the risks of this approach. The arithmetic tolerance (worst case) allows a simple interpretation: conformity is defined by the presence of the characteristic in an interval. Statistical tolerancing is more complex in its definition. An interval is not sufficient to define the conformance. To justify the statistical tolerancing formula used by designers, a tolerance interval should be interpreted as the interval where most of the parts produced should probably be located. This tolerance is justified by considering a conformity criterion of the parts guaranteeing low offsets on the latter characteristics. Unlike traditional arithmetic tolerancing, statistical tolerancing requires a sustained exchange of information between design and manufacture to be used safely. This paper proposes a formal definition of the conformity, which we apply successively to the quadratic and arithmetic tolerancing. We introduce a concept of concavity, which helps us to demonstrate the link between tolerancing approach and conformity. We use this concept to demonstrate the various acceptable propositions of statistical tolerancing (in the space decentring, dispersion).
Zaki, Rafdzah; Bulgiba, Awang; Ismail, Roshidi; Ismail, Noor Azina
2012-01-01
Accurate values are a must in medicine. An important parameter in determining the quality of a medical instrument is agreement with a gold standard. Various statistical methods have been used to test for agreement. Some of these methods have been shown to be inappropriate. This can result in misleading conclusions about the validity of an instrument. The Bland-Altman method is the most popular method judging by the many citations of the article proposing this method. However, the number of citations does not necessarily mean that this method has been applied in agreement research. No previous study has been conducted to look into this. This is the first systematic review to identify statistical methods used to test for agreement of medical instruments. The proportion of various statistical methods found in this review will also reflect the proportion of medical instruments that have been validated using those particular methods in current clinical practice. Five electronic databases were searched between 2007 and 2009 to look for agreement studies. A total of 3,260 titles were initially identified. Only 412 titles were potentially related, and finally 210 fitted the inclusion criteria. The Bland-Altman method is the most popular method with 178 (85%) studies having used this method, followed by the correlation coefficient (27%) and means comparison (18%). Some of the inappropriate methods highlighted by Altman and Bland since the 1980s are still in use. This study finds that the Bland-Altman method is the most popular method used in agreement research. There are still inappropriate applications of statistical methods in some studies. It is important for a clinician or medical researcher to be aware of this issue because misleading conclusions from inappropriate analyses will jeopardize the quality of the evidence, which in turn will influence quality of care given to patients in the future.
Zaki, Rafdzah; Bulgiba, Awang; Ismail, Roshidi; Ismail, Noor Azina
2012-01-01
Background Accurate values are a must in medicine. An important parameter in determining the quality of a medical instrument is agreement with a gold standard. Various statistical methods have been used to test for agreement. Some of these methods have been shown to be inappropriate. This can result in misleading conclusions about the validity of an instrument. The Bland-Altman method is the most popular method judging by the many citations of the article proposing this method. However, the number of citations does not necessarily mean that this method has been applied in agreement research. No previous study has been conducted to look into this. This is the first systematic review to identify statistical methods used to test for agreement of medical instruments. The proportion of various statistical methods found in this review will also reflect the proportion of medical instruments that have been validated using those particular methods in current clinical practice. Methodology/Findings Five electronic databases were searched between 2007 and 2009 to look for agreement studies. A total of 3,260 titles were initially identified. Only 412 titles were potentially related, and finally 210 fitted the inclusion criteria. The Bland-Altman method is the most popular method with 178 (85%) studies having used this method, followed by the correlation coefficient (27%) and means comparison (18%). Some of the inappropriate methods highlighted by Altman and Bland since the 1980s are still in use. Conclusions This study finds that the Bland-Altman method is the most popular method used in agreement research. There are still inappropriate applications of statistical methods in some studies. It is important for a clinician or medical researcher to be aware of this issue because misleading conclusions from inappropriate analyses will jeopardize the quality of the evidence, which in turn will influence quality of care given to patients in the future. PMID:22662248
Chen, Qing; Xu, Pengfei; Liu, Wenzhong
2016-01-01
Computer vision as a fast, low-cost, noncontact, and online monitoring technology has been an important tool to inspect product quality, particularly on a large-scale assembly production line. However, the current industrial vision system is far from satisfactory in the intelligent perception of complex grain images, comprising a large number of local homogeneous fragmentations or patches without distinct foreground and background. We attempt to solve this problem based on the statistical modeling of spatial structures of grain images. We present a physical explanation in advance to indicate that the spatial structures of the complex grain images are subject to a representative Weibull distribution according to the theory of sequential fragmentation, which is well known in the continued comminution of ore grinding. To delineate the spatial structure of the grain image, we present a method of multiscale and omnidirectional Gaussian derivative filtering. Then, a product quality classifier based on sparse multikernel–least squares support vector machine is proposed to solve the low-confidence classification problem of imbalanced data distribution. The proposed method is applied on the assembly line of a food-processing enterprise to classify (or identify) automatically the production quality of rice. The experiments on the real application case, compared with the commonly used methods, illustrate the validity of our method. PMID:26986726
NASA Astrophysics Data System (ADS)
Shirzaei, M.; Walter, T. R.
2009-10-01
Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.
Assessment Methodology for Process Validation Lifecycle Stage 3A.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Chen, Shu; Ingram, Marzena; Spes, Jana
2017-07-01
The paper introduces evaluation methodologies and associated statistical approaches for process validation lifecycle Stage 3A. The assessment tools proposed can be applied to newly developed and launched small molecule as well as bio-pharma products, where substantial process and product knowledge has been gathered. The following elements may be included in Stage 3A: number of 3A batch determination; evaluation of critical material attributes, critical process parameters, critical quality attributes; in vivo in vitro correlation; estimation of inherent process variability (IPV) and PaCS index; process capability and quality dashboard (PCQd); and enhanced control strategy. US FDA guidance on Process Validation: General Principles and Practices, January 2011 encourages applying previous credible experience with suitably similar products and processes. A complete Stage 3A evaluation is a valuable resource for product development and future risk mitigation of similar products and processes. Elements of 3A assessment were developed to address industry and regulatory guidance requirements. The conclusions made provide sufficient information to make a scientific and risk-based decision on product robustness.
Choudhry, Shahid A.; Li, Jing; Davis, Darcy; Erdmann, Cole; Sikka, Rishi; Sutariya, Bharat
2013-01-01
Introduction: Preventing the occurrence of hospital readmissions is needed to improve quality of care and foster population health across the care continuum. Hospitals are being held accountable for improving transitions of care to avert unnecessary readmissions. Advocate Health Care in Chicago and Cerner (ACC) collaborated to develop all-cause, 30-day hospital readmission risk prediction models to identify patients that need interventional resources. Ideally, prediction models should encompass several qualities: they should have high predictive ability; use reliable and clinically relevant data; use vigorous performance metrics to assess the models; be validated in populations where they are applied; and be scalable in heterogeneous populations. However, a systematic review of prediction models for hospital readmission risk determined that most performed poorly (average C-statistic of 0.66) and efforts to improve their performance are needed for widespread usage. Methods: The ACC team incorporated electronic health record data, utilized a mixed-method approach to evaluate risk factors, and externally validated their prediction models for generalizability. Inclusion and exclusion criteria were applied on the patient cohort and then split for derivation and internal validation. Stepwise logistic regression was performed to develop two predictive models: one for admission and one for discharge. The prediction models were assessed for discrimination ability, calibration, overall performance, and then externally validated. Results: The ACC Admission and Discharge Models demonstrated modest discrimination ability during derivation, internal and external validation post-recalibration (C-statistic of 0.76 and 0.78, respectively), and reasonable model fit during external validation for utility in heterogeneous populations. Conclusions: The ACC Admission and Discharge Models embody the design qualities of ideal prediction models. The ACC plans to continue its partnership to further improve and develop valuable clinical models. PMID:24224068
Parallelizable 3D statistical reconstruction for C-arm tomosynthesis system
NASA Astrophysics Data System (ADS)
Wang, Beilei; Barner, Kenneth; Lee, Denny
2005-04-01
Clinical diagnosis and security detection tasks increasingly require 3D information which is difficult or impossible to obtain from 2D (two dimensional) radiographs. As a 3D (three dimensional) radiographic and non-destructive imaging technique, digital tomosynthesis is especially fit for cases where 3D information is required while a complete projection data is not available. Nowadays, FBP (filtered back projection) is extensively used in industry for its fast speed and simplicity. However, it is hard to deal with situations where only a limited number of projections from constrained directions are available, or the SNR (signal to noises ratio) of the projections is low. In order to deal with noise and take into account a priori information of the object, a statistical image reconstruction method is described based on the acquisition model of X-ray projections. We formulate a ML (maximum likelihood) function for this model and develop an ordered-subsets iterative algorithm to estimate the unknown attenuation of the object. Simulations show that satisfied results can be obtained after 1 to 2 iterations, and after that there is no significant improvement of the image quality. An adaptive wiener filter is also applied to the reconstructed image to remove its noise. Some approximations to speed up the reconstruction computation are also considered. Applying this method to computer generated projections of a revised Shepp phantom and true projections from diagnostic radiographs of a patient"s hand and mammography images yields reconstructions with impressive quality. Parallel programming is also implemented and tested. The quality of the reconstructed object is conserved, while the computation time is considerably reduced by almost the number of threads used.
Govender, Indira; Ehrlich, Rodney; Van Vuuren, Unita; De Vries, Elma; Namane, Mosedi; De Sa, Angela; Murie, Katy; Schlemmer, Arina; Govender, Strini; Isaacs, Abdul; Martell, Rob
2012-12-01
To determine whether clinical audit improved the performance of diabetic clinical processes in the health district in which it was implemented. Patient folders were systematically sampled annually for review. Primary health-care facilities in the Metro health district of the Western Cape Province in South Africa. Health-care workers involved in diabetes management. Clinical audit and feedback. The Skillings-Mack test was applied to median values of pooled audit results for nine diabetic clinical processes to measure whether there were statistically significant differences between annual audits performed in 2005, 2007, 2008 and 2009. Descriptive statistics were used to illustrate the order of values per process. A total of 40 community health centres participated in the baseline audit of 2005 that decreased to 30 in 2009. Except for two routine processes, baseline medians for six out of nine processes were below 50%. Pooled audit results showed statistically significant improvements in seven out of nine clinical processes. The findings indicate an association between the application of clinical audit and quality improvement in resource-limited settings. Co-interventions introduced after the baseline audit are likely to have contributed to improved outcomes. In addition, support from the relevant government health programmes and commitment of managers and frontline staff contributed to the audit's success.
NASA Astrophysics Data System (ADS)
Farmer, W. H.; Archfield, S. A.; Over, T. M.; Kiang, J. E.
2015-12-01
In the United States and across the globe, the majority of stream reaches and rivers are substantially impacted by water use or remain ungaged. The result is large gaps in the availability of natural streamflow records from which to infer hydrologic understanding and inform water resources management. From basin-specific to continent-wide scales, many efforts have been undertaken to develop methods to estimate ungaged streamflow. This work applies and contrasts several statistical models of daily streamflow to more than 1,700 reference-quality streamgages across the conterminous United States using a cross-validation methodology. The variability of streamflow simulation performance across the country exhibits a pattern familiar to other continental scale modeling efforts performed for the United States. For portions of the West Coast and the dense, relatively homogeneous and humid regions of the eastern United States models produce reliable estimates of daily streamflow using many different prediction methods. Model performance for the middle portion of the United States, marked by more heterogeneous and arid conditions, and with larger contributing areas and sparser networks of streamgages, is consistently poor. A discussion of the difficulty of statistical interpolation and regionalization in these regions raises additional questions of data availability and quality, hydrologic process representation and dominance, and intrinsic variability.
Chandrasekaran, A; Ravisankar, R; Harikrishnan, N; Satapathy, K K; Prasad, M V R; Kanagasabapathy, K V
2015-02-25
Anthropogenic activities increase the accumulation of heavy metals in the soil environment. Soil pollution significantly reduces environmental quality and affects the human health. In the present study soil samples were collected at different locations of Yelagiri Hills, Tamilnadu, India for heavy metal analysis. The samples were analyzed for twelve selected heavy metals (Mg, Al, K, Ca, Ti, Fe, V, Cr, Mn, Co, Ni and Zn) using energy dispersive X-ray fluorescence (EDXRF) spectroscopy. Heavy metals concentration in soil were investigated using enrichment factor (EF), geo-accumulation index (Igeo), contamination factor (CF) and pollution load index (PLI) to determine metal accumulation, distribution and its pollution status. Heavy metal toxicity risk was assessed using soil quality guidelines (SQGs) given by target and intervention values of Dutch soil standards. The concentration of Ni, Co, Zn, Cr, Mn, Fe, Ti, K, Al, Mg were mainly controlled by natural sources. Multivariate statistical methods such as correlation matrix, principal component analysis and cluster analysis were applied for the identification of heavy metal sources (anthropogenic/natural origin). Geo-statistical methods such as kirging identified hot spots of metal contamination in road areas influenced mainly by presence of natural rocks. Copyright © 2014 Elsevier B.V. All rights reserved.
Clustering and Flow Conservation Monitoring Tool for Software Defined Networks.
Puente Fernández, Jesús Antonio; García Villalba, Luis Javier; Kim, Tai-Hoon
2018-04-03
Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches.
Machine Learning Methods for Production Cases Analysis
NASA Astrophysics Data System (ADS)
Mokrova, Nataliya V.; Mokrov, Alexander M.; Safonova, Alexandra V.; Vishnyakov, Igor V.
2018-03-01
Approach to analysis of events occurring during the production process were proposed. Described machine learning system is able to solve classification tasks related to production control and hazard identification at an early stage. Descriptors of the internal production network data were used for training and testing of applied models. k-Nearest Neighbors and Random forest methods were used to illustrate and analyze proposed solution. The quality of the developed classifiers was estimated using standard statistical metrics, such as precision, recall and accuracy.
Manterola, Carlos; Torres, Rodrigo; Burgos, Luis; Vial, Manuel; Pineda, Viviana
2006-07-01
Surgery is a curative treatment for gastric cancer (GC). As relapse is frequent, adjuvant therapies such as postoperative chemo radiotherapy have been tried. In Chile, some hospitals adopted Macdonald's study as a protocol for the treatment of GC. To determine methodological quality and internal and external validity of the Macdonald study. Three instruments were applied that assess methodological quality. A critical appraisal was done and the internal and external validity of the methodological quality was analyzed with two scales: MINCIR (Methodology and Research in Surgery), valid for therapy studies and CONSORT (Consolidated Standards of Reporting Trials), valid for randomized controlled trials (RCT). Guides and scales were applied by 5 researchers with training in clinical epidemiology. The reader's guide verified that the Macdonald study was not directed to answer a clearly defined question. There was random assignment, but the method used is not described and the patients were not considered until the end of the study (36% of the group with surgery plus chemo radiotherapy did not complete treatment). MINCIR scale confirmed a multicentric RCT, not blinded, with an unclear randomized sequence, erroneous sample size estimation, vague objectives and no exclusion criteria. CONSORT system proved the lack of working hypothesis and specific objectives as well as an absence of exclusion criteria and identification of the primary variable, an imprecise estimation of sample size, ambiguities in the randomization process, no blinding, an absence of statistical adjustment and the omission of a subgroup analysis. The instruments applied demonstrated methodological shortcomings that compromise the internal and external validity of the.
Tasker, Gary D.; Granato, Gregory E.
2000-01-01
Decision makers need viable methods for the interpretation of local, regional, and national-highway runoff and urban-stormwater data including flows, concentrations and loads of chemical constituents and sediment, potential effects on receiving waters, and the potential effectiveness of various best management practices (BMPs). Valid (useful for intended purposes), current, and technically defensible stormwater-runoff models are needed to interpret data collected in field studies, to support existing highway and urban-runoffplanning processes, to meet National Pollutant Discharge Elimination System (NPDES) requirements, and to provide methods for computation of Total Maximum Daily Loads (TMDLs) systematically and economically. Historically, conceptual, simulation, empirical, and statistical models of varying levels of detail, complexity, and uncertainty have been used to meet various data-quality objectives in the decision-making processes necessary for the planning, design, construction, and maintenance of highways and for other land-use applications. Water-quality simulation models attempt a detailed representation of the physical processes and mechanisms at a given site. Empirical and statistical regional water-quality assessment models provide a more general picture of water quality or changes in water quality over a region. All these modeling techniques share one common aspect-their predictive ability is poor without suitable site-specific data for calibration. To properly apply the correct model, one must understand the classification of variables, the unique characteristics of water-resources data, and the concept of population structure and analysis. Classifying variables being used to analyze data may determine which statistical methods are appropriate for data analysis. An understanding of the characteristics of water-resources data is necessary to evaluate the applicability of different statistical methods, to interpret the results of these techniques, and to use tools and techniques that account for the unique nature of water-resources data sets. Populations of data on stormwater-runoff quantity and quality are often best modeled as logarithmic transformations. Therefore, these factors need to be considered to form valid, current, and technically defensible stormwater-runoff models. Regression analysis is an accepted method for interpretation of water-resources data and for prediction of current or future conditions at sites that fit the input data model. Regression analysis is designed to provide an estimate of the average response of a system as it relates to variation in one or more known variables. To produce valid models, however, regression analysis should include visual analysis of scatterplots, an examination of the regression equation, evaluation of the method design assumptions, and regression diagnostics. A number of statistical techniques are described in the text and in the appendixes to provide information necessary to interpret data by use of appropriate methods. Uncertainty is an important part of any decisionmaking process. In order to deal with uncertainty problems, the analyst needs to know the severity of the statistical uncertainty of the methods used to predict water quality. Statistical models need to be based on information that is meaningful, representative, complete, precise, accurate, and comparable to be deemed valid, up to date, and technically supportable. To assess uncertainty in the analytical tools, the modeling methods, and the underlying data set, all of these components need be documented and communicated in an accessible format within project publications.
Nicolaisen, Marianne; Müller, Stig; Patel, Hitendra R H; Hanssen, Tove Aminda
2014-12-01
To assess patients' symptoms, quality of life and satisfaction with information three to four years after radical prostatectomy, radical external beam radiotherapy and postoperative radiotherapy and to analyse differences between treatment groups and the relationship between disease-specific, health-related and overall quality of life and satisfaction with information. Radical prostate cancer treatments are associated with changes in quality of life. Differences between patients undergoing different treatments in symptoms and quality of life have been reported, but there are limited long-term data comparing radical prostatectomy with radical external beam radiotherapy and postoperative radiotherapy. A cross-sectional survey design was used. The study sample included 143 men treated with radical prostatectomy and/or radical external beam radiotherapy. Quality of life was measured using the 12-item Short Form Health Survey and the 50-item Expanded Prostate Cancer Index Composite Instrument. Questions assessing overall Quality of life and satisfaction with information were included. Descriptive statistics and interference statistical methods were applied to analyse the data. Radical external beam radiotherapy was associated with less urinary incontinence and better urinary function. There were no differences between the groups for disease-specific quality of life sum scores. Sexual quality of life was reported very low in all groups. Disease-specific quality of life and health-related quality of life were associated with overall quality of life. Patients having undergone surgery were more satisfied with information, and there was a positive correlation between quality of life and patient satisfaction. Pretreatment information and patient education lead to better quality of life and satisfaction. This study indicates a need for structured, pretreatment information and follow-up for all men going through radical prostate cancer treatment. Long-term quality of life effects should be considered when planning follow-up and information for men after radical prostate cancer treatment. Structured and organised information/education may increase preparedness for symptoms and bother after the treatment, improve symptom management strategies and result in improved quality of life. © 2014 John Wiley & Sons Ltd.
The Effect of Aromatherapy on Sleep Quality of Elderly People Residing in a Nursing Home.
Faydalı, Saide; Çetinkaya, Funda
Sleep is important for health and quality of life in the elderly, and sleep disturbances are reported to be associated with many of the adverse medical conditions. This research was carried out to evaluate the effect of inhalation of lavender oil on sleep quality of nursing home residents. A questionnaire was used to evaluate sociodemographic characteristics and sleeping properties of the 30 volunteers, enrolled. Pittsburgh Sleep Quality Index was applied as a pre- and posttest to measure sleep quality of individuals who inhaled lavender oil drops on the pillows every evening for a week before sleeping. Before and after aromatherapy, the mean Pittsburgh Sleep Quality Index score of the nursing home residents was (Equation is included in full-text article.)= 6.0 ± 5.1 and (Equation is included in full-text article.)= 2.6 ± 3.4, respectively, whereas statistically significant difference was not observed for independent variables. Cronbach α reliability coefficient of the Pittsburgh Sleep Quality Index scale was found to be 0.816. The results indicated an improvement of sleep quality of nursing home residents after the application of aromatherapy with lavender oil.
A data-driven approach to quality risk management
Alemayehu, Demissie; Alvir, Jose; Levenstein, Marcia; Nickerson, David
2013-01-01
Aim: An effective clinical trial strategy to ensure patient safety as well as trial quality and efficiency involves an integrated approach, including prospective identification of risk factors, mitigation of the risks through proper study design and execution, and assessment of quality metrics in real-time. Such an integrated quality management plan may also be enhanced by using data-driven techniques to identify risk factors that are most relevant in predicting quality issues associated with a trial. In this paper, we illustrate such an approach using data collected from actual clinical trials. Materials and Methods: Several statistical methods were employed, including the Wilcoxon rank-sum test and logistic regression, to identify the presence of association between risk factors and the occurrence of quality issues, applied to data on quality of clinical trials sponsored by Pfizer. Results: Only a subset of the risk factors had a significant association with quality issues, and included: Whether study used Placebo, whether an agent was a biologic, unusual packaging label, complex dosing, and over 25 planned procedures. Conclusion: Proper implementation of the strategy can help to optimize resource utilization without compromising trial integrity and patient safety. PMID:24312890
Applications of spatial statistical network models to stream data
Isaak, Daniel J.; Peterson, Erin E.; Ver Hoef, Jay M.; Wenger, Seth J.; Falke, Jeffrey A.; Torgersen, Christian E.; Sowder, Colin; Steel, E. Ashley; Fortin, Marie-Josée; Jordan, Chris E.; Ruesch, Aaron S.; Som, Nicholas; Monestiez, Pascal
2014-01-01
Streams and rivers host a significant portion of Earth's biodiversity and provide important ecosystem services for human populations. Accurate information regarding the status and trends of stream resources is vital for their effective conservation and management. Most statistical techniques applied to data measured on stream networks were developed for terrestrial applications and are not optimized for streams. A new class of spatial statistical model, based on valid covariance structures for stream networks, can be used with many common types of stream data (e.g., water quality attributes, habitat conditions, biological surveys) through application of appropriate distributions (e.g., Gaussian, binomial, Poisson). The spatial statistical network models account for spatial autocorrelation (i.e., nonindependence) among measurements, which allows their application to databases with clustered measurement locations. Large amounts of stream data exist in many areas where spatial statistical analyses could be used to develop novel insights, improve predictions at unsampled sites, and aid in the design of efficient monitoring strategies at relatively low cost. We review the topic of spatial autocorrelation and its effects on statistical inference, demonstrate the use of spatial statistics with stream datasets relevant to common research and management questions, and discuss additional applications and development potential for spatial statistics on stream networks. Free software for implementing the spatial statistical network models has been developed that enables custom applications with many stream databases.
NASA Astrophysics Data System (ADS)
Chitrakar, S.; Miller, S. N.; Liu, T.; Caffrey, P. A.
2015-12-01
Water quality data have been collected from three representative stream reaches in a coalbed methane (CBM) development area for over five years to improve the understanding of salt loading in the system. These streams are located within Atlantic Rim development area of the Muddy Creek in south-central Wyoming. Significant development of CBM wells is ongoing in the study area. Three representative sampling stream reaches included the Duck Pond Draw and Cow Creek, which receive co-produced water, and; South Fork Creek, and upstream Cow Creek which do not receive co-produced water. Water samples were assayed for various parameters which included sodium, calcium, magnesium, fluoride, chlorine, nitrate, O-phosphate, sulfate, carbonate, bicarbonates, and other water quality parameters such as pH, conductivity, and TDS. Based on these water quality parameters we have investigated various hydrochemical and geochemical processes responsible for the high variability in water quality in the region. However, effective interpretation of complex databases to understand aforementioned processes has been a challenging task due to the system's complexity. In this work we applied multivariate statistical techniques including cluster analysis (CA), principle component analysis (PCA) and discriminant analysis (DA) to analyze water quality data and identify similarities and differences among our locations. First, CA technique was applied to group the monitoring sites based on the multivariate similarities. Second, PCA technique was applied to identify the prevalent parameters responsible for the variation of water quality in each group. Third, the DA technique was used to identify the most important factors responsible for variation of water quality during low flow season and high flow season. The purpose of this study is to improve the understanding of factors or sources influencing the spatial and temporal variation of water quality. The ultimate goal of this whole research is to develop coupled salt loading and GIS-based hydrological modelling tool that will be able to simulate the salt loadings under various user defined scenarios in the regions undergoing CBM development. Therefore, the findings from this study will be used to formulate the predominant processes responsible for solute loading.
Spectral signature verification using statistical analysis and text mining
NASA Astrophysics Data System (ADS)
DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.
2016-05-01
In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is present for comparison. The spectral validation method proposed is described from a practical application and analytical perspective.
Using Creative Problem Solving (TRIZ) in Improving the Quality of Hospital Services
LariSemnani, Behrouz; Far, Rafat Mohebbi; Shalipoor, Elham; Mohseni, Mohammad
2015-01-01
TRIZ is an initiative and SERVQUAL is a structured methodology for quality improvement. Using these tools, inventive problem solving can be applied for quality improvement, and the highest quality can be reached using creative quality improvement methodology. The present study seeks to determine the priority of quality aspects of services provided for patients in the hospital as well as how TRIZ can help in improving the quality of those services. This Study is an applied research which used a dynamic qualitative descriptive survey method during year 2011. Statistical population includes every patient who visited in one of the University Hospitals from March 2011. There existed a big gap between patients’ expectations from what seemingly is seen (the design of the hospital) and timely provision of services with their perceptions. Also, quality aspects of services were prioritized as follows: keeping the appearance of hospital (the design), accountability, assurance, credibility and having empathy. Thus, the only thing which mattered most for all staff and managers of studied hospital was the appearance of hospital as well as its staff look. This can grasp a high percentage of patients’ satisfaction. By referring to contradiction matrix, the most important principles of TRIZ model were related to tangible factors including principles No. 13 (discarding and recovering), 25 (self-service), 35 (parameter changes), and 2 (taking out). Furthermore, in addition to these four principles, principle No. 24 (intermediary) was repeated most among the others. By utilizing TRIZ, hospital problems can be examined with a more open view, Go beyond The conceptual framework of the organization and responded more quickly to patients ’ needs. PMID:25560360
Ghafari, Somayeh; Ahmadi, Fazlolah; Nabavi, Masoud; Anoshirvan, Kazemnejad; Memarian, Robabe; Rafatbakhsh, Mohamad
2009-08-01
To identify the effects of applying Progressive Muscle Relaxation Technique on Quality of Life of patients with multiple Sclerosis. In view of the growing caring options in Multiple Sclerosis, improvement of quality of life has become increasingly relevant as a caring intervention. Complementary therapies are widely used by multiple sclerosis patients and Progressive Muscle Relaxation Technique is a form of complementary therapies. Quasi-experimental study. Multiple Sclerosis patients (n = 66) were selected with no probability sampling then assigned to experimental and control groups (33 patients in each group). Means of data collection included: Individual Information Questionnaire, SF-8 Health Survey, Self-reported checklist. PMRT performed for 63 sessions by experimental group during two months but no intervention was done for control group. Statistical analysis was done by SPSS software. Student t-test showed that there was no significant difference between two groups in mean scores of health-related quality of life before the study but this test showed a significant difference between two groups, one and two months after intervention (p < 0.05). anova test with repeated measurements showed that there is a significant difference in mean score of whole and dimensions of health-related quality of life between two groups in three times (p < 0.05). Although this study provides modest support for the effectiveness of Progressive Muscle Relaxation Technique on quality of life of multiple sclerosis patients, further research is required to determine better methods to promote quality of life of patients suffer multiple sclerosis and other chronic disease. Progressive Muscle Relaxation Technique is practically feasible and is associated with increase of life quality of multiple sclerosis patients; so that health professionals need to update their knowledge about complementary therapies.
Quality assessment of butter cookies applying multispectral imaging
Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne
2013-01-01
A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036
Chounlamany, Vanseng; Tanchuling, Maria Antonia; Inoue, Takanobu
2017-09-01
Payatas landfill in Quezon City, Philippines, releases leachate to the Marikina River through a creek. Multivariate statistical techniques were applied to study temporal and spatial variations in water quality of a segment of the Marikina River. The data set included 12 physico-chemical parameters for five monitoring stations over a year. Cluster analysis grouped the monitoring stations into four clusters and identified January-May as dry season and June-September as wet season. Principal components analysis showed that three latent factors are responsible for the data set explaining 83% of its total variance. The chemical oxygen demand, biochemical oxygen demand, total dissolved solids, Cl - and PO 4 3- are influenced by anthropogenic impact/eutrophication pollution from point sources. Total suspended solids, turbidity and SO 4 2- are influenced by rain and soil erosion. The highest state of pollution is at the Payatas creek outfall from March to May, whereas at downstream stations it is in May. The current study indicates that the river monitoring requires only four stations, nine water quality parameters and testing over three specific months of the year. The findings of this study imply that Payatas landfill requires a proper leachate collection and treatment system to reduce its impact on the Marikina River.
Muz, Gamze; Taşcı, Sultan
2017-10-01
The most common problems in hemodialysis patients are sleep disorders and fatigue. This randomized-controlled experimental study was conducted to determine the effect of aromatherapy applied by inhalation on sleep quality and fatigue level in hemodialysis patients. The study was completed in five hemodialysis centers settled in two provinces with 27 intervention group patients and 35 controls, being totally 62 patients, recruited with simple randomization. Ethical approval, informed consent from the individuals and institutional permission were obtained. Data were collected with a questionnaire form and Visual Analogue Scale (VAS) for fatigue, Piper fatigue scale, Pittsburgh Sleep Quality Index (PSQI), and follow-up forms for the patient and the researcher. Aromatherapy inhalation (sweet orange and lavender oil) was performed before going to bed every day for one month to the intervention group patients. No other application has been made to the control group patients except for standard hemodialysis treatment. All of the forms were performed at baseline and at follow-up at the end of the four weeks (baseline and last follow-up), VAS and Piper fatigue scale were performed during follow-ups at the end of every week (the first, second and third follow-ups). Data were statistically analyzed with Independent Samples t-test, one way analysis of variance, Pearson correlation analysis, chi-square test, Friedman and Mann Whitney U tests and Bonferroni test. p<0.05 was set as statistically significant in comparisons. Mean total and sub-dimension scores of VAS, Piper fatigue scale and PSQI (except for daytime sleepiness dysfunction sub-dimension) of the intervention and control groups at baseline were not significantly different (p>0.05). It was found that mean total and sub-dimension scores of VAS, Piper fatigue scale and PSQI of the intervention group significantly decreased in other follow-ups compared to the control group (p<0.05). Consequently, it was determined that aromatherapy applied by inhalation improved sleep quality, decreased fatigue level and severity in hemodialysis patients. Accordingly, aromatherapy prepared with sweet orange and lavender oil may be recommended to increase sleep quality and to decrease fatigue level of the hemodialysis patients. Copyright © 2017 Elsevier Inc. All rights reserved.
Importance of implementing an analytical quality control system in a core laboratory.
Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T
2015-01-01
The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Novel Data Reduction Based on Statistical Similarity
Lee, Dongeun; Sim, Alex; Choi, Jaesik; ...
2016-07-18
Applications such as scientific simulations and power grid monitoring are generating so much data quickly that compression is essential to reduce storage requirement or transmission capacity. To achieve better compression, one is often willing to discard some repeated information. These lossy compression methods are primarily designed to minimize the Euclidean distance between the original data and the compressed data. But this measure of distance severely limits either reconstruction quality or compression performance. In this paper, we propose a new class of compression method by redefining the distance measure with a statistical concept known as exchangeability. This approach reduces the storagemore » requirement and captures essential features, while reducing the storage requirement. In this paper, we report our design and implementation of such a compression method named IDEALEM. To demonstrate its effectiveness, we apply it on a set of power grid monitoring data, and show that it can reduce the volume of data much more than the best known compression method while maintaining the quality of the compressed data. Finally, in these tests, IDEALEM captures extraordinary events in the data, while its compression ratios can far exceed 100.« less
Mashburn, Andrew J; Downer, Jason T; Rivers, Susan E; Brackett, Marc A; Martinez, Andres
2014-04-01
Social and emotional learning programs are designed to improve the quality of social interactions in schools and classrooms in order to positively affect students' social, emotional, and academic development. The statistical power of group randomized trials to detect effects of social and emotional learning programs and other preventive interventions on setting-level outcomes is influenced by the reliability of the outcome measure. In this paper, we apply generalizability theory to an observational measure of the quality of classroom interactions that is an outcome in a study of the efficacy of a social and emotional learning program called The Recognizing, Understanding, Labeling, Expressing, and Regulating emotions Approach. We estimate multiple sources of error variance in the setting-level outcome and identify observation procedures to use in the efficacy study that most efficiently reduce these sources of error. We then discuss the implications of using different observation procedures on both the statistical power and the monetary costs of conducting the efficacy study.
Objective assessment of image quality. IV. Application to adaptive optics
Barrett, Harrison H.; Myers, Kyle J.; Devaney, Nicholas; Dainty, Christopher
2008-01-01
The methodology of objective assessment, which defines image quality in terms of the performance of specific observers on specific tasks of interest, is extended to temporal sequences of images with random point spread functions and applied to adaptive imaging in astronomy. The tasks considered include both detection and estimation, and the observers are the optimal linear discriminant (Hotelling observer) and the optimal linear estimator (Wiener). A general theory of first- and second-order spatiotemporal statistics in adaptive optics is developed. It is shown that the covariance matrix can be rigorously decomposed into three terms representing the effect of measurement noise, random point spread function, and random nature of the astronomical scene. Figures of merit are developed, and computational methods are discussed. PMID:17106464
Ren, Anna N; Neher, Robert E; Bell, Tyler; Grimm, James
2018-06-01
Preoperative planning is important to achieve successful implantation in primary total knee arthroplasty (TKA). However, traditional TKA templating techniques are not accurate enough to predict the component size to a very close range. With the goal of developing a general predictive statistical model using patient demographic information, ordinal logistic regression was applied to build a proportional odds model to predict the tibia component size. The study retrospectively collected the data of 1992 primary Persona Knee System TKA procedures. Of them, 199 procedures were randomly selected as testing data and the rest of the data were randomly partitioned between model training data and model evaluation data with a ratio of 7:3. Different models were trained and evaluated on the training and validation data sets after data exploration. The final model had patient gender, age, weight, and height as independent variables and predicted the tibia size within 1 size difference 96% of the time on the validation data, 94% of the time on the testing data, and 92% on a prospective cadaver data set. The study results indicated the statistical model built by ordinal logistic regression can increase the accuracy of tibia sizing information for Persona Knee preoperative templating. This research shows statistical modeling may be used with radiographs to dramatically enhance the templating accuracy, efficiency, and quality. In general, this methodology can be applied to other TKA products when the data are applicable. Copyright © 2018 Elsevier Inc. All rights reserved.
2012-01-01
Background No previous studies have addressed the integrated relationships among system quality, service quality, job satisfaction, and system performance; this study attempts to bridge such a gap with evidence-based practice study. Methods The convenience sampling method was applied to the information system users of three hospitals in southern Taiwan. A total of 500 copies of questionnaires were distributed, and 283 returned copies were valid, suggesting a valid response rate of 56.6%. SPSS 17.0 and AMOS 17.0 (structural equation modeling) statistical software packages were used for data analysis and processing. Results The findings are as follows: System quality has a positive influence on service quality (γ11= 0.55), job satisfaction (γ21= 0.32), and system performance (γ31= 0.47). Service quality (β31= 0.38) and job satisfaction (β32= 0.46) will positively influence system performance. Conclusions It is thus recommended that the information office of hospitals and developers take enhancement of service quality and user satisfaction into consideration in addition to placing b on system quality and information quality when designing, developing, or purchasing an information system, in order to improve benefits and gain more achievements generated by hospital information systems. PMID:23171394
Kim, Dohun; Chang, Sun Ju; Lee, Hyun Ok; Lee, Seung Hee
2018-01-01
This study aimed to develop a culturally tailored, patient-centered psychosocial intervention program and to investigate the effects of the program on health-related quality of life, sleep disturbance, and depression in cancer survivors. This was a one-group pretest and posttest design. A total of 19 cancer survivors participated in the program. The program was designed to have an 8-week duration with one class per week. Every class was composed of a 90-min education session and a 90-min exercise. Among the health-related quality of life subscales, the scores of global health status/quality of life, physical functioning, and emotional functioning at posttest were statistically increased than those at pretest. Fatigue scores significantly decreased, whereas no changes were observed in sleep disturbance or depression scores. The findings of this study suggested that a culturally tailored, patient-centered psychosocial intervention could be applied in clinical settings to improve health-related quality of life in cancer survivors.
Field data analysis of boar semen quality.
Broekhuijse, M L W J; Feitsma, H; Gadella, B M
2011-09-01
This contribution provides an overview of approaches to correlate sow fertility data with boar semen quality characteristics. Large data sets of fertility data and ejaculate data are more suitable to analyse effects of semen quality characteristics on field fertility. Variation in fertility in sows is large. The effect of semen factors is relatively small and therefore impossible to find in smaller data sets. Large data sets allow for statistical corrections on both sow- and boar-related parameters. Remaining sow fertility variation can then be assigned to semen quality parameters, which is of huge interest to AI (artificial insemination) companies. Previous studies of Varkens KI Nederland to find the contribution to field fertility of (i) the number of sperm cells in an insemination dose, (ii) the sperm motility and morphological defects and (iii) the age of semen at the moment of insemination are discussed in context of the possibility to apply such knowledge to select boars on the basis of their sperm parameters for AI purposes. © 2011 Blackwell Verlag GmbH.
The Taguchi Method Application to Improve the Quality of a Sustainable Process
NASA Astrophysics Data System (ADS)
Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.
2018-06-01
Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Bahn, G. S.
1977-01-01
Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.
Monte Carlo simulation of PET/MR scanner and assessment of motion correction strategies
NASA Astrophysics Data System (ADS)
Işın, A.; Uzun Ozsahin, D.; Dutta, J.; Haddani, S.; El-Fakhri, G.
2017-03-01
Positron Emission Tomography is widely used in three dimensional imaging of metabolic body function and in tumor detection. Important research efforts are made to improve this imaging modality and powerful simulators such as GATE are used to test and develop methods for this purpose. PET requires acquisition time in the order of few minutes. Therefore, because of the natural patient movements such as respiration, the image quality can be adversely affected which drives scientists to develop motion compensation methods to improve the image quality. The goal of this study is to evaluate various image reconstructions methods with GATE simulation of a PET acquisition of the torso area. Obtained results show the need to compensate natural respiratory movements in order to obtain an image with similar quality as the reference image. Improvements are still possible in the applied motion field's extraction algorithms. Finally a statistical analysis should confirm the obtained results.
Post-processing method for wind speed ensemble forecast using wind speed and direction
NASA Astrophysics Data System (ADS)
Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin
2017-04-01
Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.
Gennaro, G; Ballaminut, A; Contento, G
2017-09-01
This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.
Influence of water quality on the embodied energy of drinking water treatment.
Santana, Mark V E; Zhang, Qiong; Mihelcic, James R
2014-01-01
Urban water treatment plants rely on energy intensive processes to provide safe, reliable water to users. Changes in influent water quality may alter the operation of a water treatment plant and its associated energy use or embodied energy. Therefore the objective of this study is to estimate the effect of influent water quality on the operational embodied energy of drinking water, using the city of Tampa, Florida as a case study. Water quality and water treatment data were obtained from the David L Tippin Water Treatment Facility (Tippin WTF). Life cycle energy analysis (LCEA) was conducted to calculate treatment chemical embodied energy values. Statistical methods including Pearson's correlation, linear regression, and relative importance were used to determine the influence of water quality on treatment plant operation and subsequently, embodied energy. Results showed that influent water quality was responsible for about 14.5% of the total operational embodied energy, mainly due to changes in treatment chemical dosages. The method used in this study can be applied to other urban drinking water contexts to determine if drinking water source quality control or modification of treatment processes will significantly minimize drinking water treatment embodied energy.
NASA Astrophysics Data System (ADS)
Jean, Pierre-Olivier; Bradley, Robert; Tremblay, Jean-Pierre
2015-04-01
An important asset for the management of wild ungulates is the ability to recognize the spatial distribution of forage quality across heterogeneous landscapes. To do so typically requires knowledge of which plant species are eaten, in what abundance they are eaten, and what their nutritional quality might be. Acquiring such data may be, however, difficult and time consuming. Here, we are proposing a rapid and cost-effective forage quality monitoring tool that combines near infrared (NIR) spectra of fecal samples and easily obtained data on plant community composition. Our approach rests on the premise that NIR spectra of fecal samples collected within low population density exclosures reflect the optimal forage quality of a given landscape. Forage quality can thus be based on the Mahalanobis distance of fecal spectral scans across the landscape relative to fecal spectral scans inside exclosures (referred to as DISTEX). The Gi* spatial autocorrelation statistic can then be applied among neighbouring DISTEX values to detect and map 'hot-spots' and 'cold-spots' of nutritional quality over the landscape. We tested our approach in a heterogeneous boreal landscape on Anticosti Island (Qu
Artificial Intelligence Approach to Support Statistical Quality Control Teaching
ERIC Educational Resources Information Center
Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno
2006-01-01
Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
Urresti-Estala, Begoña; Carrasco-Cantos, Francisco; Vadillo-Pérez, Iñaki; Jiménez-Gavilán, Pablo
2013-03-15
Determine background levels are a key element in the further characterisation of groundwater bodies, according to Water Framework Directive 2000/60/EC and, more specifically, Groundwater Directive 2006/118/EC. In many cases, these levels present very high values for some parameters and types of groundwater, which is significant for their correct estimation as a prior step to establishing thresholds, assessing the status of water bodies and subsequently identifying contaminant patterns. The Guadalhorce River basin presents widely varying hydrogeological and hydrochemical conditions. Therefore, its background levels are the result of the many factors represented in the natural chemical composition of water bodies in this basin. The question of determining background levels under objective criteria is generally addressed as a statistical problem, arising from the many aspects involved in its calculation. In the present study, we outline the advantages of applying two statistical techniques applied specifically for this purpose: (1) the iterative 2σ technique and (2) the distribution function, and examine whether the conclusions reached by these techniques are similar or whether they differ considerably. In addition, we identify the specific characteristics of each approach and the circumstances under which they should be used. Copyright © 2012 Elsevier Ltd. All rights reserved.
Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh
2017-01-01
This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Efficacy of a perceptual and visual-motor skill intervention program for students with dyslexia.
Fusco, Natália; Germano, Giseli Donadon; Capellini, Simone Aparecida
2015-01-01
To verify the efficacy of a perceptual and visual-motor skill intervention program for students with dyslexia. The participants were 20 students from third to fifth grade of a public elementary school in Marília, São Paulo, aged from 8 years to 11 years and 11 months, distributed into the following groups: Group I (GI; 10 students with developmental dyslexia) and Group II (GII; 10 students with good academic performance). A perceptual and visual-motor intervention program was applied, which comprised exercises for visual-motor coordination, visual discrimination, visual memory, visual-spatial relationship, shape constancy, sequential memory, visual figure-ground coordination, and visual closure. In pre- and post-testing situations, both groups were submitted to the Test of Visual-Perceptual Skills (TVPS-3), and the quality of handwriting was analyzed using the Dysgraphia Scale. The analyzed statistical results showed that both groups of students had dysgraphia in pretesting situation. In visual perceptual skills, GI presented a lower performance compared to GII, as well as in the quality of writing. After undergoing the intervention program, GI increased the average of correct answers in TVPS-3 and improved the quality of handwriting. The developed intervention program proved appropriate for being applied to students with dyslexia, and showed positive effects because it provided improved visual perception skills and quality of writing for students with developmental dyslexia.
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
Quantitative metrics for assessment of chemical image quality and spatial resolution
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
2016-02-28
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Quantitative metrics for assessment of chemical image quality and spatial resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Potential errors and misuse of statistics in studies on leakage in endodontics.
Lucena, C; Lopez, J M; Pulgar, R; Abalos, C; Valderrama, M J
2013-04-01
To assess the quality of the statistical methodology used in studies of leakage in Endodontics, and to compare the results found using appropriate versus inappropriate inferential statistical methods. The search strategy used the descriptors 'root filling' 'microleakage', 'dye penetration', 'dye leakage', 'polymicrobial leakage' and 'fluid filtration' for the time interval 2001-2010 in journals within the categories 'Dentistry, Oral Surgery and Medicine' and 'Materials Science, Biomaterials' of the Journal Citation Report. All retrieved articles were reviewed to find potential pitfalls in statistical methodology that may be encountered during study design, data management or data analysis. The database included 209 papers. In all the studies reviewed, the statistical methods used were appropriate for the category attributed to the outcome variable, but in 41% of the cases, the chi-square test or parametric methods were inappropriately selected subsequently. In 2% of the papers, no statistical test was used. In 99% of cases, a statistically 'significant' or 'not significant' effect was reported as a main finding, whilst only 1% also presented an estimation of the magnitude of the effect. When the appropriate statistical methods were applied in the studies with originally inappropriate data analysis, the conclusions changed in 19% of the cases. Statistical deficiencies in leakage studies may affect their results and interpretation and might be one of the reasons for the poor agreement amongst the reported findings. Therefore, more effort should be made to standardize statistical methodology. © 2012 International Endodontic Journal.
Neutron/Gamma-ray discrimination through measures of fit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Moslem; Prenosil, Vaclav; Cvachovec, Frantisek
2015-07-01
Statistical tests and their underlying measures of fit can be utilized to separate neutron/gamma-ray pulses in a mixed radiation field. In this article, first the application of a sample statistical test is explained. Fit measurement-based methods require true pulse shapes to be used as reference for discrimination. This requirement makes practical implementation of these methods difficult; typically another discrimination approach should be employed to capture samples of neutrons and gamma-rays before running the fit-based technique. In this article, we also propose a technique to eliminate this requirement. These approaches are applied to several sets of mixed neutron and gamma-ray pulsesmore » obtained through different digitizers using stilbene scintillator in order to analyze them and measure their discrimination quality. (authors)« less
The Effect of Hydration on the Voice Quality of Future Professional Vocal Performers.
van Wyk, Liezl; Cloete, Mariaan; Hattingh, Danel; van der Linde, Jeannie; Geertsema, Salome
2017-01-01
The application of systemic hydration as an instrument for optimal voice quality has been a common practice by several professional voice users over the years. Although the physiological action has been determined, the benefits on acoustic and perceptual characteristics are relatively unknown. The present study aimed to determine whether systemic hydration has beneficial outcomes on the voice quality of future professional voice users. A within-subject, pretest posttest design is applied to determine quantitative research results of female singing students between 18 and 32 years of age without a history of voice pathology. Acoustic and perceptual data were collected before and after a 2-hour singing rehearsal. The difference between the hypohydrated condition (controlled) and the hydrated condition (experimental) and the relationship between adequate hydration and acoustic and perceptual parameters of voice was then investigated. A statistical significant (P = 0.041) increase in jitter values were obtained for the hypohydrated condition. Increased maximum phonation time (MPT/z/) and higher maximum frequency for hydration indicated further statistical significant changes in voice quality (P = 0.028 and P = 0.015, respectively). Systemic hydration has positive outcomes on perceptual and acoustic parameters of voice quality for future professional singers. The singer's ability to sustain notes for longer and reach higher frequencies may reflect well in performances. Any positive change in voice quality may benefit the singer's occupational success and subsequently their social, emotional, and vocational well-being. More research evidence is needed to determine the parameters for implementing adequate hydration in vocal hygiene programs. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Sociodemographic aspects and quality of life of patients with sickle cell anemia
dos Santos, Juliana Pereira; Gomes Neto, Mansueto
2013-01-01
Background Sickle cell anemia is a chronic inherited disease, widespread in the Brazilian population due to the high degree of miscegenation in the country. Despite the high prevalence, there are few studies describing the characteristics of patients and the impact of the disease on quality of life. Objective To describe the sociodemographic profile and the impact of the disease on the quality of life of sickle cell anemia patients. Methods Over 18-year-old patients with sickle cell anemia who attended meetings held by the Associação Baiana de Portadores de Doenças Falciformes, an association for sickle cell anemia patients in Bahia, were interviewed. Sociodemographic data were collected and the generic the Medical Outcomes 36-Item Short-Form Health Survey (SF-36) questionnaire, which is used to assess quality of life, was applied. The analysis of the descriptive statistics was performed using the Statistics Program for the Social Sciences software. Results Thirty-two mostly female (65.6%) patients were interviewed. The mean age was 31.9 ± 12.67 years, 50.0% considered themselves black, 68.8% did not work and 87.5% had per capita income below the poverty line (up to one and a half minimum wages). The SF-36 scores were: limitation by physical aspects 26.56, functional capacity 28.9, emotional aspects 30.20, social aspects, 50.0, pain 50.31, mental health 54.62, general health status 56.09 and vitality 56.71. This shows that the disease has a huge impact on the patients' quality of life. Conclusion The disease interferes in the working capacity of individuals, who mostly have low incomes and impaired access to healthcare services and significantly impacts on their quality of life. PMID:24106440
Control by quality: proposition of a typology.
Pujo, P; Pillet, M
The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.
What Is the Effect of Strength Training on Pain and Sleep in Patients With Fibromyalgia?
Andrade, Alexandro; Vilarino, Guilherme Torres; Bevilacqua, Guilherme Guimarães
2017-12-01
The study aimed to investigate the effect of an 8-wk structured strength training program on pain and sleep quality in patients with fibromyalgia. Fifty-two patients with fibromyalgia were evaluated; 31 submitted to strength training and 21 comprised the control group. The instruments used were the Fibromyalgia Impact Questionnaire and The Pittsburgh Sleep Quality Index. The questionnaires were applied before the first training session, at 12 sessions, and after 24 sessions. Descriptive statistics (mean, SD, and frequency) and inferential tests were used. After 8 wks of intervention, significant differences were found between groups in subjective quality of sleep (P = 0.03), sleep disturbance (P = 0.02), daytime dysfunction (P = 0.04), and total sleep score (P < 0.01). The correlation analysis using Spearman's test indicated a positive relationship between the variables of pain intensity and sleep quality (P < 0.01); when pain intensity increased in patients with fibromyalgia, sleep quality worsened. Strength training is safe and effective in treating people with fibromyalgia, and a significant decrease in sleep disturbances occurs after 8 wks of intervention.
Niesink, A; Trappenburg, J C A; de Weert-van Oene, G H; Lammers, J W J; Verheij, T J M; Schrijvers, A J P
2007-11-01
Chronic disease management for patients with chronic obstructive pulmonary disease (COPD) may improve quality, outcomes and access to care. To investigate effectiveness of chronic disease management programmes on the quality-of-life of people with COPD. Medline and Embase (1995-2005) were searched for relevant articles, and reference lists and abstracts were searched for controlled trials of chronic disease management programmes for patients with COPD. Quality-of-life was assessed as an outcome parameter. Two reviewers independently reviewed each paper for methodological quality and extracted the data. We found 10 randomized-controlled trials comparing chronic disease management with routine care. Patient populations, health-care professionals, intensity, and content of the intervention were heterogeneous. Different instruments were used to assess quality of life. Five out of 10 studies showed statistically significant positive outcomes on one or more domains of the quality of life instruments. Three studies, partly located in primary care, showed positive results. All chronic disease management projects for people with COPD involving primary care improved quality of life. In most of the studies, aspects of chronic disease management were applied to a limited extent. Quality of randomized-controlled trials was not optimal. More research is needed on chronic disease management programmes in patients with COPD across primary and secondary care.
Grossman, R A
1995-09-01
The purpose of this study was to determine whether women can discriminate better from less effective paracervical block techniques applied to opposite sides of the cervix. If this discrimination could be made, it would be possible to compare different techniques and thus improve the quality of paracervical anesthesia. Two milliliters of local anesthetic was applied to one side and 6 ml to the other side of volunteers' cervices before cervical dilation. Statistical examination was by sequential analysis. The study was stopped after 47 subjects had entered, when sequential analysis found that there was no significant difference in women's perception of pain. Nine women reported more pain on the side with more anesthesia and eight reported more pain on the side with less anesthesia. Because the amount of anesthesia did not make a difference, the null hypothesis (that women cannot discriminate between different anesthetic techniques) was accepted. Women are not able to discriminate different doses of local anesthetic when applied to opposite sides of the cervix.
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Wyrzykowska, N; Czarnecka-Operacz, M; Adamski, Z
2015-01-01
Atopic dermatitis (AD) is an inflammatory, chronically relapsing and highly pruritic skin disorder that considerably effects patients' life. Dermatology Life Quality Index (DLQI) is often applied in clinical research in order to evaluate the impact of AD on daily performance of patients. The aim of the study was to evaluate the long-term effect of allergen specific immunotherapy (ASIT) on the quality of life in AD patients. 15 patients suffering from AD, allergic to house dust mites or grass pollen allergens, who were previously treated with ASIT participated in the study. Our treatment with allergy vaccinations was performed during the time period between 1995 and 2001. DLQI questionnaires have been filled by the patients before the treatment, after termination of ASIT and after 2 - 12 years of the observational period. The statistical tests revealed a significant difference between the DLQI before ASIT was introduced and after termination of ASIT. Every answer except two (describing the influence of skin condition on preventing from working or studying and on sexual life) of these periods also disclosed statistically significant difference. As for the relation between the DLQI after ASIT and the actual one the tests revealed non significant difference, also regarding to every single answer of the questionnaire. In relation to improvement of quality of life in AD patients, this study confirms the effectiveness of ASIT and it discloses the persistence of its results in long-term aspect.
2013-01-01
Background We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. Methods Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek’s p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps ′guideline – perform - falsify – reform′. Results 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. Conclusions The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity. PMID:24074151
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Statistical efficiency of adaptive algorithms.
Widrow, Bernard; Kamenetsky, Max
2003-01-01
The statistical efficiency of a learning algorithm applied to the adaptation of a given set of variable weights is defined as the ratio of the quality of the converged solution to the amount of data used in training the weights. Statistical efficiency is computed by averaging over an ensemble of learning experiences. A high quality solution is very close to optimal, while a low quality solution corresponds to noisy weights and less than optimal performance. In this work, two gradient descent adaptive algorithms are compared, the LMS algorithm and the LMS/Newton algorithm. LMS is simple and practical, and is used in many applications worldwide. LMS/Newton is based on Newton's method and the LMS algorithm. LMS/Newton is optimal in the least squares sense. It maximizes the quality of its adaptive solution while minimizing the use of training data. Many least squares adaptive algorithms have been devised over the years, but no other least squares algorithm can give better performance, on average, than LMS/Newton. LMS is easily implemented, but LMS/Newton, although of great mathematical interest, cannot be implemented in most practical applications. Because of its optimality, LMS/Newton serves as a benchmark for all least squares adaptive algorithms. The performances of LMS and LMS/Newton are compared, and it is found that under many circumstances, both algorithms provide equal performance. For example, when both algorithms are tested with statistically nonstationary input signals, their average performances are equal. When adapting with stationary input signals and with random initial conditions, their respective learning times are on average equal. However, under worst-case initial conditions, the learning time of LMS can be much greater than that of LMS/Newton, and this is the principal disadvantage of the LMS algorithm. But the strong points of LMS are ease of implementation and optimal performance under important practical conditions. For these reasons, the LMS algorithm has enjoyed very widespread application. It is used in almost every modem for channel equalization and echo cancelling. Furthermore, it is related to the famous backpropagation algorithm used for training neural networks.
Blind image quality assessment based on aesthetic and statistical quality-aware features
NASA Astrophysics Data System (ADS)
Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi
2017-07-01
The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.
Evaluation of different types of enamel conditioning before application of a fissure sealant.
Ciucchi, Philip; Neuhaus, Klaus W; Emerich, Marta; Peutzfeldt, Anne; Lussi, Adrian
2015-01-01
The aim of the study was to compare fissure sealant quality after mechanical conditioning of erbium-doped yttrium aluminium garnet (Er:YAG) laser or air abrasion prior to chemical conditioning of phosphoric acid etching or of a self-etch adhesive. Twenty-five permanent molars were initially divided into three groups: control group (n = 5), phosphoric acid etching; test group 1 (n = 10), air abrasion; and test group 2, (n = 10) Er:YAG laser. After mechanical conditioning, the test group teeth were sectioned buccolingually and the occlusal surface of one half tooth (equal to one sample) was acid etched, while a self-etch adhesive was applied on the other half. The fissure system of each sample was sealed, thermo-cycled and immersed in 5% methylene dye for 24 h. Each sample was sectioned buccolingually, and one slice was analysed microscopically. Using specialized software microleakage, unfilled margin, sealant failure and unfilled area proportions were calculated. A nonparametric ANOVA model was applied to compare the Er:YAG treatment with that of air abrasion and the self-etch adhesive with phosphoric acid (α = 0.05). Test groups were compared to the control group using Wilcoxon rank sum tests (α = 0.05). The control group displayed significantly lower microleakage but higher unfilled area proportions than the Er:YAG laser + self-etch adhesive group and displayed significantly higher unfilled margin and unfilled area proportions than the air-abrasion + self-etch adhesive group. There was no statistically significant difference in the quality of sealants applied in fissures treated with either Er:YAG laser or air abrasion prior to phosphoric acid etching, nor in the quality of sealants applied in fissures treated with either self-etch adhesive or phosphoric acid following Er:YAG or air-abrasion treatment.
Wagner, Richard J.; Frans, Lonna M.; Huffman, Raegan L.
2006-01-01
Water-quality samples were collected from sites in four irrigation return-flow drainage basins in the Columbia Basin Project from July 2002 through October 2004. Ten samples were collected throughout the irrigation season (generally April through October) and two samples were collected during the non-irrigation season. Samples were analyzed for temperature, pH, specific conductance, dissolved oxygen, major ions, trace elements, nutrients, and a suite of 107 pesticides and pesticide metabolites (pesticide transformation products) and to document the occurrence, distribution, and pesticides transport and pesticide metabolites. The four drainage basins vary in size from 19 to 710 square miles. Percentage of agricultural cropland ranges from about 35 percent in Crab Creek drainage basin to a maximum of 75 percent in Lind Coulee drainage basin. More than 95 percent of cropland in Red Rock Coulee, Crab Creek, and Sand Hollow drainage basins is irrigated, whereas only 30 percent of cropland in Lind Coulee is irrigated. Forty-two pesticides and five metabolites were detected in samples from the four irrigation return-flow drainage basins. The most compounds detected were in samples from Sand Hollow with 37, followed by Lind Coulee with 33, Red Rock Coulee with 30, and Crab Creek with 28. Herbicides were the most frequently detected pesticides, followed by insecticides, metabolites, and fungicides. Atrazine, bentazon, diuron, and 2,4-D were the most frequently detected herbicides and chlorpyrifos and azinphos-methyl were the most frequently detected insecticides. A statistical comparison of pesticide concentrations in surface-water samples collected in the mid-1990s at Crab Creek and Sand Hollow with those collected in this study showed a statistically significant increase in concentrations for diuron and a statistically significant decrease for ethoprophos and atrazine in Crab Creek. Statistically significant increases were in concentrations of bromacil, diuron, and pendimethalin at Sand Hollow and statistically significant decreases were in concentrations of 2,6-diethylanaline, alachlor, atrazine, DCPA, and EPTC. A seasonal Kendall trend test on data from Lind Coulee indicated no statistically significant trends for any pesticide for 1994 through 2004. A comparison of pesticide concentrations detected in this study with those detected in previous U.S. Geological Survey National Water-Quality Assessment studies of the Central Columbia Plateau, Yakima River basin, and national agricultural studies indicated that concentrations in this study generally were in the middle to lower end of the concentration spectrum for the most frequently detected herbicides and insecticides, but that the overall rate of detection was near the high end. Thirty-one of the 42 herbicides, insecticides, and fungicides detected in surface-water samples were applied to the major agricultural crops in the drainage basins, and 11 of the detected pesticides are sold for residential application. Eight of the pesticides detected in surface-water samples were not reported as having any agricultural or residential use. The overall pattern of pesticide use depends on which crops are grown in each drainage basin. Drainage basins with predominantly more orchards have higher amounts of insecticides applied, whereas basins with larger percentages of field crops tend to have more herbicides applied. Pesticide usage was most similar in Crab Creek and Sand Hollow, where the largest total amounts applied were the insecticides azinphos-methyl, carbaryl, and chlorpyrifos and the herbicide EPTC. In Red Rock Coulee basin, DCPA was the most heavily applied herbicide, followed by the fungicide chlorothalonil, the herbicide EPTC, and the insecticides chlorpyrifos and azinphos-methyl. In Lind Coulee, which has a large percentage of dryland agricultural area, the herbicides 2,4-D and EPTC were applied in the largest amount, followed by the fungicide chlorothalonil. The
A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy
NASA Astrophysics Data System (ADS)
Bennun, Leonardo
2017-07-01
A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied
[The effect of vegetarian diet on selected biochemical and blood morphology parameters].
Nazarewicz, Rafał
2007-01-01
The objective was to examine whether vegetarian diet influence biochemical parameters of blood and plasma urea in selective vegetarian group. The investigation covered 41 subject, 22 of them had been applying vegetarian diet and 19 were omnivorous. The study shows statistically significant lower values of white blood cells, % and amounts of neutrocytes and insignificant lower level of red blood cells, hemoglobine, hematocrit and platelet in vegetarian group. Significant lower plasma urea level was observed in that group. These changes indicate that high quality deficiency protein was due to vegetarian diet.
Statistical auditing of toxicology reports.
Deaton, R R; Obenchain, R L
1994-06-01
Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.
The Computer Student Worksheet Based Mathematical Literacy for Statistics
NASA Astrophysics Data System (ADS)
Manoy, J. T.; Indarasati, N. A.
2018-01-01
The student worksheet is one of media teaching which is able to improve teaching an activity in the classroom. Indicators in mathematical literacy were included in a student worksheet is able to help the students for applying the concept in daily life. Then, the use of computers in learning can create learning with environment-friendly. This research used developmental research which was Thiagarajan (Four-D) development design. There are 4 stages in the Four-D, define, design, develop, and disseminate. However, this research was finish until the third stage, develop stage. The computer student worksheet based mathematical literacy for statistics executed good quality. This student worksheet is achieving the criteria if able to achieve three aspects, validity, practicality, and effectiveness. The subject in this research was the students at The 1st State Senior High School of Driyorejo, Gresik, grade eleven of The 5th Mathematics and Natural Sciences. The computer student worksheet products based mathematical literacy for statistics executed good quality, while it achieved the aspects for validity, practical, and effectiveness. This student worksheet achieved the validity aspects with an average of 3.79 (94.72%), and practical aspects with an average of 2.85 (71.43%). Besides, it achieved the effectiveness aspects with a percentage of the classical complete students of 94.74% and a percentage of the student positive response of 75%.
Clustering and Flow Conservation Monitoring Tool for Software Defined Networks
Puente Fernández, Jesús Antonio
2018-01-01
Prediction systems present some challenges on two fronts: the relation between video quality and observed session features and on the other hand, dynamics changes on the video quality. Software Defined Networks (SDN) is a new concept of network architecture that provides the separation of control plane (controller) and data plane (switches) in network devices. Due to the existence of the southbound interface, it is possible to deploy monitoring tools to obtain the network status and retrieve a statistics collection. Therefore, achieving the most accurate statistics depends on a strategy of monitoring and information requests of network devices. In this paper, we propose an enhanced algorithm for requesting statistics to measure the traffic flow in SDN networks. Such an algorithm is based on grouping network switches in clusters focusing on their number of ports to apply different monitoring techniques. Such grouping occurs by avoiding monitoring queries in network switches with common characteristics and then, by omitting redundant information. In this way, the present proposal decreases the number of monitoring queries to switches, improving the network traffic and preventing the switching overload. We have tested our optimization in a video streaming simulation using different types of videos. The experiments and comparison with traditional monitoring techniques demonstrate the feasibility of our proposal maintaining similar values decreasing the number of queries to the switches. PMID:29614049
Context dependent anti-aliasing image reconstruction
NASA Technical Reports Server (NTRS)
Beaudet, Paul R.; Hunt, A.; Arlia, N.
1989-01-01
Image Reconstruction has been mostly confined to context free linear processes; the traditional continuum interpretation of digital array data uses a linear interpolator with or without an enhancement filter. Here, anti-aliasing context dependent interpretation techniques are investigated for image reconstruction. Pattern classification is applied to each neighborhood to assign it a context class; a different interpolation/filter is applied to neighborhoods of differing context. It is shown how the context dependent interpolation is computed through ensemble average statistics using high resolution training imagery from which the lower resolution image array data is obtained (simulation). A quadratic least squares (LS) context-free image quality model is described from which the context dependent interpolation coefficients are derived. It is shown how ensembles of high-resolution images can be used to capture the a priori special character of different context classes. As a consequence, a priori information such as the translational invariance of edges along the edge direction, edge discontinuity, and the character of corners is captured and can be used to interpret image array data with greater spatial resolution than would be expected by the Nyquist limit. A Gibb-like artifact associated with this super-resolution is discussed. More realistic context dependent image quality models are needed and a suggestion is made for using a quality model which now is finding application in data compression.
Demir Zencirci, Ayten; Arslan, Sümeyye
2011-01-01
Aim To assess the relationship between sleep quality and demographic variables, morning-evening type, and burnout in nurses who work shifts. Methods We carried out a cross-sectional self-administered study with forced choice and open-ended structured questionnaires – Pittsburg Sleep Quality Index, Morningness-eveningness Questionnaire, and Maslach Burnout Inventory. The study was carried out at Gazi University Medicine Faculty Hospital of Ankara on 524 invited nurses from July to September 2008, with a response rate of 89.94% (n = 483). Descriptive and inferential statistics were applied to determine the risk factors of poor sleep quality. Results Most socio-demographic variables did not affect sleep quality. Participants with poor sleep quality had quite high burnout levels. Most nurses who belonged to a type that is neither morning nor evening had poor sleep quality. Nurses who experienced an incident worsening their sleep patterns (P < 0.001) and needlestick or sharp object injuries (P = 0.010) in the last month had poor sleep quality. The subjective sleep quality and sleep latency points of evening types within created models for the effect of burnout dimensions were high. Conclusions Nurses working consistently either in the morning or at night had better sleep quality than those working rotating shifts. Further studies are still needed to develop interventions that improve sleep quality and decrease burnout in nurses working shifts. PMID:21853548
Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang
2015-05-01
In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bladder cancer among hairdressers: a meta-analysis
Schablon, Anja; Schedlbauer, Grita; Dulon, Madeleine; Nienhaus, Albert
2010-01-01
Background Occupational risks for bladder cancer in hairdressers by using hair products have been examined in many epidemiological studies. But owing to small sample sizes of the studies and the resulting lack of statistical power, the results of these studies have been inconsistent and significant associations have rarely been found. Methods We conducted a meta-analysis to determine summary risk ratios (SRRs) for the risk of bladder cancer among hairdressers. Studies were identified by a MEDLINE, EMBASE, CENTRAL search and by the reference lists of articles/relevant reviews. Statistical tests for publication bias and for heterogeneity as well as sensitivity analysis were applied. In addition, the study quality and the risk of bias were assessed using six criteria. Results 42 studies were included and statistically significantly increased risks around 1.3–1.7 were found for all but one analysis. The SRR increased with duration of employment from 1.30 (95% CI 1.15 to 1.48) for ‘ever registered as hairdresser’ to 1.70 (95% CI 1.01 to 2.88) for ‘job held ≥10 years’. No difference was found between the risk for smoking-adjusted data (SRR 1.35, 95% CI 1.13 to 1.61) and no adjustment (SRR 1.33, 95% CI 1.18 to 1.50). Studies assessed as being of high quality (n=11) and of moderate quality (n=31) showed similar SRRs. There was no evidence of publication bias or heterogeneity in all analyses. Conclusion In summary, our results showed an increased and statistically significant risk for bladder cancer among hairdressers, in particular for hairdressers in jobs held ≥10 years. Residual confounding by smoking cannot be totally ruled out. Because of the long latency times of bladder cancer it remains an open question whether hairdressers working prior to 1980 and after 1980, when some aromatic amines were banned as hair dye ingredients, have the same risk for bladder cancer. PMID:20447989
Rushton, Paul R P; Grevitt, Michael P
2013-04-20
Review and statistical analysis of studies evaluating the effect of surgery on the health-related quality of life of adolescents with adolescent idiopathic scoliosis, using Scoliosis Research Society (SRS) outcomes. Apply published minimum clinical important differences (MCID) values for the SRS22r questionnaire to the literature to identify what areas of health-related quality of life are consistently affected by surgery and whether changes are clinically meaningful. The interpretation of published studies using the SRS outcomes has been limited by the lack of MCID values for the questionnaire domains. The recent publication of these data allows the clinical importance of any changes in these studies to be examined for the first time. A literature search was undertaken to locate suitable studies that were then analyzed. Statistically significant differences from baseline to 2 years postoperatively were ascertained by narratively reporting the analyses within included studies. When possible, clinically significant changes were assessed using 95% confidence intervals for the change in mean domain score. If the lower bound of the confidence intervals for the change exceeded the MCID for that domain, the change was considered clinically significant. The numbers of cohorts available for the different analyses varied (5-16). Eighty-one percent and 94% of included cohorts experienced statistically significant improvements in pain and self-image domains. In terms of clinical significance, it was only self-image that regularly improved by more than MCID, doing so in 4 of 5 included cohorts (80%) compared with 1 of 12 cohorts (8%) for pain. No clinically relevant changes occurred in mental health or activity domains. Evidence suggests that surgery can lead to clinically important improvement in patient self-image. Surgeons and patients should be aware of the limited evidence for improvements in domains other than self-image after surgery. Surgical decision-making will also be influenced by the natural history of adolescent idiopathic scoliosis.
A full year evaluation of the CALIOPE-EU air quality modeling system over Europe for 2004
NASA Astrophysics Data System (ADS)
Pay, M. T.; Piot, M.; Jorba, O.; Gassó, S.; Gonçalves, M.; Basart, S.; Dabdub, D.; Jiménez-Guerrero, P.; Baldasano, J. M.
The CALIOPE-EU high-resolution air quality modeling system, namely WRF-ARW/HERMES-EMEP/CMAQ/BSC-DREAM8b, is developed and applied to Europe (12 km × 12 km, 1 h). The model performances are tested in terms of air quality levels and dynamics reproducibility on a yearly basis. The present work describes a quantitative evaluation of gas phase species (O 3, NO 2 and SO 2) and particulate matter (PM2.5 and PM10) against ground-based measurements from the EMEP (European Monitoring and Evaluation Programme) network for the year 2004. The evaluation is based on statistics. Simulated O 3 achieves satisfactory performances for both daily mean and daily maximum concentrations, especially in summer, with annual mean correlations of 0.66 and 0.69, respectively. Mean normalized errors are comprised within the recommendations proposed by the United States Environmental Protection Agency (US-EPA). The general trends and daily variations of primary pollutants (NO 2 and SO 2) are satisfactory. Daily mean concentrations of NO 2 correlate well with observations (annual correlation r = 0.67) but tend to be underestimated. For SO 2, mean concentrations are well simulated (mean bias = 0.5 μg m -3) with relatively high annual mean correlation ( r = 0.60), although peaks are generally overestimated. The dynamics of PM2.5 and PM10 is well reproduced (0.49 < r < 0.62), but mean concentrations remain systematically underestimated. Deficiencies in particulate matter source characterization are discussed. Also, the spatially distributed statistics and the general patterns for each pollutant over Europe are examined. The model performances are compared with other European studies. While O 3 statistics generally remain lower than those obtained by the other considered studies, statistics for NO 2, SO 2, PM2.5 and PM10 present higher scores than most models.
Statistical Literacy among Applied Linguists and Second Language Acquisition Researchers
ERIC Educational Resources Information Center
Loewen, Shawn; Lavolette, Elizabeth; Spino, Le Anne; Papi, Mostafa; Schmidtke, Jens; Sterling, Scott; Wolff, Dominik
2014-01-01
The importance of statistical knowledge in applied linguistics and second language acquisition (SLA) research has been emphasized in recent publications. However, the last investigation of the statistical literacy of applied linguists occurred more than 25 years ago (Lazaraton, Riggenbach, & Ediger, 1987). The current study undertook a partial…
Using statistical process control to make data-based clinical decisions.
Pfadt, A; Wheeler, D J
1995-01-01
Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.
NASA Astrophysics Data System (ADS)
Jin, Zhenyu; Lin, Jing; Liu, Zhong
2008-07-01
By study of the classical testing techniques (such as Shack-Hartmann Wave-front Sensor) adopted in testing the aberration of ground-based astronomical optical telescopes, we bring forward two testing methods on the foundation of high-resolution image reconstruction technology. One is based on the averaged short-exposure OTF and the other is based on the Speckle Interferometric OTF by Antoine Labeyrie. Researches made by J.Ohtsubo, F. Roddier, Richard Barakat and J.-Y. ZHANG indicated that the SITF statistical results would be affected by the telescope optical aberrations, which means the SITF statistical results is a function of optical system aberration and the atmospheric Fried parameter (seeing). Telescope diffraction-limited information can be got through two statistics methods of abundant speckle images: by the first method, we can extract the low frequency information such as the full width at half maximum (FWHM) of the telescope PSF to estimate the optical quality; by the second method, we can get a more precise description of the telescope PSF with high frequency information. We will apply the two testing methods to the 2.4m optical telescope of the GMG Observatory, in china to validate their repeatability and correctness and compare the testing results with that of the Shack-Hartmann Wave-Front Sensor got. This part will be described in detail in our paper.
Vavken, P; Culen, G; Dorotka, R
2008-01-01
The demand to routinely apply evidence-based methods in orthopedic surgery increases steadily. In order to do so, however, the validity and reliability of the "evidence" has to be scrutinized. The object of this study was to assess the quality of the most recent orthopedic evidence and to determine variables that have an influence on quality. All 2006 controlled trials from orthopedic journals with high impact factors were analysed in a cross-sectional study. A score based on the CONSORT statement was used to assess study quality. Selected variables were tested for their influence on the quality of the study. Two independent blinded observers reviewed 126 studies. The overall quality was moderate to high. The most neglected parameters were power analysis, intention-to-treat, and concealment. The participation of a methodologically trained investigator increases study quality significantly. There was no difference in study quality irrespective of whether or not there was statistically significant result. Using our quality score we were able show fairly good results for recent orthopedic studies. The most frequently neglected issues in orthopedic research are blinding, power analysis, and intention-to-treat. This may distort the results of clinical investigations considerably and, especially, lack of concealment causes false-positive findings. Our data show furthermore that participation of a methodologist significantly increases quality of the study and consequently strengthens the reliability of results.
The Quality of Rare Disease Registries: Evaluation and Characterization.
Coi, Alessio; Santoro, Michele; Villaverde-Hueso, Ana; Lipucci Di Paola, Michele; Gainotti, Sabina; Taruscio, Domenica; Posada de la Paz, Manuel; Bianchi, Fabrizio
2016-01-01
The focus on the quality of the procedures for data collection, storing, and analysis in the definition and implementation of a rare disease registry (RDR) is the basis for developing a valid and long-term sustainable tool. The aim of this study was to provide useful information for characterizing a quality profile for RDRs using an analytical approach applied to RDRs participating in the European Platform for Rare Disease Registries 2011-2014 (EPIRARE) survey. An indicator of quality was defined by choosing a small set of quality-related variables derived from the survey. The random forest method was used to identify the variables best defining a quality profile for RDRs. Fisher's exact test was employed to assess the association with the indicator of quality, and the Cochran-Armitage test was used to check the presence of a linear trend along different levels of quality. The set of variables found to characterize high-quality RDRs focused on ethical and legal issues, governance, communication of activities and results, established procedures to regulate access to data and security, and established plans to ensure long-term sustainability. The quality of RDRs is usually associated with a good oversight and governance mechanism and with durable funding. The results suggest that RDRs would benefit from support in management, information technology, epidemiology, and statistics. © 2016 S. Karger AG, Basel.
Identification of Water Bodies in a Landsat 8 OLI Image Using a J48 Decision Tree.
Acharya, Tri Dev; Lee, Dong Ha; Yang, In Tae; Lee, Jae Kang
2016-07-12
Water bodies are essential to humans and other forms of life. Identification of water bodies can be useful in various ways, including estimation of water availability, demarcation of flooded regions, change detection, and so on. In past decades, Landsat satellite sensors have been used for land use classification and water body identification. Due to the introduction of a New Operational Land Imager (OLI) sensor on Landsat 8 with a high spectral resolution and improved signal-to-noise ratio, the quality of imagery sensed by Landsat 8 has improved, enabling better characterization of land cover and increased data size. Therefore, it is necessary to explore the most appropriate and practical water identification methods that take advantage of the improved image quality and use the fewest inputs based on the original OLI bands. The objective of the study is to explore the potential of a J48 decision tree (JDT) in identifying water bodies using reflectance bands from Landsat 8 OLI imagery. J48 is an open-source decision tree. The test site for the study is in the Northern Han River Basin, which is located in Gangwon province, Korea. Training data with individual bands were used to develop the JDT model and later applied to the whole study area. The performance of the model was statistically analysed using the kappa statistic and area under the curve (AUC). The results were compared with five other known water identification methods using a confusion matrix and related statistics. Almost all the methods showed high accuracy, and the JDT was successfully applied to the OLI image using only four bands, where the new additional deep blue band of OLI was found to have the third highest information gain. Thus, the JDT can be a good method for water body identification based on images with improved resolution and increased size.
Workplace stress in nursing workers from an emergency hospital: Job Stress Scale analysis.
Urbanetto, Janete de Souza; da Silva, Priscila Costa; Hoffmeister, Eveline; de Negri, Bianca Souza; da Costa, Bartira Ercília Pinheiro; Poli de Figueiredo, Carlos Eduardo
2011-01-01
This study identifies workplace stress according to the Job Stress Scale and associates it with socio-demographic and occupational variables of nursing workers from an emergency hospital. This is a cross-sectional study and data were collected through a questionnaire applied to 388 nursing professionals. Descriptive statistics were applied; univariate and multivariate analyses were performed. The results indicate there is a significant association with being a nursing technician or auxiliary, working in the position for more than 15 years, and having low social support, with 3.84, 2.25 and 4.79 times more chances of being placed in the 'high strain job' quadrant. The study reveals that aspects related to the workplace should be monitored by competent agencies in order to improve the quality of life of nursing workers.
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U.
Ishii, K; Shinohara, K; Ishikawa, M; Baba, M; Isobe, M; Okamoto, A; Kitajima, S; Sasao, M
2010-10-01
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-γ pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and γ-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the γ-ray contamination in most of the beam heating phase was negligible compared with the statistical error with 10 ms time resolution.
Fast neutron-gamma discrimination on neutron emission profile measurement on JT-60U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishii, K.; Okamoto, A.; Kitajima, S.
2010-10-15
A digital signal processing (DSP) system is applied to stilbene scintillation detectors of the multichannel neutron emission profile monitor in JT-60U. Automatic analysis of the neutron-{gamma} pulse shape discrimination is a key issue to diminish the processing time in the DSP system, and it has been applied using the two-dimensional (2D) map. Linear discriminant function is used to determine the dividing line between neutron events and {gamma}-ray events on a 2D map. In order to verify the validity of the dividing line determination, the pulse shape discrimination quality is evaluated. As a result, the {gamma}-ray contamination in most of themore » beam heating phase was negligible compared with the statistical error with 10 ms time resolution.« less
Yinon, Lital; Thurston, George
2018-01-01
Background The statistical association between increased exposure to air pollution and increased risk of morbidity and mortality is well established. However, documentation of the health benefits of lowering air pollution levels, which would support the biological plausibility of those past statistical associations, are not as well developed. A better understanding of the aftereffects of interventions to reduce air pollution is needed in order to: 1) better document the benefits of lowered air pollution; and, 2) identify the types of reductions that most effectively provide health benefits. Methods This study analyzes daily health and pollution data from three major cities in Israel that have undergone pollution control interventions to reduce sulfur emissions from combustion sources. In this work, the hypothesis tested is that transitions to cleaner fuels are accompanied by a decreased risk of daily cardiovascular and respiratory mortalities. Interrupted time series regression models are applied in order to test whether the cleaner air interventions are associated with a statistically significant reduction in mortality. Results In the multi-city meta-analysis we found statistically significant reductions of 13.3% [CI −21.9%, −3.8%] in cardiovascular mortality, and a borderline significant (p=0.06) reduction of 19.0% [CI −35.1%, 1.1%] in total mortality. Conclusions Overall, new experiential evidence is provided consistent with human health benefits being associated with interventions to reduce air pollution. The methods employed also provide an approach that may be applied elsewhere in the future to better document and optimize the health benefits of clean air interventions. PMID:28237065
Yinon, Lital; Thurston, George
2017-05-01
The statistical association between increased exposure to air pollution and increased risk of morbidity and mortality is well established. However, documentation of the health benefits of lowering air pollution levels, which would support the biological plausibility of those past statistical associations, are not as well developed. A better understanding of the aftereffects of interventions to reduce air pollution is needed in order to: 1) better document the benefits of lowered air pollution; and, 2) identify the types of reductions that most effectively provide health benefits. This study analyzes daily health and pollution data from three major cities in Israel that have undergone pollution control interventions to reduce sulfur emissions from combustion sources. In this work, the hypothesis tested is that transitions to cleaner fuels are accompanied by a decreased risk of daily cardiovascular and respiratory mortalities. Interrupted time series regression models are applied in order to test whether the cleaner air interventions are associated with a statistically significant reduction in mortality. In the multi-city meta-analysis we found statistically significant reductions of 13.3% [CI -21.9%, -3.8%] in cardiovascular mortality, and a borderline significant (p=0.06) reduction of 19.0% [CI -35.1%, 1.1%] in total mortality. Overall, new experiential evidence is provided consistent with human health benefits being associated with interventions to reduce air pollution. The methods employed also provide an approach that may be applied elsewhere in the future to better document and optimize the health benefits of clean air interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Estimating liver cancer deaths in Thailand based on verbal autopsy study.
Waeto, Salwa; Pipatjaturon, Nattakit; Tongkumchum, Phattrawan; Choonpradub, Chamnein; Saelim, Rattikan; Makaje, Nifatamah
2014-01-01
Liver cancer mortality is high in Thailand but utility of related vital statistics is limited due to national vital registration (VR) data being under reported for specific causes of deaths. Accurate methodologies and reliable supplementary data are needed to provide worthy national vital statistics. This study aimed to model liver cancer deaths based on verbal autopsy (VA) study in 2005 to provide more accurate estimates of liver cancer deaths than those reported. The results were used to estimate number of liver cancer deaths during 2000-2009. A verbal autopsy (VA) was carried out in 2005 based on a sample of 9,644 deaths from nine provinces and it provided reliable information on causes of deaths by gender, age group, location of deaths in or outside hospital, and causes of deaths of the VR database. Logistic regression was used to model liver cancer deaths and other variables. The estimated probabilities from the model were applied to liver cancer deaths in the VR database, 2000-2009. Thus, the more accurately VA-estimated numbers of liver cancer deaths were obtained. The model fits the data quite well with sensitivity 0.64. The confidence intervals from statistical model provide the estimates and their precisions. The VA-estimated numbers of liver cancer deaths were higher than the corresponding VR database with inflation factors 1.56 for males and 1.64 for females. The statistical methods used in this study can be applied to available mortality data in developing countries where their national vital registration data are of low quality and supplementary reliable data are available.
NASA Astrophysics Data System (ADS)
Curci, Gabriele; Falasca, Serena
2017-04-01
Deterministic air quality forecast is routinely carried out at many local Environmental Agencies in Europe and throughout the world by means of eulerian chemistry-transport models. The skill of these models in predicting the ground-level concentrations of relevant pollutants (ozone, nitrogen dioxide, particulate matter) a few days ahead has greatly improved in recent years, but it is not yet always compliant with the required quality level for decision making (e.g. the European Commission has set a maximum uncertainty of 50% on daily values of relevant pollutants). Post-processing of deterministic model output is thus still regarded as a useful tool to make the forecast more reliable. In this work, we test several bias correction techniques applied to a long-term dataset of air quality forecasts over Europe and Italy. We used the WRF-CHIMERE modelling system, which provides operational experimental chemical weather forecast at CETEMPS (http://pumpkin.aquila.infn.it/forechem/), to simulate the years 2008-2012 at low resolution over Europe (0.5° x 0.5°) and moderate resolution over Italy (0.15° x 0.15°). We compared the simulated dataset with available observation from the European Environmental Agency database (AirBase) and characterized model skill and compliance with EU legislation using the Delta tool from FAIRMODE project (http://fairmode.jrc.ec.europa.eu/). The bias correction techniques adopted are, in order of complexity: (1) application of multiplicative factors calculated as the ratio of model-to-observed concentrations averaged over the previous days; (2) correction of the statistical distribution of model forecasts, in order to make it similar to that of the observations; (3) development and application of Model Output Statistics (MOS) regression equations. We illustrate differences and advantages/disadvantages of the three approaches. All the methods are relatively easy to implement for other modelling systems.
NASA Technical Reports Server (NTRS)
Raiman, Laura B.
1992-01-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
Sleep quality and communication aspects in children.
de Castro Corrêa, Camila; José, Maria Renata; Andrade, Eduardo Carvalho; Feniman, Mariza Ribeiro; Fukushiro, Ana Paula; Berretin-Felix, Giédre; Maximino, Luciana Paula
2017-09-01
To correlate quality of life of children in terms of sleep, with their oral language skills, auditory processing and orofacial myofunctional aspects. Nineteen children (12 males and seven females, in the mean age 9.26) undergoing otorhinolaryngological and speech evaluations participated in this study. The OSA-18 questionnaire was applied, followed by verbal and nonverbal sequential memory tests, dichotic digit test, nonverbal dichotic test and Sustained Auditory Attention Ability Test, related to auditory processing. The Phonological Awareness Profile test, Rapid Automatized Naming and Phonological Working Memory were used for assessment of the phonological processing. Language was assessed by the ABFW Child Language Test, analyzing the phonological and lexical levels. Orofacial myofunctional aspects were evaluated through the MBGR Protocol. Statistical tests used: the Mann-Whitney Test, Fisher's exact test and Spearman Correlation. Relating the performance of children in all evaluations to the results obtained in the OSA-18, there was a statistically significant correlation in the phonological working memory for backward digits (p = 0.04); as well as in the breathing item (p = 0.03), posture of the mandible (p = 0.03) and mobility of lips (p = 0.04). A correlation was seen between the sleep quality of life and the skills related to the phonological processing, specifically in the phonological working memory in backward digits, and related to orofacial myofunctional aspects. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Raiman, Laura B.
1992-12-01
Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .
VISSR Atmospheric Sounder (VAS) simulation experiment for a severe storm environment
NASA Technical Reports Server (NTRS)
Chesters, D.; Uccellini, L. W.; Mostek, A.
1981-01-01
Radiance fields were simulated for prethunderstorm environments in Oklahoma to demonstrate three points: (1) significant moisture gradients can be seen directly in images of the VISSIR Atmospheric Sounder (VAS) channels; (2) temperature and moisture profiles can be retrieved from VAS radiances with sufficient accuracy to be useful for mesoscale analysis of a severe storm environment; and (3) the quality of VAS mesoscale soundings improves with conditioning by local weather statistics. The results represent the optimum retrievability of mesoscale information from VAS radiance without the use of ancillary data. The simulations suggest that VAS data will yield the best soundings when a human being classifies the scene, picks relatively clear areas for retrieval, and applies a "local" statistical data base to resolve the ambiguities of satellite observations in favor of the most probable atmospheric structure.
Chukmaitov, Askar; Harless, David W; Bazzoli, Gloria J; Carretta, Henry J; Siangphoe, Umaporn
2015-01-01
Implementation of accountable care organizations (ACOs) is currently underway, but there is limited empirical evidence on the merits of the ACO model. The aim was to study the associations between delivery system characteristics and ACO competencies, including centralization strategies to manage organizations, hospital integration with physicians and outpatient facilities, health information technology, infrastructure to monitor community health and report quality, and risk-adjusted 30-day all-cause mortality and case-mixed-adjusted inpatient costs for the Medicare population. Panel data (2006-2009) were assembled from Florida and multiple sources: inpatient hospital discharge, vital statistics, the American Hospital Association, the Healthcare Information and Management Systems Society, and other databases. We applied a panel study design, controlling for hospital and market characteristics. Hospitals that were in centralized health systems or became more centralized over the study period had significantly larger reductions in mortality compared with hospitals that remained freestanding. Surprisingly, tightly integrated hospital-physician arrangements were associated with increased mortality; as such, hospitals may wish to proceed cautiously when developing specific types of alignment with local physician organizations. We observed no statistically significant differences in the growth rate of costs across hospitals in any of the health systems studied relative to freestanding hospitals. Although we observed quality improvement in some organizational types, these outcome improvements were not coupled with the additional desired objective of lower cost growth. This implies that additional changes not present during our study period, potentially changes in provider payment approaches, are essential for achieving the ACO objectives of higher quality of care at lower costs. Provider organizations implementing ACOs should consider centralizing service delivery as a viable strategy to improve quality of care, although the strategy did not result in lower cost growth.
Network analysis applications in hydrology
NASA Astrophysics Data System (ADS)
Price, Katie
2017-04-01
Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain underexplored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five longterm USGS streamflow and water quality gages, allowing network application of longterm flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long term and eventbased hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwatersurface water interactions.
The social media index: measuring the impact of emergency medicine and critical care websites.
Thoma, Brent; Sanders, Jason L; Lin, Michelle; Paterson, Quinten S; Steeg, Jordon; Chan, Teresa M
2015-03-01
The number of educational resources created for emergency medicine and critical care (EMCC) that incorporate social media has increased dramatically. With no way to assess their impact or quality, it is challenging for educators to receive scholarly credit and for learners to identify respected resources. The Social Media index (SMi) was developed to help address this. We used data from social media platforms (Google PageRanks, Alexa Ranks, Facebook Likes, Twitter Followers, and Google+ Followers) for EMCC blogs and podcasts to derive three normalized (ordinal, logarithmic, and raw) formulas. The most statistically robust formula was assessed for 1) temporal stability using repeated measures and website age, and 2) correlation with impact by applying it to EMCC journals and measuring the correlation with known journal impact metrics. The logarithmic version of the SMi containing four metrics was the most statistically robust. It correlated significantly with website age (Spearman r=0.372; p<0.001) and repeated measures through seven months (r=0.929; p<0.001). When applied to EMCC journals, it correlated significantly with all impact metrics except number of articles published. The strongest correlations were seen with the Immediacy Index (r=0.609; p<0.001) and Article Influence Score (r=0.608; p<0.001). The SMi's temporal stability and correlation with journal impact factors suggests that it may be a stable indicator of impact for medical education websites. Further study is needed to determine whether impact correlates with quality and how learners and educators can best utilize this tool.
Sever, Ivan; Klaric, Eva; Tarle, Zrinka
2016-07-01
Dental microhardness experiments are influenced by unobserved factors related to the varying tooth characteristics that affect measurement reproducibility. This paper explores the appropriate analytical tools for modeling different sources of unobserved variability to reduce the biases encountered and increase the validity of microhardness studies. The enamel microhardness of human third molars was measured by Vickers diamond. The effects of five bleaching agents-10, 16, and 30 % carbamide peroxide, and 25 and 38 % hydrogen peroxide-were examined, as well as the effect of artificial saliva and amorphous calcium phosphate. To account for both between- and within-tooth heterogeneity in evaluating treatment effects, the statistical analysis was performed in the mixed-effects framework, which also included the appropriate weighting procedure to adjust for confounding. The results were compared to those of the standard ANOVA model usually applied. The weighted mixed-effects model produced the parameter estimates of different magnitude and significance than the standard ANOVA model. The results of the former model were more intuitive, with more precise estimates and better fit. Confounding could seriously bias the study outcomes, highlighting the need for more robust statistical procedures in dental research that account for the measurement reliability. The presented framework is more flexible and informative than existing analytical techniques and may improve the quality of inference in dental research. Reported results could be misleading if underlying heterogeneity of microhardness measurements is not taken into account. The confidence in treatment outcomes could be increased by applying the framework presented.
Statistical process control for electron beam monitoring.
López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín
2015-07-01
To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H
2017-03-13
Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.
Regojo Zapata, O; Lamata Hernández, F; Sánchez Zalabardo, J M; Elizalde Benito, A; Navarro Gil, J; Valdivia Uría, J G
2004-09-01
Studies about quality in thesis and investigation projects in biomedical sciences are unusual, but very important in university teaching because is necessary to improve the quality elaboration of the thesis. The objectives the study were to determine the project's quality of thesis in our department, according to the fulfillment of the scientific methodology and to establish, if it exists, a relation between the global quality of the project and the statistical used resources. Descriptive study of 273 thesis projects performed between 1995-2002 in surgery department of the Zaragoza University. The review realized for 15 observers that they analyzed 28 indicators of every project. Giving a value to each of the indicators, the projects qualified in a scale from 1 to 10 according to the quality in the fulfillment of the scientific methodology. The mean of the project's quality was 5.53 (D.E: 1.77). In 13.9% the thesis projects was concluded with the reading of the work. The three indicators of statistical used resources had a significant difference with the value of the quality projects. The quality of the statistical resources is very important when a project of thesis wants to be realized by good methodology, because it assures to come to certain conclusions. In our study we have thought that more of the third part of the variability in the quality of the project of thesis explains for three statistical above-mentioned articles.
Potential impacts of climate change on water quality in a shallow reservoir in China.
Zhang, Chen; Lai, Shiyu; Gao, Xueping; Xu, Liping
2015-10-01
To study the potential effects of climate change on water quality in a shallow reservoir in China, the field data analysis method is applied to data collected over a given monitoring period. Nine water quality parameters (water temperature, ammonia nitrogen, nitrate nitrogen, nitrite nitrogen, total nitrogen, total phosphorus, chemical oxygen demand, biochemical oxygen demand and dissolved oxygen) and three climate indicators for 20 years (1992-2011) are considered. The annual trends exhibit significant trends with respect to certain water quality and climate parameters. Five parameters exhibit significant seasonality differences in the monthly means between the two decades (1992-2001 and 2002-2011) of the monitoring period. Non-parametric regression of the statistical analyses is performed to explore potential key climate drivers of water quality in the reservoir. The results indicate that seasonal changes in temperature and rainfall may have positive impacts on water quality. However, an extremely cold spring and high wind speed are likely to affect the self-stabilising equilibrium states of the reservoir, which requires attention in the future. The results suggest that land use changes have important impact on nitrogen load. This study provides useful information regarding the potential effects of climate change on water quality in developing countries.
[Quality of life in patients with psoriasis].
García-Sánchez, Liliana; Montiel-Jarquín, Álvaro José; Vázquez-Cruz, Eduardo; May-Salazar, Adriana; Gutiérrez-Gabriel, Itzel; Loría-Castellanoso, Jorge
Psoriasis is a chronic inflammatory skin disease, in which an autoimmune mechanism participates, triggering an accelerated keratopoiesis. Its etiology is unknown; environmental factors, trauma, and infections are involved. The aim of this paper is to present the correlation between the index of severity of psoriasis and quality of life in patients with psoriasis. This was a cross-sectional study in 72 patients with psoriasis, older than 15 years old, who agreed to participate in the study. We applied the Dermatology Life Quality Index and the Psoriasis Severity Index; descriptive statistics, measures of central tendency, dispersion, and correlation measures were used. Patients (n = 72), were 43% male, 57% female, with a mean age 51.22 (15-77) ± 14.05 years. Education: bachelor's degree 23.6%, housework occupation 26.4%, duration of the disease 12.25 (1-50) ± 10.58 years. Psoriasis plaques occurred in 88.9%, the Psoriasis Severity Index was mild in 70.8%. The result of the impact on quality of life was moderate in effect in 33.3%, the difference between the degree of involvement of the disease and the impact on quality of life was p = 0.104, and correlation between the quality of life and degree of psoriasis was p = 0.463. Quality of life is independent of the degree of disease in patients with psoriasis.
Böning, G; Schäfer, M; Grupp, U; Kaul, D; Kahn, J; Pavel, M; Maurer, M; Denecke, T; Hamm, B; Streitparth, F
2015-08-01
To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. ASIR 40% significantly reduced CTDIvol (10.17±3.06mGy [FBP], 6.34±2.25mGy [ASIR] (p<0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76±1.87 [FBP], 3.2±2.32 [ASIR]) (p<0.05) (complete tumor-to-muscle, 2.74±2.67 [FBP], 4.31±4.61 [ASIR]) (p<0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0±0 [FBP], 5.0±0 [ASIR]), visibility of suspicious lesion (4.8±0.5 [FBP], 4.8±0.5 [ASIR]) and artifacts (5.0±0 [FBP], 5.0±0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3±0.6 [FBP], 4.0±0.8 [ASIR]) (p<0.05), contrast (4.4±0.6 [FBP], 4.1±0.8 [ASIR]) (p<0.001) and visibility of small structures (4.5±0.7 [FBP], 4.3±0.8 [ASIR]) (p<0.001). In clinical practice ASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Using knowledge for indexing health web resources in a quality-controlled gateway.
Joubert, Michel; Darmoni, Stefan J; Avillach, Paul; Dahamna, Badisse; Fieschi, Marius
2008-01-01
The aim of this study is to provide to indexers MeSH terms to be considered as major ones in a list of terms automatically extracted from a document. We propose a method combining symbolic knowledge - the UMLS Metathesaurus and Semantic Network - and statistical knowledge drawn from co-occurrences of terms in the CISMeF database (a French-language quality-controlled health gateway) using data mining measures. The method was tested on CISMeF corpus of 293 resources. There was a proportion of 0.37+/-0.26 major terms in the processed records. The method produced lists of terms with a proportion of terms initially pointed out as major of 0.54+/-0.31. The method we propose reduces the number of terms, which seem not useful for content description of resources, such as "check tags", but retains the most descriptive ones. Discarding these terms is accounted for by: 1) the removal by using semantic knowledge of associations of concepts bearing no real medical significance, 2) the removal by using statistical knowledge of nonstatistically significant associations of terms. This method can assist effectively indexers in their daily work and will be soon applied in the CISMeF system.
[A development and evaluation of nursing KMS using QFD in outpatient departments].
Lee, Han Na; Yun, Eun Kyoung
2014-02-01
This study was done to develop and implement the Nursing KMS (knowledge management system) in order to improve knowledge sharing and creation among clinical nurses in outpatient departments. This study was a methodological research using the 'System Development Life Cycle': consisting of planning, analyzing, design, implementation, and evaluation. Quality Function Deployment (QFD) was applied to establish nurse requirements and to identify important design requirements. Participants were 32 nurses and for evaluation data were collected pre and post intervention at K Hospital in Seoul, a tertiary hospital with over 1,000 beds. The Nursing KMS was built using a Linux-based operating system, Oracle DBMS, and Java 1.6 web programming tools. The system was implemented as a sub-system of the hospital information system. There was statistically significant differences in the sharing of knowledge but creating of knowledge was no statistically meaningful difference observed. In terms of satisfaction with the system, system efficiency ranked first followed by system convenience, information suitability and information usefulness. The results indicate that the use of Nursing KMS increases nurses' knowledge sharing and can contribute to increased quality of nursing knowledge and provide more opportunities for nurses to gain expertise from knowledge shared among nurses.
Utturkar, Sagar M.; Klingeman, Dawn Marie; Land, Miriam L.; ...
2014-06-14
Our motivation with this work was to assess the potential of different types of sequence data combined with de novo and hybrid assembly approaches to improve existing draft genome sequences. Our results show Illumina, 454 and PacBio sequencing technologies were used to generate de novo and hybrid genome assemblies for four different bacteria, which were assessed for quality using summary statistics (e.g. number of contigs, N50) and in silico evaluation tools. Differences in predictions of multiple copies of rDNA operons for each respective bacterium were evaluated by PCR and Sanger sequencing, and then the validated results were applied as anmore » additional criterion to rank assemblies. In general, assemblies using longer PacBio reads were better able to resolve repetitive regions. In this study, the combination of Illumina and PacBio sequence data assembled through the ALLPATHS-LG algorithm gave the best summary statistics and most accurate rDNA operon number predictions. This study will aid others looking to improve existing draft genome assemblies. As to availability and implementation–all assembly tools except CLC Genomics Workbench are freely available under GNU General Public License.« less
Laboratory animal science: a resource to improve the quality of science.
Forni, M
2007-08-01
The contribution of animal experimentation to biomedical research is of undoubted value, nevertheless the real usefulness of animal models is still being hotly debated. Laboratory Animal Science is a multidisciplinary approach to humane animal experimentation that allows the choice of the correct animal model and the collection of unbiased data. Refinement, Reduction and Replacement, the "3Rs rule", are now widely accepted and have a major influence on animal experimentation procedures. Refinement, namely any decrease in the incidence or severity of inhumane procedures applied to animals, has been today extended to the entire lives of the experimental animals. Reduction of the number of animals used to obtain statistically significant data may be achieved by improving experimental design and statistical analysis of data. Replacement refers to the development of validated alternative methods. A Laboratory Animal Science training program in biomedical degrees can promote the 3Rs and improve the welfare of laboratory animals as well as the quality of science with ethical, scientific and economic advantages complying with the European requirement that "persons who carry out, take part in, or supervise procedures on animals, or take care of animals used in procedures, shall have had appropriate education and training".
Quantitative Measures for Software Independent Verification and Validation
NASA Technical Reports Server (NTRS)
Lee, Alice
1996-01-01
As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.
da Costa Lobato, Tarcísio; Hauser-Davis, Rachel Ann; de Oliveira, Terezinha Ferreira; Maciel, Marinalva Cardoso; Tavares, Maria Regina Madruga; da Silveira, Antônio Morais; Saraiva, Augusto Cesar Fonseca
2015-02-15
The Amazon area has been increasingly suffering from anthropogenic impacts, especially due to the construction of hydroelectric power plant reservoirs. The analysis and categorization of the trophic status of these reservoirs are of interest to indicate man-made changes in the environment. In this context, the present study aimed to categorize the trophic status of a hydroelectric power plant reservoir located in the Brazilian Amazon by constructing a novel Water Quality Index (WQI) and Trophic State Index (TSI) for the reservoir using major ion concentrations and physico-chemical water parameters determined in the area and taking into account the sampling locations and the local hydrological regimes. After applying statistical analyses (factor analysis and cluster analysis) and establishing a rule base of a fuzzy system to these indicators, the results obtained by the proposed method were then compared to the generally applied Carlson and a modified Lamparelli trophic state index (TSI), specific for trophic regions. The categorization of the trophic status by the proposed fuzzy method was shown to be more reliable, since it takes into account the specificities of the study area, while the Carlson and Lamparelli TSI do not, and, thus, tend to over or underestimate the trophic status of these ecosystems. The statistical techniques proposed and applied in the present study, are, therefore, relevant in cases of environmental management and policy decision-making processes, aiding in the identification of the ecological status of water bodies. With this, it is possible to identify which factors should be further investigated and/or adjusted in order to attempt the recovery of degraded water bodies. Copyright © 2014 Elsevier B.V. All rights reserved.
Air quality assessment in Portugal and the special case of the Tâmega e Sousa region
NASA Astrophysics Data System (ADS)
de Almeida, Fátima; Correia, Aldina; Silva, Eliana Costa e.
2017-06-01
Air pollution is a major environmental problem which can present a significant risk for human health. This paper, presents the evaluation of the air quality in several region of Portugal. Special focus is given to the region of Tâmega e Sousa where ESTG/P. Porto is located. ANOVA and MANOVA techniques are applied to study the differences between air quality in the period between 2009 and 2012 in several regions of Portugal. The data includes altitude, area, expenditure of environmental measures on protection of air quality and climate, expenditure on protection of biodiversity and landscape, burned area, number of forest fires, extractive and manufacturing industries, per municipality and per year. Using information gathered by the project QualAr about concentrations of the pollutants: CO, NO2, O3, PM10 and SO2, an air quality indicator with five levels is consider. The results point to significant differences in the air quality for the regions and the years considered. Additionally, for identifying the factors that influence the air quality in 2012 a multivariate regression model was used. The results show statistical evidence that air quality in 2011, number of forest fires in 2012 and 2010, number of manufacturing industries per km2 in 2012 and number of forest fires in 2010 are the variables that present a larger contribution to the quality of the air in 2012.
Hasani Sangani, Mohammad; Jabbarian Amiri, Bahman; Alizadeh Shabani, Afshin; Sakieh, Yousef; Ashrafi, Sohrab
2015-04-01
Increasing land utilization through diverse forms of human activities, such as agriculture, forestry, urban growth, and industrial development, has led to negative impacts on the water quality of rivers. To find out how catchment attributes, such as land use, hydrologic soil groups, and lithology, can affect water quality variables (Ca(2+), Mg(2+), Na(+), Cl(-), HCO 3 (-) , pH, TDS, EC, SAR), a spatio-statistical approach was applied to 23 catchments in southern basins of the Caspian Sea. All input data layers (digital maps of land use, soil, and lithology) were prepared using geographic information system (GIS) and spatial analysis. Relationships between water quality variables and catchment attributes were then examined by Spearman rank correlation tests and multiple linear regression. Stepwise approach-based multiple linear regressions were developed to examine the relationship between catchment attributes and water quality variables. The areas (%) of marl, tuff, or diorite, as well as those of good-quality rangeland and bare land had negative effects on all water quality variables, while those of basalt, forest land cover were found to contribute to improved river water quality. Moreover, lithological variables showed the greatest most potential for predicting the mean concentration values of water quality variables, and noting that measure of EC and TDS have inversely associated with area (%) of urban land use.
NASA Astrophysics Data System (ADS)
Murali, Swetha; Ponmalar, V.
2017-07-01
To make innovation and continuous improvement as a norm, some traditional practices must become unlearnt. Change for growth and competitiveness are required for sustainability for any profitable business such as the construction industry. The leading companies are willing to implement Total Quality Management (TQM) principles, to realise potential advantages and improve growth and efficiency. Ironically, researches recollected quality as the most significant provider for competitive advantage in industrial leadership. The two objectives of this paper are 1) Identify TQM effectiveness in residential projects and 2) Identify the client satisfaction/dissatisfaction areas using Analytical Hierarchy Process (AHP) and suggest effective mitigate measures. Using statistical survey techniques like set of questionnaire survey, it is observed that total quality management was applied in some leading successful organization to an extent. The main attributes for quality achievement can be defined as teamwork and better communication with single agreed goal between client and contractor. Onsite safety is a paramount attribute in the identifying quality within the residential projects. It was noticed that the process based quality methods such as onsite safe working condition; safe management system and modern engineering process safety controls etc. as interlinked functions. Training and effective communication with all stakeholders on quality management principles is essential for effective quality work. Late Only through effective TQM principles companies can avoid some contract litigations with an increased client satisfaction Index.
Lakdizaji, Sima; Hassankhni, Hadi; Mohajjel Agdam, Alireza; Khajegodary, Mohammad; Salehi, Rezvanieh
2013-03-01
Heart failure is one of the most common cardiovascular diseases which decrease the quality of life. Most of the factors influencing the quality of life can be modified with educational interventions. Therefore, this study examined the impact of a continuous training program on quality of life of patients with heart failure. This randomized clinical trial study was conducted during May to August 2011. Forty four participants with heart failure referred to Shahid Madani's polyclinics of Tabriz were selected through convenient sampling method and were randomly allocated to two groups. The intervention group (n = 22) received ongoing training including one-to-one teaching, counseling sessions and phone calls over 3 months. The control group (n = 22) received routine care program. Data on quality of life was collected using the Minnesota Living with Heart Failure Questionnaire at baseline as well as three months later. The statistical tests showed significant differences in the physical, emotional dimensions and total quality of life in intervention group. But in control group, no significant differences were obtained. There was not any significant association in demographic characteristics and quality of life. Ongoing training programs can be effective in improving quality of life of patients with heart failure. Hence applying ongoing educational program as a non-pharmacological intervention can help to improve the quality of life of these patients.
A method to estimate spatiotemporal air quality in an urban traffic corridor.
Singh, Nongthombam Premananda; Gokhale, Sharad
2015-12-15
Air quality exposure assessment using personal exposure sampling or direct measurement of spatiotemporal air pollutant concentrations has difficulty and limitations. Most statistical methods used for estimating spatiotemporal air quality do not account for the source characteristics (e.g. emissions). In this study, a prediction method, based on the lognormal probability distribution of hourly-average-spatial concentrations of carbon monoxide (CO) obtained by a CALINE4 model, has been developed and validated in an urban traffic corridor. The data on CO concentrations were collected at three locations and traffic and meteorology within the urban traffic corridor.(1) The method has been developed with the data of one location and validated at other two locations. The method estimated the CO concentrations reasonably well (correlation coefficient, r≥0.96). Later, the method has been applied to estimate the probability of occurrence [P(C≥Cstd] of the spatial CO concentrations in the corridor. The results have been promising and, therefore, may be useful to quantifying spatiotemporal air quality within an urban area. Copyright © 2015 Elsevier B.V. All rights reserved.
Impact of artifact removal on ChIP quality metrics in ChIP-seq and ChIP-exo data
Carroll, Thomas S.; Liang, Ziwei; Salama, Rafik; Stark, Rory; de Santiago, Ines
2014-01-01
With the advent of ChIP-seq multiplexing technologies and the subsequent increase in ChIP-seq throughput, the development of working standards for the quality assessment of ChIP-seq studies has received significant attention. The ENCODE consortium's large scale analysis of transcription factor binding and epigenetic marks as well as concordant work on ChIP-seq by other laboratories has established a new generation of ChIP-seq quality control measures. The use of these metrics alongside common processing steps has however not been evaluated. In this study, we investigate the effects of blacklisting and removal of duplicated reads on established metrics of ChIP-seq quality and show that the interpretation of these metrics is highly dependent on the ChIP-seq preprocessing steps applied. Further to this we perform the first investigation of the use of these metrics for ChIP-exo data and make recommendations for the adaptation of the NSC statistic to allow for the assessment of ChIP-exo efficiency. PMID:24782889
Stochastic performance modeling and evaluation of obstacle detectability with imaging range sensors
NASA Technical Reports Server (NTRS)
Matthies, Larry; Grandjean, Pierrick
1993-01-01
Statistical modeling and evaluation of the performance of obstacle detection systems for Unmanned Ground Vehicles (UGVs) is essential for the design, evaluation, and comparison of sensor systems. In this report, we address this issue for imaging range sensors by dividing the evaluation problem into two levels: quality of the range data itself and quality of the obstacle detection algorithms applied to the range data. We review existing models of the quality of range data from stereo vision and AM-CW LADAR, then use these to derive a new model for the quality of a simple obstacle detection algorithm. This model predicts the probability of detecting obstacles and the probability of false alarms, as a function of the size and distance of the obstacle, the resolution of the sensor, and the level of noise in the range data. We evaluate these models experimentally using range data from stereo image pairs of a gravel road with known obstacles at several distances. The results show that the approach is a promising tool for predicting and evaluating the performance of obstacle detection with imaging range sensors.
NASA Astrophysics Data System (ADS)
Squizzato, Stefania; Masiol, Mauro
2015-10-01
The air quality is influenced by the potential effects of meteorology at meso- and synoptic scales. While local weather and mixing layer dynamics mainly drive the dispersion of sources at small scales, long-range transports affect the movements of air masses over regional, transboundary and even continental scales. Long-range transport may advect polluted air masses from hot-spots by increasing the levels of pollution at nearby or remote locations or may further raise air pollution levels where external air masses originate from other hot-spots. Therefore, the knowledge of ground-wind circulation and potential long-range transports is fundamental not only to evaluate how local or external sources may affect the air quality at a receptor site but also to quantify it. This review is focussed on establishing the relationships among PM2.5 sources, meteorological condition and air mass origin in the Po Valley, which is one of the most polluted areas in Europe. We have chosen the results from a recent study carried out in Venice (Eastern Po Valley) and have analysed them using different statistical approaches to understand the influence of external and local contribution of PM2.5 sources. External contributions were evaluated by applying Trajectory Statistical Methods (TSMs) based on back-trajectory analysis including (i) back-trajectories cluster analysis, (ii) potential source contribution function (PSCF) and (iii) concentration weighted trajectory (CWT). Furthermore, the relationships between the source contributions and ground-wind circulation patterns were investigated by using (iv) cluster analysis on wind data and (v) conditional probability function (CPF). Finally, local source contribution have been estimated by applying the Lenschow' approach. In summary, the integrated approach of different techniques has successfully identified both local and external sources of particulate matter pollution in a European hot-spot affected by the worst air quality.
Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis
Arakeri, Surekha Ulhas
2017-01-01
Introduction Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. Aim To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. Materials and Methods FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. Results The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. Conclusion MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up. PMID:28511391
Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis.
Sinkar, Prachi; Arakeri, Surekha Ulhas
2017-03-01
Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up.
Development and application of a statistical quality assessment method for dense-graded mixes.
DOT National Transportation Integrated Search
2004-08-01
This report describes the development of the statistical quality assessment method and the procedure for mapping the measures obtained from the quality assessment method to a composite pay factor. The application to dense-graded mixes is demonstrated...
Using the Benford's Law as a First Step to Assess the Quality of the Cancer Registry Data.
Crocetti, Emanuele; Randi, Giorgia
2016-01-01
Benford's law states that the distribution of the first digit different from 0 [first significant digit (FSD)] in many collections of numbers is not uniform. The aim of this study is to evaluate whether population-based cancer incidence rates follow Benford's law, and if this can be used in their data quality check process. We sampled 43 population-based cancer registry populations (CRPs) from the Cancer Incidence in 5 Continents-volume X (CI5-X). The distribution of cancer incidence rate FSD was evaluated overall, by sex, and by CRP. Several statistics, including Pearson's coefficient of correlation and distance measures, were applied to check the adherence to the Benford's law. In the whole dataset (146,590 incidence rates) and for each sex (70,722 male and 75,868 female incidence rates), the FSD distributions were Benford-like. The coefficient of correlation between observed and expected FSD distributions was extremely high (0.999), and the distance measures low. Considering single CRP (from 933 to 7,222 incidence rates), the results were in agreement with the Benford's law, and only a few CRPs showed possible discrepancies from it. This study demonstrated for the first time that cancer incidence rates follow Benford's law. This characteristic can be used as a new, simple, and objective tool in data quality evaluation. The analyzed data had been already checked for publication in CI5-X. Therefore, their quality was expected to be good. In fact, only for a few CRPs several statistics were consistent with possible violations.
Generalized watermarking attack based on watermark estimation and perceptual remodulation
NASA Astrophysics Data System (ADS)
Voloshynovskiy, Sviatoslav V.; Pereira, Shelby; Herrigel, Alexander; Baumgartner, Nazanin; Pun, Thierry
2000-05-01
Digital image watermarking has become a popular technique for authentication and copyright protection. For verifying the security and robustness of watermarking algorithms, specific attacks have to be applied to test them. In contrast to the known Stirmark attack, which degrades the quality of the image while destroying the watermark, this paper presents a new approach which is based on the estimation of a watermark and the exploitation of the properties of Human Visual System (HVS). The new attack satisfies two important requirements. First, image quality after the attack as perceived by the HVS is not worse than the quality of the stego image. Secondly, the attack uses all available prior information about the watermark and cover image statistics to perform the best watermark removal or damage. The proposed attack is based on a stochastic formulation of the watermark removal problem, considering the embedded watermark as additive noise with some probability distribution. The attack scheme consists of two main stages: (1) watermark estimation and partial removal by a filtering based on a Maximum a Posteriori (MAP) approach; (2) watermark alteration and hiding through addition of noise to the filtered image, taking into account the statistics of the embedded watermark and exploiting HVS characteristics. Experiments on a number of real world and computer generated images show the high efficiency of the proposed attack against known academic and commercial methods: the watermark is completely destroyed in all tested images without altering the image quality. The approach can be used against watermark embedding schemes that operate either in coordinate domain, or transform domains like Fourier, DCT or wavelet.
NASA Astrophysics Data System (ADS)
Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael
1994-11-01
This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.
Optimization of Premix Powders for Tableting Use.
Todo, Hiroaki; Sato, Kazuki; Takayama, Kozo; Sugibayashi, Kenji
2018-05-08
Direct compression is a popular choice as it provides the simplest way to prepare the tablet. It can be easily adopted when the active pharmaceutical ingredient (API) is unstable in water or to thermal drying. An optimal formulation of preliminary mixed powders (premix powders) is beneficial if prepared in advance for tableting use. The aim of this study was to find the optimal formulation of the premix powders composed of lactose (LAC), cornstarch (CS), and microcrystalline cellulose (MCC) by using statistical techniques. Based on the "Quality by Design" concept, a (3,3)-simplex lattice design consisting of three components, LAC, CS, and MCC was employed to prepare the model premix powders. Response surface method incorporating a thin-plate spline interpolation (RSM-S) was applied for estimation of the optimum premix powders for tableting use. The effect of tablet shape identified by the surface curvature on the optimization was investigated. The optimum premix powder was effective when the premix was applied to a small quantity of API, although the function of premix was limited in the case of the formulation of large amount of API. Statistical techniques are valuable to exploit new functions of well-known materials such as LAC, CS, and MCC.
Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...
Holman, Benjamin W B; Coombs, Cassius E O; Morris, Stephen; Kerr, Matthew J; Hopkins, David L
2017-11-01
Beef loins (LL) stored under different chilled-then-frozen storage combinations (up to 5 and 52weeks, respectively) and two frozen holding temperatures were evaluated for microbial load and meat quality parameters. We found holding temperature effects to be negligible, which suggest -12°C could deliver comparable quality LL to -18°C across these same storage periods. Meat quality parameters varied significantly, but when compared to existing consumer thresholds these may not be perceptible, colour being the exception which proved unacceptable, earlier into retail display when either chilled and subsequent frozen storage periods were increased. There was insufficient detection of key spoilage microbes to allow for statistical analysis, potentially due to the hygienic and commercially representative LL source, although variation in water activity, glycogen content, pH and other moisture parameters conducive to microbial proliferation were influenced by chilled-then-frozen storage. These outcomes could be applied to defining storage thresholds that assure beef quality within export networks, leveraging market access, and improving product management. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Locks, R B; Dos Santos, K; da Silva, J
2017-04-01
The aim of this study was to determine whether there were differences in health-related quality of life of patients with allergic rhinitis treated with bilastine 20 mg compared to those treated with loratadine 10 mg. This was a prospective randomised double-blinded study. Otolaryngology outpatient clinics in Criciúma, state of Santa Catarina, Brazil. Seventy-three patients, aged between 18 and 63 years, of whom 36 were treated with loratadine 10 mg and 37 with bilastine 20 mg with medication administered once a day for 10 days. The outcome was quality of life as assessed by the modified Rhinoconjunctivitis Quality of Life Questionnaire (RQLQm), which was applied at baseline and after 10 days of treatment. The use of bilastine 20 mg or loratadine 10 mg significantly reduced RQLQm scores after 10 days of treatment (P < 0.001); however, there was no statistically significant difference between the two treatment groups (P > 0.05). Health-related quality of life in patients with allergic rhinitis improved significantly after 10 days of treatment with loratadine and bilastine, and the effectiveness of both was equivalent. © 2016 John Wiley & Sons Ltd.
Does playing blindfold chess reduce the quality of game: comments on chabris and hearst (2003).
Jeremic, Veljko; Vukmirovic, Dragan; Radojicic, Zoran
2010-01-01
Blindfold chess is a special type of chess game where both the board and pieces are not visible to its players. This paper aims to determine whether the quality of the game played blindfolded is lower than when played under normal conditions. The best chess program was used to analyze games played by the world's top Grandmasters under both conditions. We have analyzed the Monaco 1993-1998 data set introduced by Chabris and Hearst (2003). The results showed that although a larger number of mistakes occurred while playing blindfolded, no significant statistical difference between the rapid and blindfold games has been found. Nevertheless, by applying the same methodology to the Monaco 2002-2007 data set a substantial difference between the blindfold and the rapid chess game was noticed. In this paper, we have addressed the possible improvement of the chess game quality and the advances in chess programs that may be responsible for detecting more blunders. Copyright © 2009 Cognitive Science Society, Inc.
Approach to design space from retrospective quality data.
Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon
2016-01-01
Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.
2011-01-01
Background Psychometric properties include validity, reliability and sensitivity to change. Establishing the psychometric properties of an instrument which measures three-dimensional human posture are essential prior to applying it in clinical practice or research. Methods This paper reports the findings of a systematic literature review which aimed to 1) identify non-invasive three-dimensional (3D) human posture-measuring instruments; and 2) assess the quality of reporting of the methodological procedures undertaken to establish their psychometric properties, using a purpose-build critical appraisal tool. Results Seventeen instruments were identified, of which nine were supported by research into psychometric properties. Eleven and six papers respectively, reported on validity and reliability testing. Rater qualification and reference standards were generally poorly addressed, and there was variable quality reporting of rater blinding and statistical analysis. Conclusions There is a lack of current research to establish the psychometric properties of non-invasive 3D human posture-measuring instruments. PMID:21569486
Mueller, David S.
2016-05-12
The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.
Weighted analysis of paired microarray experiments.
Kristiansson, Erik; Sjögren, Anders; Rudemo, Mats; Nerman, Olle
2005-01-01
In microarray experiments quality often varies, for example between samples and between arrays. The need for quality control is therefore strong. A statistical model and a corresponding analysis method is suggested for experiments with pairing, including designs with individuals observed before and after treatment and many experiments with two-colour spotted arrays. The model is of mixed type with some parameters estimated by an empirical Bayes method. Differences in quality are modelled by individual variances and correlations between repetitions. The method is applied to three real and several simulated datasets. Two of the real datasets are of Affymetrix type with patients profiled before and after treatment, and the third dataset is of two-colour spotted cDNA type. In all cases, the patients or arrays had different estimated variances, leading to distinctly unequal weights in the analysis. We suggest also plots which illustrate the variances and correlations that affect the weights computed by our analysis method. For simulated data the improvement relative to previously published methods without weighting is shown to be substantial.
Multilingual Twitter Sentiment Classification: The Role of Human Annotators
Mozetič, Igor; Grčar, Miha; Smailović, Jasmina
2016-01-01
What are the limits of automated Twitter sentiment classification? We analyze a large set of manually labeled tweets in different languages, use them as training data, and construct automated classification models. It turns out that the quality of classification models depends much more on the quality and size of training data than on the type of the model trained. Experimental results indicate that there is no statistically significant difference between the performance of the top classification models. We quantify the quality of training data by applying various annotator agreement measures, and identify the weakest points of different datasets. We show that the model performance approaches the inter-annotator agreement when the size of the training set is sufficiently large. However, it is crucial to regularly monitor the self- and inter-annotator agreements since this improves the training datasets and consequently the model performance. Finally, we show that there is strong evidence that humans perceive the sentiment classes (negative, neutral, and positive) as ordered. PMID:27149621
NASA Astrophysics Data System (ADS)
Awi; Ahmar, A. S.; Rahman, A.; Minggi, I.; Mulbar, U.; Asdar; Ruslan; Upu, H.; Alimuddin; Hamda; Rosidah; Sutamrin; Tiro, M. A.; Rusli
2018-01-01
This research aims to reveal the profile about the level of creativity and the ability to propose statistical problem of students at Mathematics Education 2014 Batch in the State University of Makassar in terms of their cognitive style. This research uses explorative qualitative method by giving meta-cognitive scaffolding at the time of research. The hypothesis of research is that students who have field independent (FI) cognitive style in statistics problem posing from the provided information already able to propose the statistical problem that can be solved and create new data and the problem is already been included as a high quality statistical problem, while students who have dependent cognitive field (FD) commonly are still limited in statistics problem posing that can be finished and do not load new data and the problem is included as medium quality statistical problem.
NASA Astrophysics Data System (ADS)
Bonfante, Antonello; Gambuti, Angelita; Monaco, Eugenia; Langella, Giuliano; Manna, Piero; Orefice, Nadia; Albrizio, Rossella; Basile, Angelo; Terribile, Fabio
2016-04-01
Water deficits limit yields and this is one of the negative aspects of climate change. However, this applies particularly when emphasis is on biomass production (e.g. for crops like maize, wheat, etc.) but not for plants where quality, not quantity is most relevant. For example, water stress occurring during specific phenological phases of grapevine development is an important factor when producing good quality wines. It induces, for example, the production of anthocyanins and aroma precursors. Water stress due to future increases of temperature and decreases of rainfall due to climate change can, therefore, represent an opportunity to increase winegrowers' incomes. This study was carried out in Campania region (Southern Italy), an area well known for high quality wine production. Growth of the Aglianico grapevine cultivar, with a standard clone population on 1103 Paulsen rootstocks, was studied on two different types of soil: Calcisols and Cambisols occurring along a slope of 90 m length with 11% gradient. The agro-hydrological model SWAP was calibrated and applied to estimate soil-plant water status at the various crop phenological phases for three vintages (2011-2013). Then, the Crop water stress index (CWSI), as estimated by the model, was related to physiological measurements (e.g. leaf water potential), grape bunches measurements (e.g. sugar content) and wine quality (e.g. tannins). For both soils, the correlation between measurements and CWSI were high (e.g. -0.97** with sugar; 0.895* with anthocyanins in the skins). Next, the model was applied to future climate conditions (2021-2051) obtained from statistical downscaling of Global Circulation Models (AOGCM) in order to estimate the effect of the climate on CWSI and hence on vine quality. Results show that the effects of climate change on grape and wine quality are not expected to be significant for this particular grape variety when grown on these Calcisols and Cambisols. However, significant differences are found between the two soils in terms of ultra, standard and low quality grapes, which confirms the reliability of the terroir concept for the Calcisol. CWSI >15 values for the Calcisol indicate the potential benefits of drip irrigation which is, however, not allowed under current regulations.
Erhart, M; Wetzel, R; Krügel, A; Ravens-Sieberer, U
2005-12-01
Within a comprehensive comparison of telephone and postal survey methods the SF-8 was applied to assess adult's health-related quality of life. The 1690 subjects were randomly assigned to a telephone survey and a postal survey. Comparisons across the different modes of administration addressed the response rates, central tendency, deviation, ceiling and floor effects observed in the SF-8 scores as well as the inter-item correlation. The importance of age and gender as moderating factors was investigated. Results indicate no or small statistically significant differences in the responses to the SF-8 depending on the actual mode of administration and the health aspect questioned. It was concluded that further investigations should focus on the exact nature of these deviations and try to generate correction factors.
Application of Taguchi methods to infrared window design
NASA Astrophysics Data System (ADS)
Osmer, Kurt A.; Pruszynski, Charles J.
1990-10-01
Dr. Genichi Taguchi, a prominent quality consultant, reduced a branch of statistics known as "Design of Experiments" to a cookbook methodology that can be employed by any competent engineer. This technique has been extensively employed by Japanese manufacturers, and is widely credited with helping them attain their current level of success in low cost, high quality product design and fabrication. Although this technique was originally put forth as a tool to streamline the determination of improved production processes, it can also be applied to a wide range of engineering problems. As part of an internal research project, this method of experimental design has been adapted to window trade studies and materials research. Two of these analyses are presented herein, and have been chosen to illustrate the breadth of applications to which the Taguchi method can be utilized.
Current application of chemometrics in traditional Chinese herbal medicine research.
Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke
2016-07-15
Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.
Principles of continuous quality improvement applied to intravenous therapy.
Dunavin, M K; Lane, C; Parker, P E
1994-01-01
Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.
PV cells electrical parameters measurement
NASA Astrophysics Data System (ADS)
Cibira, Gabriel
2017-12-01
When measuring optical parameters of a photovoltaic silicon cell, precise results bring good electrical parameters estimation, applying well-known physical-mathematical models. Nevertheless, considerable re-combination phenomena might occur in both surface and intrinsic thin layers within novel materials. Moreover, rear contact surface parameters may influence close-area re-combination phenomena, too. Therefore, the only precise electrical measurement approach is to prove assumed cell electrical parameters. Based on theoretical approach with respect to experiments, this paper analyses problems within measurement procedures and equipment used for electrical parameters acquisition within a photovoltaic silicon cell, as a case study. Statistical appraisal quality is contributed.
Documentation of the U.S. Geological Survey Oceanographic Time-Series Measurement Database
Montgomery, Ellyn T.; Martini, Marinna A.; Lightsom, Frances L.; Butman, Bradford
2008-01-02
This report describes the instrumentation and platforms used to make the measurements; the methods used to process, apply quality-control criteria, and archive the data; the data storage format, and how the data are released and distributed. The report also includes instructions on how to access the data from the online database at http://stellwagen.er.usgs.gov/. As of 2016, the database contains about 5,000 files, which may include observations of current velocity, wave statistics, ocean temperature, conductivity, pressure, and light transmission at one or more depths over some duration of time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
KL Gaustad; DD Turner
2007-09-30
This report provides a short description of the Atmospheric Radiation Measurement (ARM) microwave radiometer (MWR) RETrievel (MWRRET) Value-Added Product (VAP) algorithm. This algorithm utilizes complimentary physical and statistical retrieval methods and applies brightness temperature offsets to reduce spurious liquid water path (LWP) bias in clear skies resulting in significantly improved precipitable water vapor (PWV) and LWP retrievals. We present a general overview of the technique, input parameters, output products, and describe data quality checks. A more complete discussion of the theory and results is given in Turner et al. (2007b).
VizieR Online Data Catalog: Fundamental parameters of Kepler stars (Silva Aguirre+, 2015)
NASA Astrophysics Data System (ADS)
Silva Aguirre, V.; Davies, G. R.; Basu, S.; Christensen-Dalsgaard, J.; Creevey, O.; Metcalfe, T. S.; Bedding, T. R.; Casagrande, L.; Handberg, R.; Lund, M. N.; Nissen, P. E.; Chaplin, W. J.; Huber, D.; Serenelli, A. M.; Stello, D.; van Eylen, V.; Campante, T. L.; Elsworth, Y.; Gilliland, R. L.; Hekker, S.; Karoff, C.; Kawaler, S. D.; Kjeldsen, H.; Lundkvist, M. S.
2016-02-01
Our sample has been extracted from the 77 exoplanet host stars presented in Huber et al. (2013, Cat. J/ApJ/767/127). We have made use of the full time-base of observations from the Kepler satellite to uniformly determine precise fundamental stellar parameters, including ages, for a sample of exoplanet host stars where high-quality asteroseismic data were available. We devised a Bayesian procedure flexible in its input and applied it to different grids of models to study systematics from input physics and extract statistically robust properties for all stars. (4 data files).
Funderburk, F R; Pathak, D S; Pleil, A M
1998-01-01
Quality of life is a fascinating field to researchers and practitioners alike. To some researchers, quality of life is of interest because it offers untold challenges in constructing instruments and capturing data necessary to answer key questions about health, disease, and treatment. For such researchers, quality of life is about statistical relationships among questions and about using questions to define the physical, social, and emotional domains of health. To other researchers, this field is about finding practical applications in policy and treatment decision making for the information provided by quality of life assessments. To these researchers, the focus of quality of life is on ways to apply knowledge of quality of life differences between groups with and without specific diseases or ways to use knowledge about how treatments affect the quality of life of various patient populations. To practitioners, quality of life is about treatment outcomes that impact individual patients' daily lives. It is the practitioner that Funderburk, Pleil, and Pathak are considering in their paper in this issue of Pharmacy Practice Management Quarterly. These authors give several important messages to practitioners seeking to serve their patients by incorporating quality of life into their practices. The key message in the paper is that to better understand and determine the impact of treatment on a patient's quality of life, it is critical to start with a baseline or reference point relevant to that patient. From that baseline or reference point, treatment decisions can be made and progress, in quality of life terms, can be evaluated. Critical questions in their framework, which is called the IN*COMPASS (Individualized Client Oriented Method for Preferred Alleviation of Sickness States) Approach, are "How are you now?" and "How would you like to be?" The authors do not endorse particular quality of life tools in their approach; rather they prescribe certain critical questions that must be answered if information captured by any quality of life tool is to be useful at the patient level. Readers should not be put off by the fancy acronym used in this paper; nor must readers be keen students of quality of life to appreciate its message. The IN*COMPASS approach is fundamental to good patient care and can be applied by practitioners with any level of understanding of and appreciation for quality of life assessments.
2013-01-01
Background Most of the institutional and research information in the biomedical domain is available in the form of English text. Even in countries where English is an official language, such as the United States, language can be a barrier for accessing biomedical information for non-native speakers. Recent progress in machine translation suggests that this technique could help make English texts accessible to speakers of other languages. However, the lack of adequate specialized corpora needed to train statistical models currently limits the quality of automatic translations in the biomedical domain. Results We show how a large-sized parallel corpus can automatically be obtained for the biomedical domain, using the MEDLINE database. The corpus generated in this work comprises article titles obtained from MEDLINE and abstract text automatically retrieved from journal websites, which substantially extends the corpora used in previous work. After assessing the quality of the corpus for two language pairs (English/French and English/Spanish) we use the Moses package to train a statistical machine translation model that outperforms previous models for automatic translation of biomedical text. Conclusions We have built translation data sets in the biomedical domain that can easily be extended to other languages available in MEDLINE. These sets can successfully be applied to train statistical machine translation models. While further progress should be made by incorporating out-of-domain corpora and domain-specific lexicons, we believe that this work improves the automatic translation of biomedical texts. PMID:23631733
Karimi, Mohammad H; Asemani, Davud
2014-05-01
Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
Methods for processing microarray data.
Ares, Manuel
2014-02-01
Quality control must be maintained at every step of a microarray experiment, from RNA isolation through statistical evaluation. Here we provide suggestions for analyzing microarray data. Because the utility of the results depends directly on the design of the experiment, the first critical step is to ensure that the experiment can be properly analyzed and interpreted. What is the biological question? What is the best way to perform the experiment? How many replicates will be required to obtain the desired statistical resolution? Next, the samples must be prepared, pass quality controls for integrity and representation, and be hybridized and scanned. Also, slides with defects, missing data, high background, or weak signal must be rejected. Data from individual slides must be normalized and combined so that the data are as free of systematic bias as possible. The third phase is to apply statistical filters and tests to the data to determine genes (1) expressed above background, (2) whose expression level changes in different samples, and (3) whose RNA-processing patterns or protein associations change. Next, a subset of the data should be validated by an alternative method, such as reverse transcription-polymerase chain reaction (RT-PCR). Provided that this endorses the general conclusions of the array analysis, gene sets whose expression, splicing, polyadenylation, protein binding, etc. change in different samples can be classified with respect to function, sequence motif properties, as well as other categories to extract hypotheses for their biological roles and regulatory logic.
Chang, Howard H.; Hu, Xuefei; Liu, Yang
2014-01-01
There has been a growing interest in the use of satellite-retrieved aerosol optical depth (AOD) to estimate ambient concentrations of PM2.5 (particulate matter <2.5 μm in aerodynamic diameter). With their broad spatial coverage, satellite data can increase the spatial–temporal availability of air quality data beyond ground monitoring measurements and potentially improve exposure assessment for population-based health studies. This paper describes a statistical downscaling approach that brings together (1) recent advances in PM2.5 land use regression models utilizing AOD and (2) statistical data fusion techniques for combining air quality data sets that have different spatial resolutions. Statistical downscaling assumes the associations between AOD and PM2.5 concentrations to be spatially and temporally dependent and offers two key advantages. First, it enables us to use gridded AOD data to predict PM2.5 concentrations at spatial point locations. Second, the unified hierarchical framework provides straightforward uncertainty quantification in the predicted PM2.5 concentrations. The proposed methodology is applied to a data set of daily AOD values in southeastern United States during the period 2003–2005. Via cross-validation experiments, our model had an out-of-sample prediction R2 of 0.78 and a root mean-squared error (RMSE) of 3.61 μg/m3 between observed and predicted daily PM2.5 concentrations. This corresponds to a 10% decrease in RMSE compared with the same land use regression model without AOD as a predictor. Prediction performances of spatial–temporal interpolations to locations and on days without monitoring PM2.5 measurements were also examined. PMID:24368510
Chang, Howard H; Hu, Xuefei; Liu, Yang
2014-07-01
There has been a growing interest in the use of satellite-retrieved aerosol optical depth (AOD) to estimate ambient concentrations of PM2.5 (particulate matter <2.5 μm in aerodynamic diameter). With their broad spatial coverage, satellite data can increase the spatial-temporal availability of air quality data beyond ground monitoring measurements and potentially improve exposure assessment for population-based health studies. This paper describes a statistical downscaling approach that brings together (1) recent advances in PM2.5 land use regression models utilizing AOD and (2) statistical data fusion techniques for combining air quality data sets that have different spatial resolutions. Statistical downscaling assumes the associations between AOD and PM2.5 concentrations to be spatially and temporally dependent and offers two key advantages. First, it enables us to use gridded AOD data to predict PM2.5 concentrations at spatial point locations. Second, the unified hierarchical framework provides straightforward uncertainty quantification in the predicted PM2.5 concentrations. The proposed methodology is applied to a data set of daily AOD values in southeastern United States during the period 2003-2005. Via cross-validation experiments, our model had an out-of-sample prediction R(2) of 0.78 and a root mean-squared error (RMSE) of 3.61 μg/m(3) between observed and predicted daily PM2.5 concentrations. This corresponds to a 10% decrease in RMSE compared with the same land use regression model without AOD as a predictor. Prediction performances of spatial-temporal interpolations to locations and on days without monitoring PM2.5 measurements were also examined.
2012-01-01
Background Health status is one of the basic factors of a high quality of life and the problem of the acceptance of illness is important for adaptation to the limitations imposed by it. The purpose of the study was the evaluation of the quality of life, satisfaction with life and the acceptance of illness by malaria patients, as well as the discovery of a relationship between studied parameters. Methods The study was undertaken in August 2010, on 120 Nigerian patients with confirmed malaria. A method of diagnostic survey, based on standardized scales - Acceptance of Illness Scale, The Satisfaction With Life Scale and a standardized survey questionnaire World Health Organization Quality of Life/BREF - was used in this study. Descriptive statistics, variability range, 95% confidence interval, correlation analysis, Spearman’s non-parametric correlation coefficient, Mann–Whitney test and Kruskal-Wallis test were applied and the, so called, test statistics was calculated, followed by the calculation of the test probability p. Results of analyses were presented in a box graph, and a graph of dispersion. Results A dominating share in the adjective scale of the AIS scale was the category of “no acceptance”, given by 71.7% of respondents. The average level of a “somatic domain” was 41.7, and of a “social domain” was 62.8. The mean satisfaction of life evaluation in the SWLS scale was 18 points. The correlation between acceptance of the disease and quality of life for the psychological domain was 0.39***, and between acceptance of the disease and satisfaction with life was 0.40***. The correlation between satisfaction with life and quality of life for the psychological domain was 0.65***, and between satisfaction with life and quality of life for the environment domain was 0.60***. The mean level of AIS for the studied population of men was 16.5, and test probability: p = 0.0014**, and for the environment domain the level was 50, and the test probability: p = 0.0073**. For quality of life in the social sphere the test probability: p = 0.0013** in relatively older individuals. Conclusion The majority of people do not accept their condition. Evaluation of the quality of life was the highest in the social domain, and the lowest in the somatic domain. There is a statistically significant correlation between the level of acceptance of illness and the quality of life and satisfaction with life. The strongest correlation is found between satisfaction with life and the evaluation of the quality of life in psychological and environmental domains. Men evaluate their quality of life in the environmental domain higher and demonstrate a higher acceptance of their disease. There is a correlation regarding a significantly higher quality of life in the social sphere in relatively older people. PMID:22616635
Analysing attitude data through ridit schemes.
El-rouby, M G
1994-12-02
The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
NASA Astrophysics Data System (ADS)
Attia, Khalid A. M.; El-Abasawi, Nasr M.; El-Olemy, Ahmed; Serag, Ahmed
2018-02-01
Five simple spectrophotometric methods were developed for the determination of simeprevir in the presence of its oxidative degradation product namely, ratio difference, mean centering, derivative ratio using the Savitsky-Golay filters, second derivative and continuous wavelet transform. These methods are linear in the range of 2.5-40 μg/mL and validated according to the ICH guidelines. The obtained results of accuracy, repeatability and precision were found to be within the acceptable limits. The specificity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. Furthermore, these methods were statistically comparable to RP-HPLC method and good results were obtained. So, they can be used for the routine analysis of simeprevir in quality-control laboratories.
Assessment of CT image quality using a Bayesian approach
NASA Astrophysics Data System (ADS)
Reginatto, M.; Anton, M.; Elster, C.
2017-08-01
One of the most promising approaches for evaluating CT image quality is task-specific quality assessment. This involves a simplified version of a clinical task, e.g. deciding whether an image belongs to the class of images that contain the signature of a lesion or not. Task-specific quality assessment can be done by model observers, which are mathematical procedures that carry out the classification task. The most widely used figure of merit for CT image quality is the area under the ROC curve (AUC), a quantity which characterizes the performance of a given model observer. In order to estimate AUC from a finite sample of images, different approaches from classical statistics have been suggested. The goal of this paper is to introduce task-specific quality assessment of CT images to metrology and to propose a novel Bayesian estimation of AUC for the channelized Hotelling observer (CHO) applied to the task of detecting a lesion at a known image location. It is assumed that signal-present and signal-absent images follow multivariate normal distributions with the same covariance matrix. The Bayesian approach results in a posterior distribution for the AUC of the CHO which provides in addition a complete characterization of the uncertainty of this figure of merit. The approach is illustrated by its application to both simulated and experimental data.
Hua, Ang Kean
2017-01-01
Malacca River water quality is affected due to rapid urbanization development. The present study applied LULC changes towards water quality detection in Malacca River. The method uses LULC, PCA, CCA, HCA, NHCA, and ANOVA. PCA confirmed DS, EC, salinity, turbidity, TSS, DO, BOD, COD, As, Hg, Zn, Fe, E. coli , and total coliform. CCA confirmed 14 variables into two variates; first variate involves residential and industrial activities; and second variate involves agriculture, sewage treatment plant, and animal husbandry. HCA and NHCA emphasize that cluster 1 occurs in urban area with Hg, Fe, total coliform, and DO pollution; cluster 3 occurs in suburban area with salinity, EC, and DS; and cluster 2 occurs in rural area with salinity and EC. ANOVA between LULC and water quality data indicates that built-up area significantly polluted the water quality through E. coli , total coliform, EC, BOD, COD, TSS, Hg, Zn, and Fe, while agriculture activities cause EC, TSS, salinity, E. coli , total coliform, arsenic, and iron pollution; and open space causes contamination of turbidity, salinity, EC, and TSS. Research finding provided useful information in identifying pollution sources and understanding LULC with river water quality as references to policy maker for proper management of Land Use area.
Effects of Rhinophototherapy on Quality of Life in Persistant Allergic Rhinitis
Korkmaz, Hakan; Sürenoğlu, Ünzile Akpinar; Saylam, Güleser; Özdek, Ali
2013-01-01
Objectives To investigate the effect of rhinophototherapy with medical therapy on quality of life in persistent allergic rhinitis. Methods A prospective, randomized study was being performed between December 2009 and March 2010. The study included 65 patients with persistent allergic rhinitis. The diagnosis was confirmed with positive skin tests. All of the patients had house dust mite allergies. We divided the patients into two groups. First group (n=33) was given topical mometasone furoate 200 mcg/day and levocetirizine 5 mg/day for a month. Rhinophototherapy was applied with the same medical therapy to the second group (n=32), twice a week for three weeks continuously. Rhinophototherapy included visible light, ultraviolet A and ultraviolet B. We evaluated patients before the treatment, at the first month and at the third month after treatment with rhinoconjunctivitis quality of life questionnaire, nasal symptom scores and visual analogue scale (VAS) scores. Results Improvements of all variables of the quality of life questionnaire, nasal symptom scores and VAS were statistically significant in the second group both on the first and the third months when compared with the first group. Conclusion Allergic rhinitis is a social problem and impairs quality of life. Rhinophototherapy with medical therapy improves the quality of life in allergic rhinitis. PMID:23799163
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Surzhikov, V D; Surzhikov, D V
2014-01-01
The search and measurement of causal relationships between exposure to air pollution and health state of the population is based on the system analysis and risk assessment to improve the quality of research. With this purpose there is applied the modern statistical analysis with the use of criteria of independence, principal component analysis and discriminate function analysis. As a result of analysis out of all atmospheric pollutants there were separated four main components: for diseases of the circulatory system main principal component is implied with concentrations of suspended solids, nitrogen dioxide, carbon monoxide, hydrogen fluoride, for the respiratory diseases the main c principal component is closely associated with suspended solids, sulfur dioxide and nitrogen dioxide, charcoal black. The discriminant function was shown to be used as a measure of the level of air pollution.
[Nursing care time in a teaching hospital].
Rogenski, Karin Emília; Fugulin, Fernanda Maria Togeiro; Gaidzinski, Raquel Rapone; Rogenski, Noemi Marisa Brunet
2011-03-01
This is a quantitative exploratory, descriptive study performed with the objective to identify and analyze the performance of the average time of nursing care delivered to patients of the Inpatient Units of the University Hospital at University of São Paulo (UH-USP), from 2001 to 2005. The average nursing care time delivered to patients of the referred units was identified by applying of a mathematical equation proposed in the literature, after surveying data from the Medical and Statistical Service and based on the monthly working shifts of the nursing professionals. Data analysis was performed using descriptive statistics. The average nursing care time observed in most units, despite some variations, remained stable during the analyzed period. Based on this observed stability, it is concluded that the nursing staff in the referred HU-USP units has been continuously evaluated with the purposes of maintaining the average time of assistance and, thus, the quality of the care being delivered.
NASA Technical Reports Server (NTRS)
Kogut, J.; Larduinat, E.; Fitzgerald, M.
1983-01-01
The utility of methods for generating TM RLUTS which can improve the quality of the resultant images was investigated. The TM-CCT-ADDS tape was changed to account for a different collection window for the calibration data. Several scenes of Terrebonne Bay, Louisiana and the Grand Bahamas were analyzed to evaluate the radiometric corrections operationally applied to the image data and to investigate several techniques for reducing striping in the images. Printer plots for the TM shutter data were produced and detector statistics were compiled and plotted. These statistics included various combinations of the average shutter counts for each scan before and after DC restore for forward and reverse scans. Results show that striping is caused by the detectors becoming saturated when they view a bright cloud and depress the DC restore level.
Adaptive distributed source coding.
Varodayan, David; Lin, Yao-Chung; Girod, Bernd
2012-05-01
We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.
A Method for Retrieving Ground Flash Fraction from Satellite Lightning Imager Data
NASA Technical Reports Server (NTRS)
Koshak, William J.
2009-01-01
A general theory for retrieving the fraction of ground flashes in N lightning observed by a satellite-based lightning imager is provided. An "exponential model" is applied as a physically reasonable constraint to describe the measured optical parameter distributions, and population statistics (i.e., mean, variance) are invoked to add additional constraints to the retrieval process. The retrieval itself is expressed in terms of a Bayesian inference, and the Maximum A Posteriori (MAP) solution is obtained. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The ability to retrieve ground flash fraction has important benefits to the atmospheric chemistry community. For example, using the method to partition the existing satellite global lightning climatology into separate ground and cloud flash climatologies will improve estimates of lightning nitrogen oxides (NOx) production; this in turn will improve both regional air quality and global chemistry/climate model predictions.
Analytic score distributions for a spatially continuous tridirectional Monte Carol transport problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, T.E.
1996-01-01
The interpretation of the statistical error estimates produced by Monte Carlo transport codes is still somewhat of an art. Empirically, there are variance reduction techniques whose error estimates are almost always reliable, and there are variance reduction techniques whose error estimates are often unreliable. Unreliable error estimates usually result from inadequate large-score sampling from the score distribution`s tail. Statisticians believe that more accurate confidence interval statements are possible if the general nature of the score distribution can be characterized. Here, the analytic score distribution for the exponential transform applied to a simple, spatially continuous Monte Carlo transport problem is provided.more » Anisotropic scattering and implicit capture are included in the theory. In large part, the analytic score distributions that are derived provide the basis for the ten new statistical quality checks in MCNP.« less
Aquatic effects assessment: needs and tools.
Marchini, Silvia
2002-01-01
In the assessment of the adverse effects pollutants can produce on exposed ecosystems, different approaches can be followed depending on the quality and quantity of information available, whose advantages and limits are discussed with reference to the aquatic compartment. When experimental data are lacking, a predictive approach can be pursued by making use of validated quantitative structure-activity relationships (QSARs), which provide reliable ecotoxicity estimates only if appropriate models are applied. The experimental approach is central to any environmental hazard assessment procedure, although many uncertainties underlying the extrapolation from a limited set of single species laboratory data to the complexity of the ecosystem (e.g., the limitations of common summary statistics, the variability of species sensitivity, the need to consider alterations at higher level of integration) make the task difficult. When adequate toxicity information are available, the statistical extrapolation approach can be used to predict environmental compatible concentrations.
Air pollution exposure prediction approaches used in air pollution epidemiology studies.
Özkaynak, Halûk; Baxter, Lisa K; Dionisio, Kathie L; Burke, Janet
2013-01-01
Epidemiological studies of the health effects of outdoor air pollution have traditionally relied upon surrogates of personal exposures, most commonly ambient concentration measurements from central-site monitors. However, this approach may introduce exposure prediction errors and misclassification of exposures for pollutants that are spatially heterogeneous, such as those associated with traffic emissions (e.g., carbon monoxide, elemental carbon, nitrogen oxides, and particulate matter). We review alternative air quality and human exposure metrics applied in recent air pollution health effect studies discussed during the International Society of Exposure Science 2011 conference in Baltimore, MD. Symposium presenters considered various alternative exposure metrics, including: central site or interpolated monitoring data, regional pollution levels predicted using the national scale Community Multiscale Air Quality model or from measurements combined with local-scale (AERMOD) air quality models, hybrid models that include satellite data, statistically blended modeling and measurement data, concentrations adjusted by home infiltration rates, and population-based human exposure model (Stochastic Human Exposure and Dose Simulation, and Air Pollutants Exposure models) predictions. These alternative exposure metrics were applied in epidemiological applications to health outcomes, including daily mortality and respiratory hospital admissions, daily hospital emergency department visits, daily myocardial infarctions, and daily adverse birth outcomes. This paper summarizes the research projects presented during the symposium, with full details of the work presented in individual papers in this journal issue.
Effects of Urbanization on Stream Water Quality in the City of Atlanta, Georgia, USA
NASA Astrophysics Data System (ADS)
Peters, N. E.
2009-05-01
A long-term stream water-quality monitoring network was established in the City of Atlanta (COA) during 2003 to assess baseline water-quality conditions and the effects of urbanization on stream water quality. Routine hydrologically-based manual stream sampling, including several concurrent manual point and equal width increment sampling, was conducted approximately 12 times per year at 21 stations, with drainage areas ranging from 3.7 to 232 km2. Eleven of the stations are real-time (RT) water-quality stations having continuous measures of stream stage/discharge, pH, dissolved oxygen, specific conductance, water temperature, and turbidity, and automatic samplers for stormwater collection. Samples were analyzed for field parameters, and a broad suite of water-quality and sediment-related constituents. This paper summarizes an evaluation of field parameters and concentrations of major ions, minor and trace metals, nutrient species (nitrogen and phosphorus), and coliform bacteria among stations and with respect to watershed characteristics and plausible sources from 2003 through September 2007. The concentrations of most constituents in the COA streams are statistically higher than those of two nearby reference streams. Concentrations are statistically different among stations for several constituents, despite high variability both within and among stations. The combination of routine manual sampling, automatic sampling during stormflows, and real-time water-quality monitoring provided sufficient information about the variability of urban stream water quality to develop hypotheses for causes of water-quality differences among COA streams. Fecal coliform bacteria concentrations of most individual samples at each station exceeded Georgia's water-quality standard for any water-usage class. High chloride concentrations occur at three stations and are hypothesized to be associated with discharges of chlorinated combined sewer overflows, drainage of swimming pool(s), and dissolution and transport during rainstorms of CaCl2, a deicing salt applied to roads during winter storms. Water quality of one stream was highly affected by the dissolution and transport of ammonium alum [NH4Al(SO4)2] from an alum manufacturing plant in the watershed; streamwater has low pH (<5), low alkalinity and high concentrations of minor and trace metals. Several trace metals (Cu, Pb and Zn) exceed acute and chronic water-quality standards and the high concentrations are attributed to washoff from impervious surfaces.
Lee, Michael T.; Asquith, William H.; Oden, Timothy D.
2012-01-01
In December 2005, the U.S. Geological Survey (USGS), in cooperation with the City of Houston, Texas, began collecting discrete water-quality samples for nutrients, total organic carbon, bacteria (Escherichia coli and total coliform), atrazine, and suspended sediment at two USGS streamflow-gaging stations that represent watersheds contributing to Lake Houston (08068500 Spring Creek near Spring, Tex., and 08070200 East Fork San Jacinto River near New Caney, Tex.). Data from the discrete water-quality samples collected during 2005–9, in conjunction with continuously monitored real-time data that included streamflow and other physical water-quality properties (specific conductance, pH, water temperature, turbidity, and dissolved oxygen), were used to develop regression models for the estimation of concentrations of water-quality constituents of substantial source watersheds to Lake Houston. The potential explanatory variables included discharge (streamflow), specific conductance, pH, water temperature, turbidity, dissolved oxygen, and time (to account for seasonal variations inherent in some water-quality data). The response variables (the selected constituents) at each site were nitrite plus nitrate nitrogen, total phosphorus, total organic carbon, E. coli, atrazine, and suspended sediment. The explanatory variables provide easily measured quantities to serve as potential surrogate variables to estimate concentrations of the selected constituents through statistical regression. Statistical regression also facilitates accompanying estimates of uncertainty in the form of prediction intervals. Each regression model potentially can be used to estimate concentrations of a given constituent in real time. Among other regression diagnostics, the diagnostics used as indicators of general model reliability and reported herein include the adjusted R-squared, the residual standard error, residual plots, and p-values. Adjusted R-squared values for the Spring Creek models ranged from .582–.922 (dimensionless). The residual standard errors ranged from .073–.447 (base-10 logarithm). Adjusted R-squared values for the East Fork San Jacinto River models ranged from .253–.853 (dimensionless). The residual standard errors ranged from .076–.388 (base-10 logarithm). In conjunction with estimated concentrations, constituent loads can be estimated by multiplying the estimated concentration by the corresponding streamflow and by applying the appropriate conversion factor. The regression models presented in this report are site specific, that is, they are specific to the Spring Creek and East Fork San Jacinto River streamflow-gaging stations; however, the general methods that were developed and documented could be applied to most perennial streams for the purpose of estimating real-time water quality data.
Using internal marketing to improve organizational commitment and service quality.
Tsai, Yafang; Wu, Shih-Wang
2011-12-01
The purpose of this article was to explore the structural relationships among internal marketing, organizational commitment and service quality and to practically apply the findings. Internal marketing is a way to assist hospitals in improving the quality of the services that they provide while executing highly labour-intensive tasks. Through internal marketing, a hospital can enhance the organizational commitment of its employees to attain higher service quality. This research uses a cross-sectional study to survey nursing staff perceptions about internal marketing, organizational commitment and service quality. The results of the survey are evaluated using equation models. The sample includes three regional hospitals in Taiwan. Three hundred and fifty questionnaires were distributed and 288 valid questionnaires were returned, yielding a response rate of 82.3%. The survey process lasted from 1 February to 9 March 2007. The data were analysed with SPSS 12.0, including descriptive statistics based on demographics. In addition, the influence of demographics on internal marketing, organizational commitment and service quality is examined using one-way anova. The findings reveal that internal marketing plays a critical role in explaining employee perceptions of organizational commitment and service quality. Organizational commitment is the mediator between internal marketing and service quality. The results indicate that internal marketing has an impact on both organizational commitment and service quality. Internal marketing should be emphasized to influence frontline nursing staff, thereby helping to create better organizational commitment and service quality. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.
Statistics Report on TEQSA Registered Higher Education Providers, 2016
ERIC Educational Resources Information Center
Australian Government Tertiary Education Quality and Standards Agency, 2016
2016-01-01
This Statistics Report is the third release of selected higher education sector data held by the Australian Government Tertiary Education Quality and Standards Agency (TEQSA) for its quality assurance activities. It provides a snapshot of national statistics on all parts of the sector by bringing together data collected directly by TEQSA with data…
Jenkins, Kathy J; Koch Kupiec, Jennifer; Owens, Pamela L; Romano, Patrick S; Geppert, Jeffrey J; Gauvreau, Kimberlee
2016-05-20
The National Quality Forum previously approved a quality indicator for mortality after congenital heart surgery developed by the Agency for Healthcare Research and Quality (AHRQ). Several parameters of the validated Risk Adjustment for Congenital Heart Surgery (RACHS-1) method were included, but others differed. As part of the National Quality Forum endorsement maintenance process, developers were asked to harmonize the 2 methodologies. Parameters that were identical between the 2 methods were retained. AHRQ's Healthcare Cost and Utilization Project State Inpatient Databases (SID) 2008 were used to select optimal parameters where differences existed, with a goal to maximize model performance and face validity. Inclusion criteria were not changed and included all discharges for patients <18 years with International Classification of Diseases, Ninth Revision, Clinical Modification procedure codes for congenital heart surgery or nonspecific heart surgery combined with congenital heart disease diagnosis codes. The final model includes procedure risk group, age (0-28 days, 29-90 days, 91-364 days, 1-17 years), low birth weight (500-2499 g), other congenital anomalies (Clinical Classifications Software 217, except for 758.xx), multiple procedures, and transfer-in status. Among 17 945 eligible cases in the SID 2008, the c statistic for model performance was 0.82. In the SID 2013 validation data set, the c statistic was 0.82. Risk-adjusted mortality rates by center ranged from 0.9% to 4.1% (5th-95th percentile). Congenital heart surgery programs can now obtain national benchmarking reports by applying AHRQ Quality Indicator software to hospital administrative data, based on the harmonized RACHS-1 method, with high discrimination and face validity. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Fontes, Tânia; Li, Peilin; Barros, Nelson; Zhao, Pengjun
2018-08-01
Air quality traffic-related measures have been implemented worldwide to control the pollution levels of urban areas. Although some of those measures are claiming environmental improvements, few studies have checked their real impact. In fact, quantitative estimates are often focused on reducing emissions, rather than on evaluating the actual measures' effect on air quality. Even when air quality studies are conducted, results are frequently unclear. In order to properly assess the real impact on air quality of traffic-related measures, a statistical method is proposed. The method compares the pollutant concentration levels observed after the implementation of a measure with the concentration values of the previous year. Short- and long-term impact is assessed considering not only their influence on the average pollutant concentration, but also on its maximum level. To control the effect of the main confounding factors, only the days with similar environmental conditions are analysed. The changeability of the key meteorological variables that affect the transport and dispersion of the pollutant studied are used to identify and group the days categorized as similar. Resemblance of the pollutants' concentration of the previous day is also taken into account. The impact of the road traffic measures on the air pollutants' concentration is then checked for those similar days using specific statistical functions. To evaluate the proposed method, the impact on PM 2.5 concentrations of two air quality traffic-related measures (M1 and M2) implemented in the city of Beijing are taken into consideration: M1 was implemented in 2009, restricting the circulation of yellow-labelled vehicles, while M2 was implemented in 2014, restricting the circulation of heavy-duty vehicles. To compare the results of each measure, a time-period when these measures were not applied is used as case-control. Copyright © 2018 Elsevier Ltd. All rights reserved.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Experimental Study of Quantum Graphs With and Without Time-Reversal Invariance
NASA Astrophysics Data System (ADS)
Anlage, Steven Mark; Fu, Ziyuan; Koch, Trystan; Antonsen, Thomas; Ott, Edward
An experimental setup consisting of a microwave network is used to simulate quantum graphs. The random coupling model (RCM) is applied to describe the universal statistical properties of the system with and without time-reversal invariance. The networks which are large compared to the wavelength, are constructed from coaxial cables connected by T junctions, and by making nodes with circulators time-reversal invariance for microwave propagation in the networks can be broken. The results of experimental study of microwave networks with and without time-reversal invariance are presented both in frequency domain and time domain. With the measured S-parameter data of two-port networks, the impedance statistics and the nearest-neighbor spacing statistics are examined. Moreover, the experiments of time reversal mirrors for networks demonstrate that the reconstruction quality can be used to quantify the degree of the time-reversal invariance for wave propagation. Numerical models of networks are also presented to verify the time domain experiments. We acknowledge support under contract AFOSR COE Grant FA9550-15-1-0171 and the ONR Grant N000141512134.
Bellenguez, Céline; Strange, Amy; Freeman, Colin; Donnelly, Peter; Spencer, Chris C A
2012-01-01
High-throughput genotyping arrays provide an efficient way to survey single nucleotide polymorphisms (SNPs) across the genome in large numbers of individuals. Downstream analysis of the data, for example in genome-wide association studies (GWAS), often involves statistical models of genotype frequencies across individuals. The complexities of the sample collection process and the potential for errors in the experimental assay can lead to biases and artefacts in an individual's inferred genotypes. Rather than attempting to model these complications, it has become a standard practice to remove individuals whose genome-wide data differ from the sample at large. Here we describe a simple, but robust, statistical algorithm to identify samples with atypical summaries of genome-wide variation. Its use as a semi-automated quality control tool is demonstrated using several summary statistics, selected to identify different potential problems, and it is applied to two different genotyping platforms and sample collections. The algorithm is written in R and is freely available at www.well.ox.ac.uk/chris-spencer chris.spencer@well.ox.ac.uk Supplementary data are available at Bioinformatics online.
Statistical analysis of the calibration procedure for personnel radiation measurement instruments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.
1980-11-01
Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less
NASA Astrophysics Data System (ADS)
Friedel, M. J.; Daughney, C.
2016-12-01
The development of a successful surface-groundwater management strategy depends on the quality of data provided for analysis. This study evaluates the statistical robustness when using a modified self-organizing map (MSOM) technique to estimate missing values for three hypersurface models: synoptic groundwater-surface water hydrochemistry, time-series of groundwater-surface water hydrochemistry, and mixed-survey (combination of groundwater-surface water hydrochemistry and lithologies) hydrostratigraphic unit data. These models of increasing complexity are developed and validated based on observations from the Southland region of New Zealand. In each case, the estimation method is sufficiently robust to cope with groundwater-surface water hydrochemistry vagaries due to sample size and extreme data insufficiency, even when >80% of the data are missing. The estimation of surface water hydrochemistry time series values enabled the evaluation of seasonal variation, and the imputation of lithologies facilitated the evaluation of hydrostratigraphic controls on groundwater-surface water interaction. The robust statistical results for groundwater-surface water models of increasing data complexity provide justification to apply the MSOM technique in other regions of New Zealand and abroad.
Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim
2007-01-01
This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population. PMID:17615902
Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim
2007-03-01
This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.
Fabron, Eliana Maria Gradim; Petrini, Andressa Schweitzer; Cardoso, Vanessa de Moraes; Batista, João Carlos Torgal; Motonaga, Suely Mayumi; Marino, Viviane Cristina de Castro
2017-06-08
To investigate vocal quality variability after applying tongue trills associated with transcutaneous electrical nerve stimulation (TENS) on the larynx of women with normal laryngeal function. Additionally, to verify the effect of this technique over time on voice quality. Participants were 40 women (average 23.4 years) without vocal complaints. The procedure involved tongue trills with or without TENS for 3 minutes, rest and repeating the technique for another 2 minutes. The participants' voices were recorded before (Pre), after three minutes (Post 3min) and after two additional minutes (Post 5min) applying the technique. TENS with two electrodes was used on the thyroid cartilage. Self-assessment, acoustic and perceptual analysis were performed. When comparing tongue trills in isolation and associated with TENS, a greater sense of stability in phonation (self-assessment) and improvement in voice quality (perceptual evaluation) was observed in the combination technique. There was no statistical difference in acoustics findings between tongue trills in isolation and associated with TENS. When comparing the time effect of tongue trills with TENS in self-assessment there was a perception of less muscle tension (3min) and greater comfort during phonation (5 min); in the acoustic analysis, there was an increase of F0 (3 and 5 min) and intensity (5 min) when compared to Pre-moment; in the perceptual evaluation, better voice quality (3min). Comparing tongue trills in isolation and associated with TENS, there were changes in the comfort and muscle tension perception, as well as in vocal quality. On the other hand, tongue trills associated with TENS performed in 3 or 5 minutes resulted in beneficial effects on the voice identified in the assessments.
Löfvander, Monica; Rosenblad, Andreas; Wiklund, Tony; Bennström, Halina; Leppert, Jerzy
2014-12-01
To examine whether new immigrants had inferior quality-of-life, well-being and general functioning compared with Swedish age- and sex-matched controls. A prospective case-control study was designed including immigrants from non-European countries, 18-65 years of age, with recent Permanent Permits to Stay (PPS) in Sweden, and age- and sex-matched Swedish-born (SB) persons from the general population in Västmanland County, Sweden. The General Health Questionnaire (GHQ-12), the brief version of the World Health Organization Quality-of-Life (WHOQOL-BREF) Scale and the General Activity Functioning Assessment Scale (GAF) from DSM-IV were posted (SB), or applied in personal interviews (PPS) with interpreters. Differences between the PPS and SB groups were measured using McNemar's test and Wilcoxon signed-rank test conducted separately for observations at baseline, 6- and 12-month follow-up. There were 93 pairs (mean age 36 years). Persons from Somalia (67%) and Iraq (27%) dominated the PPS group. The differences between the groups were statistically significant for all time points for the Psychological health and Social relationship domains of WHOQOL-BREF, and for the baseline and 6-month follow-up time points of GHQ-12 where the PPS-group had a higher degree of well-being, health and quality-of-life than the SB. This tendency applied for both sexes in the immigrant group. These new immigrants did not have inferior physical or psychological health, quality-of-life, well-being or social functioning compared with their age- and sex-matched Swedish born pairs during a 1-year follow-up. Thus, there is reason to advocate immigrants' fast integration into society. © 2014 the Nordic Societies of Public Health.
Head CT: Image quality improvement with ASIR-V using a reduced radiation dose protocol for children.
Kim, Hyun Gi; Lee, Ho-Joon; Lee, Seung-Koo; Kim, Hyun Ji; Kim, Myung-Joon
2017-09-01
To investigate the quality of images reconstructed with adaptive statistical iterative reconstruction V (ASIR-V), using pediatric head CT protocols. A phantom was scanned at decreasing 20% mA intervals using our standard pediatric head CT protocols. Each study was then reconstructed at 10% ASIR-V intervals. After the phantom study, we reduced mA by 10% in the protocol for <3-year-old patients and applied 30% ASIR-V and by 30% in the protocol for 3- to 15-year-old patients and applied 40% ASIR-V. Increasing the percentage of ASIR-V resulted in lower noise and higher contrast-to-noise ratio (CNR) and preserved spatial resolution in the phantom study. Compared to a conventional-protocol, reduced-dose protocol with ASIR-V achieved 12.8% to 34.0% of dose reduction and showed images of lower noise (9.22 vs. 10.73, P = 0.043) and higher CNR in different levels (centrum semiovale, 2.14 vs. 1.52, P = 0.003; basal ganglia, 1.46 vs. 1.07, P = 0.001; and cerebellum, 2.18 vs. 1.33, P < 0.001). Qualitative analysis showed higher gray-white matter differentiation and sharpness and preserved overall diagnostic quality in the images with ASIR-V. Use of ASIR-V allowed a 12.8% to 34.0% dose reduction in each age group with potential to improve image quality. • It is possible to reduce radiation dose and improve image quality with ASIR-V. • We improved noise and CNR and decreased radiation dose. • Sharpness improved with ASIR-V. • Total radiation dose was decreased by 12.8% to 34.0%.
Dittmar, John C.; Pierce, Steven; Rothstein, Rodney; Reid, Robert J. D.
2013-01-01
Genome-wide experiments often measure quantitative differences between treated and untreated cells to identify affected strains. For these studies, statistical models are typically used to determine significance cutoffs. We developed a method termed “CLIK” (Cutoff Linked to Interaction Knowledge) that overlays biological knowledge from the interactome on screen results to derive a cutoff. The method takes advantage of the fact that groups of functionally related interacting genes often respond similarly to experimental conditions and, thus, cluster in a ranked list of screen results. We applied CLIK analysis to five screens of the yeast gene disruption library and found that it defined a significance cutoff that differed from traditional statistics. Importantly, verification experiments revealed that the CLIK cutoff correlated with the position in the rank order where the rate of true positives drops off significantly. In addition, the gene sets defined by CLIK analysis often provide further biological perspectives. For example, applying CLIK analysis retrospectively to a screen for cisplatin sensitivity allowed us to identify the importance of the Hrq1 helicase in DNA crosslink repair. Furthermore, we demonstrate the utility of CLIK to determine optimal treatment conditions by analyzing genome-wide screens at multiple rapamycin concentrations. We show that CLIK is an extremely useful tool for evaluating screen quality, determining screen cutoffs, and comparing results between screens. Furthermore, because CLIK uses previously annotated interaction data to determine biologically informed cutoffs, it provides additional insights into screen results, which supplement traditional statistical approaches. PMID:23589890
Quality of life and self-esteem in children with chronic tic disorder.
Hesapçıoğlu, Selma Tural; Tural, Mustafa Kemal; Kandil, Sema
2014-12-01
In this study, it was aimed to evaluate the quality of life and self-esteem in children and adolescents with Tourette syndrome (TS) and other chronic motor or vocal tic disorders in comparison with the control group. This is the first study examining the effects of quality of life and self-esteem on each other in chronic tic disorders. Among 62 patients aged between 6 and 16 years who were diagnosed with chronic tic disorder according to the Diagnostic and Statistical Manual of Mental Disorders-IV, 57 patients who met the study inclusion criteria constituted the study group and 57 age- and gender-matched individuals constituted the control group (Ethics committee file number: 2009/69; ethics committee meeting number: 2009/14 (11.06.2009); ethics committee decision number: 16). The Rosenberg self-esteem scale, Pediatric Quality of Life Inventory, Children's Depression Inventory, Screen for Child Anxiety Related Disorders, Maudsley Obsessional Compulsive Inventory and the Schedule for Affective Disorders and Schizophrenia-Present and Lifetime version were applied to the children and adolescents. In the study group, all quality of life subtests were found to be lower compared to the control group both in children and adolescents except for self-reported emotional functionality and social functionality. Being below the age of 12 years and female gender were found to be predictors of low self-esteem in tic disorder. In the reports obtained from the children and adolescents, low self-esteem was related with decreased quality of life in all areas except for academic functionality. Children and adolescents with tic disorder experience functional disruption with a higher rate compared to the group without a psychiatric disorder or severe medical condition. Applying holistic approaches considering other clinical psychiatric symptoms as a part of chronic tic disorder will be useful in increasing the quality of life and self-esteem of these children.
Quality of life and self-esteem in children with chronic tic disorder
Hesapçıoğlu, Selma Tural; Tural, Mustafa Kemal; Kandil, Sema
2014-01-01
Aim: In this study, it was aimed to evaluate the quality of life and self-esteem in children and adolescents with Tourette syndrome (TS) and other chronic motor or vocal tic disorders in comparison with the control group. This is the first study examining the effects of quality of life and self-esteem on each other in chronic tic disorders. Material and Methods: Among 62 patients aged between 6 and 16 years who were diagnosed with chronic tic disorder according to the Diagnostic and Statistical Manual of Mental Disorders-IV, 57 patients who met the study inclusion criteria constituted the study group and 57 age- and gender-matched individuals constituted the control group (Ethics committee file number: 2009/69; ethics committee meeting number: 2009/14 (11.06.2009); ethics committee decision number: 16). The Rosenberg self-esteem scale, Pediatric Quality of Life Inventory, Children’s Depression Inventory, Screen for Child Anxiety Related Disorders, Maudsley Obsessional Compulsive Inventory and the Schedule for Affective Disorders and Schizophrenia-Present and Lifetime version were applied to the children and adolescents. Results: In the study group, all quality of life subtests were found to be lower compared to the control group both in children and adolescents except for self-reported emotional functionality and social functionality. Being below the age of 12 years and female gender were found to be predictors of low self-esteem in tic disorder. In the reports obtained from the children and adolescents, low self-esteem was related with decreased quality of life in all areas except for academic functionality. Conclusions: Children and adolescents with tic disorder experience functional disruption with a higher rate compared to the group without a psychiatric disorder or severe medical condition. Applying holistic approaches considering other clinical psychiatric symptoms as a part of chronic tic disorder will be useful in increasing the quality of life and self-esteem of these children. PMID:26078684
NASA Astrophysics Data System (ADS)
Bonfante, Antonello; Basile, Angelo; Dragonetti, Giovanna; De Lorenzi, Francesca; De Mascellis, Roberto; Gambuti, Angelita; Giorio, Pasquale; Guida, Giampiero; Manna, Piero; Minieri, Luciana; Oliva, Marco; Orefice, Nadia; Terribile, Fabio
2015-04-01
Water deficit is a limiting factor to yield production and crop adaptation to future climate conditions. This is true for crops addressed mainly for biomass production (e.g. maize, wheat, etc.) but not for those where the quality is relevant. Specifically, in grapevine water stress (mid or limited) - occurring during specific phenological phases - is a factor to produce good quality wines. It induces for example the production of anthocyanins and aroma precursors. Therefore, the water stress, due to the future increase of temperature and the rainfall decrease, could represent an opportunity to increase winegrowers' incomes. The study was carried out in Campania region (Southern Italy), in an area vocated to high quality wines production (ZOVISA project: Viticultural zoning at farm scale) The study was realized in two different soils (calcisol and cambisol), under the same climate, on Aglianico cultivar, standard clone population on 1103 Paulsen rootstocks placed along a slope of 90 m length with 11% of gradient. The agro-hydrological model SWAP was calibrated and applied to estimate soil-plant water status at the various crop phenological phases for three vintages (2011-2013). Crop water stress index (CWSI) - estimated by the model - was related to physiological measurements (e.g leaf water potential), grape bunches measurements (e.g. sugar content) and wine quality (e.g. tannins). For both soils, the correlation between measurements and CWSI were high (e.g. -0.97** with sugar; 0.895* with anthocyanins in the skins). Then, the model was applied to future climate condition (2021-2051) obtained from statistical downscaling of GCM in order to estimate the effect of the climate on CWSI and hence on vine quality. The results show that the effects of the climate change on the vine quality is dependent by the soil, being relevant to the cambisol and less pronounced to the calcisol, with an expected improvement of wine quality in the cambisol.
Alikari, Victoria; Sachlas, Athanasios; Giatrakou, Stavroula; Stathoulis, John; Fradelos, Evagelos; Theofilou, Paraskevi; Lavdaniti, Maria; Zyga, Sofia
2017-01-01
An important factor which influences the quality of life of patients with arthritis is the fatigue they experience. The purpose of this study was to assess the relationship between fatigue and quality of life among patients with osteoarthritis and rheumatoid arthritis. Between January 2015 and March 2015, 179 patients with osteoarthritis and rheumatoid arthritis completed the Fatigue Assessment Scale and the Missoula-VITAS Quality of Life Index-15 (MVQoLI-15). The study was conducted in Rehabilitation Centers located in the area of Peloponnese, Greece. Data related to sociodemographic characteristics and their individual medical histories were recorded. Statistical analysis was performed using the IBM SPSS Statistics version 19. The analysis did not reveal statistically significant correlation between fatigue and quality of life neither in the total sample nor among patients with osteoarthritis (r = -0.159; p = 0.126) or rheumatoid arthritis. However, there was a statistically significant relationship between some aspects of fatigue and dimensions of quality of life. Osteoarthritis patients had statistically significant lower MVQoLI-15 score than rheumatoid arthritis patients (13.73 ± 1.811 vs 14.61 ± 1.734) and lower FAS score than rheumatoid patients (26.14 ± 3.668 vs 29.94 ± 3.377) (p-value < 0.001). The finding that different aspects of fatigue may affect dimensions of quality of life may help health care professionals by proposing the early treatment of fatigue in order to gain benefits for quality of life.
[Quality assurance and quality improvement. Personal experiences and intentions].
Roche, B G; Sommer, C
1995-01-01
In may 1994 we were selected by the surgical Swiss association to make a study about quality in USA. During our travel we visited 3 types of institutions: Hospitals, National Institute of standard and Technology, Industry, Johnson & Johnson. We appreciate to compare 2 types of quality programs: Quality Assurance (QA) and Continuous Quality Improvement (CQI). In traditional healthcare circles, QA is the process established to meet external regulatory requirements and to assure that patient care is consistent with established standards. In a modern quality terms, QA outside of healthcare means designing a product or service, as well as controlling its production, so well that quality is inevitable. The ideas of W. Edward Deming is that there is never improvement just by inspection. He developed a theory based on 14 principles. A productive work is accomplished through processes. Understanding the variability of processes is a key to improve quality. Quality management sees each person in an organisation as part of one or more processes. The job of every worker is to receive the work of others, add value to that work, and supply it to the next person in the process. This is called the triple role the workers as customer, processor, and supplier. The main source of quality defects is problems in the process. The old assumption is that quality fails when people do the right thing wrong; the new assumption is that, more often, quality failures arise when people do the wrong think right. Exhortation, incentives and discipline of workers are unlikely to improve quality. If quality is failing when people do their jobs as designed, then exhorting them to do better is managerial nonsense. Modern quality theory is customer focused. Customers are identified internally and externally. The modern approach to quality is thoroughly grounded in scientific and statistical thinking. Like in medicine, the symptom is a defect in quality. The therapist of process must perform diagnostic test, formulate hypotheses of cause, test those hypotheses, apply remedies, and assess the effect of remedies. Total employee involvement is critical. A power comes from enabling all employees to become involved in quality improvement. A great advantage of CQI is the prevention orientation of the concept. The CQI permeated a collegial approach, people learn how to work together to improve. CQI is a time consuming procedure. During our travel we learned the definition of quality as the customer satisfaction. To build a CQI concept in employed time but all employed are involved in quality improvement. Applying CQI we could be able to refuse Quality control programs.
A DMAIC approach for process capability improvement an engine crankshaft manufacturing process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa
2014-05-01
The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, R. M.; Tai, K.-S.
2013-01-01
The Global Modeling and Assimilation Office (GMAO) observing system simulation experiment (OSSE) framework is used to explore the response of analysis error and forecast skill to observation quality. In an OSSE, synthetic observations may be created that have much smaller error than real observations, and precisely quantified error may be applied to these synthetic observations. Three experiments are performed in which synthetic observations with magnitudes of applied observation error that vary from zero to twice the estimated realistic error are ingested into the Goddard Earth Observing System Model (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation for a one-month period representing July. The analysis increment and observation innovation are strongly impacted by observation error, with much larger variances for increased observation error. The analysis quality is degraded by increased observation error, but the change in root-mean-square error of the analysis state is small relative to the total analysis error. Surprisingly, in the 120 hour forecast increased observation error only yields a slight decline in forecast skill in the extratropics, and no discernable degradation of forecast skill in the tropics.
Development of Super-Ensemble techniques for ocean analyses: the Mediterranean Sea case
NASA Astrophysics Data System (ADS)
Pistoia, Jenny; Pinardi, Nadia; Oddo, Paolo; Collins, Matthew; Korres, Gerasimos; Drillet, Yann
2017-04-01
Short-term ocean analyses for Sea Surface Temperature SST in the Mediterranean Sea can be improved by a statistical post-processing technique, called super-ensemble. This technique consists in a multi-linear regression algorithm applied to a Multi-Physics Multi-Model Super-Ensemble (MMSE) dataset, a collection of different operational forecasting analyses together with ad-hoc simulations produced by modifying selected numerical model parameterizations. A new linear regression algorithm based on Empirical Orthogonal Function filtering techniques is capable to prevent overfitting problems, even if best performances are achieved when we add correlation to the super-ensemble structure using a simple spatial filter applied after the linear regression. Our outcomes show that super-ensemble performances depend on the selection of an unbiased operator and the length of the learning period, but the quality of the generating MMSE dataset has the largest impact on the MMSE analysis Root Mean Square Error (RMSE) evaluated with respect to observed satellite SST. Lower RMSE analysis estimates result from the following choices: 15 days training period, an overconfident MMSE dataset (a subset with the higher quality ensemble members), and the least square algorithm being filtered a posteriori.
Rapisarda, Paolo; Camin, Federica; Fabroni, Simona; Perini, Matteo; Torrisi, Biagio; Intrigliolo, Francesco
2010-03-24
To investigate the influence of different types of fertilizers on quality parameters, N-containing compounds, and the delta(15)N, delta(13)C, delta(2)H, delta (34)S, and delta(18)O values of citrus fruit, a study was performed on the orange fruit cv. 'Valencia late' (Citrus sinensis L. Osbeck), which was harvested in four plots (three organic and one conventional) located on the same farm. The results demonstrated that different types of organic fertilizers containing the same amount of nitrogen did not effect important changes in orange fruit quality parameters. The levels of total N and N-containing compounds such as synephrine in fruit juice were not statistically different among the different treatments. The delta(15)N values of orange fruit grown under fertilizer derived from animal origin as well as from vegetable compost were statistically higher than those grown with mineral fertilizer. Therefore, delta(15)N values can be used as an indicator of citrus fertilization management (organic or conventional), because even when applied organic fertilizers are of different origins, the natural abundance of (15)N in organic citrus fruit remains higher than in conventional ones. These treatments also did not effect differences in the delta(13)C, delta(2)H, delta(34)S, and delta(18)O values of fruit.
Chang, Kao-Ping; Lai, Chung-Sheng; Hsieh, Tung-Ying; Wu, Yi-Chia; Chang, Chih-Hau
2012-07-13
This study describes 2-year impact on quality of life (QOL) in relation to the anatomical discrepancy among T4a oral cancer patients after free flap reconstruction in Taiwan. Thirty-two patients who underwent tumor ablation with simultaneous microvascular free flap transfer at 2-year follow-up were recruited. They were divided into six subgroups, according to the resected area, consisting of: (1) buccal/retromolar trigone; (2) cheek; (3) commissure; (4) lip; (5) mandible; and (6) tongue. Functional disturbances and daily activity were analyzed using the Version-1 UW QOL Questionnaire with one more specific category: 'Drooling'. Kruskal-Wallis rank sums analysis was used to test differences in average QOL scores between these subgroups. Post-hoc analysis was applied to assess influence of dominant categories between subgroups. The category 'Pain' revealed the highest average score and reached significant statistical difference (P = 0.019) among all the categories, however, the category 'Employment' averaged the lowest score. Regarding 'Pain', there existed a statistical significance (P = 0.0032) between the commissure- and cheek-involved groups, which described the former showed poorer pain quality of life. The commissure-involved group had the lowest average score, which might imply the worst QOL in our study, especially for the categories 'Pain' and 'Drooling'. This present study of T4a patients was the first carried out in Taiwan implementing the QOL questionnaire, and its results may serve for future reference.
Quality evaluation of Persian nutrition and diet therapy websites.
Gholizadeh, Zahra; Papi, Ahmad; Ashrafi-Rizi, Hasan; Shahrzadi, Leila; Hasanzadeh, Akbar
2017-01-01
Nowadays websites are among the most important information sources used by most people. With the spread of websites, especially those related to health issues, the number of their visitors also increases, more than half of which are about nutritional information. Therefore, quality analysis of nutrition and diet therapy websites is of outmost importance. This study aims to evaluate the quality of Persian nutrition and diet therapy websites. The current work is a survey study and uses an applied study method. The statistical population consists of 51 Persian websites about nutrition and diet therapy and census method was used in order to study them. Data gathering was done using a checklist and with direct visit to each website. Descriptive and analytical statistics were used to analyse the gathered data with the help of SPSS 21 software. Findings showed that content (66.7%), organization (82.4%), user friendly interfaces (52.9%) and total quality (70.6%) of most websites had a mediocre score while the design score for most of the websites (70.6%) was acceptable also organizational websites had better design, organization and quality compared to private websites. The three websites with the highest general quality score were the websites of "Novel Diet Therapy," "Behsite" and "Dr. BehdadiPour" (jointly) and "Dr. Kermani" respectively. Also in the dimension of content the factors of goal, relevance and credibility, in the dimension of design the factors of color, text and sound, pictures and videos, in the dimension of organization the factors of stability and indexing and in the dimension of user friendliness the factors of confidentiality, credibility and personalization had the highest scores. The results showed that the design score was higher than other scores. Also the general quality score of the websites was mediocre and was not desirable. Also websites didn't have suitable scores in every factor. Since most people search the internet for nutritional and diet therapy information, the creators of these websites should endeavor to fix the shortcomings of their websites and increase the quality of their websites in several different areas.
Quality evaluation of Persian nutrition and diet therapy websites
Gholizadeh, Zahra; Papi, Ahmad; Ashrafi-rizi, Hasan; Shahrzadi, Leila; Hasanzadeh, Akbar
2017-01-01
INTRODUCTION: Nowadays websites are among the most important information sources used by most people. With the spread of websites, especially those related to health issues, the number of their visitors also increases, more than half of which are about nutritional information. Therefore, quality analysis of nutrition and diet therapy websites is of outmost importance. This study aims to evaluate the quality of Persian nutrition and diet therapy websites. METHODS: The current work is a survey study and uses an applied study method. The statistical population consists of 51 Persian websites about nutrition and diet therapy and census method was used in order to study them. Data gathering was done using a checklist and with direct visit to each website. Descriptive and analytical statistics were used to analyse the gathered data with the help of SPSS 21 software. RESULTS: Findings showed that content (66.7%), organization (82.4%), user friendly interfaces (52.9%) and total quality (70.6%) of most websites had a mediocre score while the design score for most of the websites (70.6%) was acceptable also organizational websites had better design, organization and quality compared to private websites. The three websites with the highest general quality score were the websites of “Novel Diet Therapy,” “Behsite” and “Dr. BehdadiPour” (jointly) and “Dr. Kermani” respectively. Also in the dimension of content the factors of goal, relevance and credibility, in the dimension of design the factors of color, text and sound, pictures and videos, in the dimension of organization the factors of stability and indexing and in the dimension of user friendliness the factors of confidentiality, credibility and personalization had the highest scores. CONCLUSION: The results showed that the design score was higher than other scores. Also the general quality score of the websites was mediocre and was not desirable. Also websites didn’t have suitable scores in every factor. Since most people search the internet for nutritional and diet therapy information, the creators of these websites should endeavor to fix the shortcomings of their websites and increase the quality of their websites in several different areas. PMID:28616415
The difficulty in assessing errors in numerical models of air quality is a major obstacle to improving their ability to predict and retrospectively map air quality. In this paper, using simulation outputs from the Community Multi-scale Air Quality Model (CMAQ), the statistic...
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
Shelton, Jennifer L.; Fram, Miranda S.; Munday, Cathy M.; Belitz, Kenneth
2010-01-01
Groundwater quality in the approximately 25,500-square-mile Sierra Nevada study unit was investigated in June through October 2008, as part of the Priority Basin Project of the Groundwater Ambient Monitoring and Assessment (GAMA) Program. The GAMA Priority Basin Project is being conducted by the U.S. Geological Survey (USGS) in cooperation with the California State Water Resources Control Board (SWRCB). The Sierra Nevada study was designed to provide statistically robust assessments of untreated groundwater quality within the primary aquifer systems in the study unit, and to facilitate statistically consistent comparisons of groundwater quality throughout California. The primary aquifer systems (hereinafter, primary aquifers) are defined by the depth of the screened or open intervals of the wells listed in the California Department of Public Health (CDPH) database of wells used for public and community drinking-water supplies. The quality of groundwater in shallower or deeper water-bearing zones may differ from that in the primary aquifers; shallow groundwater may be more vulnerable to contamination from the surface. In the Sierra Nevada study unit, groundwater samples were collected from 84 wells (and springs) in Lassen, Plumas, Butte, Sierra, Yuba, Nevada, Placer, El Dorado, Amador, Alpine, Calaveras, Tuolumne, Madera, Mariposa, Fresno, Inyo, Tulare, and Kern Counties. The wells were selected on two overlapping networks by using a spatially-distributed, randomized, grid-based approach. The primary grid-well network consisted of 30 wells, one well per grid cell in the study unit, and was designed to provide statistical representation of groundwater quality throughout the entire study unit. The lithologic grid-well network is a secondary grid that consisted of the wells in the primary grid-well network plus 53 additional wells and was designed to provide statistical representation of groundwater quality in each of the four major lithologic units in the Sierra Nevada study unit: granitic, metamorphic, sedimentary, and volcanic rocks. One natural spring that is not used for drinking water was sampled for comparison with a nearby primary grid well in the same cell. Groundwater samples were analyzed for organic constituents (volatile organic compounds [VOC], pesticides and pesticide degradates, and pharmaceutical compounds), constituents of special interest (N-nitrosodimethylamine [NDMA] and perchlorate), naturally occurring inorganic constituents (nutrients, major ions, total dissolved solids, and trace elements), and radioactive constituents (radium isotopes, radon-222, gross alpha and gross beta particle activities, and uranium isotopes). Naturally occurring isotopes and geochemical tracers (stable isotopes of hydrogen and oxygen in water, stable isotopes of carbon, carbon-14, strontium isotopes, and tritium), and dissolved noble gases also were measured to help identify the sources and ages of the sampled groundwater. Three types of quality-control samples (blanks, replicates, and samples for matrix spikes) each were collected at approximately 10 percent of the wells sampled for each analysis, and the results for these samples were used to evaluate the quality of the data for the groundwater samples. Field blanks rarely contained detectable concentrations of any constituent, suggesting that contamination from sample collection, handling, and analytical procedures was not a significant source of bias in the data for the groundwater samples. Differences between replicate samples were within acceptable ranges, with few exceptions. Matrix-spike recoveries were within acceptable ranges for most compounds. This study did not attempt to evaluate the quality of water delivered to consumers; after withdrawal from the ground, groundwater typically is treated, disinfected, or blended with other waters to maintain water quality. Regulatory benchmarks apply to finished drinking water that is served to the consumer, not to untre
Classification of Malaysia aromatic rice using multivariate statistical analysis
NASA Astrophysics Data System (ADS)
Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.
2015-05-01
Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.
[Pre-and post-pubertal rhythmic gymnastics athletes' physical qualities].
Menezes, Luciana de Souza; Novaes, Jefferson; Fernandes-Filho, José
2012-01-01
Identifying and comparinge the physical qualities of Brazilian athletes and practitioners of rhythmic gymnastics. 125 Brazilian athletes and rhythmic gymnasts practitioners of Rhythmic Gymnastics fromaged 7 to 25 years- old were evaluated. They were divided into the following categories: Different competitive levels (international, national and regional) and pre- and post- menarche practitioners. The protocols used were: Burpee (coordination), Sargent jJump test, and gGoniometry (flexibility). This was a cross-sectional, comparative delineation sStudy with a transverse cut and comparative delineation. Descriptive statistics, and inferencialinferential analysianalysiss were estimated. The and ANOVA were usedwas applied. Tukey's Afterwards the post hoc Tuckey test was then applied. The results were: Burpee international level=20.,0±0.,8; national level=18.,3±2.,7; regional level=18.,9±1.,9; pre- menarche=13.,7±3.,2 and post- menarche=16.,2±3.,8; vertical high- international level=40.,1±2.,7 cm; national level= 38.,0±4.,3 cm; regional level=35.,1±3.,5 cm; pre- menarche= 25.,2±7.,4 cm and post- menarche=35.,4±6.,6 cm; leg goniometry: international level =180.,0±00.,0; national level =146.,9±13.,93; regional level=147.,1±10.,75; pre- menarche=135.,80±22.,62 and pPost- menarche=141.,0±23.,09; and back goniometry: international level=33.,3±5.,69; national level=38.,3±13.,82; regional level=36.,5±11.,84; pre- menarche=48.,7±12.,80 and post- menarche=48.,8±12.,30. Significant statistical differences were found between the different categories in regarding the all the variables between the different categories.
Melvin, Steven D; Petit, Marie A; Duvignacq, Marion C; Sumpter, John P
2017-08-01
The quality and reproducibility of science has recently come under scrutiny, with criticisms spanning disciplines. In aquatic toxicology, behavioural tests are currently an area of controversy since inconsistent findings have been highlighted and attributed to poor quality science. The problem likely relates to limitations to our understanding of basic behavioural patterns, which can influence our ability to design statistically robust experiments yielding ecologically relevant data. The present study takes a first step towards understanding baseline behaviours in fish, including how basic choices in experimental design might influence behavioural outcomes and interpretations in aquatic toxicology. Specifically, we explored how fish acclimate to behavioural arenas and how different lengths of observation time impact estimates of basic swimming parameters (i.e., average, maximum and angular velocity). We performed a semi-quantitative literature review to place our findings in the context of the published literature describing behavioural tests with fish. Our results demonstrate that fish fundamentally change their swimming behaviour over time, and that acclimation and observational timeframes may therefore have implications for influencing both the ecological relevance and statistical robustness of behavioural toxicity tests. Our review identified 165 studies describing behavioural responses in fish exposed to various stressors, and revealed that the majority of publications documenting fish behavioural responses report extremely brief acclimation times and observational durations, which helps explain inconsistencies identified across studies. We recommend that researchers applying behavioural tests with fish, and other species, apply a similar framework to better understand baseline behaviours and the implications of design choices for influencing study outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Glynn, Carolyn; Weslien, Jan
2004-12-01
We investigated the effects of Bacillus thuringiensis variety kurstaki x aizawai (Bt) on infestation levels of two lepidopteran insects as well as on seed quality in Norway spruce, Picea abies L. (Karst.) in central Sweden. Spruce flowers (female strobili) were sprayed with a 0.2% suspension (wt:wt) of the Bt preparation Turex 50 WP, 25,000 IU/mg in water. To expose even those lepidopteran larvae that feed exclusively embedded within the cone tissue, the Bt treatment was applied to open flowers, before they closed and developed into cones. The experimental design included three main factors: treatment (untreated control, water, or Bt), spruce genotype (three clones), and spraying time (spraying before, during, and after the phase of highest pollen receptivity). The Bt treatment reduced the proportion of cones infested by the cone worm Dioryctria abietella Den. et Schiff. (Lepidoptera: Pyralidae) from approximately 30 to 15%. There was no statistically significant treatment effect on the infestation rate of Cydia strobilella (L.) (Lepidoptera: Tortricidae). The Bt variety kurstaki x aizawai treatment caused no reduction in seed quality as measured by seed weight or percentage of nonfilled seeds. There was no difference in number of seeds per cone between the Bt-treated and untreated control cones. There was a significant effect of genotype on insect infestation rates, as well as on number of seeds per cone and seed weight. Neither level of insect damage nor any seed quality parameters were affected by time of application of the treatments.
Islam, Abu Reza Md Towfiqul; Ahmed, Nasir; Bodrud-Doza, Md; Chu, Ronghao
2017-12-01
Drinking water is susceptible to the poor quality of contaminated water affecting the health of humans. Thus, it is an essential study to investigate factors affecting groundwater quality and its suitability for drinking uses. In this paper, the entropy theory, multivariate statistics, spatial autocorrelation index, and geostatistics are applied to characterize groundwater quality and its spatial variability in the Sylhet district of Bangladesh. A total of 91samples have been collected from wells (e.g., shallow, intermediate, and deep tube wells at 15-300-m depth) from the study area. The results show that NO 3 - , then SO 4 2- , and As are the most contributed parameters influencing the groundwater quality according to the entropy theory. The principal component analysis (PCA) and correlation coefficient also confirm the results of the entropy theory. However, Na + has the highest spatial autocorrelation and the most entropy, thus affecting the groundwater quality. Based on the entropy-weighted water quality index (EWQI) and groundwater quality index (GWQI) classifications, it is observed that 60.45 and 53.86% of water samples are classified as having an excellent to good qualities, while the remaining samples vary from medium to extremely poor quality domains for drinking purposes. Furthermore, the EWQI classification provides the more reasonable results than GWQIs due to its simplicity, accuracy, and ignoring of artificial weight. A Gaussian semivariogram model has been chosen to the best fit model, and groundwater quality indices have a weak spatial dependence, suggesting that both geogenic and anthropogenic factors play a pivotal role in spatial heterogeneity of groundwater quality oscillations.
Cultural Adaptation Quality of Family Life Scale for the Brazilian Portuguese.
Jorge, Bianca Miguel; Levy, Cilmara Cristina Alves da Costa; Granato, Lídio
2015-01-01
To culturally adapt the Family Quality of Life Scale to the Brazilian Portuguese version and evaluate the instrument reliability and family quality of life of those who have children with hearing loss. The process of cultural adaptation of the scale followed the steps of the Guidelines for the Process of Cross-Cultural Adaptation of Self-Report Measure. It was conducted in three stages: translation, back translation, and application in a pilot sample, as a way to check the comprehension difficulties of the items. After it had been completed, it was administered to 41 families who have children with hearing loss and, with their results, the quality of life and reliability were analyzed based on the Cronbach's alpha statistical test. In the first version (translation), among the 25 items, there were differences between the translators only in four items; after the corrections, the second version was done (back translation), in which other four more differences were found. Finally, after the final corrections, the last version was developed and used in the pilot sample without differences. Thus, it was applied to families with deaf children, who believe to be satisfied as to their quality of life. The Cronbach's alpha test found that the scale shows a satisfactory reliability. The Brazilian Portuguese version of the Family Quality of Life Scale is a tool of easy use and satisfactory reliability. The families are satisfied with their family quality of life.
Leadership in nursing and patient satisfaction in hospital context.
Nunes, Elisabete Maria Garcia Teles; Gaspar, Maria Filomena Mendes
2016-06-01
Objectives to know the quality of the leadership relationship from the perspective of a chief nurse and nurse, patient satisfaction, the relationship between the quality of the relationship perceived for both and patient satisfaction. Methods a quantitative, transverse and correlational approach. Non-probabilistic convenience sample consists of 15 chief nurses, 342 nurses, 273 patients. Data collected at the Central Lisbon Hospital Center, between January and March 2013, through the LMX-7, CLMX-7 and SUCEH21 scales. Statistical analysis was performed through SPSS ® Statistics 19. Results the chief nurse considers the quality of the leadership relationship good, the nurses consider it satisfactory, patients are considered to be satisfied with nursing care; there is a statistically significant correlation between the quality of the leadership relationship from the perspective of chief nurses and patient satisfaction, there is no statistically significant correlation between the quality of the leadership relationship in the nurse's perspective and satisfaction. Conclusion the chief nurse has a major role in patient satisfaction.
Enhanced data validation strategy of air quality monitoring network.
Harkat, Mohamed-Faouzi; Mansouri, Majdi; Nounou, Mohamed; Nounou, Hazem
2018-01-01
Quick validation and detection of faults in measured air quality data is a crucial step towards achieving the objectives of air quality networks. Therefore, the objectives of this paper are threefold: (i) to develop a modeling technique that can be used to predict the normal behavior of air quality variables and help provide accurate reference for monitoring purposes; (ii) to develop fault detection method that can effectively and quickly detect any anomalies in measured air quality data. For this purpose, a new fault detection method that is based on the combination of generalized likelihood ratio test (GLRT) and exponentially weighted moving average (EWMA) will be developed. GLRT is a well-known statistical fault detection method that relies on maximizing the detection probability for a given false alarm rate. In this paper, we propose to develop GLRT-based EWMA fault detection method that will be able to detect the changes in the values of certain air quality variables; (iii) to develop fault isolation and identification method that allows defining the fault source(s) in order to properly apply appropriate corrective actions. In this paper, reconstruction approach that is based on Midpoint-Radii Principal Component Analysis (MRPCA) model will be developed to handle the types of data and models associated with air quality monitoring networks. All air quality modeling, fault detection, fault isolation and reconstruction methods developed in this paper will be validated using real air quality data (such as particulate matter, ozone, nitrogen and carbon oxides measurement). Copyright © 2017 Elsevier Inc. All rights reserved.
Load-based approaches for modelling visual clarity in streams at regional scale.
Elliott, A H; Davies-Colley, R J; Parshotam, A; Ballantine, D
2013-01-01
Reduction of visual clarity in streams by diffuse sources of fine sediment is a cause of water quality impairment in New Zealand and internationally. In this paper we introduce the concept of a load of optical cross section (LOCS), which can be used for load-based management of light-attenuating substances and for water quality models that are based on mass accounting. In this approach, the beam attenuation coefficient (units of m(-1)) is estimated from the inverse of the visual clarity (units of m) measured with a black disc. This beam attenuation coefficient can also be considered as an optical cross section (OCS) per volume of water, analogous to a concentration. The instantaneous 'flux' of cross section is obtained from the attenuation coefficient multiplied by the water discharge, and this can be accumulated over time to give an accumulated 'load' of cross section (LOCS). Moreover, OCS is a conservative quantity, in the sense that the OCS of two combined water volumes is the sum of the OCS of the individual water volumes (barring effects such as coagulation, settling, or sorption). The LOCS can be calculated for a water quality station using rating curve methods applied to measured time series of visual clarity and flow. This approach was applied to the sites in New Zealand's National Rivers Water Quality Network (NRWQN). Although the attenuation coefficient follows roughly a power relation with flow at some sites, more flexible loess rating curves are required at other sites. The hybrid mechanistic-statistical catchment model SPARROW (SPAtially Referenced Regressions On Watershed attributes), which is based on a mass balance for mean annual load, was then applied to the NRWQN dataset. Preliminary results from this model are presented, highlighting the importance of factors related to erosion, such as rainfall, slope, hardness of catchment rock types, and the influence of pastoral development on the load of optical cross section.
Survey statistics of automated segmentations applied to optical imaging of mammalian cells.
Bajcsy, Peter; Cardone, Antonio; Chalfoun, Joe; Halter, Michael; Juba, Derek; Kociolek, Marcin; Majurski, Michael; Peskin, Adele; Simon, Carl; Simon, Mylene; Vandecreme, Antoine; Brady, Mary
2015-10-15
The goal of this survey paper is to overview cellular measurements using optical microscopy imaging followed by automated image segmentation. The cellular measurements of primary interest are taken from mammalian cells and their components. They are denoted as two- or three-dimensional (2D or 3D) image objects of biological interest. In our applications, such cellular measurements are important for understanding cell phenomena, such as cell counts, cell-scaffold interactions, cell colony growth rates, or cell pluripotency stability, as well as for establishing quality metrics for stem cell therapies. In this context, this survey paper is focused on automated segmentation as a software-based measurement leading to quantitative cellular measurements. We define the scope of this survey and a classification schema first. Next, all found and manually filteredpublications are classified according to the main categories: (1) objects of interests (or objects to be segmented), (2) imaging modalities, (3) digital data axes, (4) segmentation algorithms, (5) segmentation evaluations, (6) computational hardware platforms used for segmentation acceleration, and (7) object (cellular) measurements. Finally, all classified papers are converted programmatically into a set of hyperlinked web pages with occurrence and co-occurrence statistics of assigned categories. The survey paper presents to a reader: (a) the state-of-the-art overview of published papers about automated segmentation applied to optical microscopy imaging of mammalian cells, (b) a classification of segmentation aspects in the context of cell optical imaging, (c) histogram and co-occurrence summary statistics about cellular measurements, segmentations, segmented objects, segmentation evaluations, and the use of computational platforms for accelerating segmentation execution, and (d) open research problems to pursue. The novel contributions of this survey paper are: (1) a new type of classification of cellular measurements and automated segmentation, (2) statistics about the published literature, and (3) a web hyperlinked interface to classification statistics of the surveyed papers at https://isg.nist.gov/deepzoomweb/resources/survey/index.html.
Mjørud, Marit; Kirkevold, Marit; Røsvik, Janne; Engedal, Knut
2014-01-01
To investigate which factors the Quality of Life in Late-Stage Dementia (QUALID) scale holds when used among people with dementia (pwd) in nursing homes and to find out how the symptom load varies across the different severity levels of dementia. We included 661 pwd [mean age ± SD, 85.3 ± 8.6 years; 71.4% women]. The QUALID and the Clinical Dementia Rating (CDR) scale were applied. A principal component analysis (PCA) with varimax rotation and Kaiser normalization was applied to test the factor structure. Nonparametric analyses were applied to examine differences of symptom load across the three CDR groups. The mean QUALID score was 21.5 (±7.1), and the CDR scores of the three groups were 1 in 22.5%, 2 in 33.6% and 3 in 43.9%. The results of the statistical measures employed were the following: Crohnbach's α of QUALID, 0.74; Bartlett's test of sphericity, p <0.001; the Kaiser-Meyer-Olkin measure, 0.77. The PCA analysis resulted in three components accounting for 53% of the variance. The first component was 'tension' ('facial expression of discomfort', 'appears physically uncomfortable', 'verbalization suggests discomfort', 'being irritable and aggressive', 'appears calm', Crohnbach's α = 0.69), the second was 'well-being' ('smiles', 'enjoys eating', 'enjoys touching/being touched', 'enjoys social interaction', Crohnbach's α = 0.62) and the third was 'sadness' ('appears sad', 'cries', 'facial expression of discomfort', Crohnbach's α 0.65). The mean score on the components 'tension' and 'well-being' increased significantly with increasing severity levels of dementia. Three components of quality of life (qol) were identified. Qol decreased with increasing severity of dementia. © 2013 S. Karger AG, Basel.
Gatti, Marco; Marchisio, Filippo; Fronda, Marco; Rampado, Osvaldo; Faletti, Riccardo; Bergamasco, Laura; Ropolo, Roberto; Fonio, Paolo
The aim of this study was to evaluate the impact on dose reduction and image quality of the new iterative reconstruction technique: adaptive statistical iterative reconstruction (ASIR-V). Fifty consecutive oncologic patients acted as case controls undergoing during their follow-up a computed tomography scan both with ASIR and ASIR-V. Each study was analyzed in a double-blinded fashion by 2 radiologists. Both quantitative and qualitative analyses of image quality were conducted. Computed tomography scanner radiation output was 38% (29%-45%) lower (P < 0.0001) for the ASIR-V examinations than for the ASIR ones. The quantitative image noise was significantly lower (P < 0.0001) for ASIR-V. Adaptive statistical iterative reconstruction-V had a higher performance for the subjective image noise (P = 0.01 for 5 mm and P = 0.009 for 1.25 mm), the other parameters (image sharpness, diagnostic acceptability, and overall image quality) being similar (P > 0.05). Adaptive statistical iterative reconstruction-V is a new iterative reconstruction technique that has the potential to provide image quality equal to or greater than ASIR, with a dose reduction around 40%.
Industry guidelines, laws and regulations ignored: quality of drug advertising in medical journals.
Lankinen, Kari S; Levola, Tero; Marttinen, Kati; Puumalainen, Inka; Helin-Salmivaara, Arja
2004-11-01
To document the quality of evidence base for marketing claims in prescription drug advertisements, to facilitate identification of potential targets for quality improvement. A sample of 1036 advertisements from four major Finnish medical journals published in 2002. Marketing claims were classified in four groups: unambiguous clinical outcome, vague clinical outcome, emotive or immeasurable outcome and non-clinical outcome. Medline references were traced and classified according to the level of evidence available. The statistical variables used in the advertisements were also documented. The sample included 245 distinct advertisements with 883 marketing claims, 1-10 claims per advertisement. Three hundred thirty seven (38%) of the claims were referenced. Each claim could be supported by one reference or more, so the number of references analysed totalled 381, 1-9 references per advertisement. Nine percent of the claims implied unambiguous clinical outcomes, 68% included vague or emotive statements. Twenty one percent of the references were irrelevant to the claim. There was a fair amount of non-scientific and scientific support for the 73 unambiguous claims, but not a single claim was supported by strong scientific evidence. Vague, emotive and non-clinical claims were significantly more often supported by non-Medline or irrelevant references than unambiguous claims. Statistical parameters were stated only 34 times. Referenced marketing claims may appear more scientific, but the use of references does not guarantee the quality of the claims. For the benefit of all stakeholders, both the regulatory control and industry's self-control of drug marketing should adopt more active monitoring roles, and apply sanctions when appropriate. Concerted efforts by several stakeholders might be more effective. Copyright 2004 John Wiley & Sons, Ltd.
The effectiveness of progressive muscle relaxation on the postpartum quality of life.
Gökşin, İlknur; Ayaz-Alkaya, Sultan
2018-04-05
This study aimed to determine the effectiveness of progressive muscle relaxation (PMR) on the quality of life women during postpartum period. A quasi-experimental design was used. The participants consisted of primiparous women who had experienced a vaginal birth in the obstetrics department of a hospital. Thirty women in the intervention group and 30 women in the control group were included. Data were collected by questionnaire and Maternal Postpartum Quality of Life Questionnaire (MAPP-QoL) between June 2016 and April 2017. Progressive muscle relaxation were applied to the intervention group. PMR were performed as contracting a muscle group, then relaxing it, moving (or progressing) from one muscle group to another. The mean pre-test and post-test scores of the MAPP-QoL in the intervention group were 24.43 ± 4.58 and 26.07 ± 4.58, respectively (t = -2.73, p < .05). The mean pre-test and post-test scores of the MAPP-QoL in the control group were 23.29 ± 4.37 and 21.99 ± 5.58, respectively (t = 2.23, p < .05). The difference between the mean scores of the women in the intervention and control groups before PMR was not statistically significant (t = 0.99, p > .05), whereas the difference between the groups after PMR was found to be statistically significant (t = 3.09, p < .05. Postpartum quality of life of women was increased after PMR. Progressive muscle relaxation should be taught to women who are admitted to obstetrics and outpatient clinics, that home visits be completed in order to expand the use of PMR. Copyright © 2018. Published by Elsevier B.V.
Tepavcevic, Darija Kisic; Pekmezovic, Tatjana; Stojsavljevic, Nebojsa; Kostic, Jelena; Basuroski, Irena Dujmovic; Mesaros, Sarlota; Drulovic, Jelena
2014-04-01
The aim of this study was to determine the changes in the health-related quality of life (HRQoL) and predictors of change among patients with multiple sclerosis (MS) at 3 and 6 years during the follow-up period. A group of 109 consecutive MS patients (McDonald's criteria) referred to the Clinic of Neurology, Belgrade, were enrolled in the study. At three time points during the study (baseline, and at 3 and 6 years during the follow-up period), the HRQoL (measured by MSQoL-54), Expanded Disability Status Scale, and Hamilton Rating Scale for Depression and Fatigue Severity Scale were assessed. During the study period, 93 patients provided both follow-up assessments. Statistically significant deterioration in the HRQoL at each subsequent time point was detected for all scales of the MSQoL-54 except for the pain and change in health scales. A higher level of education was a significant prognostic factor for a better HRQoL on the cognitive function scale throughout the entire period of observation, while marital status (single, including divorced and widowed) and increased age at the onset of MS had significant predictive values of poorer quality-of-life scores on the overall quality-of-life scale at 6-year follow-up. Higher levels of physical disability and depression at baseline were statistically significant prognostic markers for deterioration in HRQoL for the majority of MSQoL-54 scales during the entire follow-up period. Our study suggests that baseline demographic and clinical characteristics could be applied as prognostic markers of the HRQOL for patients diagnosed with MS.
Zhang, Jun; Yang, Ke-hu; Tian, Jin-hui; Wang, Chun-mei
2012-11-01
The aim of this meta-analysis was to evaluate the effects of yoga on psychologic function and quality of life (QoL) in women with breast cancer. A systematic search of PubMed, EMBASE, the Cochrane Library, the Chinese Biomedical Literature Database, and the Chinese Digital Journals Full-text Database was carried out. Randomized control trials (RCTs) examining the effects of yoga, versus a control group receiving no intervention, on psychologic functioning and QoL in women with breast cancer were included. Methodological quality of included RCTs was assessed according to the Cochrane Handbook for Systematic Reviews of Interventions 5.0.1, and data were analyzed using the Cochrane Collaboration's Review Manager 5.1. Six (6) studies involving 382 patients were included. The meta-analysis showed that yoga can improve QoL for women with breast cancer. A statistically significant effect favoring yoga for the outcome of QoL was found (standard mean difference=0.27, 95% confidence interval [0.02, 0.52], p=0.03). Although the effects of yoga on psychologic function outcomes--such as anxiety, depression, distress and sleep--were in the expected direction, these effects were not statistically significant (p>0.05). Fatigue showed no significant difference (p>0.05). The present data provided little indication of how effective yoga might be when they were applied by women with breast cancer except for mildly effective in QOL improvement. The findings were based on a small body of evidence in which methodological quality was not high. Further well-designed RCTs with large sample size are needed to clarify the utility of yoga practice for this population.
Kubsik-Gidlewska, Anna; Klimkiewicz, Robert; Klimkiewicz, Paulina; Janczewska, Katarzyna; Jankowska, Agnieszka; Nowakowski, Tomasz; Woldańska-Okońska, Marta
2017-01-01
Multiple sclerosis is a chronic demyelinating disease of the central nervous system, which results a progressive disability. The disease reduces the quality of life of patients, changes the general health perceptions, and also limits performing social roles because of emotional problems. Evaluation of the impact of the methods of rehabilitation to improve the mental health of patients with multiple sclerosis, and also to change individual parameters included in the overall assessment of mental health. The study was conducted in 2010-2014 at the Department of Physical Medicine and Rehabilitation in Lodz. The study included 120 patients with multiple sclerosis. Patients were classified into 4 test groups: in the first was used the laser, in the second - laser and magnetostimulation, in the third - kinesiotherapy, and in the fourth - magnetostimulation. The tests were carried out three times. To evaluate the quality of life was used Quality of Life Questionnaire (MSQOL-54), analyzed the overall assessment of mental health. The improvement in a range of parameters, an overall assessment of the quality of mental health has allowed to get a better overall psychological well-being. ,There was oserved a statistically significant difference at the level of p<0.001 between groups in 4/5 investigated parameters, statistically significant differences weren't obserwed at the evaluation of cognitive functions. The greatest improvement was observed in Group II and Group IV. In the examination it was confirmed an effectiveness of physical treatment, such a the laser radiation and magnetostimulation. Synergism of both methods in their biological activity, allows for evoke of hysteresis fenomenon, resulting in the maintenance of the treatment effects after cessation of rehabilitation. Applying the classical kinesiotherapy only doesn't allow to get long-term effects.
[Male sterility and its association with genital disease and environmental factors].
Merino Ruiz, M C; De León Cervantes, M G; García Flores, R F
1995-10-01
Semen quality may be affected by many factors, as there is evidence that conditions as varicocele, criptorquidia, orchitis and bacterian infections; as well as to exposure to physical agents as heat, or chemical substances, or ingestion of alcohol and drugs, may affect semen quality. The objective of this study is to investigate the risk implied in the exposure to these factors on the semen quality. The study was carried out in a prospective way in a group of males at Clínica de Infertilidad, Unidad de Biología de la Reproducción del Hospital Universitario Dr. J.E. González. Ninety nine males were studied, they received an intentioned questionnaire about antecedents of exposure to environmental factors, and urologic resolved pathology. Espermatobioscopy was done and it was classified according to OMS. Two groups were formed, one with the individuals with normal espermatobioscopy (n = 25); and the abnormal ones (n = 74). The statistical Incidences Reason, square Xi and Atributable Risk, were applied in order to determine the impact that different factors may have on semen quality. The found alterations in semen were astenozoospermia (n = 58); hypospermia (n = 22); oligozoospermia (n = 18); teratozoospermia (n = 7); polizoospermia (n = 7); and azoospermia (n = 6). The results of the mentioned statistical tests, show that in these alterations there is an associated risk factor to the use of tobacco, exposure to chemical substances, to physical aggresors; and anatomic anomalies previously corrected. It is considered that obtention of this information is a great help because once the unfavorable factors are eliminated, the environment is improved in order to obtain an espermatogenesis in optimal conditions.
NASA Astrophysics Data System (ADS)
Hong, Chaopeng; Zhang, Qiang; Zhang, Yang; Tang, Youhua; Tong, Daniel; He, Kebin
2017-06-01
In this study, a regional coupled climate-chemistry modeling system using the dynamical downscaling technique was established by linking the global Community Earth System Model (CESM) and the regional two-way coupled Weather Research and Forecasting - Community Multi-scale Air Quality (WRF-CMAQ) model for the purpose of comprehensive assessments of regional climate change and air quality and their interactions within one modeling framework. The modeling system was applied over east Asia for a multi-year climatological application during 2006-2010, driven with CESM downscaling data under Representative Concentration Pathways 4.5 (RCP4.5), along with a short-term air quality application in representative months in 2013 that was driven with a reanalysis dataset. A comprehensive model evaluation was conducted against observations from surface networks and satellite observations to assess the model's performance. This study presents the first application and evaluation of the two-way coupled WRF-CMAQ model for climatological simulations using the dynamical downscaling technique. The model was able to satisfactorily predict major meteorological variables. The improved statistical performance for the 2 m temperature (T2) in this study (with a mean bias of -0.6 °C) compared with the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-models might be related to the use of the regional model WRF and the bias-correction technique applied for CESM downscaling. The model showed good ability to predict PM2. 5 in winter (with a normalized mean bias (NMB) of 6.4 % in 2013) and O3 in summer (with an NMB of 18.2 % in 2013) in terms of statistical performance and spatial distributions. Compared with global models that tend to underpredict PM2. 5 concentrations in China, WRF-CMAQ was able to capture the high PM2. 5 concentrations in urban areas. In general, the two-way coupled WRF-CMAQ model performed well for both climatological and air quality applications. The coupled modeling system with direct aerosol feedbacks predicted aerosol optical depth relatively well and significantly reduced the overprediction in downward shortwave radiation at the surface (SWDOWN) over polluted regions in China. The performance of cloud variables was not as good as other meteorological variables, and underpredictions of cloud fraction resulted in overpredictions of SWDOWN and underpredictions of shortwave and longwave cloud forcing. The importance of climate-chemistry interactions was demonstrated via the impacts of aerosol direct effects on climate and air quality. The aerosol effects on climate and air quality in east Asia (e.g., SWDOWN and T2 decreased by 21.8 W m-2 and 0.45 °C, respectively, and most pollutant concentrations increased by 4.8-9.5 % in January over China's major cities) were more significant than in other regions because of higher aerosol loadings that resulted from severe regional pollution, which indicates the need for applying online-coupled models over east Asia for regional climate and air quality modeling and to study the important climate-chemistry interactions. This work established a baseline for WRF-CMAQ simulations for a future period under the RCP4.5 climate scenario, which will be presented in a future paper.
Statistical methods used in articles published by the Journal of Periodontal and Implant Science.
Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young
2014-12-01
The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.
Rural-urban disparity in oral health-related quality of life.
Gaber, Amal; Galarneau, Chantal; Feine, Jocelyne S; Emami, Elham
2018-04-01
The objective of this population-based cross-sectional study was to estimate rural-urban disparity in the oral health-related quality of life (OHRQoL) of the Quebec adult population. A 2-stage sampling design was used to collect data from the 1788 parents/caregivers of schoolchildren living in the 8 regions of the province of Quebec in Canada. Andersen's behavioural model for health services utilization was used as a conceptual framework. Place of residency was defined according to the Statistics Canada Census Metropolitan Area and Census Agglomeration Influenced Zone classification. The outcome of interest was OHRQoL measured using the Oral Health Impact Profile (OHIP)-14 validated questionnaire. Data weighting was applied, and the prevalence, extent and severity of negative oral health impacts were calculated. Statistical analyses included descriptive statistics, bivariate analyses and binary logistic regression. The prevalence of poor oral health-related quality life (OHRQoL) was statistically higher in rural areas than in urban zones (P = .02). Rural residents reported a significantly higher prevalence of negative daily-life impacts in pain, psychological discomfort and social disability OHIP domains (P < .05). Additionally, the rural population showed a greater number of negative oral health impacts (P = .03). There was no significant rural-urban difference in the severity of poor oral health. Logistic regression indicated that the prevalence of poor OHRQoL was significantly related to place of residency (OR = 1.6; 95% CI = 1.1-2.5; P = .022), perceived oral health (OR = 9.4; 95% CI = 5.7-15.5; P < .001), dental treatment needs factors (perceived need for dental treatment, pain, dental care seeking) (OR = 8.7; 95% CI = 4.8-15.6; P < .001) and education (OR = 2.7; 95% CI = 1.8-3.9; P < .001). The results of this study suggest a potential difference in OHRQoL of Quebec rural and urban populations, and a need to develop strategies to promote oral health outcomes, specifically for rural residents. Further studies are needed to confirm these results. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Parvin, C A
1993-03-01
The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.
[Effect of thermal treatments on the chemical characteristics of mora crab meat (Homalaspis plana)].
Quitral Robles, Vilma; Abugoch, Lilian; Vinagre, Julia; Guarda, Abel; Larraín, M Angélica; Santana, Gabriela
2003-03-01
Marine species muscles present non-proteins nitrogenated compounds, used as quality index. They are total volatile basis (NBVT), trimethylamine oxide (TMAO) and trimethylamine (TMA). pH is considered too as a quality index. The aim of this work was to evaluate these parameters in a fresh and canned marine product from the V region, corresponding to mora crab (Homalaspis plana). Fresh pincer meat from mora crab was extracted and kept in ice until theits analysis and thermal process of the canned product. A 3(2) statistical design was applied, considering two variables with 3 levels: 15, 30 y 45 minutes time levels: 80 degrees, 100 degrees y 121 degrees C temperature levels. Nine conditions of time-temperature were obtained. The thermal treatment caused an increase in pH and BVT. The TMA was increased since reduction of TMAO.
Social support network and quality of life in multiple sclerosis patients.
Costa, David Castro; Sá, Maria José; Calheiros, José Manuel
2017-05-01
To analyse the relationship between the social support network (SSN) and health related quality of life (HRQOL) in multiple sclerosis (MS) patients. The sample comprised 150 consecutive MS patients attending our MS clinic. To assess the socio-demographic data, a specifically designed questionnaire was applied. The HRQOL dimensions were measured with the Short-Form Health Survey Questionnaire-SF36 and the SSN with the Medical Outcomes Study Social Support Survey. Spearman's correlation was used to compare the magnitude of the relationship between the SSN and HRQOL. The mean patient age was 41.7 years (± 10.4; range: 18-70 yr); the mean Expanded Disability Status Score was 2.5 (±2.4; range: 0-9). There was a statistically significant correlation between the structure of the SSN and the HRQOL. The composition of the SSN, social group membership and participation in voluntary work have an important role in the HRQOL of patients with MS.
Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance
Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield
2013-01-01
The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Statistical Interpretation of Classical Thermodynamic Heating and Expansion Processes
ERIC Educational Resources Information Center
Cartier, Stephen F.
2011-01-01
A statistical model has been developed and applied to interpret thermodynamic processes typically presented from the macroscopic, classical perspective. Through this model, students learn and apply the concepts of statistical mechanics, quantum mechanics, and classical thermodynamics in the analysis of the (i) constant volume heating, (ii)…
NASA Astrophysics Data System (ADS)
Robichaud, A.; Ménard, R.
2013-05-01
We present multi-year objective analyses (OA) on a high spatio-temporal resolution (15 or 21 km, every hour) for the warm season period (1 May-31 October) for ground-level ozone (2002-2012) and for fine particulate matter (diameter less than 2.5 microns (PM2.5)) (2004-2012). The OA used here combines the Canadian Air Quality forecast suite with US and Canadian surface air quality monitoring sites. The analysis is based on an optimal interpolation with capabilities for adaptive error statistics for ozone and PM2.5 and an explicit bias correction scheme for the PM2.5 analyses. The estimation of error statistics has been computed using a modified version of the Hollingsworth-Lönnberg's (H-L) method. Various quality controls (gross error check, sudden jump test and background check) have been applied to the observations to remove outliers. An additional quality control is applied to check the consistency of the error statistics estimation model at each observing station and for each hour. The error statistics are further tuned "on the fly" using a χ2 (chi-square) diagnostic, a procedure which verifies significantly better than without tuning. Successful cross-validation experiments were performed with an OA set-up using 90% of observations to build the objective analysis and with the remainder left out as an independent set of data for verification purposes. Furthermore, comparisons with other external sources of information (global models and PM2.5 satellite surface derived measurements) show reasonable agreement. The multi-year analyses obtained provide relatively high precision with an absolute yearly averaged systematic error of less than 0.6 ppbv (parts per billion by volume) and 0.7 μg m-3 (micrograms per cubic meter) for ozone and PM2.5 respectively and a random error generally less than 9 ppbv for ozone and under 12 μg m-3 for PM2.5. In this paper, we focus on two applications: (1) presenting long term averages of objective analysis and analysis increments as a form of summer climatology and (2) analyzing long term (decadal) trends and inter-annual fluctuations using OA outputs. Our results show that high percentiles of ozone and PM2.5 are both following a decreasing trend overall in North America with the eastern part of United States (US) presenting the highest decrease likely due to more effective pollution controls. Some locations, however, exhibited an increasing trend in the mean ozone and PM2.5 such as the northwestern part of North America (northwest US and Alberta). The low percentiles are generally rising for ozone which may be linked to increasing emissions from emerging countries and the resulting pollution brought by the intercontinental transport. After removing the decadal trend, we demonstrate that the inter-annual fluctuations of the high percentiles are significantly correlated with temperature fluctuations for ozone and precipitation fluctuations for PM2.5. We also show that there was a moderately significant correlation between the inter-annual fluctuations of the high percentiles of ozone and PM2.5 with economic indices such as the Industrial Dow Jones and/or the US gross domestic product growth rate.
Wire Roughness Assessment of 0.016'' × 0.022'' the Technique Lingual Orthodontics.
Facchini, Fátima Mm; Filho, Mario Vedovello; Vedovello, Silvia As; Cotrim, Flávio A; Cotrim-Ferreira, Andrຟa; Tubel, Carlos Am
2017-04-01
To evaluate the difference in surface roughness of stainless steel archwires of different commercial brands used in lingual orthodontics. Precontoured arches measuring 0.016'' × 0.022'' were selected of the following brands: Tecnident, Adenta, G&H, Highland Metals Inc., Ormco, Incognito, and Ebraces. Quantitative evaluation of the surface roughness of archwires was performed by means of an atomic force microscope in contact mode. Three surface readouts were taken of each sample, analyzing areas of 20 × 20 μm. Each scan of the samples produced a readout of 512 lines, generating three-dimensional images of the wires. The analysis of variance statistical test was applied to prove significant variables (p > 0.05), with H 0 being rejected and H 1 accepted. The Incognito brand showed the lowest surface roughness. The archwires of brands Adenta, Tecnident, Highland, and Ormco showed similar values among them, and all close to these obtained by the Incognito brand. The archwires of the Ebraces brand showed the highest surface roughness, with values being close to those of the G&H Brand. There was a statistical difference in surface roughness of orthodontic archwires among the brands studied. Companies should pay attention to the quality control of their materials, as these may directly affect the quality of orthodontic treatment.
Ryu, Jihye; Lee, Chaeyoung
2014-12-01
Positive selection not only increases beneficial allele frequency but also causes augmentation of allele frequencies of sequence variants in close proximity. Signals for positive selection were detected by the statistical differences in subsequent allele frequencies. To identify selection signatures in Korean cattle, we applied a composite log-likelihood (CLL)-based method, which calculates a composite likelihood of the allelic frequencies observed across sliding windows of five adjacent loci and compares the value with the critical statistic estimated by 50,000 permutations. Data for a total of 11,799 nucleotide polymorphisms were used with 71 Korean cattle and 209 foreign beef cattle. As a result, 147 signals were identified for Korean cattle based on CLL estimates (P < 0.01). The signals might be candidate genetic factors for meat quality by which the Korean cattle have been selected. Further genetic association analysis with 41 intragenic variants in the selection signatures with the greatest CLL for each chromosome revealed that marbling score was associated with five variants. Intensive association studies with all the selection signatures identified in this study are required to exclude signals associated with other phenotypes or signals falsely detected and thus to identify genetic markers for meat quality. © 2014 Stichting International Foundation for Animal Genetics.
Dantan, Etienne; Foucher, Yohann; Lorent, Marine; Giral, Magali; Tessier, Philippe
2018-06-01
Defining thresholds of prognostic markers is essential for stratified medicine. Such thresholds are mostly estimated from purely statistical measures regardless of patient preferences potentially leading to unacceptable medical decisions. Quality-Adjusted Life-Years are a widely used preferences-based measure of health outcomes. We develop a time-dependent Quality-Adjusted Life-Years-based expected utility function for censored data that should be maximized to estimate an optimal threshold. We performed a simulation study to compare estimated thresholds when using the proposed expected utility approach and purely statistical estimators. Two applications illustrate the usefulness of the proposed methodology which was implemented in the R package ROCt ( www.divat.fr ). First, by reanalysing data of a randomized clinical trial comparing the efficacy of prednisone vs. placebo in patients with chronic liver cirrhosis, we demonstrate the utility of treating patients with a prothrombin level higher than 89%. Second, we reanalyze the data of an observational cohort of kidney transplant recipients: we conclude to the uselessness of the Kidney Transplant Failure Score to adapt the frequency of clinical visits. Applying such a patient-centered methodology may improve future transfer of novel prognostic scoring systems or markers in clinical practice.
Comparison of semen parameters in samples collected by masturbation at a clinic and at home.
Elzanaty, Saad; Malm, Johan
2008-06-01
To investigate differences in semen quality between samples collected by masturbation at a clinic and at home. Cross-sectional study. Fertility center. Three hundred seventy-nine men assessed for infertility. None. Semen was analyzed according to World Health Organization guidelines. Seminal markers of epididymal (neutral alpha-glucosidase), prostatic (prostate-specific antigen and zinc), and seminal vesicle (fructose) function were measured. Two patient groups were defined according to sample collection location: at a clinic (n = 273) or at home (n = 106). Compared with clinic-collected semen, home-collected samples had statistically significantly higher values for sperm concentration, total sperm count, rapid progressive motility, and total count of progressive motility. Semen volume, proportion of normal sperm morphology, neutral alpha-glucosidase, prostate-specific antigen, zinc, and fructose did not differ significantly between groups. An abnormal sperm concentration (<20 x 10(6)/mL) was seen in statistically significantly fewer of the samples obtained at home (19/106, 18%) than at the clinic (81/273, 30%), and the same applied to proportions of samples with abnormal (< 25%) rapid progressive motility (68/106 [64%] and 205/273 [75%], respectively). The present results demonstrate superior semen quality in samples collected by masturbation at home compared with at a clinic. This should be taken into consideration in infertility investigations.
Hybrid modeling as a QbD/PAT tool in process development: an industrial E. coli case study.
von Stosch, Moritz; Hamelink, Jan-Martijn; Oliveira, Rui
2016-05-01
Process understanding is emphasized in the process analytical technology initiative and the quality by design paradigm to be essential for manufacturing of biopharmaceutical products with consistent high quality. A typical approach to developing a process understanding is applying a combination of design of experiments with statistical data analysis. Hybrid semi-parametric modeling is investigated as an alternative method to pure statistical data analysis. The hybrid model framework provides flexibility to select model complexity based on available data and knowledge. Here, a parametric dynamic bioreactor model is integrated with a nonparametric artificial neural network that describes biomass and product formation rates as function of varied fed-batch fermentation conditions for high cell density heterologous protein production with E. coli. Our model can accurately describe biomass growth and product formation across variations in induction temperature, pH and feed rates. The model indicates that while product expression rate is a function of early induction phase conditions, it is negatively impacted as productivity increases. This could correspond with physiological changes due to cytoplasmic product accumulation. Due to the dynamic nature of the model, rational process timing decisions can be made and the impact of temporal variations in process parameters on product formation and process performance can be assessed, which is central for process understanding.
Effect of topical honey on limitation of radiation-induced oral mucositis: an intervention study.
Khanal, B; Baliga, M; Uppal, N
2010-12-01
Radiation therapy for oral carcinoma is therapeutically useful in dose of at least 6000 cGy but causes mucositis that severely interferes with oral function. The literature indicates that honey appears to promote wound healing, so the authors investigated whether its anti-inflammatory properties might limit the severity of radiation-induced oral mucositis. A single-blinded, randomized, controlled clinical trial was carried out to compare the mucositis-limiting qualities of honey with lignocaine. A visual assessment scale permitted scoring of degrees of mucositis and statistical evaluation of the results was performed using the χ(2) test. Only 1 of 20 patients in the honey group developed intolerable oral mucositis compared with the lignocaine group, indicating that honey is strongly protective (RR=0.067) against the development of mucositis. The proportion of patients with intolerable oral mucositis was lower in the honey group and this was statistically significant (p=0.000). Honey applied topically to the oral mucosa of patients undergoing radiation therapy appears to provide a distinct benefit by limiting the severity of mucositis. Honey is readily available, affordable and well accepted by patients making it useful for improving the quality of life in irradiated patients. Copyright © 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2003-12-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Investigation of improving MEMS-type VOA reliability
NASA Astrophysics Data System (ADS)
Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.
2004-01-01
MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).
Feature maps driven no-reference image quality prediction of authentically distorted images
NASA Astrophysics Data System (ADS)
Ghadiyaram, Deepti; Bovik, Alan C.
2015-03-01
Current blind image quality prediction models rely on benchmark databases comprised of singly and synthetically distorted images, thereby learning image features that are only adequate to predict human perceived visual quality on such inauthentic distortions. However, real world images often contain complex mixtures of multiple distortions. Rather than a) discounting the effect of these mixtures of distortions on an image's perceptual quality and considering only the dominant distortion or b) using features that are only proven to be efficient for singly distorted images, we deeply study the natural scene statistics of authentically distorted images, in different color spaces and transform domains. We propose a feature-maps-driven statistical approach which avoids any latent assumptions about the type of distortion(s) contained in an image, and focuses instead on modeling the remarkable consistencies in the scene statistics of real world images in the absence of distortions. We design a deep belief network that takes model-based statistical image features derived from a very large database of authentically distorted images as input and discovers good feature representations by generalizing over different distortion types, mixtures, and severities, which are later used to learn a regressor for quality prediction. We demonstrate the remarkable competence of our features for improving automatic perceptual quality prediction on a benchmark database and on the newly designed LIVE Authentic Image Quality Challenge Database and show that our approach of combining robust statistical features and the deep belief network dramatically outperforms the state-of-the-art.
ERIC Educational Resources Information Center
Vaughn, Brandon K.; Wang, Pei-Yu
2009-01-01
The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…
The Development of Official Social Statistics in Italy with a Life Quality Approach
ERIC Educational Resources Information Center
Sabbadini, Linda Laura
2011-01-01
The article covers the main steps of official statistics in the second half of the Nineties through the illustration of the transition from economic oriented official statistics to the quality of life approach. The system of the Multipurpose Surveys introduced in 1993 to give an answer to questions at social level and to provide indicators for…
Statistics of Statisticians: Critical Mass of Statistics and Operational Research Groups
NASA Astrophysics Data System (ADS)
Kenna, Ralph; Berche, Bertrand
Using a recently developed model, inspired by mean field theory in statistical physics, and data from the UK's Research Assessment Exercise, we analyse the relationship between the qualities of statistics and operational research groups and the quantities of researchers in them. Similar to other academic disciplines, we provide evidence for a linear dependency of quality on quantity up to an upper critical mass, which is interpreted as the average maximum number of colleagues with whom a researcher can communicate meaningfully within a research group. The model also predicts a lower critical mass, which research groups should strive to achieve to avoid extinction. For statistics and operational research, the lower critical mass is estimated to be 9 ± 3. The upper critical mass, beyond which research quality does not significantly depend on group size, is 17 ± 6.
Petzold, Thomas; Steinwitz, Adrienne; Schmitt, Jochen; Eberlein-Gonska, Maria
2013-01-01
Obligatory external quality assurance is an established method used to ensure the quality of inpatient care in Germany. The comprehensive approach is unique in international comparison. In addition to the statutory requirement, the health insurance funds require this form of external quality control in order to foster quality-based competition between hospitals. Ever since its introduction, healthcare providers have scrutinised the effects of the mandatory use of this survey. The study was based on all patients in the University Hospital Dresden, for whom a quality assurance sheet (n = 45,639) had to be recorded between 2003 and 2011. The documentation of these sheets was carried out by specially trained personnel. For each performance area, the duration of the documentation quality sheets was assessed, and a descriptive analysis of all quality assurance sheets was conducted. In the presence of statistical significance the so-called "Structured Dialogues" were analysed. Over the whole period, 167 statistically noticeable problems occurred. Nine of these have been rated as noticeable problems in medical quality by the specialised working groups of the project office quality assurance (PGSQS) at the Saxon State Medical Association (SLÄK). The remaining 158 statistical anomalies included 25 documentation errors; 96 were classified as statistically significant, and only 37 were marked to indicate that re-observation by the PGSQS was required. The total effort estimate for the documentation of quality assurance sheets was approximately 1,420 working days in the observation period. As far as the quality of patient care is concerned, the results can be considered positive because only a small number of quality indicators indicate noticeable qualitative problems. This statement is based primarily on the comparison of the groups of Saxony and Germany, which are included in the quality report of external quality assurance in accordance with sect. 137 SGB V. The majority of noticeable statistical problems were due to documentation errors. Other noticeable statistical problems that are medically indicated, but without effect on the extramural care to patients, recurrently occur with the respective quality indicators. Examples include the postoperative mobility indicators of the implementation of endoprostheses which cannot be used to draw conclusions about patient outcomes. Information on the quality of life as well as the post-hospital course of disease would be important in this context, but is still lacking. The use of external quality assurance data in accordance with sect. 137 SGB V for evaluation research has so far been handled quite restrictively. Thus, in-depth analyses on the quality of treatment cannot be derived. Copyright © 2013. Published by Elsevier GmbH.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
NASA Astrophysics Data System (ADS)
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-06-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Evaluating national cause-of-death statistics: principles and application to the case of China.
Rao, Chalapati; Lopez, Alan D.; Yang, Gonghuan; Begg, Stephen; Ma, Jiemin
2005-01-01
Mortality statistics systems provide basic information on the levels and causes of mortality in populations. Only a third of the world's countries have complete civil registration systems that yield adequate cause-specific mortality data for health policy-making and monitoring. This paper describes the development of a set of criteria for evaluating the quality of national mortality statistics and applies them to China as an example. The criteria cover a range of structural, statistical and technical aspects of national mortality data. Little is known about cause-of-death data in China, which is home to roughly one-fifth of the world's population. These criteria were used to evaluate the utility of data from two mortality statistics systems in use in China, namely the Ministry of Health-Vital Registration (MOH-VR) system and the Disease Surveillance Point (DSP) system. We concluded that mortality registration was incomplete in both. No statistics were available for geographical subdivisions of the country to inform resource allocation or for the monitoring of health programmes. Compilation and publication of statistics is irregular in the case of the DSP, and they are not made publicly available at all by the MOH-VR. More research is required to measure the content validity of cause-of-death attribution in the two systems, especially due to the use of verbal autopsy methods in rural areas. This framework of criteria-based evaluation is recommended for the evaluation of national mortality data in developing countries to determine their utility and to guide efforts to improve their value for guiding policy. PMID:16184281
Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U
2009-05-01
In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.
Managed care quality of care and plan choice in New York SCHIP.
Liu, Hangsheng; Phelps, Charles E; Veazie, Peter J; Dick, Andrew W; Klein, Jonathan D; Shone, Laura P; Noyes, Katia; Szilagyi, Peter G
2009-06-01
To examine whether low-income parents of children enrolled in the New York State Children's Health Insurance Program (SCHIP) choose managed care plans with better quality of care. 2001 New York SCHIP evaluation data; 2001 New York State Managed Care Plan Performance Report; 2000 New York State Managed Care Enrollment Report. Each market was defined as a county. A final sample of 2,325 new enrollees was analyzed after excluding those in markets with only one SCHIP plan. Plan quality was measured using seven Consumer Assessment of Health Plans Survey (CAHPS) and three Health Plan Employer Data and Information Set (HEDIS) scores. A conditional logit model was applied with plan and individual/family characteristics as covariates. There were 30 plans in the 45 defined markets. The choice probability increased 2.5 percentage points for each unit increase in the average CAHPS score, and the association was significantly larger in children with special health care needs. However, HEDIS did not show any statistically significant association with plan choice. Low-income parents do choose managed care plans with higher CAHPS scores for their newly enrolled children, suggesting that overall quality could improve over time because of the dynamics of enrollment.
Lee, Jih-Chin; Lai, Wen-Sen; Ju, Da-Tong; Chu, Yueng-Hsiang; Yang, Jinn-Moon
2015-03-01
During endoscopic sinus surgery (ESS), intra-operative bleeding can significantly compromise visualization of the surgical field. The diode laser that provides good hemostatic and vaporization effects and excellent photocoagulation has been successfully applied in endoscopic surgery with several advantages. The current retrospective study demonstrates the feasibility of diode laser-combined endoscopic sinus surgery on sphenoidotomy. The patients who went through endoscopic transphenoidal pituitary surgery were enrolled. During the operation, the quality of the surgical field was assessed and graded by the operating surgeon using the scale proposed by Boezaart. The mean operation time was 37.80 ± 10.90 minutes. The mean score on the quality of surgical field was 1.95. A positive correlation between the lower surgical field quality score and the shorter surgical time was found with statistical significance (P < 0.0001). No infections, hemorrhages, or other complications occurred intra- or post-operatively. The diode laser-assisted sphenoidotomy is a reliable and safe approach of pituitary gland surgery with minimal invasiveness. It is found that application of diode laser significantly improved quality of surgical field and shortened operation time. © 2015 Wiley Periodicals, Inc.
Sedlack, Jeffrey D
2010-01-01
Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.
Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1.
Zahel, Thomas; Marschall, Lukas; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Mueller, Eric M; Murphy, Patrick; Natschläger, Thomas; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph
2017-10-12
Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0). However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.
Applying Probabilistic Decision Models to Clinical Trial Design
Smith, Wade P; Phillips, Mark H
2018-01-01
Clinical trial design most often focuses on a single or several related outcomes with corresponding calculations of statistical power. We consider a clinical trial to be a decision problem, often with competing outcomes. Using a current controversy in the treatment of HPV-positive head and neck cancer, we apply several different probabilistic methods to help define the range of outcomes given different possible trial designs. Our model incorporates the uncertainties in the disease process and treatment response and the inhomogeneities in the patient population. Instead of expected utility, we have used a Markov model to calculate quality adjusted life expectancy as a maximization objective. Monte Carlo simulations over realistic ranges of parameters are used to explore different trial scenarios given the possible ranges of parameters. This modeling approach can be used to better inform the initial trial design so that it will more likely achieve clinical relevance. PMID:29888075
Extending LMS to Support IRT-Based Assessment Test Calibration
NASA Astrophysics Data System (ADS)
Fotaris, Panagiotis; Mastoras, Theodoros; Mavridis, Ioannis; Manitsaris, Athanasios
Developing unambiguous and challenging assessment material for measuring educational attainment is a time-consuming, labor-intensive process. As a result Computer Aided Assessment (CAA) tools are becoming widely adopted in academic environments in an effort to improve the assessment quality and deliver reliable results of examinee performance. This paper introduces a methodological and architectural framework which embeds a CAA tool in a Learning Management System (LMS) so as to assist test developers in refining items to constitute assessment tests. An Item Response Theory (IRT) based analysis is applied to a dynamic assessment profile provided by the LMS. Test developers define a set of validity rules for the statistical indices given by the IRT analysis. By applying those rules, the LMS can detect items with various discrepancies which are then flagged for review of their content. Repeatedly executing the aforementioned procedure can improve the overall efficiency of the testing process.
Gifford, Wendy; Lefebre, Nancy; Davies, Barbara
2014-01-01
The aims of this study were to field test and evaluate a series of organizational strategies to promote evidence-informed decision making (EIDM) by nurse managers and clinical leaders in home healthcare. EIDM is central to delivering high-quality and effective healthcare. Barriers exist and organizational strategies are needed to support EIDM. Management and clinical leaders from 4 units participated in a 20-week organization-focused intervention. Preintervention (n = 32) and postintervention (n = 17) surveys and semistructured interviews (n = 15) were completed. Statistically significant increases were found on 4 of 31 survey items reflecting an increased organizational capacity for participants to acquire and apply research evidence in decision making. Support from designated facilitators with advanced skills in finding, appraising, and applying research was the highest rated intervention strategy. Results are useful to inform the development of organizational infrastructures to increase EIDM capacity in community-based healthcare organizations.
Molinos-Senante, María; Farías, Rodrigo
2018-06-04
The privatization of water and sewerage services (WSS) has led to the foundation of water economic groups, which integrate several water companies and have gained notable importance at the global level. In the framework of benchmarking studies, there are no prior studies exploring the impact that economic groups have on the efficiency and quality of service provided by water companies. This study investigates, for the first time, whether the membership of water companies in an economic group influences their performance. Quantity- and quality-adjusted efficiency scores were computed using data envelopment analysis models. An empirical application was developed for the Chilean water industry since most of their water companies are private and belong to an economic group. The results show that independent water companies provide WSS with better quality than do water companies that belong to an economic group. From a statistical point of view, it was evident that membership in an economic group impacts both the quantity- and quality-adjusted efficiency scores of water companies. The results of this study illustrate that applying the model-firm regulation to the Chilean water industry has significant drawbacks that should be addressed by the water regulator to promote the long-term sustainability of the water industry.
Kim, Hyun Gi; Lee, Young Han; Choi, Jin-Young; Park, Mi-Suk; Kim, Myeong-Jin; Kim, Ki Whang
2015-01-01
Purpose To investigate the optimal blending percentage of adaptive statistical iterative reconstruction (ASIR) in a reduced radiation dose while preserving a degree of image quality and texture that is similar to that of standard-dose computed tomography (CT). Materials and Methods The CT performance phantom was scanned with standard and dose reduction protocols including reduced mAs or kVp. Image quality parameters including noise, spatial, and low-contrast resolution, as well as image texture, were quantitatively evaluated after applying various blending percentages of ASIR. The optimal blending percentage of ASIR that preserved image quality and texture compared to standard dose CT was investigated in each radiation dose reduction protocol. Results As the percentage of ASIR increased, noise and spatial-resolution decreased, whereas low-contrast resolution increased. In the texture analysis, an increasing percentage of ASIR resulted in an increase of angular second moment, inverse difference moment, and correlation and in a decrease of contrast and entropy. The 20% and 40% dose reduction protocols with 20% and 40% ASIR blending, respectively, resulted in an optimal quality of images with preservation of the image texture. Conclusion Blending the 40% ASIR to the 40% reduced tube-current product can maximize radiation dose reduction and preserve adequate image quality and texture. PMID:25510772
Groene, Oliver; Klazinga, Niek; Wagner, Cordula; Arah, Onyebuchi A; Thompson, Andrew; Bruneau, Charles; Suñol, Rosa
2010-09-24
Hospitals in European countries apply a wide range of quality improvement strategies. Knowledge of the effectiveness of these strategies, implemented as part of an overall hospital quality improvement system, is limited. We propose to study the relationships among organisational quality improvement systems, patient empowerment, organisational culture, professionals' involvement with the quality of hospital care, including clinical effectiveness, patient safety and patient involvement. We will employ a cross-sectional, multi-level study design in which patient-level measurements are nested in hospital departments, which are in turn nested in hospitals in different EU countries. Mixed methods will be used for data collection, measurement and analysis. Hospital/care pathway level constructs that will be assessed include external pressure, hospital governance, quality improvement system, patient empowerment in quality improvement, organisational culture and professional involvement. These constructs will be assessed using questionnaires. Patient-level constructs include clinical effectiveness, patient safety and patient involvement, and will be assessed using audit of patient records, routine data and patient surveys. For the assessment of hospital and pathway level constructs we will collect data from randomly selected hospitals in eight countries. For a sample of hospitals in each country we will carry out additional data collection at patient-level related to four conditions (stroke, acute myocardial infarction, hip fracture and delivery). In addition, structural components of quality improvement systems will be assessed using visits by experienced external assessors. Data analysis will include descriptive statistics and graphical representations and methods for data reduction, classification techniques and psychometric analysis, before moving to bi-variate and multivariate analysis. The latter will be conducted at hospital and multilevel. In addition, we will apply sophisticated methodological elements such as the use of causal diagrams, outcome modelling, double robust estimation and detailed sensitivity analysis or multiple bias analyses to assess the impact of the various sources of bias. Products of the project will include a catalogue of instruments and tools that can be used to build departmental or hospital quality and safety programme and an appraisal scheme to assess the maturity of the quality improvement system for use by hospitals and by purchasers to contract hospitals.
[Influence of demographic and socioeconomic characteristics on the quality of life].
Grbić, Gordana; Djokić, Dragoljub; Kocić, Sanja; Mitrašinović, Dejan; Rakić, Ljiljana; Prelević, Rade; Krivokapić, Žarko; Miljković, Snežana
2011-01-01
The quality of life is a multidimensional concept, which is best expressed by the subjective well-being. Evaluation of the quality of life is the basis for measuring the well-being, and the determination of factors that determine the quality of life quality is the basis for its improvement. To evaluate and assess the determinants of the perceived quality of life of group distinguishing features which characterize demographic and socioeconomic factors. This was a cross-sectional study of a representative sample of the population in Serbia aged over 20 years (9479 examinees). The quality of life was expressed by the perception of well-being (pleasure of life). Data on the examinees (demographic and socioeconomic characteristics) were collected by using a questionnaire for adults of each household. To process, analyze and present the data, we used the methods of parametric descriptive statistics (mean value, standard deviation, coefficient of variation), variance analysis and factor analysis. Although men evaluated the quality of life with a slightly higher grading, there was no statistically significant difference in the evaluation of the quality of life in relation to the examinee's gender (p > 0.005). Among the examinees there was a high statistically significant difference in grading the quality of life depending on age, level of education, marital status and type of job (p < 0.001). In relation to the number of children, there was no statistically significant difference in he grading of the quality of life (p > 0.005). The quality of life is influenced by numerous factors that characterize each person (demographic and socioeconomic characteristics of individual). Determining factors of the quality of life are numerous and diverse, and the manner and the strength of their influence are variable.
Comparison of Data Quality of NOAA's ISIS and SURFRAD Networks to NREL's SRRL-BMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderberg, M.; Sengupta, M.
2014-11-01
This report provides analyses of broadband solar radiometric data quality for the National Oceanic and Atmospheric Administration's Integrated Surface Irradiance Study and Surface Radiation Budget Network (SURFRAD) solar measurement networks. The data quality of these networks is compared to that of the National Renewable Energy Laboratory's Solar Radiation Research Laboratory Baseline Measurement System (SRRL-BMS) native data resolutions and hourly averages of the data from the years 2002 through 2013. This report describes the solar radiometric data quality testing and flagging procedures and the method used to determine and tabulate data quality statistics. Monthly data quality statistics for each network weremore » plotted by year against the statistics for the SRRL-BMS. Some of the plots are presented in the body of the report, but most are in the appendix. These plots indicate that the overall solar radiometric data quality of the SURFRAD network is superior to that of the Integrated Surface Irradiance Study network and can be comparable to SRRL-BMS.« less
NASA Astrophysics Data System (ADS)
Äijälä, Mikko; Heikkinen, Liine; Fröhlich, Roman; Canonaco, Francesco; Prévôt, André S. H.; Junninen, Heikki; Petäjä, Tuukka; Kulmala, Markku; Worsnop, Douglas; Ehn, Mikael
2017-03-01
Mass spectrometric measurements commonly yield data on hundreds of variables over thousands of points in time. Refining and synthesizing this raw data into chemical information necessitates the use of advanced, statistics-based data analytical techniques. In the field of analytical aerosol chemistry, statistical, dimensionality reductive methods have become widespread in the last decade, yet comparable advanced chemometric techniques for data classification and identification remain marginal. Here we present an example of combining data dimensionality reduction (factorization) with exploratory classification (clustering), and show that the results cannot only reproduce and corroborate earlier findings, but also complement and broaden our current perspectives on aerosol chemical classification. We find that applying positive matrix factorization to extract spectral characteristics of the organic component of air pollution plumes, together with an unsupervised clustering algorithm, k-means+ + , for classification, reproduces classical organic aerosol speciation schemes. Applying appropriately chosen metrics for spectral dissimilarity along with optimized data weighting, the source-specific pollution characteristics can be statistically resolved even for spectrally very similar aerosol types, such as different combustion-related anthropogenic aerosol species and atmospheric aerosols with similar degree of oxidation. In addition to the typical oxidation level and source-driven aerosol classification, we were also able to classify and characterize outlier groups that would likely be disregarded in a more conventional analysis. Evaluating solution quality for the classification also provides means to assess the performance of mass spectral similarity metrics and optimize weighting for mass spectral variables. This facilitates algorithm-based evaluation of aerosol spectra, which may prove invaluable for future development of automatic methods for spectra identification and classification. Robust, statistics-based results and data visualizations also provide important clues to a human analyst on the existence and chemical interpretation of data structures. Applying these methods to a test set of data, aerosol mass spectrometric data of organic aerosol from a boreal forest site, yielded five to seven different recurring pollution types from various sources, including traffic, cooking, biomass burning and nearby sawmills. Additionally, three distinct, minor pollution types were discovered and identified as amine-dominated aerosols.
Manterola, Carlos; Grande, Luís
2010-04-01
To determine methodological quality of therapy articles published in Cirugía Española and to study its association with the publication year, the centre of origin and subjects. A literature study which included all therapy articles published between 2005 and 2008. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor and experimental studies. Variables analysed included: year of publication, centre of origin, design, and methodological quality of articles. A valid and reliable scale was applied to determine methodological quality. A total of 243 articles [206 series of cases (84.8%), 27 cohort studies (11.1%), 9 clinical trials (3.7%) and 1 case control study (0.4%)] were found. Studies came preferentially from Catalonia and Valencia (22.3% and 12.3% respectively). Thematic areas most frequently found were hepato-bilio-pancreatic and colorectal surgery (20.0% and 16.6%, respectively). Average and median of the methodological quality score calculated for the entire series were 9.5+/-4.3 points and 8 points, respectively. Association between methodological quality and geographical area (p=0.0101), subject area (p=0.0267), and university origin (p=0.0369) was found. A significant increase of methodological quality by publication year was observed (p=0.0004). Methodological quality of therapy articles published in Cirugía Española between 2005 and 2008 is low; but an increase tendency with statistical significance was observed.
Abstraction the public from scientific - applied meteorological-climatologic data
NASA Astrophysics Data System (ADS)
Trajanoska, L.
2010-09-01
Mathematical and meteorological statistic processing of meteorological-climatologic data, which includes assessment of the exactness, level of confidence of the average and extreme values, frequencies (probabilities) of the occurrence of each meteorological phenomenon and element e.t.c. helps to describe the impacts climate may have on different social and economic activities (transportation, heat& power generation), as well as on human health. Having in mind the new technology and the commercial world, during the work with meteorological-climatologic data we have meet many different challenges. Priority in all of this is the quality of the meteorological-climatologic set of data. First, we need compatible modern, sophisticated measurement and informatics solution for data. Results of this measurement through applied processing and analyze is the second branch which is very important also. Should we all (country) need that? Today we have many unpleasant events connected with meteorology, many questions which are not answered and all of this has too long lasting. We must give the answers and solve the real and basic issue. In this paper the data issue will be presented. We have too much of data but so little of real and quality applied of them, Why? There is a data for: -public applied -for jurisdiction needs -for getting fast decision-solutions (meteorological-dangerous phenomenon's) -for getting decisions for long-lasting plans -for explore in different sphere of human living So, it is very important for what kind of data we are talking. Does the data we are talking are with public or scientific-applied character? So,we have two groups. The first group which work with the data direct from the measurement place and instrument. They are store a quality data base and are on extra help to the journalists, medical workers, human civil engineers, electromechanical engineers, agro meteorological and forestry engineer e.g. The second group do work with all scientific methods for the needed purposes. Hours, days, years and periods with characteristic meanings are separated for the purposes of the comprehensive analyze and application.
[Algorithms of artificial neural networks--practical application in medical science].
Stefaniak, Bogusław; Cholewiński, Witold; Tarkowska, Anna
2005-12-01
Artificial Neural Networks (ANN) may be a tool alternative and complementary to typical statistical analysis. However, in spite of many computer applications of various ANN algorithms ready for use, artificial intelligence is relatively rarely applied to data processing. This paper presents practical aspects of scientific application of ANN in medicine using widely available algorithms. Several main steps of analysis with ANN were discussed starting from material selection and dividing it into groups, to the quality assessment of obtained results at the end. The most frequent, typical reasons for errors as well as the comparison of ANN method to the modeling by regression analysis were also described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Y.
This report describes the research work performed under the support of the DOE research grant E-FG02-97ER4108. The work is composed of three parts: (1) Visual analysis and quality control of the Micro Vertex Detector (MVD) of the PHENIX experiments carried out of Brookhaven National Laboratory. (2) Continuation of the data analysis of the EMU05/09/16 experiments for the study of the inclusive particle production spectra and multi-particle correlation. (3) Exploration of a new statistical means to study very high-multiplicity of nuclear-particle ensembles and its perspectives to apply to the higher energy experiments.
Nasajpour, Mohammad Reza; Ashrafi-Rizi, Hasan; Soleymani, Mohammad Reza; Shahrzadi, Leila; Hassanzadeh, Akbar
2014-01-01
Today, the websites of college and university libraries play an important role in providing the necessary services for clients. These websites not only allow the users to access different collections of library resources, but also provide them with the necessary guidance in order to use the information. The goal of this study is the quality evaluation of the college library websites in Iranian Medical Universities based on the Stover model. This study uses an analytical survey method and is an applied study. The data gathering tool is the standard checklist provided by Stover, which was modified by the researchers for this study. The statistical population is the college library websites of the Iranian Medical Universities (146 websites) and census method was used for investigation. The data gathering method was a direct access to each website and filling of the checklist was based on the researchers' observations. Descriptive and analytical statistics (Analysis of Variance (ANOVA)) were used for data analysis with the help of the SPSS software. The findings showed that in the dimension of the quality of contents, the highest average belonged to type one universities (46.2%) and the lowest average belonged to type three universities (24.8%). In the search and research capabilities, the highest average belonged to type one universities (48.2%) and the lowest average belonged to type three universities. In the dimension of facilities provided for the users, type one universities again had the highest average (37.2%), while type three universities had the lowest average (15%). In general the library websites of type one universities had the highest quality (44.2%), while type three universities had the lowest quality (21.1%). Also the library websites of the College of Rehabilitation and the College of Paramedics, of the Shiraz University of Medical Science, had the highest quality scores. The results showed that there was a meaningful difference between the quality of the college library websites and the university types, resulting in college libraries of type one universities having the highest average score and the college libraries of type three universities having the lowest score.
Nasajpour, Mohammad Reza; Ashrafi-rizi, Hasan; Soleymani, Mohammad Reza; Shahrzadi, Leila; Hassanzadeh, Akbar
2014-01-01
Introduction: Today, the websites of college and university libraries play an important role in providing the necessary services for clients. These websites not only allow the users to access different collections of library resources, but also provide them with the necessary guidance in order to use the information. The goal of this study is the quality evaluation of the college library websites in Iranian Medical Universities based on the Stover model. Material and Methods: This study uses an analytical survey method and is an applied study. The data gathering tool is the standard checklist provided by Stover, which was modified by the researchers for this study. The statistical population is the college library websites of the Iranian Medical Universities (146 websites) and census method was used for investigation. The data gathering method was a direct access to each website and filling of the checklist was based on the researchers’ observations. Descriptive and analytical statistics (Analysis of Variance (ANOVA)) were used for data analysis with the help of the SPSS software. Findings: The findings showed that in the dimension of the quality of contents, the highest average belonged to type one universities (46.2%) and the lowest average belonged to type three universities (24.8%). In the search and research capabilities, the highest average belonged to type one universities (48.2%) and the lowest average belonged to type three universities. In the dimension of facilities provided for the users, type one universities again had the highest average (37.2%), while type three universities had the lowest average (15%). In general the library websites of type one universities had the highest quality (44.2%), while type three universities had the lowest quality (21.1%). Also the library websites of the College of Rehabilitation and the College of Paramedics, of the Shiraz University of Medical Science, had the highest quality scores. Discussion: The results showed that there was a meaningful difference between the quality of the college library websites and the university types, resulting in college libraries of type one universities having the highest average score and the college libraries of type three universities having the lowest score. PMID:25540794
Applied Behavior Analysis and Statistical Process Control?
ERIC Educational Resources Information Center
Hopkins, B. L.
1995-01-01
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
The Application of Statistics Education Research in My Classroom
ERIC Educational Resources Information Center
Jordan, Joy
2007-01-01
A collaborative, statistics education research project (Lovett, 2001) is discussed. Some results of the project were applied in the computer lab sessions of my elementary statistics course. I detail the process of applying these research results, as well as the use of knowledge surveys. Furthermore, I give general suggestions to teachers who want…
Applying Statistical Process Control to Clinical Data: An Illustration.
ERIC Educational Resources Information Center
Pfadt, Al; And Others
1992-01-01
Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…
A Realistic Experimental Design and Statistical Analysis Project
ERIC Educational Resources Information Center
Muske, Kenneth R.; Myers, John A.
2007-01-01
A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…
Statistical approaches used to assess and redesign surface water-quality-monitoring networks.
Khalil, B; Ouarda, T B M J
2009-11-01
An up-to-date review of the statistical approaches utilized for the assessment and redesign of surface water quality monitoring (WQM) networks is presented. The main technical aspects of network design are covered in four sections, addressing monitoring objectives, water quality variables, sampling frequency and spatial distribution of sampling locations. This paper discusses various monitoring objectives and related procedures used for the assessment and redesign of long-term surface WQM networks. The appropriateness of each approach for the design, contraction or expansion of monitoring networks is also discussed. For each statistical approach, its advantages and disadvantages are examined from a network design perspective. Possible methods to overcome disadvantages and deficiencies in the statistical approaches that are currently in use are recommended.
Gates, Allison; Gates, Michelle; Duarte, Gonçalo; Cary, Maria; Becker, Monika; Prediger, Barbara; Vandermeer, Ben; Fernandes, Ricardo M; Pieper, Dawid; Hartling, Lisa
2018-06-13
Systematic reviews (SRs) of randomised controlled trials (RCTs) can provide the best evidence to inform decision-making, but their methodological and reporting quality varies. Tools exist to guide the critical appraisal of quality and risk of bias in SRs, but evaluations of their measurement properties are limited. We will investigate the interrater reliability (IRR), usability, and applicability of A MeaSurement Tool to Assess systematic Reviews (AMSTAR), AMSTAR 2, and Risk Of Bias In Systematic reviews (ROBIS) for SRs in the fields of biomedicine and public health. An international team of researchers at three collaborating centres will undertake the study. We will use a random sample of 30 SRs of RCTs investigating therapeutic interventions indexed in MEDLINE in February 2014. Two reviewers at each centre will appraise the quality and risk of bias in each SR using AMSTAR, AMSTAR 2, and ROBIS. We will record the time to complete each assessment and for the two reviewers to reach consensus for each SR. We will extract the descriptive characteristics of each SR, the included studies, participants, interventions, and comparators. We will also extract the direction and strength of the results and conclusions for the primary outcome. We will summarise the descriptive characteristics of the SRs using means and standard deviations, or frequencies and proportions. To test for interrater reliability between reviewers and between the consensus agreements of reviewer pairs, we will use Gwet's AC 1 statistic. For comparability to previous evaluations, we will also calculate weighted Cohen's kappa and Fleiss' kappa statistics. To estimate usability, we will calculate the mean time to complete the appraisal and to reach consensus for each tool. To inform applications of the tools, we will test for statistical associations between quality scores and risk of bias judgments, and the results and conclusions of the SRs. Appraising the methodological and reporting quality of SRs is necessary to determine the trustworthiness of their conclusions. Which tool may be most reliably applied and how the appraisals should be used is uncertain; the usability of newly developed tools is unknown. This investigation of common (AMSTAR) and newly developed (AMSTAR 2, ROBIS) tools will provide empiric data to inform their application, interpretation, and refinement.
Empirical Tryout of a New Statistic for Detecting Temporally Inconsistent Responders.
Kerry, Matthew J
2018-01-01
Statistical screening of self-report data is often advised to support the quality of analyzed responses - For example, reduction of insufficient effort responding (IER). One recently introduced index based on Mahalanobis's D for detecting outliers in cross-sectional designs replaces centered scores with difference scores between repeated-measure items: Termed person temporal consistency ( D 2 ptc ). Although the adapted D 2 ptc index demonstrated usefulness in simulation datasets, it has not been applied to empirical data. The current study addresses D 2 ptc 's low uptake by critically appraising its performance across three empirical applications. Independent samples were selected to represent a range of scenarios commonly encountered by organizational researchers. First, in Sample 1, a repeat-measure of future time perspective (FTP) inexperienced working adults (age >40-years; n = 620) indicated that temporal inconsistency was significantly related to respondent age and item reverse-scoring. Second, in repeat-measure of team efficacy aggregations, D 2 ptc successfully detected team-level inconsistency across repeat-performance cycles. Thirdly, the usefulness of the D 2 ptc was examined in an experimental study dataset of subjective life expectancy indicated significantly more stable responding in experimental conditions compared to controls. The empirical findings support D 2 ptc 's flexible and useful application to distinct study designs. Discussion centers on current limitations and further extensions that may be of value to psychologists screening self-report data for strengthening response quality and meaningfulness of inferences from repeated-measures self-reports. Taken together, the findings support the usefulness of the newly devised statistic for detecting IER and other extreme response patterns.
Martínez, Marina; Arantzamendi, María; Belar, Alazne; Carrasco, José Miguel; Carvajal, Ana; Rullán, María; Centeno, Carlos
2017-06-01
Dignity therapy is psychotherapy to relieve psychological and existential distress in patients at the end of life. Little is known about its effect. To analyse the outcomes of dignity therapy in patients with advanced life-threatening diseases. Systematic review was conducted. Three authors extracted data of the articles and evaluated quality using Critical Appraisal Skills Programme. Data were synthesized, considering study objectives. PubMed, CINAHL, Cochrane Library and PsycINFO. The years searched were 2002 (year of dignity therapy development) to January 2016. 'Dignity therapy' was used as search term. Studies with patients with advanced life-threatening diseases were included. Of 121 studies, 28 were included. Quality of studies is high. Results were grouped into effectiveness, satisfaction, suitability and feasibility, and adaptability to different diseases and cultures. Two of five randomized control trials applied dignity therapy to patients with high levels of baseline psychological distress. One showed statistically significant decrease on patients' anxiety and depression scores over time. The other showed statistical decrease on anxiety scores pre-post dignity therapy, not on depression. Nonrandomized studies suggested statistically significant improvements in existential and psychosocial measurements. Patients, relatives and professionals perceived it improved end-of-life experience. Evidence suggests that dignity therapy is beneficial. One randomized controlled trial with patients with high levels of psychological distress shows DT efficacy in anxiety and depression scores. Other design studies report beneficial outcomes in terms of end-of-life experience. Further research should understand how dignity therapy functions to establish a means for measuring its impact and assessing whether high level of distress patients can benefit most from this therapy.
Verhagen, Simone J. W.; Simons, Claudia J. P.; van Zelst, Catherine; Delespaul, Philippe A. E. G.
2017-01-01
Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a ‘behavior setting’) with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available. PMID:29163294
Verhagen, Simone J W; Simons, Claudia J P; van Zelst, Catherine; Delespaul, Philippe A E G
2017-01-01
Background: Mental healthcare needs person-tailored interventions. Experience Sampling Method (ESM) can provide daily life monitoring of personal experiences. This study aims to operationalize and test a measure of momentary reward-related Quality of Life (rQoL). Intuitively, quality of life improves by spending more time on rewarding experiences. ESM clinical interventions can use this information to coach patients to find a realistic, optimal balance of positive experiences (maximize reward) in daily life. rQoL combines the frequency of engaging in a relevant context (a 'behavior setting') with concurrent (positive) affect. High rQoL occurs when the most frequent behavior settings are combined with positive affect or infrequent behavior settings co-occur with low positive affect. Methods: Resampling procedures (Monte Carlo experiments) were applied to assess the reliability of rQoL using various behavior setting definitions under different sampling circumstances, for real or virtual subjects with low-, average- and high contextual variability. Furthermore, resampling was used to assess whether rQoL is a distinct concept from positive affect. Virtual ESM beep datasets were extracted from 1,058 valid ESM observations for virtual and real subjects. Results: Behavior settings defined by Who-What contextual information were most informative. Simulations of at least 100 ESM observations are needed for reliable assessment. Virtual ESM beep datasets of a real subject can be defined by Who-What-Where behavior setting combinations. Large sample sizes are necessary for reliable rQoL assessments, except for subjects with low contextual variability. rQoL is distinct from positive affect. Conclusion: rQoL is a feasible concept. Monte Carlo experiments should be used to assess the reliable implementation of an ESM statistic. Future research in ESM should asses the behavior of summary statistics under different sampling situations. This exploration is especially relevant in clinical implementation, where often only small datasets are available.
Feio, M J; Ferreira, J; Buffagni, A; Erba, S; Dörflinger, G; Ferréol, M; Munné, A; Prat, N; Tziortzis, I; Urbanič, G
2014-04-01
Within the Mediterranean region each country has its own assessment method based on aquatic macroinvertebrates. However, independently of the classification system, quality assessments should be comparable across members of the European Commission, which means, among others, that the boundaries between classes should not deviate significantly. Here we check for comparability between High-Good and Good-Moderate classifications, through the use of a common metric. Additionally, we discuss the influence of the conceptual and statistical approaches used to calculate a common boundary within the Mediterranean countries participating in the Intercalibration Exercise (e.g., using individual national type-boundaries, one value for each common type or an average boundary by country; weighted average, median) in the overall outcome. All methods, except for the IBMWP (the Iberian BMWP) when applied to temporary rivers, were highly correlated (0.82
Inpatient preanalytic process improvements.
Wagar, Elizabeth A; Phipps, Ron; Del Guidice, Robert; Middleton, Lavinia P; Bingham, John; Prejean, Cheryl; Johnson-Hamilton, Martha; Philip, Pheba; Le, Ngoc Han; Muses, Waheed
2013-12-01
Phlebotomy services are a common target for preanalytic improvements. Many new, quality engineering tools have recently been applied in clinical laboratories. However, data on relatively few projects have been published. This example describes a complete application of current, quality engineering tools to improve preanalytic phlebotomy services. To decrease the response time in the preanalytic inpatient laboratory by 25%, to reduce the number of incident reports related to preanalytic phlebotomy, and to make systematic process changes that satisfied the stakeholders. The Department of Laboratory Medicine, General Services Section, at the University of Texas MD Anderson Cancer Center (Houston) is responsible for inpatient phlebotomy in a 24-hour operation, which serves 689 inpatient beds. The study director was project director of the Division of Pathology and Laboratory Medicine's Quality Improvement Section and was assisted by 2 quality technologists and an industrial engineer from MD Anderson Office of Performance Improvement. After implementing each solution, using well-recognized, quality tools and metrics, the response time for blood collection decreased by 23%, which was close to meeting the original responsiveness goal of 25%. The response time between collection and arrival in the laboratory decreased by 8%. Applicable laboratory-related incident reports were reduced by 43%. Comprehensive application of quality tools, such as statistical control charts, Pareto diagrams, value-stream maps, process failure modes and effects analyses, fishbone diagrams, solution prioritization matrices, and customer satisfaction surveys can significantly improve preset goals for inpatient phlebotomy.
Nimptsch, Ulrike; Peschke, Dirk; Mansky, Thomas
2016-10-01
In 2008 the 'Initiative Qualitätsmedizin' (initiative for quality in medical care, IQM) was established as a voluntary non-profit association of hospital providers of all kinds of ownership. Currently, about 350 hospitals from Germany and Switzerland participate in IQM. Member hospitals are committed to a quality strategy based on measuring outcome indicators using administrative data, peer review procedures to improve medical quality, and transparency by public reporting. This study aims to investigate whether voluntary implementation of this approach is associated with improvements in medical outcome. Within a retrospective before-after study 63 hospitals, which started to participate in IQM between 2009 and 2011, were monitored. In-hospital mortality in these hospitals was studied for 14 selected inpatient services in comparison to the German national average. The analyses examine whether in-hospital mortality declined after participation of the studied hospitals in IQM, independently of secular trends or deviations in case mix when compared to the national average, and whether such findings were associated with initial hospital performance or peer review procedures. Declining in-hospital mortality was observed in hospitals with initially subpar performance. These declines were statistically significant for treatment of myocardial infarction, heart failure, pneumonia, and septicemia. Similar, but statistically non-significant trends were observed for nine further treatments. Following peer-review procedures significant declines in in-hospital mortality were observed for treatments of myocardial infarction, heart failure, and pneumonia. Mortality declines after peer reviews regarding stroke, hip fracture and colorectal resection were not significant, and after peer reviews regarding mechanically ventilated patients no changes were observed. The results point to a positive impact of the quality approach applied by IQM on clinical outcomes. A more targeted selection of hospitals to be peer-reviewed might further enhance the impact of this approach. Copyright © 2016. Published by Elsevier GmbH.
Cui, Ming; Xu, Lili; Wang, Huimin; Ju, Shaoqing; Xu, Shuizhu; Jing, Rongrong
2017-12-01
Measurement uncertainty (MU) is a metrological concept, which can be used for objectively estimating the quality of test results in medical laboratories. The Nordtest guide recommends an approach that uses both internal quality control (IQC) and external quality assessment (EQA) data to evaluate the MU. Bootstrap resampling is employed to simulate the unknown distribution based on the mathematical statistics method using an existing small sample of data, where the aim is to transform the small sample into a large sample. However, there have been no reports of the utilization of this method in medical laboratories. Thus, this study applied the Nordtest guide approach based on bootstrap resampling for estimating the MU. We estimated the MU for the white blood cell (WBC) count, red blood cell (RBC) count, hemoglobin (Hb), and platelets (Plt). First, we used 6months of IQC data and 12months of EQA data to calculate the MU according to the Nordtest method. Second, we combined the Nordtest method and bootstrap resampling with the quality control data and calculated the MU using MATLAB software. We then compared the MU results obtained using the two approaches. The expanded uncertainty results determined for WBC, RBC, Hb, and Plt using the bootstrap resampling method were 4.39%, 2.43%, 3.04%, and 5.92%, respectively, and 4.38%, 2.42%, 3.02%, and 6.00% with the existing quality control data (U [k=2]). For WBC, RBC, Hb, and Plt, the differences between the results obtained using the two methods were lower than 1.33%. The expanded uncertainty values were all less than the target uncertainties. The bootstrap resampling method allows the statistical analysis of the MU. Combining the Nordtest method and bootstrap resampling is considered a suitable alternative method for estimating the MU. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Jeong, Hyeonsoo; Song, Ki-Duk; Seo, Minseok; Caetano-Anollés, Kelsey; Kim, Jaemin; Kwak, Woori; Oh, Jae-Don; Kim, EuiSoo; Jeong, Dong Kee; Cho, Seoae; Kim, Heebal; Lee, Hak-Kyo
2015-08-20
Natural and artificial selection following domestication has led to the existence of more than a hundred pig breeds, as well as incredible variation in phenotypic traits. Berkshire pigs are regarded as having superior meat quality compared to other breeds. As the meat production industry seeks selective breeding approaches to improve profitable traits such as meat quality, information about genetic determinants of these traits is in high demand. However, most of the studies have been performed using trained sensory panel analysis without investigating the underlying genetic factors. Here we investigate the relationship between genomic composition and this phenotypic trait by scanning for signatures of positive selection in whole-genome sequencing data. We generated genomes of 10 Berkshire pigs at a total of 100.6 coverage depth, using the Illumina Hiseq2000 platform. Along with the genomes of 11 Landrace and 13 Yorkshire pigs, we identified genomic variants of 18.9 million SNVs and 3.4 million Indels in the mapped regions. We identified several associated genes related to lipid metabolism, intramuscular fatty acid deposition, and muscle fiber type which attribute to pork quality (TG, FABP1, AKIRIN2, GLP2R, TGFBR3, JPH3, ICAM2, and ERN1) by applying between population statistical tests (XP-EHH and XP-CLR). A statistical enrichment test was also conducted to detect breed specific genetic variation. In addition, de novo short sequence read assembly strategy identified several candidate genes (SLC25A14, IGF1, PI4KA, CACNA1A) as also contributing to lipid metabolism. Results revealed several candidate genes involved in Berkshire meat quality; most of these genes are involved in lipid metabolism and intramuscular fat deposition. These results can provide a basis for future research on the genomic characteristics of Berkshire pigs.
Obsessive Compulsive Symptoms and Quality of Life in mothers of Children With Atopic Dermatitis.
Gunduz, S; Usak, E; Ozen, S; Gorpelioglu, C
2017-06-01
Atopic dermatitis is one of the most common skin disorders in children and it can negatively affect both children and their families. The purpose of this study was to investigate the effect of atopic dermatitis on quality of life related to maternal health and maternal obsessive compulsive symptoms. A cross-sectional study was conducted in the pediatric and dermatology polyclinics. The SCORAD index was used for determining the severity of disease, and the Maudsley Obsessive Compulsive Inventory (MOCI) and SF-36 form were applied to the participants' mothers. A total of 120 children and their mothers participated the study. Comparing the atopic dermatitis group and the healthy control group, no statistically significant differences were seen in terms of MOCI and SF-36 scores, except for the physical functioning subscore. The results showed that having a child with atopic dermatitis and the severity of the disease do not influence their mothers in terms of obsessive-compulsive symptoms and health-related quality of life, except for physical functioning scores. Copyright © 2017 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.
Egg quality in laying hens exposed to Mycoplasma gallisepticum F-strain attenuated vaccine.
Machado, L D S; Santos, F F D; Togashi, C K; Abreu, D L D C; Pimentel, J C; Sesti, L; Pereira, V L D A; Nascimento, E R D
2017-04-01
Mycoplasma gallisepticum causes coughing, ocular and nasal discharge, reduction in feed intake, lower and uneven growth, decline in egg production and quality, and increase in mortality. Among the attenuated vaccination strains, MGF can reduce clinical signs and lesions in layer hens, stimulate immune responses of cellular and humoral basis, act as an instrument of competitive exclusion in relation to field strains, and reduce the use of antimicrobials. This study aimed to investigate the effects of attenuated MG F-strain vaccination on egg quality in 3 groups of 30 hens each, being one control and 2 vaccinated. Vaccination was applied by ocular route at 8 and 12 wk of age. Comparisons were made among unvaccinated hens; vaccinated at 8 wk of age; and vaccinated at 8 and 12 wk of age. There were no statistical differences in eggshell thickness and weight among groups. Eggs from twice vaccinated birds yielded a Haugh unit significantly lower than the other groups without affecting egg classification. There was no significant difference in ELISA results between the vaccinated groups. © 2016 Poultry Science Association Inc.
NASA Astrophysics Data System (ADS)
Deligiorgi, Despina; Philippopoulos, Kostas; Thanou, Lelouda; Karvounis, Georgios
2010-01-01
Spatial interpolation in air pollution modeling is the procedure for estimating ambient air pollution concentrations at unmonitored locations based on available observations. The selection of the appropriate methodology is based on the nature and the quality of the interpolated data. In this paper, an assessment of three widely used interpolation methodologies is undertaken in order to estimate the errors involved. For this purpose, air quality data from January 2001 to December 2005, from a network of seventeen monitoring stations, operating at the greater area of Athens in Greece, are used. The Nearest Neighbor and the Liner schemes were applied to the mean hourly observations, while the Inverse Distance Weighted (IDW) method to the mean monthly concentrations. The discrepancies of the estimated and measured values are assessed for every station and pollutant, using the correlation coefficient, the scatter diagrams and the statistical residuals. The capability of the methods to estimate air quality data in an area with multiple land-use types and pollution sources, such as Athens, is discussed.
Semen quality and sex hormones with reference to metal welding.
Hjollund, N H; Bonde, J P; Jensen, T K; Ernst, E; Henriksen, T B; Kolstad, H A; Giwercman, A; Skakkebaek, N E; Olsen, J
1998-01-01
Welding may involve hazards to the male reproductive system, but previous studies of semen quality have produced inconsistent results. We studied the effects of welding on markers of semen quality in a Danish nationwide sample of 430 first-time pregnancy planners without earlier reproductive experience. Couples were recruited among members of the union of metal workers and three other trade unions and were followed from termination of birth control until pregnancy for a maximum of six menstrual cycles. The males provided semen samples in each cycle. Median sperm density for welders was 56 x 10(6)/mL (52.5 x 10(6)/mL and 50.0 x 10(6)/mL in two reference groups). No statistically significant differences attributable to welding were found in proportions of morphologically normal sperm, sperm motility assessed by computer-aided sperm analysis, or sex hormones (testosterone, follicle-stimulating hormone, and luteinizing hormone). These negative findings may not apply to populations with high-level exposure to welding fume or to welders exposed to other putative hazards, e.g., heat.
Do HMO penetration and hospital competition impact quality of hospital care?
Rivers, P A; Fottler, M D
2004-11-01
This study examines the impact of HMO penetration and competition on hospital markets. A modified structure-conduct-performance paradigm was applied to the health care industry in order to investigate the impact of HMO penetration and competition on risk-adjusted hospital mortality rates (i.e. quality of hospital care). Secondary data for 1957 acute care hospitals in the USA from the 1991 American Hospital Association's Annual Survey of Hospitals were used. The outcome variables were risk-adjusted mortality rates in 1991. Predictor variables were market characteristics (i.e. managed care penetration and hospital competition). Control variables were environmental, patient, and institutional characteristics. Associations between predictor and outcome variables were investigated using statistical regression techniques. Hospital competition had a negative relationship with risk-adjusted mortality rates (a negative indicator of quality of care). HMO penetration, hospital competition, and an interaction effect of HMO penetration and competition were not found to have significant effects on risk-adjusted mortality rates. These findings suggest that when faced with intense competition, hospitals may respond in ways associated with reducing their mortality rates.
34 CFR 611.1 - What definitions apply to the Teacher Quality Enhancement Grants Program?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 34 Education 3 2012-07-01 2012-07-01 false What definitions apply to the Teacher Quality... Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TEACHER QUALITY ENHANCEMENT GRANTS PROGRAM General Provisions § 611.1 What definitions apply to the Teacher Quality Enhancement...
34 CFR 611.1 - What definitions apply to the Teacher Quality Enhancement Grants Program?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 34 Education 3 2014-07-01 2014-07-01 false What definitions apply to the Teacher Quality... Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TEACHER QUALITY ENHANCEMENT GRANTS PROGRAM General Provisions § 611.1 What definitions apply to the Teacher Quality Enhancement...
34 CFR 611.1 - What definitions apply to the Teacher Quality Enhancement Grants Program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 34 Education 3 2011-07-01 2011-07-01 false What definitions apply to the Teacher Quality... Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION TEACHER QUALITY ENHANCEMENT GRANTS PROGRAM General Provisions § 611.1 What definitions apply to the Teacher Quality Enhancement...
21 CFR 211.165 - Testing and release for distribution.
Code of Federal Regulations, 2010 CFR
2010-04-01
... products meet each appropriate specification and appropriate statistical quality control criteria as a condition for their approval and release. The statistical quality control criteria shall include appropriate acceptance levels and/or appropriate rejection levels. (e) The accuracy, sensitivity, specificity, and...
Some Dimensions of Data Quality in Statistical Systems
DOT National Transportation Integrated Search
1997-07-01
An important objective of a statistical data system is to enable users of the data to recommend (an organizations to take) rational action for solving problems or for improving quality of service of manufactured product. With this view in mind, this ...
Distribution of water quality parameters in Dhemaji district, Assam (India).
Buragohain, Mridul; Bhuyan, Bhabajit; Sarma, H P
2010-07-01
The primary objective of this study is to present a statistically significant water quality database of Dhemaji district, Assam (India) with special reference to pH, fluoride, nitrate, arsenic, iron, sodium and potassium. 25 water samples collected from different locations of five development blocks in Dhemaji district have been studied separately. The implications presented are based on statistical analyses of the raw data. Normal distribution statistics and reliability analysis (correlation and covariance matrix) have been employed to find out the distribution pattern, localisation of data, and other related information. Statistical observations show that all the parameters under investigation exhibit non uniform distribution with a long asymmetric tail either on the right or left side of the median. The width of the third quartile was consistently found to be more than the second quartile for each parameter. Differences among mean, mode and median, significant skewness and kurtosis value indicate that the distribution of various water quality parameters in the study area is widely off normal. Thus, the intrinsic water quality is not encouraging due to unsymmetrical distribution of various water quality parameters in the study area.
Pierre-Olivier, Jean; Bradley, Robert L; Tremblay, Jean-Pierre; Côté, Steeve D
2015-09-01
An important asset for the management of wild ungulates is recognizing the spatial distribution of forage quality across heterogeneous landscapes. To do so typically requires knowledge of which plant species are eaten, in what abundance they are eaten, and what their nutritional quality might be. Acquiring such data, however, may be difficult and time consuming. Here, we are proposing a rapid and cost-effective forage quality monitoring tool that combines near infrared (NIR) spectra of fecal samples and easily obtained data on plant community composition. Our approach rests on the premise that NIR spectra of fecal samples collected within low population density exclosures reflect the optimal forage quality of a given landscape. Forage quality can thus be based on the Mahalanobis distance of fecal spectral scans across the landscape relative to fecal spectral scans inside exclosures (referred to as DISTEX). The Gi* spatial autocorrelation statistic can then be applied among neighboring DISTEX values to detect and map "hot spots" and "cold spots" of nutritional quality over the landscape. We tested our approach in a heterogeneous boreal landscape on Anticosti Island (Québec, Canada), where white-tailed deer (Odocoileus virginianus) populations over the landscape have ranged from 20 to 50 individuals/km2 for at least 80 years, resulting in a loss of most palatable and nutritious plant species. Our results suggest that hot spots of forage quality occur when old-growth balsam fir stands comprise >39.8% of 300 ha neighborhoods, whereas cold spots occur in laggs (i.e., transition zones from forest to peatland). In terms of ground-level indicator plant species, the presence of Canada bunchberry (Cornus canadensis) was highly correlated with hot spots, whereas tamarack (Larix laricina) was highly correlated with cold spots. Mean DISTEX values were positively and significantly correlated with the neutral detergent fiber and acid detergent lignin contents of feces. While our approach would need more independent field trials before it is fully validated, its low cost and ease of execution should make it a valuable tool for advancing both the basic and applied ecology of large herbivores.
NASA Astrophysics Data System (ADS)
Suthama, N.; Pramono, Y. B.; Sukamto, B.
2018-01-01
Dietary inclusion of antibiotics as growth promoters (AGPs) in poultry production has been applied for decades worldwide, but recently AGPs have been banned due to the negative consequences for health and food safety. Soybean oligosccharide (SOS) derived from soybean meal extract is one of natural compound without carrying-over the residue to product and is consumer’s health friendly. The purpose of the present study was to evaluate dietary inclusion of SOS on broiler meat quality. A total of 120 broilers of 7-day-old were allocated into 3 treatments with 4 replications (10 birds each) in completely randomized design. Treatments applied were D1: diet without SOS, D2: D1 plus 0.15% SOS, and D3: D1 plus 0.30% SOS. Intestinal lactic acid bacteria (LAB), protein digestibility, meat protein and fat depositions, and meat cholesterol were the parameters observed. Data were statistically tested using analysis of variance and Duncan test. Dietary SOS inclusion at 0.30% (D3) significantly (P<0.05) increased LAB population (7.21x104 cfu/g), protein digestibility (72.80%), and meat protein deposition (90.83 g/bird), but it decreased meat fat (8.27 g/bird) and meat cholesterol (37.28 mg/100 g). In conclusion, dietary SOS inclusion at 0.30% improves meat quality of broiler based on the increase in meat protein deposition with lower fat and cholesterol.
Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther
2015-03-01
Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.
Petruseviciene, Daiva; Surmaitiene, Deive; Baltaduoniene, Daiva; Lendraitiene, Egle
2018-01-01
We aimed to evaluate the short-term effects of community-based occupational therapy on health-related quality of life and engagement in meaningful activities among women with breast cancer. An open label randomized controlled trial study design was applied. The participants were members of various societies of women with cancer. In total, 22 women have participated in the study. Participants of the experimental group ( n = 11) participated in a 6-week community-based occupational therapy program and the usual activities of various societies, whereas the control group ( n = 11) women participated in the usual activities of the societies only. 1 of the participants withdrew during the course; therefore 21 completed the study successfully. Participants of both groups were assessed for health-related quality of life and the participants of the experimental group were assessed for engagement in meaningful activities. The evaluation was carried out during the nonacute period of the disease-at the beginning of the study and after 6 weeks. Women of the experimental group demonstrated statistically significantly better scores in the global quality of life, role functions, physical, emotional, cognitive, and social functions, fatigue, insomnia, financial impact, systemic therapy side effects, and breast symptoms scales compared to the control group participants ( p < 0.05) after the 6 weeks, as measured by the EORTC QLQ-C30 questionnaire and its breast cancer module QLQ-BR23 . Furthermore, women of the experimental group demonstrated significant greater engagement in meaningful activities when applying community-based occupational therapy ( p < 0.05), as measured by using the Engagement in Meaningful Activities Survey (EMAS). The evaluation of the associations between the women's engagement in meaningful activities and changes in health-related quality of life showed that greater engagement in meaningful activities was associated with better emotional functions and a lower level of insomnia ( p < 0.05). Based on the results of our study, we recommend applying occupational therapy in the field of community healthcare in order to maintain or improve breast cancer patients' health-related quality of life and suggest involving women into meaningful activities during community-based occupational therapy after clarifying which activities are important to them.
Three Experts on Quality Management: Philip B. Crosby, W. Edwards Deming, Joseph M. Juran
1992-07-01
Department of the Navy Office of the Under Secretary of the Navy Total Quality Leadership Omce THREE EXPERTS ON QUALITY MANAGEMENT : PHILIP B. CROSBY W...research, as the "price of nonconformance." To aid managers in statistical theory , statistical thinking, and the application tracking the cost of doing...Quality Management emphasizes that the process must become a way of life in Theory of Systems. "A system is a series of the organization. Continuance is
ERIC Educational Resources Information Center
Huizingh, Eelko K. R. E.
2007-01-01
Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…
Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.
Westgard, James O; Westgard, Sten A
2017-03-01
Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Modeling the Dispersion of Inert Particles Using the SAQM Model
NASA Astrophysics Data System (ADS)
Pearson, R.; Fitzgerald, R. M.
2005-12-01
Cities throughout the U.S are subject to the emission of particulate matter (PM) into the atmosphere from a variety of sources. The impact of these emissions has been studied extensively in for regulatory compliance in the area of health effects, air quality and visibility. Little work has been done to study the fate and transport of the inert particulate matter within the El Paso-Juarez Airshed. The Environmental Physics Group at The University of Texas at El Paso has recently applied the SARMAP Air Quality Model (SAQM) to model the dispersion of inert particulate matter in the El Paso-Juarez Airshed. The meteorological data for the SAQM was created with the Penn State/NCAR meteorological modeling system, version 5 (MM5). The SAQM was used to simulate two common occurrences for large particulate emission and concentration. The first was periods of heavy traffic volume at the international bridges which cause large numbers of cars to sit, with engines running, for extended periods of time. The second was moderate to high wind events that cause large amounts of coarse particulate matter to become entrained in the atmosphere and transported into and around the region. Output from the MM5 was used as the meteorological driver for the SAQM. The MM5 was initialized with data from the NCAR reanalysis project. Meteorological data collected in the region by the Texas Commission on Environmental Quality (TCEQ) and by EPA was used for Four Dimensional Data Assimilation. The MM5 was nudged with gridded, surface and observational data. Statistical analysis was done on the MM5 for the variables, wind speed, wind direction, temperature and mixing ratio. The statistics performed included RMSE, RMSEs, RMSEu and index of agreement SAQM was applied to the domain with grid cell sizes of 1.3 km per side. Temporal comparisons were done between EPA's PM2.5 to identify similarities in the evolution of the SAQM with observation. The experience gained in this work will facilitate further studies of dispersion of inert particles throughout other U.S Southwest cities.
Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril
2014-07-01
Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Real, Jordi; Forné, Carles; Roso-Llorach, Albert; Martínez-Sánchez, Jose M
2016-05-01
Controlling for confounders is a crucial step in analytical observational studies, and multivariable models are widely used as statistical adjustment techniques. However, the validation of the assumptions of the multivariable regression models (MRMs) should be made clear in scientific reporting. The objective of this study is to review the quality of statistical reporting of the most commonly used MRMs (logistic, linear, and Cox regression) that were applied in analytical observational studies published between 2003 and 2014 by journals indexed in MEDLINE.Review of a representative sample of articles indexed in MEDLINE (n = 428) with observational design and use of MRMs (logistic, linear, and Cox regression). We assessed the quality of reporting about: model assumptions and goodness-of-fit, interactions, sensitivity analysis, crude and adjusted effect estimate, and specification of more than 1 adjusted model.The tests of underlying assumptions or goodness-of-fit of the MRMs used were described in 26.2% (95% CI: 22.0-30.3) of the articles and 18.5% (95% CI: 14.8-22.1) reported the interaction analysis. Reporting of all items assessed was higher in articles published in journals with a higher impact factor.A low percentage of articles indexed in MEDLINE that used multivariable techniques provided information demonstrating rigorous application of the model selected as an adjustment method. Given the importance of these methods to the final results and conclusions of observational studies, greater rigor is required in reporting the use of MRMs in the scientific literature.
Process control charts in infection prevention: Make it simple to make it happen.
Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A
2017-03-01
Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Kuokkanen, Liisa; Leino-Kilpi, Helena; Numminen, Olivia; Isoaho, Hannu; Flinkman, Mervi; Meretoja, Riitta
2016-01-01
Although both nurse empowerment and competence are fundamental concepts of describing newly graduated nurses' professional development and job satisfaction, only few studies exist on the relationship between these concepts. Therefore, the purpose of this study was to determine how newly graduated nurses assess their empowerment and to clarify professional competence compared to other work-related factors. A descriptive, cross-sectional and correlational design was applied. The sample comprised newly graduated nurses (n = 318) in Finland. Empowerment was measured using the 19-item Qualities of an Empowered Nurse scale and the Nurse Competence Scale measured nurses' self-assessed generic competence. In addition to demographic data, the background data included employment sector (public/private), job satisfaction, intent to change/leave job, work schedule (shifts/business hours) and assessments of the quality of care in the workplace. The data were analysed statistically by using Spearman's correlation coefficient as well as the One-Way and Multivariate Analysis of Variance. Cronbach's alpha coefficient was used to estimate the internal consistency. Newly graduated nurses perceived their level of empowerment and competence fairly high. The association between nurse empowerment and professional competence was statistically significant. Other variables correlating positively to empowerment included employment sector, age, job satisfaction, intent to change job, work schedule, and satisfaction with the quality of care in the work unit. The study indicates competence had the strongest effect on newly graduated nurses' empowerment. New graduates need support and career opportunities. In the future, nurses' further education and nurse managers' resources for supporting and empowering nurses should respond to the newly graduated nurses' requisites for attractive and meaningful work.
2012-01-01
Background This study describes 2-year impact on quality of life (QOL) in relation to the anatomical discrepancy among T4a oral cancer patients after free flap reconstruction in Taiwan. Methods Thirty-two patients who underwent tumor ablation with simultaneous microvascular free flap transfer at 2-year follow-up were recruited. They were divided into six subgroups, according to the resected area, consisting of: (1) buccal/retromolar trigone; (2) cheek; (3) commissure; (4) lip; (5) mandible; and (6) tongue. Functional disturbances and daily activity were analyzed using the Version-1 UW QOL Questionnaire with one more specific category: ‘Drooling’. Kruskal-Wallis rank sums analysis was used to test differences in average QOL scores between these subgroups. Post-hoc analysis was applied to assess influence of dominant categories between subgroups. Results The category ‘Pain’ revealed the highest average score and reached significant statistical difference (P = 0.019) among all the categories, however, the category ‘Employment’ averaged the lowest score. Regarding ‘Pain’, there existed a statistical significance (P = 0.0032) between the commissure- and cheek-involved groups, which described the former showed poorer pain quality of life. Conclusions The commissure-involved group had the lowest average score, which might imply the worst QOL in our study, especially for the categories ‘Pain’ and ‘Drooling’. This present study of T4a patients was the first carried out in Taiwan implementing the QOL questionnaire, and its results may serve for future reference. PMID:22789070
Anota, Amélie; Hamidou, Zeinab; Paget-Bailly, Sophie; Chibaudel, Benoist; Bascoul-Mollevi, Caroline; Auquier, Pascal; Westeel, Virginie; Fiteni, Frederic; Borg, Christophe; Bonnetain, Franck
2015-01-01
Longitudinal analysis of health-related quality of life (HRQoL) remains unstandardized and compromises comparison of results between trials. In oncology, despite available statistical approaches, results are poorly used to change standards of care, mainly due to lack of standardization and the ability to propose clinical meaningful results. In this context, the time to deterioration (TTD) has been proposed as a modality of longitudinal HRQoL analysis for cancer patients. As for tumor response and progression, we propose to develop RECIST criteria for HRQoL. Several definitions of TTD are investigated in this paper. We applied this approach in early breast cancer and metastatic pancreatic cancer with a 5-point minimal clinically important difference. In breast cancer, TTD was defined as compared to the baseline score or to the best previous score. In pancreatic cancer (arm 1: gemcitabine with FOLFIRI.3, arm 2: gemcitabine alone), the time until definitive deterioration (TUDD) was investigated with or without death as event. In the breast cancer study, 381 women were included. The median TTD was influenced by the choice of the reference score. In pancreatic cancer study, 98 patients were enrolled. Patients in Arm 1 presented longer TUDD than those in Arm 2 for most of HRQoL scores. Results of TUDD were slightly different according to the definition of deterioration applied. Currently, the international ARCAD group supports the idea of developing RECIST for HRQoL in pancreatic and colorectal cancer with liver metastasis, with a view to using HRQoL as a co-primary endpoint along with a tumor parameter.
Jarzyna, Ingeborga; Obidziński, Artur; Tokarska-Guzik, Barbara; Sotek, Zofia; Pabjanek, Piotr; Pytlarczyk, Adam; Sachajdakiewicz, Izabela
2017-01-01
Species distribution models are scarcely applicable to invasive species because of their breaking of the models’ assumptions. So far, few mechanistic, semi-mechanistic or statistical solutions like dispersal constraints or propagule limitation have been applied. We evaluated a novel quasi-semi-mechanistic approach for regional scale models, using historical proximity variables (HPV) representing a state of the population in a given moment in the past. Our aim was to test the effects of addition of HPV sets of different minimal recentness, information capacity and the total number of variables on the quality of the species distribution model for Heracleum mantegazzianum on 116000 km2 in Poland. As environmental predictors, we used fragments of 103 1×1 km, world- wide, free-access rasters from WorldGrids.org. Single and ensemble models were computed using BIOMOD2 package 3.1.47 working in R environment 3.1.0. The addition of HPV improved the quality of single and ensemble models from poor to good and excellent. The quality was the highest for the variants with HPVs based on the distance from the most recent past occurrences. It was mostly affected by the algorithm type, but all HPV traits (minimal recentness, information capacity, model type or the number of the time periods) were significantly important determinants. The addition of HPVs improved the quality of current projections, raising the occurrence probability in regions where the species had occurred before. We conclude that HPV addition enables semi-realistic estimation of the rate of spread and can be applied to the short-term forecasting of invasive or declining species, which also break equal-dispersal probability assumptions. PMID:28926580
Zhang, X Y; Li, H; Zhao, Y J; Wang, Y; Sun, Y C
2016-07-01
To quantitatively evaluate the quality and accuracy of three-dimensional (3D) data acquired by using two kinds of structure intra-oral scanner to scan the typical teeth crown preparations. Eight typical teeth crown preparations model were scanned 3 times with two kinds of structured light intra-oral scanner(A, B), as test group. A high precision model scanner were used to scan the model as true value group. The data above the cervical margin was extracted. The indexes of quality including non-manifold edges, the self-intersections, highly-creased edges, spikes, small components, small tunnels, small holes and the anount of triangles were measured with the tool of mesh doctor in Geomagic studio 2012. The scanned data of test group were aligned to the data of true value group. 3D deviations of the test group compared with true value group were measured for each scanned point, each preparation and each group. Independent-samples Mann-Whitney U test was applied to analyze 3D deviations for each scanned point of A and B group. Correlation analysis was applied to index values and 3D deviation values. The total number of spikes in A group was 96, and that in B group and true value group were 5 and 0 respectively. Trueness: A group 8.0 (8.3) μm, B group 9.5 (11.5) μm(P>0.05). Correlation analysis of the number of spikes with data precision of A group was r=0.46. In the study, the qulity of the scanner B is better than scanner A, the difference of accuracy is not statistically significant. There is correlation between quality and data precision of the data scanned with scanner A.
Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.
Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus
2012-01-02
Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.
Autorino, Riccardo; Borges, Claudio; White, Michael A; Altunrende, Fatih; Perdoná, Sisto; Haber, Georges-Pascal; De Sio, Marco; Khanna, Rakesh; Stein, Robert J; Kaouk, Jihad H
2010-12-01
To assess the quality of reporting of randomized controlled trials (RCTs) presented in abstract form at the annual World Congress of Endourology (WCE) and evaluate their course of subsequent publication. All RCTs presented in abstract form at the 2004, 2005, and 2006 WCE annual meetings were identified for review. Quality of reporting was assessed by applying a standardized 14-item evaluation tool based on the Consolidated Standards for the Reporting of Trials (CONSORT) statement. The subsequent publication rate for the corresponding studies by scanning Medline was also evaluated. Appropriate statistical analysis was performed. A total of 94 RCTs (3.5% of 2669) were identified for review: 21 in 2004, 36 in 2005, and 37 in 2006. Overall, 45 (47.3% of the total) were subsequently published as a full length indexed manuscript with a mean time to publication of 16.4 ± 13.2 months. Approximately 61 (60%) identified the study design as RCT in the abstract title. None reported the method of randomization. In studies that reported blinding (seven, 11% of 62), five were double blinded and two single blinded. Adverse events were reported in 38% of cases. Only 10% of the abstracts complied fully with more than 10 items according to our CONSORT-based checklist, whereas the majority of them failed to comply with most of the CONSORT requirements. Although representing a small portion of the overall number of abstracts, there has been a steady increase of presentation of RCTs at the WCE over the assessed 3-year period. Most of the time they are recognized as RCTs in the abstract title. When applying the CONSORT criteria, necessary information to assess their methodologic quality is incomplete in some cases.
Mędrzycki, Piotr; Jarzyna, Ingeborga; Obidziński, Artur; Tokarska-Guzik, Barbara; Sotek, Zofia; Pabjanek, Piotr; Pytlarczyk, Adam; Sachajdakiewicz, Izabela
2017-01-01
Species distribution models are scarcely applicable to invasive species because of their breaking of the models' assumptions. So far, few mechanistic, semi-mechanistic or statistical solutions like dispersal constraints or propagule limitation have been applied. We evaluated a novel quasi-semi-mechanistic approach for regional scale models, using historical proximity variables (HPV) representing a state of the population in a given moment in the past. Our aim was to test the effects of addition of HPV sets of different minimal recentness, information capacity and the total number of variables on the quality of the species distribution model for Heracleum mantegazzianum on 116000 km2 in Poland. As environmental predictors, we used fragments of 103 1×1 km, world- wide, free-access rasters from WorldGrids.org. Single and ensemble models were computed using BIOMOD2 package 3.1.47 working in R environment 3.1.0. The addition of HPV improved the quality of single and ensemble models from poor to good and excellent. The quality was the highest for the variants with HPVs based on the distance from the most recent past occurrences. It was mostly affected by the algorithm type, but all HPV traits (minimal recentness, information capacity, model type or the number of the time periods) were significantly important determinants. The addition of HPVs improved the quality of current projections, raising the occurrence probability in regions where the species had occurred before. We conclude that HPV addition enables semi-realistic estimation of the rate of spread and can be applied to the short-term forecasting of invasive or declining species, which also break equal-dispersal probability assumptions.