Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
I. Arismendi; S. L. Johnson; J. B. Dunham
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical...
USDA-ARS?s Scientific Manuscript database
Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417.568 Adequate... definitions and accounting, statistics, and reporting practices that are widely accepted in the health care... 42 Public Health 3 2010-10-01 2010-10-01 false Adequate financial records, statistical data, and...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417.568 Adequate... definitions and accounting, statistics, and reporting practices that are widely accepted in the health care... 42 Public Health 3 2011-10-01 2011-10-01 false Adequate financial records, statistical data, and...
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
The Checkered History of American Psychiatric Epidemiology
Horwitz, Allan V; Grob, Gerald N
2011-01-01
Context American psychiatry has been fascinated with statistics ever since the specialty was created in the early nineteenth century. Initially, psychiatrists hoped that statistics would reveal the benefits of institutional care. Nevertheless, their fascination with statistics was far removed from the growing importance of epidemiology generally. The impetus to create an epidemiology of mental disorders came from the emerging social sciences, whose members were concerned with developing a scientific understanding of individual and social behavior and applying it to a series of pressing social problems. Beginning in the 1920s, the interest of psychiatric epidemiologists shifted to the ways that social environments contributed to the development of mental disorders. This emphasis dramatically changed after 1980 when the policy focus of psychiatric epidemiology became the early identification and prevention of mental illness in individuals. Methods This article reviews the major developments in psychiatric epidemiology over the past century and a half. Findings The lack of an adequate classification system for mental illness has precluded the field of psychiatric epidemiology from providing causal understandings that could contribute to more adequate policies to remediate psychiatric disorders. Because of this gap, the policy influence of psychiatric epidemiology has stemmed more from institutional and ideological concerns than from knowledge about the causes of mental disorders. Conclusion Most of the problems that have bedeviled psychiatric epidemiology since its inception remain unresolved. In particular, until epidemiologists develop adequate methods to measure mental illnesses in community populations, the policy contributions of this field will not be fully realized. PMID:22188350
Sampling methods for amphibians in streams in the Pacific Northwest.
R. Bruce Bury; Paul Stephen Corn
1991-01-01
Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...
42 CFR 417.568 - Adequate financial records, statistical data, and cost finding.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ORGANIZATIONS, COMPETITIVE MEDICAL PLANS, AND HEALTH CARE PREPAYMENT PLANS Medicare Payment: Cost Basis § 417... health care industry. (b) Provision of data. (1) The HMO or CMP must provide adequate cost and... 42 Public Health 3 2012-10-01 2012-10-01 false Adequate financial records, statistical data, and...
Chan, Robin F.; Shabalin, Andrey A.; Xie, Lin Y.; Adkins, Daniel E.; Zhao, Min; Turecki, Gustavo; Clark, Shaunna L.; Aberg, Karolina A.
2017-01-01
Abstract Methylome-wide association studies are typically performed using microarray technologies that only assay a very small fraction of the CG methylome and entirely miss two forms of methylation that are common in brain and likely of particular relevance for neuroscience and psychiatric disorders. The alternative is to use whole genome bisulfite (WGB) sequencing but this approach is not yet practically feasible with sample sizes required for adequate statistical power. We argue for revisiting methylation enrichment methods that, provided optimal protocols are used, enable comprehensive, adequately powered and cost-effective genome-wide investigations of the brain methylome. To support our claim we use data showing that enrichment methods approximate the sensitivity obtained with WGB methods and with slightly better specificity. However, this performance is achieved at <5% of the reagent costs. Furthermore, because many more samples can be sequenced simultaneously, projects can be completed about 15 times faster. Currently the only viable option available for comprehensive brain methylome studies, enrichment methods may be critical for moving the field forward. PMID:28334972
Strength and life criteria for corrugated fiberboard by three methods
Thomas J. Urbanik
1997-01-01
The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...
Quasi-Experimental Analysis: A Mixture of Methods and Judgment.
ERIC Educational Resources Information Center
Cordray, David S.
1986-01-01
The role of human judgment in the development and synthesis of evidence has not been adequately developed or acknowledged within quasi-experimental analysis. Corrective solutions need to confront the fact that causal analysis within complex environments will require a more active assessment that entails reasoning and statistical modeling.…
Sustaining School Achievement in California's Elementary Schools after State Monitoring
ERIC Educational Resources Information Center
McCabe, Molly
2010-01-01
This study examined the Academic Performance Index (API) and Adequate Yearly Progress (AYP) achievement trends between 2004 and 2006 of 58 California public elementary schools after exiting state monitoring and investigated practices for sustaining consistent achievement growth. Statistical methods were used to analyze statewide achievement trends…
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Astrophysics Data System (ADS)
Wimmer, G.
2008-01-01
In this paper we introduce two confidence and two prediction regions for statistical characterization of concentration measurements of product ions in order to discriminate various groups of persons for prospective better detection of primary lung cancer. Two MATLAB algorithms have been created for more adequate description of concentration measurements of volatile organic compounds in human breath gas for potential detection of primary lung cancer and for evaluation of the appropriate confidence and prediction regions.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
Soares, Micaela A R; Andrade, Sandra R; Martins, Rui C; Quina, Margarida J; Quinta-Ferreira, Rosa M
2012-01-01
Composting is one of the technologies recommended for pre-treating industrial eggshells (ES) before its application in soils, for calcium recycling. However, due to the high inorganic content of ES, a mixture of biodegradable materials is required to assure a successful procedure. In this study, an adequate organic blend composition containing potato peel (PP), grass clippings (GC) and wheat straw (WS) was determined by applying the simplex-centroid mixture design method to achieve a desired moisture content, carbon: nitrogen ratio and free air space for effective composting of ES. A blend of 56% PP, 37% GC and 7% WS was selected and tested in a self heating reactor, where 10% (w/w) of ES was incorporated. After 29 days of reactor operation, a dry matter reduction of 46% was achieved and thermophilic temperatures were maintained during 15 days, indicating that the blend selected by statistical approach was adequate for composting of ES.
Use of statistical and neural net approaches in predicting toxicity of chemicals.
Basak, S C; Grunwald, G D; Gute, B D; Balasubramanian, K; Opitz, D
2000-01-01
Hierarchical quantitative structure-activity relationships (H-QSAR) have been developed as a new approach in constructing models for estimating physicochemical, biomedicinal, and toxicological properties of interest. This approach uses increasingly more complex molecular descriptors in a graduated approach to model building. In this study, statistical and neural network methods have been applied to the development of H-QSAR models for estimating the acute aquatic toxicity (LC50) of 69 benzene derivatives to Pimephales promelas (fathead minnow). Topostructural, topochemical, geometrical, and quantum chemical indices were used as the four levels of the hierarchical method. It is clear from both the statistical and neural network models that topostructural indices alone cannot adequately model this set of congeneric chemicals. Not surprisingly, topochemical indices greatly increase the predictive power of both statistical and neural network models. Quantum chemical indices also add significantly to the modeling of this set of acute aquatic toxicity data.
Austin, Peter C; Schuster, Tibor; Platt, Robert W
2015-10-15
Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.
NASA Technical Reports Server (NTRS)
Darzi, Michael; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)
1992-01-01
Methods for detecting and screening cloud contamination from satellite derived visible and infrared data are reviewed in this document. The methods are applicable to past, present, and future polar orbiting satellite radiometers. Such instruments include the Coastal Zone Color Scanner (CZCS), operational from 1978 through 1986; the Advanced Very High Resolution Radiometer (AVHRR); the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), scheduled for launch in August 1993; and the Moderate Resolution Imaging Spectrometer (IMODIS). Constant threshold methods are the least demanding computationally, and often provide adequate results. An improvement to these methods are the least demanding computationally, and often provide adequate results. An improvement to these methods is to determine the thresholds dynamically by adjusting them according to the areal and temporal distributions of the surrounding pixels. Spatial coherence methods set thresholds based on the expected spatial variability of the data. Other statistically derived methods and various combinations of basic methods are also reviewed. The complexity of the methods is ultimately limited by the computing resources. Finally, some criteria for evaluating cloud screening methods are discussed.
Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation
NASA Technical Reports Server (NTRS)
DePriest, Douglas; Morgan, Carolyn
2003-01-01
The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.
NASA Astrophysics Data System (ADS)
Samanta, Gaurab; Beris, Antony; Handler, Robert; Housiadas, Kostas
2009-03-01
Karhunen-Loeve (KL) analysis of DNS data of viscoelastic turbulent channel flows helps us to reveal more information on the time-dependent dynamics of viscoelastic modification of turbulence [Samanta et. al., J. Turbulence (in press), 2008]. A selected set of KL modes can be used for a data reduction modeling of these flows. However, it is pertinent that verification be done against established DNS results. For this purpose, we did comparisons of velocity and conformations statistics and probability density functions (PDFs) of relevant quantities obtained from DNS and reconstructed fields using selected KL modes and time-dependent coefficients. While the velocity statistics show good agreement between results from DNS and KL reconstructions even with just hundreds of KL modes, tens of thousands of KL modes are required to adequately capture the trace of polymer conformation resulting from DNS. New modifications to KL method have therefore been attempted to account for the differences in conformation statistics. The applicability and impact of these new modified KL methods will be discussed in the perspective of data reduction modeling.
Oil and Gas on Indian Reservations: Statistical Methods Help to Establish Value for Royalty Purposes
ERIC Educational Resources Information Center
Fowler, Mary S.; Kadane, Joseph B.
2006-01-01
Part of the history of oil and gas development on Indian reservations concerns potential underpayment of royalties due to under-valuation of production by oil companies. This paper discusses a model used by the Shoshone and Arapaho tribes in a lawsuit against the Federal government, claiming the Government failed to collect adequate royalties.…
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.
2010-01-01
Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827
Lifshits, A M
1979-01-01
General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
Austin, Peter C
2007-11-01
I conducted a systematic review of the use of propensity score matching in the cardiovascular surgery literature. I examined the adequacy of reporting and whether appropriate statistical methods were used. I examined 60 articles published in the Annals of Thoracic Surgery, European Journal of Cardio-thoracic Surgery, Journal of Cardiovascular Surgery, and the Journal of Thoracic and Cardiovascular Surgery between January 1, 2004, and December 31, 2006. Thirty-one of the 60 studies did not provide adequate information on how the propensity score-matched pairs were formed. Eleven (18%) of studies did not report on whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. No studies used appropriate methods to compare baseline characteristics between treated and untreated subjects in the propensity score-matched sample. Eight (13%) of the 60 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Two studies used appropriate methods for some outcomes, but not for all outcomes. Thirty-nine (65%) studies explicitly used statistical methods that were inappropriate for matched-pairs data when estimating the effect of treatment on outcomes. Eleven studies did not report the statistical tests that were used to assess the statistical significance of the treatment effect. Analysis of propensity score-matched samples tended to be poor in the cardiovascular surgery literature. Most statistical analyses ignored the matched nature of the sample. I provide suggestions for improving the reporting and analysis of studies that use propensity score matching.
Communicating Patient Status: Comparison of Teaching Strategies in Prelicensure Nursing Education.
Lanz, Amelia S; Wood, Felecia G
Research indicates that nurses lack adequate preparation for reporting patient status. This study compared 2 instructional methods focused on patient status reporting in the clinical setting using a randomized posttest-only comparison group design. Reporting performance using a standardized communication framework and student perceptions of satisfaction and confidence with learning were measured in a simulated event that followed the instruction. Between the instructional methods, there was no statistical difference in student reporting performance or perceptions of learning. Performance evaluations provided helpful insights for the nurse educator.
DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.
Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin
2015-10-01
To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Egorov, A D; Stepantsov, V I; Nosovskiĭ, A M; Shipov, A A
2009-01-01
Cluster analysis was applied to evaluate locomotion training (running and running intermingled with walking) of 13 cosmonauts on long-term ISS missions by the parameters of duration (min), distance (m) and intensity (km/h). Based on the results of analyses, the cosmonauts were distributed into three steady groups of 2, 5 and 6 persons. Distance and speed showed a statistical rise (p < 0.03) from group 1 to group 3. Duration of physical locomotion training was not statistically different in the groups (p = 0.125). Therefore, cluster analysis is an adequate method of evaluating fitness of cosmonauts on long-term missions.
An instrument to assess the statistical intensity of medical research papers.
Nieminen, Pentti; Virtanen, Jorma I; Vähänikkilä, Hannu
2017-01-01
There is widespread evidence that statistical methods play an important role in original research articles, especially in medical research. The evaluation of statistical methods and reporting in journals suffers from a lack of standardized methods for assessing the use of statistics. The objective of this study was to develop and evaluate an instrument to assess the statistical intensity in research articles in a standardized way. A checklist-type measure scale was developed by selecting and refining items from previous reports about the statistical contents of medical journal articles and from published guidelines for statistical reporting. A total of 840 original medical research articles that were published between 2007-2015 in 16 journals were evaluated to test the scoring instrument. The total sum of all items was used to assess the intensity between sub-fields and journals. Inter-rater agreement was examined using a random sample of 40 articles. Four raters read and evaluated the selected articles using the developed instrument. The scale consisted of 66 items. The total summary score adequately discriminated between research articles according to their study design characteristics. The new instrument could also discriminate between journals according to their statistical intensity. The inter-observer agreement measured by the ICC was 0.88 between all four raters. Individual item analysis showed very high agreement between the rater pairs, the percentage agreement ranged from 91.7% to 95.2%. A reliable and applicable instrument for evaluating the statistical intensity in research papers was developed. It is a helpful tool for comparing the statistical intensity between sub-fields and journals. The novel instrument may be applied in manuscript peer review to identify papers in need of additional statistical review.
Assessing groundwater vulnerability to agrichemical contamination in the Midwest US
Burkart, M.R.; Kolpin, D.W.; James, D.E.
1999-01-01
Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.
Austin, Peter C
2008-09-01
Propensity-score matching is frequently used in the cardiology literature. Recent systematic reviews have found that this method is, in general, poorly implemented in the medical literature. The study objective was to examine the quality of the implementation of propensity-score matching in the general cardiology literature. A total of 44 articles published in the American Heart Journal, the American Journal of Cardiology, Circulation, the European Heart Journal, Heart, the International Journal of Cardiology, and the Journal of the American College of Cardiology between January 1, 2004, and December 31, 2006, were examined. Twenty of the 44 studies did not provide adequate information on how the propensity-score-matched pairs were formed. Fourteen studies did not report whether matching on the propensity score balanced baseline characteristics between treated and untreated subjects in the matched sample. Only 4 studies explicitly used statistical methods appropriate for matched studies to compare baseline characteristics between treated and untreated subjects. Only 11 (25%) of the 44 studies explicitly used statistical methods appropriate for the analysis of matched data when estimating the effect of treatment on the outcomes. Only 2 studies described the matching method used, assessed balance in baseline covariates by appropriate methods, and used appropriate statistical methods to estimate the treatment effect and its significance. Application of propensity-score matching was poor in the cardiology literature. Suggestions for improving the reporting and analysis of studies that use propensity-score matching are provided.
Heskes, Tom; Eisinga, Rob; Breitling, Rainer
2014-11-21
The rank product method is a powerful statistical technique for identifying differentially expressed molecules in replicated experiments. A critical issue in molecule selection is accurate calculation of the p-value of the rank product statistic to adequately address multiple testing. Both exact calculation and permutation and gamma approximations have been proposed to determine molecule-level significance. These current approaches have serious drawbacks as they are either computationally burdensome or provide inaccurate estimates in the tail of the p-value distribution. We derive strict lower and upper bounds to the exact p-value along with an accurate approximation that can be used to assess the significance of the rank product statistic in a computationally fast manner. The bounds and the proposed approximation are shown to provide far better accuracy over existing approximate methods in determining tail probabilities, with the slightly conservative upper bound protecting against false positives. We illustrate the proposed method in the context of a recently published analysis on transcriptomic profiling performed in blood. We provide a method to determine upper bounds and accurate approximate p-values of the rank product statistic. The proposed algorithm provides an order of magnitude increase in throughput as compared with current approaches and offers the opportunity to explore new application domains with even larger multiple testing issue. The R code is published in one of the Additional files and is available at http://www.ru.nl/publish/pages/726696/rankprodbounds.zip .
Li, Huanjie; Nickerson, Lisa D; Nichols, Thomas E; Gao, Jia-Hong
2017-03-01
Two powerful methods for statistical inference on MRI brain images have been proposed recently, a non-stationary voxelation-corrected cluster-size test (CST) based on random field theory and threshold-free cluster enhancement (TFCE) based on calculating the level of local support for a cluster, then using permutation testing for inference. Unlike other statistical approaches, these two methods do not rest on the assumptions of a uniform and high degree of spatial smoothness of the statistic image. Thus, they are strongly recommended for group-level fMRI analysis compared to other statistical methods. In this work, the non-stationary voxelation-corrected CST and TFCE methods for group-level analysis were evaluated for both stationary and non-stationary images under varying smoothness levels, degrees of freedom and signal to noise ratios. Our results suggest that, both methods provide adequate control for the number of voxel-wise statistical tests being performed during inference on fMRI data and they are both superior to current CSTs implemented in popular MRI data analysis software packages. However, TFCE is more sensitive and stable for group-level analysis of VBM data. Thus, the voxelation-corrected CST approach may confer some advantages by being computationally less demanding for fMRI data analysis than TFCE with permutation testing and by also being applicable for single-subject fMRI analyses, while the TFCE approach is advantageous for VBM data. Hum Brain Mapp 38:1269-1280, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Guler, Hasan; Kilic, Ugur
2018-03-01
Weaning is important for patients and clinicians who have to determine correct weaning time so that patients do not become addicted to the ventilator. There are already some predictors developed, such as the rapid shallow breathing index (RSBI), the pressure time index (PTI), and Jabour weaning index. Many important dimensions of weaning are sometimes ignored by these predictors. This is an attempt to develop a knowledge-based weaning process via fuzzy logic that eliminates the disadvantages of the present predictors. Sixteen vital parameters listed in published literature have been used to determine the weaning decisions in the developed system. Since there are considered to be too many individual parameters in it, related parameters were grouped together to determine acid-base balance, adequate oxygenation, adequate pulmonary function, hemodynamic stability, and the psychological status of the patients. To test the performance of the developed algorithm, 20 clinical scenarios were generated using Monte Carlo simulations and the Gaussian distribution method. The developed knowledge-based algorithm and RSBI predictor were applied to the generated scenarios. Finally, a clinician evaluated each clinical scenario independently. The Student's t test was used to show the statistical differences between the developed weaning algorithm, RSBI, and the clinician's evaluation. According to the results obtained, there were no statistical differences between the proposed methods and the clinician evaluations.
ERIC Educational Resources Information Center
Hilton, Sterling C.; Schau, Candace; Olsen, Joseph A.
2004-01-01
In addition to student learning, positive student attitudes have become an important course outcome for many introductory statistics instructors. To adequately assess changes in mean attitudes across introductory statistics courses, the attitude instruments used should be invariant by administration time. Attitudes toward statistics from 4,910…
NASA Astrophysics Data System (ADS)
Vinh, T.
1980-08-01
There is a need for better and more effective lightning protection for transmission and switching substations. In the past, a number of empirical methods were utilized to design systems to protect substations and transmission lines from direct lightning strokes. The need exists for convenient analytical lightning models adequate for engineering usage. In this study, analytical lightning models were developed along with a method for improved analysis of the physical properties of lightning through their use. This method of analysis is based upon the most recent statistical field data. The result is an improved method for predicting the occurrence of sheilding failure and for designing more effective protection for high and extra high voltage substations from direct strokes.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Schäffer, Beat; Pieren, Reto; Mendolia, Franco; Basner, Mathias; Brink, Mark
2017-05-01
Noise exposure-response relationships are used to estimate the effects of noise on individuals or a population. Such relationships may be derived from independent or repeated binary observations, and modeled by different statistical methods. Depending on the method by which they were established, their application in population risk assessment or estimation of individual responses may yield different results, i.e., predict "weaker" or "stronger" effects. As far as the present body of literature on noise effect studies is concerned, however, the underlying statistical methodology to establish exposure-response relationships has not always been paid sufficient attention. This paper gives an overview on two statistical approaches (subject-specific and population-averaged logistic regression analysis) to establish noise exposure-response relationships from repeated binary observations, and their appropriate applications. The considerations are illustrated with data from three noise effect studies, estimating also the magnitude of differences in results when applying exposure-response relationships derived from the two statistical approaches. Depending on the underlying data set and the probability range of the binary variable it covers, the two approaches yield similar to very different results. The adequate choice of a specific statistical approach and its application in subsequent studies, both depending on the research question, are therefore crucial.
Interpreting carnivore scent-station surveys
Sargeant, G.A.; Johnson, D.H.; Berg, W.E.
1998-01-01
The scent-station survey method has been widely used to estimate trends in carnivore abundance. However, statistical properties of scent-station data are poorly understood, and the relation between scent-station indices and carnivore abundance has not been adequately evaluated. We assessed properties of scent-station indices by analyzing data collected in Minnesota during 1986-03. Visits to stations separated by <2 km were correlated for all species because individual carnivores sometimes visited several stations in succession. Thus, visits to stations had an intractable statistical distribution. Dichotomizing results for lines of 10 stations (0 or 21 visits) produced binomially distributed data that were robust to multiple visits by individuals. We abandoned 2-way comparisons among years in favor of tests for population trend, which are less susceptible to bias, and analyzed results separately for biogeographic sections of Minnesota because trends differed among sections. Before drawing inferences about carnivore population trends, we reevaluated published validation experiments. Results implicated low statistical power and confounding as possible explanations for equivocal or conflicting results of validation efforts. Long-term trends in visitation rates probably reflect real changes in populations, but poor spatial and temporal resolution, susceptibility to confounding, and low statistical power limit the usefulness of this survey method.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.
How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?
West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.
2016-01-01
Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817
Jackson, Dan; Kirkbride, James; Croudace, Tim; Morgan, Craig; Boydell, Jane; Errazuriz, Antonia; Murray, Robin M; Jones, Peter B
2013-03-01
A recent systematic review and meta-analysis of the incidence and prevalence of schizophrenia and other psychoses in England investigated the variation in the rates of psychotic disorders. However, some of the questions of interest, and the data collected to answer these, could not be adequately addressed using established meta-analysis techniques. We developed a novel statistical method, which makes combined use of fractional polynomials and meta-regression. This was used to quantify the evidence of gender differences and a secondary peak onset in women, where the outcome of interest is the incidence of schizophrenia. Statistically significant and epidemiologically important effects were obtained using our methods. Our analysis is based on data from four studies that provide 50 incidence rates, stratified by age and gender. We describe several variations of our method, in particular those that might be used where more data is available, and provide guidance for assessing the model fit. Copyright © 2013 John Wiley & Sons, Ltd.
5 CFR 297.401 - Conditions of disclosure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... with advance adequate written assurance that the record will be used solely as a statistical research... records; and (ii) Certification that the records will be used only for statistical purposes. (2) These... information from records released for statistical purposes, the system manager will reasonably ensure that the...
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dierauf, Timothy; Kurtz, Sarah; Riley, Evan
This paper provides a recommended method for evaluating the AC capacity of a photovoltaic (PV) generating station. It also presents companion guidance on setting the facilitys capacity guarantee value. This is a principles-based approach that incorporates plant fundamental design parameters such as loss factors, module coefficients, and inverter constraints. This method has been used to prove contract guarantees for over 700 MW of installed projects. The method is transparent, and the results are deterministic. In contrast, current industry practices incorporate statistical regression where the empirical coefficients may only characterize the collected data. Though these methods may work well when extrapolationmore » is not required, there are other situations where the empirical coefficients may not adequately model actual performance.This proposed Fundamentals Approach method provides consistent results even where regression methods start to lose fidelity.« less
Frazier, Melanie; Miller, A. Whitman; Lee, Henry; Reusser, Deborah A.
2013-01-01
Discharge from the ballast tanks of ships is one of the primary vectors of nonindigenous species in marine environments. To mitigate this environmental and economic threat, international, national, and state entities are establishing regulations to limit the concentration of living organisms that may be discharged from the ballast tanks of ships. The proposed discharge standards have ranged from zero detectable organisms to 3. If standard sampling methods are used, verifying whether ballast discharge complies with these stringent standards will be challenging due to the inherent stochasticity of sampling. Furthermore, at low concentrations, very large volumes of water must be sampled to find enough organisms to accurately estimate concentration. Despite these challenges, adequate sampling protocols comprise a critical aspect of establishing standards because they help define the actual risk level associated with a standard. A standard that appears very stringent may be effectively lax if it is paired with an inadequate sampling protocol. We describe some of the statistical issues associated with sampling at low concentrations to help regulators understand the uncertainties of sampling as well as to inform the development of sampling protocols that ensure discharge standards are adequately implemented.
[Hungarian health resource allocation from the viewpoint of the English methodology].
Fadgyas-Freyler, Petra
2018-02-01
This paper describes both the English health resource allocation and the attempt of its Hungarian adaptation. We describe calculations for a Hungarian regression model using the English 'weighted capitation formula'. The model has proven statistically correct. New independent variables and homogenous regional units have to be found for Hungary. The English method can be used with adequate variables. Hungarian patient-level health data can support a much more sophisticated model. Further research activities are needed. Orv Hetil. 2018; 159(5): 183-191.
Hanifi, S.M.A.; Roy, Nikhil; Streatfield, P. Kim
2007-01-01
This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population. PMID:17615902
Bhuiya, Abbas; Hanifi, S M A; Roy, Nikhil; Streatfield, P Kim
2007-03-01
This paper compared the performance of the lot quality assurance sampling (LQAS) method in identifying inadequately-performing health work-areas with that of using health and demographic surveillance system (HDSS) data and examined the feasibility of applying the method by field-level programme supervisors. The study was carried out in Matlab, the field site of ICDDR,B, where a HDSS has been in place for over 30 years. The LQAS method was applied in 57 work-areas of community health workers in ICDDR,B-served areas in Matlab during July-September 2002. The performance of the LQAS method in identifying work-areas with adequate and inadequate coverage of various health services was compared with those of the HDSS. The health service-coverage indicators included coverage of DPT, measles, BCG vaccination, and contraceptive use. It was observed that the difference in the proportion of work-areas identified to be inadequately performing using the LQAS method with less than 30 respondents, and the HDSS was not statistically significant. The consistency between the LQAS method and the HDSS in identifying work-areas was greater for adequately-performing areas than inadequately-performing areas. It was also observed that the field managers could be trained to apply the LQAS method in monitoring their performance in reaching the target population.
ERIC Educational Resources Information Center
Bargagliotti, Anna E.
2012-01-01
Statistics and probability have become an integral part of mathematics education. Therefore it is important to understand whether curricular materials adequately represent statistical ideas. The "Guidelines for Assessment and Instruction in Statistics Education" (GAISE) report (Franklin, Kader, Mewborn, Moreno, Peck, Perry, & Scheaffer, 2007),…
NASA Astrophysics Data System (ADS)
Ben Torkia, Yosra; Ben Yahia, Manel; Khalfaoui, Mohamed; Al-Muhtaseb, Shaheen A.; Ben Lamine, Abdelmottaleb
2014-01-01
The adsorption energy distribution (AED) function of a commercial activated carbon (BDH-activated carbon) was investigated. For this purpose, the integral equation is derived by using a purely analytical statistical physics treatment. The description of the heterogeneity of the adsorbent is significantly clarified by defining the parameter N(E). This parameter represents the energetic density of the spatial density of the effectively occupied sites. To solve the integral equation, a numerical method was used based on an adequate algorithm. The Langmuir model was adopted as a local adsorption isotherm. This model is developed by using the grand canonical ensemble, which allows defining the physico-chemical parameters involved in the adsorption process. The AED function is estimated by a normal Gaussian function. This method is applied to the adsorption isotherms of nitrogen, methane and ethane at different temperatures. The development of the AED using a statistical physics treatment provides an explanation of the gas molecules behaviour during the adsorption process and gives new physical interpretations at microscopic levels.
Baqué, Michèle; Amendt, Jens
2013-01-01
Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.
Proper Image Subtraction—Optimal Transient Detection, Photometry, and Hypothesis Testing
NASA Astrophysics Data System (ADS)
Zackay, Barak; Ofek, Eran O.; Gal-Yam, Avishay
2016-10-01
Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement, and any image-difference hypothesis testing. We derive a closed-form statistic that: (1) is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise, (2) is numerically stable, (3) for accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts, (4) allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance, (5) has uncorrelated white noise, (6) is a sufficient statistic for any further statistical test on the difference image, and, in particular, allows us to distinguish particle hits and other image artifacts from real transients, (7) is symmetric to the exchange of the new and reference images, (8) is at least an order of magnitude faster to compute than some popular methods, and (9) is straightforward to implement. Furthermore, we present extensions of this method that make it resilient to registration errors, color-refraction errors, and any noise source that can be modeled. In addition, we show that the optimal way to prepare a reference image is the proper image coaddition presented in Zackay & Ofek. We demonstrate this method on simulated data and real observations from the PTF data release 2. We provide an implementation of this algorithm in MATLAB and Python.
An issue of literacy on pediatric arterial hypertension
NASA Astrophysics Data System (ADS)
Teodoro, M. Filomena; Romana, Andreia; Simão, Carla
2017-11-01
Arterial hypertension in pediatric age is a public health problem, whose prevalence has increased significantly over time. Pediatric arterial hypertension (PAH) is under-diagnosed in most cases, a highly prevalent disease, appears without notice with multiple consequences on the children's health and future adults. Children caregivers and close family must know the PAH existence, the negative consequences associated with it, the risk factors and, finally, must do prevention. In [12, 13] can be found a statistical data analysis using a simpler questionnaire introduced in [4] under the aim of a preliminary study about PAH caregivers acquaintance. A continuation of such analysis is detailed in [14]. An extension of such questionnaire was built and applied to a distinct population and it was filled online. The statistical approach is partially reproduced in the present work. Some statistical models were estimated using several approaches, namely multivariate analysis (factorial analysis), also adequate methods to analyze the kind of data in study.
Harris, Keith M; Thandrayen, Joanne; Samphoas, Chien; Se, Pros; Lewchalermwongse, Boontriga; Ratanashevorn, Rattanakorn; Perry, Megan L; Britts, Choloe
2016-04-01
This study tested a low-cost method for estimating suicide rates in developing nations that lack adequate statistics. Data comprised reported suicides from Cambodia's 2 largest newspapers. Capture-recapture modeling estimated a suicide rate of 3.8/100 000 (95% CI = 2.5-6.7) for 2012. That compares to World Health Organization estimates of 1.3 to 9.4/100 000 and a Cambodian government estimate of 3.5/100 000. Suicide rates of males were twice that of females, and rates of those <40 years were twice that of those ≥40 years. Capture-recapture modeling with newspaper reports proved a reasonable method for estimating suicide rates for countries with inadequate official data. These methods are low-cost and can be applied to regions with at least 2 newspapers with overlapping reports. Means to further improve this approach are discussed. These methods are applicable to both recent and historical data, which can benefit epidemiological work, and may also be applicable to homicides and other statistics. © 2016 APJPH.
2013-01-01
Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771
Outcome-based self-assessment on a team-teaching subject in the medical school
Cho, Sa Sun
2014-01-01
We attempted to investigate the reason why the students got a worse grade in gross anatomy and the way how we can improve upon the teaching method since there were gaps between teaching and learning under recently changed integration curriculum. General characteristics of students and exploratory factors to testify the validity were compared between year 2011 and 2012. Students were asked to complete a short survey with a Likert scale. The results were as follows: although the percentage of acceptable items was similar between professors, professor C preferred questions with adequate item discrimination and inappropriate item difficulty whereas professor Y preferred adequate item discrimination and appropriate item difficulty with statistical significance (P<0.01). The survey revealed that 26.5% of total students gave up the exam on gross anatomy of professor Y irrespective of years. These results suggested that students were affected by the corrected item difficulty rather than item discrimination in order to obtain academic achievement. Therefore, professors in a team-teaching subject should reach a consensus on an item difficulty with proper teaching methods. PMID:25548724
The Relationship between Adequate Yearly Progress and the Quality of Professional Development
ERIC Educational Resources Information Center
Wolff, Lori A.; McClelland, Susan S.; Stewart, Stephanie E.
2010-01-01
Based on publicly available data, the study examined the relationship between adequate yearly progress status and teachers' perceptions of the quality of their professional development. The sample included responses of 5,558 teachers who completed the questionnaire in the 2005-2006 school year. Results of the statistical analysis show a…
NASA Technical Reports Server (NTRS)
Falls, L. W.
1973-01-01
This document replaces Cape Kennedy empirical wind component statistics which are presently being used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as an adequate statistical model to represent component winds at Cape Kennedy. Head-, tail-, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99,865 percent for each month. Results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Cape Kennedy, Florida.
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
NASA Astrophysics Data System (ADS)
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
Benner, Christian; Havulinna, Aki S; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ripatti, Samuli; Pirinen, Matti
2017-10-05
During the past few years, various novel statistical methods have been developed for fine-mapping with the use of summary statistics from genome-wide association studies (GWASs). Although these approaches require information about the linkage disequilibrium (LD) between variants, there has not been a comprehensive evaluation of how estimation of the LD structure from reference genotype panels performs in comparison with that from the original individual-level GWAS data. Using population genotype data from Finland and the UK Biobank, we show here that a reference panel of 1,000 individuals from the target population is adequate for a GWAS cohort of up to 10,000 individuals, whereas smaller panels, such as those from the 1000 Genomes Project, should be avoided. We also show, both theoretically and empirically, that the size of the reference panel needs to scale with the GWAS sample size; this has important consequences for the application of these methods in ongoing GWAS meta-analyses and large biobank studies. We conclude by providing software tools and by recommending practices for sharing LD information to more efficiently exploit summary statistics in genetics research. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Statistical approaches to lifetime measurements with restricted observation times
NASA Astrophysics Data System (ADS)
Chen, X. C.; Zeng, Q.; Litvinov, Yu. A.; Tu, X. L.; Walker, P. M.; Wang, M.; Wang, Q.; Yue, K.; Zhang, Y. H.
2017-09-01
Two generic methods based on frequentism and Bayesianism are presented in this work aiming to adequately estimate decay lifetimes from measured data, while accounting for restricted observation times in the measurements. All the experimental scenarios that can possibly arise from the observation constraints are treated systematically and formulas are derived. The methods are then tested against the decay data of bare isomeric 44+94mRu, which were measured using isochronous mass spectrometry with a timing detector at the CSRe in Lanzhou, China. Applying both methods in three distinct scenarios yields six different but consistent lifetime estimates. The deduced values are all in good agreement with a prediction based on the neutral-atom value modified to take the absence of internal conversion into account. Potential applications of such methods are discussed.
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-01-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008–2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0. PMID:27892471
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-28
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
Data series embedding and scale invariant statistics.
Michieli, I; Medved, B; Ristov, S
2010-06-01
Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mieth, Bettina; Kloft, Marius; Rodríguez, Juan Antonio; Sonnenburg, Sören; Vobruba, Robin; Morcillo-Suárez, Carlos; Farré, Xavier; Marigorta, Urko M.; Fehr, Ernst; Dickhaus, Thorsten; Blanchard, Gilles; Schunk, Daniel; Navarro, Arcadi; Müller, Klaus-Robert
2016-11-01
The standard approach to the analysis of genome-wide association studies (GWAS) is based on testing each position in the genome individually for statistical significance of its association with the phenotype under investigation. To improve the analysis of GWAS, we propose a combination of machine learning and statistical testing that takes correlation structures within the set of SNPs under investigation in a mathematically well-controlled manner into account. The novel two-step algorithm, COMBI, first trains a support vector machine to determine a subset of candidate SNPs and then performs hypothesis tests for these SNPs together with an adequate threshold correction. Applying COMBI to data from a WTCCC study (2007) and measuring performance as replication by independent GWAS published within the 2008-2015 period, we show that our method outperforms ordinary raw p-value thresholding as well as other state-of-the-art methods. COMBI presents higher power and precision than the examined alternatives while yielding fewer false (i.e. non-replicated) and more true (i.e. replicated) discoveries when its results are validated on later GWAS studies. More than 80% of the discoveries made by COMBI upon WTCCC data have been validated by independent studies. Implementations of the COMBI method are available as a part of the GWASpi toolbox 2.0.
High prevalence of iodine deficiency in pregnant women living in adequate iodine area
Mioto, Verônica Carneiro Borges; Monteiro, Ana Carolina de Castro Nassif Gomes; de Camargo, Rosalinda Yossie Asato; Borel, Andréia Rodrigues; Catarino, Regina Maria; Kobayashi, Sergio; Chammas, Maria Cristina; Marui, Suemi
2018-01-01
Objectives Iodine deficiency during pregnancy is associated with obstetric and neonatal adverse outcomes. Serum thyroglobulin (sTg) and thyroid volume (TV) are optional tools to urinary iodine concentration (UIC) for defining iodine status. This cross-sectional study aims to evaluate the iodine status of pregnant women living in iodine-adequate area by spot UIC and correlation with sTg, TV and thyroid function. Methods Two hundred and seventy-three pregnant women were evaluated at three trimesters. All had no previous thyroid disease, no iodine supplementation and negative thyroperoxidase and thyroglobulin antibodies. Thyroid function and sTg were measured using electrochemiluminescence immunoassays. TV was determined by ultrasonography; UIC was determined using a modified Sandell–Kolthoff method. Results Median UIC was 146 µg/L, being 52% iodine deficient and only 4% excessive. TSH values were 1.50 ± 0.92, 1.50 ± 0.92 and 1.91 ± 0.96 mIU/L, respectively, in each trimester (P = 0.001). sTg did not change significantly during trimesters with median 11.2 ng/mL and only 3.3% had above 40 ng/mL. Mean TV was 9.3 ± 3.4 mL, which positively correlated with body mass index, but not with sTg. Only 4.5% presented with goitre. When pregnant women were categorized as iodine deficient (UIC < 150 µg/L), adequate (≥150 and <250 µg/L) and excessive (≥250 µg/L), sTg, thyroid hormones and TV at each trimester showed no statistical differences. Conclusions Iodine deficiency was detected frequently in pregnant women living in iodine-adequate area. sTg concentration and TV did not correlate to UIC. Our observation also demonstrated that the Brazilian salt-iodization programme prevents deficiency, but does not maintain iodine status within adequate and recommended ranges for pregnant women. PMID:29700098
The application of satellite data in monitoring strip mines
NASA Technical Reports Server (NTRS)
Sharber, L. A.; Shahrokhi, F.
1977-01-01
Strip mines in the New River Drainage Basin of Tennessee were studied through use of Landsat-1 imagery and aircraft photography. A multilevel analysis, involving conventional photo interpretation techniques, densitometric methods, multispectral analysis and statistical testing was applied to the data. The Landsat imagery proved adequate for monitoring large-scale change resulting from active mining and land-reclamation projects. However, the spatial resolution of the satellite imagery rendered it inadequate for assessment of many smaller strip mines, in the region which may be as small as a few hectares.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
Size Matters: What Are the Characteristic Source Areas for Urban Planning Strategies?
Fan, Chao; Myint, Soe W.; Wang, Chenghao
2016-01-01
Urban environmental measurements and observational statistics should reflect the properties generated over an adjacent area of adequate length where homogeneity is usually assumed. The determination of this characteristic source area that gives sufficient representation of the horizontal coverage of a sensing instrument or the fetch of transported quantities is of critical importance to guide the design and implementation of urban landscape planning strategies. In this study, we aim to unify two different methods for estimating source areas, viz. the statistical correlation method commonly used by geographers for landscape fragmentation and the mechanistic footprint model by meteorologists for atmospheric measurements. Good agreement was found in the intercomparison of the estimate of source areas by the two methods, based on 2-m air temperature measurement collected using a network of weather stations. The results can be extended to shed new lights on urban planning strategies, such as the use of urban vegetation for heat mitigation. In general, a sizable patch of landscape is required in order to play an effective role in regulating the local environment, proportional to the height at which stakeholders’ interest is mainly concerned. PMID:27832111
Kılıç, D; Göksu, E; Kılıç, T; Buyurgan, C S
2018-05-01
The aim of this randomized cross-over study was to compare one-minute and two-minute continuous chest compressions in terms of chest compression only CPR quality metrics on a mannequin model in the ED. Thirty-six emergency medicine residents participated in this study. In the 1-minute group, there was no statistically significant difference in the mean compression rate (p=0.83), mean compression depth (p=0.61), good compressions (p=0.31), the percentage of complete release (p=0.07), adequate compression depth (p=0.11) or the percentage of good rate (p=51) over the four-minute time period. Only flow time was statistically significant among the 1-minute intervals (p<0.001). In the 2-minute group, the mean compression depth (p=0.19), good compression (p=0.92), the percentage of complete release (p=0.28), adequate compression depth (p=0.96), and the percentage of good rate (p=0.09) were not statistically significant over time. In this group, the number of compressions (248±31 vs 253±33, p=0.01) and mean compression rates (123±15 vs 126±17, p=0.01) and flow time (p=0.001) were statistically significant along the two-minute intervals. There was no statistically significant difference in the mean number of chest compressions per minute, mean chest compression depth, the percentage of good compressions, complete release, adequate chest compression depth and percentage of good compression between the 1-minute and 2-minute groups. There was no statistically significant difference in the quality metrics of chest compressions between 1- and 2-minute chest compression only groups. Copyright © 2017 Elsevier Inc. All rights reserved.
Lisle, John T.; Hamilton, Martin A.; Willse, Alan R.; McFeters, Gordon A.
2004-01-01
Total direct counts of bacterial abundance are central in assessing the biomass and bacteriological quality of water in ecological and industrial applications. Several factors have been identified that contribute to the variability in bacterial abundance counts when using fluorescent microscopy, the most significant of which is retaining an adequate number of cells per filter to ensure an acceptable level of statistical confidence in the resulting data. Previous studies that have assessed the components of total-direct-count methods that contribute to this variance have attempted to maintain a bacterial cell abundance value per filter of approximately 106 cells filter-1. In this study we have established the lower limit for the number of bacterial cells per filter at which the statistical reliability of the abundance estimate is no longer acceptable. Our results indicate that when the numbers of bacterial cells per filter were progressively reduced below 105, the microscopic methods increasingly overestimated the true bacterial abundance (range, 15.0 to 99.3%). The solid-phase cytometer only slightly overestimated the true bacterial abundances and was more consistent over the same range of bacterial abundances per filter (range, 8.9 to 12.5%). The solid-phase cytometer method for conducting total direct counts of bacteria was less biased and performed significantly better than any of the microscope methods. It was also found that microscopic count data from counting 5 fields on three separate filters were statistically equivalent to data from counting 20 fields on a single filter.
Teodoro, P E; Torres, F E; Santos, A D; Corrêa, A M; Nascimento, M; Barroso, L M A; Ceccon, G
2016-05-09
The aim of this study was to evaluate the suitability of statistics as experimental precision degree measures for trials with cowpea (Vigna unguiculata L. Walp.) genotypes. Cowpea genotype yields were evaluated in 29 trials conducted in Brazil between 2005 and 2012. The genotypes were evaluated with a randomized block design with four replications. Ten statistics that were estimated for each trial were compared using descriptive statistics, Pearson correlations, and path analysis. According to the class limits established, selective accuracy and F-test values for genotype, heritability, and the coefficient of determination adequately estimated the degree of experimental precision. Using these statistics, 86.21% of the trials had adequate experimental precision. Selective accuracy and the F-test values for genotype, heritability, and the coefficient of determination were directly related to each other, and were more suitable than the coefficient of variation and the least significant difference (by the Tukey test) to evaluate experimental precision in trials with cowpea genotypes.
Regional Earthquake Likelihood Models: A realm on shaky grounds?
NASA Astrophysics Data System (ADS)
Kossobokov, V.
2005-12-01
Seismology is juvenile and its appropriate statistical tools to-date may have a "medievil flavor" for those who hurry up to apply a fuzzy language of a highly developed probability theory. To become "quantitatively probabilistic" earthquake forecasts/predictions must be defined with a scientific accuracy. Following the most popular objectivists' viewpoint on probability, we cannot claim "probabilities" adequate without a long series of "yes/no" forecast/prediction outcomes. Without "antiquated binary language" of "yes/no" certainty we cannot judge an outcome ("success/failure"), and, therefore, quantify objectively a forecast/prediction method performance. Likelihood scoring is one of the delicate tools of Statistics, which could be worthless or even misleading when inappropriate probability models are used. This is a basic loophole for a misuse of likelihood as well as other statistical methods on practice. The flaw could be avoided by an accurate verification of generic probability models on the empirical data. It is not an easy task in the frames of the Regional Earthquake Likelihood Models (RELM) methodology, which neither defines the forecast precision nor allows a means to judge the ultimate success or failure in specific cases. Hopefully, the RELM group realizes the problem and its members do their best to close the hole with an adequate, data supported choice. Regretfully, this is not the case with the erroneous choice of Gerstenberger et al., who started the public web site with forecasts of expected ground shaking for `tomorrow' (Nature 435, 19 May 2005). Gerstenberger et al. have inverted the critical evidence of their study, i.e., the 15 years of recent seismic record accumulated just in one figure, which suggests rejecting with confidence above 97% "the generic California clustering model" used in automatic calculations. As a result, since the date of publication in Nature the United States Geological Survey website delivers to the public, emergency planners and the media, a forecast product, which is based on wrong assumptions that violate the best-documented earthquake statistics in California, which accuracy was not investigated, and which forecasts were not tested in a rigorous way.
Bruner, L H; Carr, G J; Harbell, J W; Curren, R D
2002-06-01
An approach commonly used to measure new toxicity test method (NTM) performance in validation studies is to divide toxicity results into positive and negative classifications, and the identify true positive (TP), true negative (TN), false positive (FP) and false negative (FN) results. After this step is completed, the contingent probability statistics (CPS), sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) are calculated. Although these statistics are widely used and often the only statistics used to assess the performance of toxicity test methods, there is little specific guidance in the validation literature on what values for these statistics indicate adequate performance. The purpose of this study was to begin developing data-based answers to this question by characterizing the CPS obtained from an NTM whose data have a completely random association with a reference test method (RTM). Determining the CPS of this worst-case scenario is useful because it provides a lower baseline from which the performance of an NTM can be judged in future validation studies. It also provides an indication of relationships in the CPS that help identify random or near-random relationships in the data. The results from this study of randomly associated tests show that the values obtained for the statistics vary significantly depending on the cut-offs chosen, that high values can be obtained for individual statistics, and that the different measures cannot be considered independently when evaluating the performance of an NTM. When the association between results of an NTM and RTM is random the sum of the complementary pairs of statistics (sensitivity + specificity, NPV + PPV) is approximately 1, and the prevalence (i.e., the proportion of toxic chemicals in the population of chemicals) and PPV are equal. Given that combinations of high sensitivity-low specificity or low specificity-high sensitivity (i.e., the sum of the sensitivity and specificity equal to approximately 1) indicate lack of predictive capacity, an NTM having these performance characteristics should be considered no better for predicting toxicity than by chance alone.
Graph-based inductive reasoning.
Boumans, Marcel
2016-10-01
This article discusses methods of inductive inferences that are methods of visualizations designed in such a way that the "eye" can be employed as a reliable tool for judgment. The term "eye" is used as a stand-in for visual cognition and perceptual processing. In this paper "meaningfulness" has a particular meaning, namely accuracy, which is closeness to truth. Accuracy consists of precision and unbiasedness. Precision is dealt with by statistical methods, but for unbiasedness one needs expert judgment. The common view at the beginning of the twentieth century was to make the most efficient use of this kind of judgment by representing the data in shapes and forms in such a way that the "eye" can function as a reliable judge to reduce bias. The need for judgment of the "eye" is even more necessary when the background conditions of the observations are heterogeneous. Statistical procedures require a certain minimal level of homogeneity, but the "eye" does not. The "eye" is an adequate tool for assessing topological similarities when, due to heterogeneity of the data, metric assessment is not possible. In fact, graphical assessments precedes measurement, or to put it more forcefully, the graphic method is a necessary prerequisite for measurement. Copyright © 2016 Elsevier Ltd. All rights reserved.
Juang, Wang-Chuan; Huang, Sin-Jhih; Huang, Fong-Dee; Cheng, Pei-Wen; Wann, Shue-Ren
2017-01-01
Objective Emergency department (ED) overcrowding is acknowledged as an increasingly important issue worldwide. Hospital managers are increasingly paying attention to ED crowding in order to provide higher quality medical services to patients. One of the crucial elements for a good management strategy is demand forecasting. Our study sought to construct an adequate model and to forecast monthly ED visits. Methods We retrospectively gathered monthly ED visits from January 2009 to December 2016 to carry out a time series autoregressive integrated moving average (ARIMA) analysis. Initial development of the model was based on past ED visits from 2009 to 2016. A best-fit model was further employed to forecast the monthly data of ED visits for the next year (2016). Finally, we evaluated the predicted accuracy of the identified model with the mean absolute percentage error (MAPE). The software packages SAS/ETS V.9.4 and Office Excel 2016 were used for all statistical analyses. Results A series of statistical tests showed that six models, including ARIMA (0, 0, 1), ARIMA (1, 0, 0), ARIMA (1, 0, 1), ARIMA (2, 0, 1), ARIMA (3, 0, 1) and ARIMA (5, 0, 1), were candidate models. The model that gave the minimum Akaike information criterion and Schwartz Bayesian criterion and followed the assumptions of residual independence was selected as the adequate model. Finally, a suitable ARIMA (0, 0, 1) structure, yielding a MAPE of 8.91%, was identified and obtained as Visitt=7111.161+(at+0.37462 at−1). Conclusion The ARIMA (0, 0, 1) model can be considered adequate for predicting future ED visits, and its forecast results can be used to aid decision-making processes. PMID:29196487
ERIC Educational Resources Information Center
Oslund, Eric L.; Clemens, Nathan H.; Simmons, Deborah C.; Simmons, Leslie E.
2018-01-01
The current study examined statistically significant differences between struggling and adequate readers using a multicomponent model of reading comprehension in 796 sixth through eighth graders, with a primary focus on word reading and vocabulary. Path analyses and Wald tests were used to investigate the direct and indirect relations of word…
The art and science of choosing efficacy endpoints for rare disease clinical trials.
Cox, Gerald F
2018-04-01
An important challenge in rare disease clinical trials is to demonstrate a clinically meaningful and statistically significant response to treatment. Selecting the most appropriate and sensitive efficacy endpoints for a treatment trial is part art and part science. The types of endpoints should align with the stage of development (e.g., proof of concept vs. confirmation of clinical efficacy). The patient characteristics and disease stage should reflect the treatment goal of improving disease manifestations or preventing disease progression. For rare diseases, regulatory approval requires demonstration of clinical benefit, defined as how a patient, feels, functions, or survives, in at least one adequate and well-controlled pivotal study conducted according to Good Clinical Practice. In some cases, full regulatory approval can occur using a validated surrogate biomarker, while accelerated, or provisional, approval can occur using a biomarker that is likely to predict clinical benefit. Rare disease studies are small by necessity and require the use of endpoints with large effect sizes to demonstrate statistical significance. Understanding the quantitative factors that determine effect size and its impact on powering the study with an adequate sample size is key to the successful choice of endpoints. Interpreting the clinical meaningfulness of an observed change in an efficacy endpoint can be justified by statistical methods, regulatory precedence, and clinical context. Heterogeneous diseases that affect multiple organ systems may be better accommodated by endpoints that assess mean change across multiple endpoints within the same patient rather than mean change in an individual endpoint across all patients. © 2018 Wiley Periodicals, Inc.
41 CFR 51-9.201 - Conditions of disclosure.
Code of Federal Regulations, 2010 CFR
2010-07-01
... provided the agency with advance adequate written assurance that the record will be used solely as a statistical research or reporting record and the record is to be transferred in a form that is not... for requesting the records, and (2) Certification that the records will be used only for statistical...
What Is Missing in Counseling Research? Reporting Missing Data
ERIC Educational Resources Information Center
Sterner, William R.
2011-01-01
Missing data have long been problematic in quantitative research. Despite the statistical and methodological advances made over the past 3 decades, counseling researchers fail to provide adequate information on this phenomenon. Interpreting the complex statistical procedures and esoteric language seems to be a contributing factor. An overview of…
Arismendi, Ivan; Johnson, Sherri L.; Dunham, Jason B.
2015-01-01
Statistics of central tendency and dispersion may not capture relevant or desired characteristics of the distribution of continuous phenomena and, thus, they may not adequately describe temporal patterns of change. Here, we present two methodological approaches that can help to identify temporal changes in environmental regimes. First, we use higher-order statistical moments (skewness and kurtosis) to examine potential changes of empirical distributions at decadal extents. Second, we adapt a statistical procedure combining a non-metric multidimensional scaling technique and higher density region plots to detect potentially anomalous years. We illustrate the use of these approaches by examining long-term stream temperature data from minimally and highly human-influenced streams. In particular, we contrast predictions about thermal regime responses to changing climates and human-related water uses. Using these methods, we effectively diagnose years with unusual thermal variability and patterns in variability through time, as well as spatial variability linked to regional and local factors that influence stream temperature. Our findings highlight the complexity of responses of thermal regimes of streams and reveal their differential vulnerability to climate warming and human-related water uses. The two approaches presented here can be applied with a variety of other continuous phenomena to address historical changes, extreme events, and their associated ecological responses.
[Mass anomalies of the extremities in anurans].
Kovalenko, E E
2000-01-01
The author analyses literature data on anomalies of limbs in Anura. It is shown that published data is usually not enough to discuss either conditions of appearance or the causes of anomalies. Traditional statistical methods does not adequately characterise the frequency of anomalies. The author suggests a new criteria for ascertaining the fact of appearance of mass anomalies. A number of experimental data don't correspond to current theoretical ideas about the nature of anomalies. It is considered to distinguish "background" and "mass" anomalies. "Background" anomalies can not be a good indicator of unfavourable condition of development.
Crop identification and area estimation over large geographic areas using LANDSAT MSS data
NASA Technical Reports Server (NTRS)
Bauer, M. E. (Principal Investigator)
1977-01-01
The author has identified the following significant results. LANDSAT MSS data was adequate to accurately identify wheat in Kansas; corn and soybean estimates in Indiana were less accurate. Computer-aided analysis techniques were effectively used to extract crop identification information from LANDSAT data. Systematic sampling of entire counties made possible by computer classification methods resulted in very precise area estimates at county, district, and state levels. Training statistics were successfully extended from one county to other counties having similar crops and soils if the training areas sampled the total variation of the area to be classified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kogalovskii, M.R.
This paper presents a review of problems related to statistical database systems, which are wide-spread in various fields of activity. Statistical databases (SDB) are referred to as databases that consist of data and are used for statistical analysis. Topics under consideration are: SDB peculiarities, properties of data models adequate for SDB requirements, metadata functions, null-value problems, SDB compromise protection problems, stored data compression techniques, and statistical data representation means. Also examined is whether the present Database Management Systems (DBMS) satisfy the SDB requirements. Some actual research directions in SDB systems are considered.
Higher-order correlations for fluctuations in the presence of fields.
Boer, A; Dumitru, S
2002-10-01
The higher-order moments of the fluctuations for thermodynamic systems in the presence of fields are investigated in the framework of a theoretical method. The method uses a generalized statistical ensemble consistent with an adequate expression for the internal energy. The applications refer to the case of a system in a magnetoquasistatic field. In the case of linear magnetic media, one finds that, for the description of the magnetic induction fluctuations, the Gaussian approximation is satisfactory. For nonlinear media, the corresponding fluctuations are non-Gaussian, having a non-null asymmetry. Furthermore, the respective fluctuations have characteristics of leptokurtic, mesokurtic and platykurtic type, depending on the value of the magnetic field strength as compared with a scaling factor of the magnetization curve.
Bonn, Bernadine A.
2008-01-01
A long-term method detection level (LT-MDL) and laboratory reporting level (LRL) are used by the U.S. Geological Survey?s National Water Quality Laboratory (NWQL) when reporting results from most chemical analyses of water samples. Changing to this method provided data users with additional information about their data and often resulted in more reported values in the low concentration range. Before this method was implemented, many of these values would have been censored. The use of the LT-MDL and LRL presents some challenges for the data user. Interpreting data in the low concentration range increases the need for adequate quality assurance because even small contamination or recovery problems can be relatively large compared to concentrations near the LT-MDL and LRL. In addition, the definition of the LT-MDL, as well as the inclusion of low values, can result in complex data sets with multiple censoring levels and reported values that are less than a censoring level. Improper interpretation or statistical manipulation of low-range results in these data sets can result in bias and incorrect conclusions. This document is designed to help data users use and interpret data reported with the LTMDL/ LRL method. The calculation and application of the LT-MDL and LRL are described. This document shows how to extract statistical information from the LT-MDL and LRL and how to use that information in USGS investigations, such as assessing the quality of field data, interpreting field data, and planning data collection for new projects. A set of 19 detailed examples are included in this document to help data users think about their data and properly interpret lowrange data without introducing bias. Although this document is not meant to be a comprehensive resource of statistical methods, several useful methods of analyzing censored data are demonstrated, including Regression on Order Statistics and Kaplan-Meier Estimation. These two statistical methods handle complex censored data sets without resorting to substitution, thereby avoiding a common source of bias and inaccuracy.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
2018-01-01
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
[The main directions of reforming the service of medical statistics in Ukraine].
Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V
Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.
Singh, M; Ranjan, R; Das, B; Gupta, K
2014-01-01
Background: Cervical cancer being a major cause of morbidity and mortality in women in developing countries, its awareness is essential. Aim: The aim of this study is to assess the knowledge, attitude and practices of women regarding the basic screening test for detection of cancer cervix. Settings and Design: Population based cross-sectional study. Materials and Methods: Cross-sectional prospective study was conducted. Information from consenting participants (450) was collected using structured questionnaire. Answers were described in terms of knowledge, attitude and practice and their respective adequacy with respect to Papanicolaou (Pap) test, the most common test used for early detection of cervical cancer. Adequacy was compared between the categories of socio demographic and clinical variables. Statistical Analysis: The data collected was analyzed using statistical package (SPSS version 18.0). Adequacy was compared between the categories of the control variables by χ2 test with a 5% significance level. Results: Knowledge, attitude and practices regarding Pap test were adequate in 32.7%, 18.2% and 7.3% of women respectively. Major impediment to adequate practice was lack of request by physician. Knowledge, attitudes and practices were found to increase significantly with increasing age and education. Conclusion: Effective information, education and communication strategies are required to improve the level of awareness of public. Health-care professional should be proactive in imparting knowledge at every opportunity.
[What Surgeons Should Know about Risk Management].
Strametz, R; Tannheimer, M; Rall, M
2017-02-01
Background: The fact that medical treatment is associated with errors has long been recognized. Based on the principle of "first do no harm", numerous efforts have since been made to prevent such errors or limit their impact. However, recent statistics show that these measures do not sufficiently prevent grave mistakes with serious consequences. Preventable mistakes such as wrong patient or wrong site surgery still frequently occur in error statistics. Methods: Based on insight from research on human error, in due consideration of recent legislative regulations in Germany, the authors give an overview of the clinical risk management tools needed to identify risks in surgery, analyse their causes, and determine adequate measures to manage those risks depending on their relevance. The use and limitations of critical incident reporting systems (CIRS), safety checklists and crisis resource management (CRM) are highlighted. Also the rationale for IT systems to support the risk management process is addressed. Results/Conclusion: No single tool of risk management can be effective as a standalone instrument, but unfolds its effect only when embedded in a superordinate risk management system, which integrates tailor-made elements to increase patient safety into the workflows of each organisation. Competence in choosing adequate tools, effective IT systems to support the risk management process as well as leadership and commitment to constructive handling of human error are crucial components to establish a safety culture in surgery. Georg Thieme Verlag KG Stuttgart · New York.
Poppe, L.J.; Eliason, A.H.; Hastings, M.E.
2004-01-01
Measures that describe and summarize sediment grain-size distributions are important to geologists because of the large amount of information contained in textural data sets. Statistical methods are usually employed to simplify the necessary comparisons among samples and quantify the observed differences. The two statistical methods most commonly used by sedimentologists to describe particle distributions are mathematical moments (Krumbein and Pettijohn, 1938) and inclusive graphics (Folk, 1974). The choice of which of these statistical measures to use is typically governed by the amount of data available (Royse, 1970). If the entire distribution is known, the method of moments may be used; if the next to last accumulated percent is greater than 95, inclusive graphics statistics can be generated. Unfortunately, earlier programs designed to describe sediment grain-size distributions statistically do not run in a Windows environment, do not allow extrapolation of the distribution's tails, or do not generate both moment and graphic statistics (Kane and Hubert, 1963; Collias et al., 1963; Schlee and Webster, 1967; Poppe et al., 2000)1.Owing to analytical limitations, electro-resistance multichannel particle-size analyzers, such as Coulter Counters, commonly truncate the tails of the fine-fraction part of grain-size distributions. These devices do not detect fine clay in the 0.6–0.1 μm range (part of the 11-phi and all of the 12-phi and 13-phi fractions). Although size analyses performed down to 0.6 μm microns are adequate for most freshwater and near shore marine sediments, samples from many deeper water marine environments (e.g. rise and abyssal plain) may contain significant material in the fine clay fraction, and these analyses benefit from extrapolation.The program (GSSTAT) described herein generates statistics to characterize sediment grain-size distributions and can extrapolate the fine-grained end of the particle distribution. It is written in Microsoft Visual Basic 6.0 and provides a window to facilitate program execution. The input for the sediment fractions is weight percentages in whole-phi notation (Krumbein, 1934; Inman, 1952), and the program permits the user to select output in either method of moments or inclusive graphics statistics (Fig. 1). Users select options primarily with mouse-click events, or through interactive dialogue boxes.
Multifractal analysis of mobile social networks
NASA Astrophysics Data System (ADS)
Zheng, Wei; Zhang, Zifeng; Deng, Yufan
2017-09-01
As Wireless Fidelity (Wi-Fi)-enabled handheld devices have been widely used, the mobile social networks (MSNs) has been attracting extensive attention. Fractal approaches have also been widely applied to characterierize natural networks as useful tools to depict their spatial distribution and scaling properties. Moreover, when the complexity of the spatial distribution of MSNs cannot be properly charaterized by single fractal dimension, multifractal analysis is required. For further research, we introduced a multifractal analysis method based on box-covering algorithm to describe the structure of MSNs. Using this method, we find that the networks are multifractal at different time interval. The simulation results demonstrate that the proposed method is efficient for analyzing the multifractal characteristic of MSNs, which provides a distribution of singularities adequately describing both the heterogeneity of fractal patterns and the statistics of measurements across spatial scales in MSNs.
Optoelectronic scanning system upgrade by energy center localization methods
NASA Astrophysics Data System (ADS)
Flores-Fuentes, W.; Sergiyenko, O.; Rodriguez-Quiñonez, J. C.; Rivas-López, M.; Hernández-Balbuena, D.; Básaca-Preciado, L. C.; Lindner, L.; González-Navarro, F. F.
2016-11-01
A problem of upgrading an optoelectronic scanning system with digital post-processing of the signal based on adequate methods of energy center localization is considered. An improved dynamic triangulation analysis technique is proposed by an example of industrial infrastructure damage detection. A modification of our previously published method aimed at searching for the energy center of an optoelectronic signal is described. Application of the artificial intelligence algorithm of compensation for the error of determining the angular coordinate in calculating the spatial coordinate through dynamic triangulation is demonstrated. Five energy center localization methods are developed and tested to select the best method. After implementation of these methods, digital compensation for the measurement error, and statistical data analysis, a non-parametric behavior of the data is identified. The Wilcoxon signed rank test is applied to improve the result further. For optical scanning systems, it is necessary to detect a light emitter mounted on the infrastructure being investigated to calculate its spatial coordinate by the energy center localization method.
Richter, R; Hartmann, A; Meyer, A E; Rüger, U
1994-01-01
Thomas and Schmitz claim that they "deliver a proof for the effectiveness of humanistic methods" (p. 25) with their study. However, they did not or were not able to verify their claim due to several reasons: The authors did not say if and if so to what extent the treatments carried out within the framework of the TK-regulation were treatments using humanistic methods. The validity of the only criterium used by the authors, the average duration of the inability to work, must be questioned. The inferential statistical treatment of the data is insufficient; a non-parametrical evaluation is necessary. Especially missing are personal details concerning the treatment groups (age, sex, occupation, method, duration and frequency of therapy), which are indispensable for a differentiated interpretation. In addition there are numerous formal faults (wrong quotations, mistakes in tables, unclear terms etc.). In view of this criticism we come to the conclusion that the results are to a large degree worthless, at least until several of our objections have been refuted by further information and adequate inferential statistical methods. This study is especially unsuitable to prove a however defined "effectiveness of out-patient psychotherapies", therefore also not suitable to prove the effectiveness of those treatments conducted within the framework of the TK-regulation and especially not suitable to prove the superiority of humanistic methods in comparison with psychoanalytic methods and behavioural therapy.
Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva
2017-06-01
Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
NASA Astrophysics Data System (ADS)
Sergis, Antonis; Hardalupas, Yannis
2011-05-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.
Sergis, Antonis; Hardalupas, Yannis
2011-05-19
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.
Statistical variances of diffusional properties from ab initio molecular dynamics simulations
NASA Astrophysics Data System (ADS)
He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei
2018-12-01
Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.
Anomalous heat transfer modes of nanofluids: a review based on statistical analysis
2011-01-01
This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932
E-Learning in Croatian Higher Education: An Analysis of Students' Perceptions
NASA Astrophysics Data System (ADS)
Dukić, Darko; Andrijanić, Goran
2010-06-01
Over the last years, e-learning has taken an important role in Croatian higher education as a result of strategies defined and measures undertaken. Nonetheless, in comparison to the developed countries, the achievements in e-learning implementation are still unsatisfactory. Therefore, the efforts to advance e-learning within Croatian higher education need to be intensified. It is further necessary to undertake ongoing activities in order to solve possible problems in e-learning system functioning, which requires the development of adequate evaluation instruments and methods. One of the key steps in this process would be examining and analyzing users' attitudes. This paper presents a study of Croatian students' perceptions with regard to certain aspects of e-learning usage. Given the character of this research, adequate statistical methods were required for the data processing. The results of the analysis indicate that, for the most part, Croatian students have positive perceptions of e-learning, particularly as support to time-honored forms of teaching. However, they are not prepared to completely give up the traditional classroom. Using factor analysis, we identified four underlying factors of a collection of variables related to students' perceptions of e-learning. Furthermore, a certain number of statistically significant differences in student attitudes have been confirmed, in terms of gender and year of study. In our study we used discriminant analysis to determine discriminant functions that distinguished defined groups of students. With this research we managed to a certain degree to alleviate the current data insufficiency in the area of e-learning evaluation among Croatian students. Since this type of learning is gaining in importance within higher education, such analyses have to be conducted continuously.
Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard
2017-11-01
Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
Fritscher, Karl; Grunerbl, Agnes; Hanni, Markus; Suhm, Norbert; Hengg, Clemens; Schubert, Rainer
2009-10-01
Currently, conventional X-ray and CT images as well as invasive methods performed during the surgical intervention are used to judge the local quality of a fractured proximal femur. However, these approaches are either dependent on the surgeon's experience or cannot assist diagnostic and planning tasks preoperatively. Therefore, in this work a method for the individual analysis of local bone quality in the proximal femur based on model-based analysis of CT- and X-ray images of femur specimen will be proposed. A combined representation of shape and spatial intensity distribution of an object and different statistical approaches for dimensionality reduction are used to create a statistical appearance model in order to assess the local bone quality in CT and X-ray images. The developed algorithms are tested and evaluated on 28 femur specimen. It will be shown that the tools and algorithms presented herein are highly adequate to automatically and objectively predict bone mineral density values as well as a biomechanical parameter of the bone that can be measured intraoperatively.
Modeling and optimization of dough recipe for breadsticks
NASA Astrophysics Data System (ADS)
Krivosheev, A. Yu; Ponomareva, E. I.; Zhuravlev, A. A.; Lukina, S. I.; Alekhina, N. N.
2018-05-01
During the work, the authors studied the combined effect of non-traditional raw materials on indicators of quality breadsticks, mathematical methods of experiment planning were applied. The main factors chosen were the dosages of flaxseed flour and grape seed oil. The output parameters were the swelling factor of the products and their strength. Optimization of the formulation composition of the dough for bread sticks was carried out by experimental- statistical methods. As a result of the experiment, mathematical models were constructed in the form of regression equations, adequately describing the process of studies. The statistical processing of the experimental data was carried out by the criteria of Student, Cochran and Fisher (with a confidence probability of 0.95). A mathematical interpretation of the regression equations was given. Optimization of the formulation of the dough for bread sticks was carried out by the method of uncertain Lagrange multipliers. The rational values of the factors were determined: the dosage of flaxseed flour - 14.22% and grape seed oil - 7.8%, ensuring the production of products with the best combination of swelling ratio and strength. On the basis of the data obtained, a recipe and a method for the production of breadsticks "Idea" were proposed (TU (Russian Technical Specifications) 9117-443-02068106-2017).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Werth, D.; Chen, K. F.
2013-08-22
The ability of water managers to maintain adequate supplies in coming decades depends, in part, on future weather conditions, as climate change has the potential to alter river flows from their current values, possibly rendering them unable to meet demand. Reliable climate projections are therefore critical to predicting the future water supply for the United States. These projections cannot be provided solely by global climate models (GCMs), however, as their resolution is too coarse to resolve the small-scale climate changes that can affect hydrology, and hence water supply, at regional to local scales. A process is needed to ‘downscale’ themore » GCM results to the smaller scales and feed this into a surface hydrology model to help determine the ability of rivers to provide adequate flow to meet future needs. We apply a statistical downscaling to GCM projections of precipitation and temperature through the use of a scaling method. This technique involves the correction of the cumulative distribution functions (CDFs) of the GCM-derived temperature and precipitation results for the 20{sup th} century, and the application of the same correction to 21{sup st} century GCM projections. This is done for three meteorological stations located within the Coosa River basin in northern Georgia, and is used to calculate future river flow statistics for the upper Coosa River. Results are compared to the historical Coosa River flow upstream from Georgia Power Company’s Hammond coal-fired power plant and to flows calculated with the original, unscaled GCM results to determine the impact of potential changes in meteorology on future flows.« less
Pitfalls in statistical landslide susceptibility modelling
NASA Astrophysics Data System (ADS)
Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut
2010-05-01
The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.
Vinaya, Kundapur; Rakshith, Hegde; Prasad D, Krishna; Manoj, Shetty; Sunil, Mankar; Naresh, Shetty
2015-06-01
To evaluate the retention of complete cast crowns in teeth with adequate and inadequate crown height and to evaluate the effects of auxiliary retentive features on retention form complete cast crowns. Sixty freshly extracted human premolars. They were divided into 2 major groups depending upon the height of the teeth after the preparation. Group1 (H1): prepared teeth with constant height of 3.5 mm and Group 2 (H2): prepared teeth with constant height of 2.5 mm. Each group is further subdivided into 3 subgroups, depending upon the retentive features incorporated. First sub group were prepared conventionally, second sub group with proximal grooves and third subgroups with proximal boxes preparation. Castings produced in Nickel chromium alloy were cemented with glass ionomer cement and the cemented castings were subjected to tensional forces required to dislodge each cemented casting from its preparation and used for comparison of retentive quality. The data obtained were statistically analyzed using Oneway ANOVA test. The results showed there was statistically significant difference between adequate (H1) and inadequate (H2) group and increase in retention when there was incorporation of retentive features compared to conventional preparations. Incorporation of retentive grooves was statistically significant compared to retention obtained by boxes. Results also showed there was no statistically significant difference between long conventional and short groove. Complete cast crowns on teeth with adequate crown height exhibited greater retention than with inadequate crown height. Proximal grooves provided greater amount of retention when compared with proximal boxes.
2013-01-01
Background Relative validity (RV), a ratio of ANOVA F-statistics, is often used to compare the validity of patient-reported outcome (PRO) measures. We used the bootstrap to establish the statistical significance of the RV and to identify key factors affecting its significance. Methods Based on responses from 453 chronic kidney disease (CKD) patients to 16 CKD-specific and generic PRO measures, RVs were computed to determine how well each measure discriminated across clinically-defined groups of patients compared to the most discriminating (reference) measure. Statistical significance of RV was quantified by the 95% bootstrap confidence interval. Simulations examined the effects of sample size, denominator F-statistic, correlation between comparator and reference measures, and number of bootstrap replicates. Results The statistical significance of the RV increased as the magnitude of denominator F-statistic increased or as the correlation between comparator and reference measures increased. A denominator F-statistic of 57 conveyed sufficient power (80%) to detect an RV of 0.6 for two measures correlated at r = 0.7. Larger denominator F-statistics or higher correlations provided greater power. Larger sample size with a fixed denominator F-statistic or more bootstrap replicates (beyond 500) had minimal impact. Conclusions The bootstrap is valuable for establishing the statistical significance of RV estimates. A reasonably large denominator F-statistic (F > 57) is required for adequate power when using the RV to compare the validity of measures with small or moderate correlations (r < 0.7). Substantially greater power can be achieved when comparing measures of a very high correlation (r > 0.9). PMID:23721463
ERIC Educational Resources Information Center
Martin, Tammy Faith
2012-01-01
The purpose of this study was to examine principal leadership styles and their influence on school performance as measured by adequate yearly progress at selected Title I schools in South Carolina. The main focus of the research study was to complete descriptive statistics on principal leadership styles in schools that met or did not meet adequate…
Environmental Validation of Legionella Control in a VHA Facility Water System.
Jinadatha, Chetan; Stock, Eileen M; Miller, Steve E; McCoy, William F
2018-03-01
OBJECTIVES We conducted this study to determine what sample volume, concentration, and limit of detection (LOD) are adequate for environmental validation of Legionella control. We also sought to determine whether time required to obtain culture results can be reduced compared to spread-plate culture method. We also assessed whether polymerase chain reaction (PCR) and in-field total heterotrophic aerobic bacteria (THAB) counts are reliable indicators of Legionella in water samples from buildings. DESIGN Comparative Legionella screening and diagnostics study for environmental validation of a healthcare building water system. SETTING Veterans Health Administration (VHA) facility water system in central Texas. METHODS We analyzed 50 water samples (26 hot, 24 cold) from 40 sinks and 10 showers using spread-plate cultures (International Standards Organization [ISO] 11731) on samples shipped overnight to the analytical lab. In-field, on-site cultures were obtained using the PVT (Phigenics Validation Test) culture dipslide-format sampler. A PCR assay for genus-level Legionella was performed on every sample. RESULTS No practical differences regardless of sample volume filtered were observed. Larger sample volumes yielded more detections of Legionella. No statistically significant differences at the 1 colony-forming unit (CFU)/mL or 10 CFU/mL LOD were observed. Approximately 75% less time was required when cultures were started in the field. The PCR results provided an early warning, which was confirmed by spread-plate cultures. The THAB results did not correlate with Legionella status. CONCLUSIONS For environmental validation at this facility, we confirmed that (1) 100 mL sample volumes were adequate, (2) 10× concentrations were adequate, (3) 10 CFU/mL LOD was adequate, (4) in-field cultures reliably reduced time to get results by 75%, (5) PCR provided a reliable early warning, and (6) THAB was not predictive of Legionella results. Infect Control Hosp Epidemiol 2018;39:259-266.
Masoumi, Hamid Reza Fard; Basri, Mahiran; Kassim, Anuar; Abdullah, Dzulkefly Kuang; Abdollahi, Yadollah; Abd Gani, Siti Salwa; Rezaee, Malahat
2013-01-01
Lipase-catalyzed production of triethanolamine-based esterquat by esterification of oleic acid (OA) with triethanolamine (TEA) in n-hexane was performed in 2 L stirred-tank reactor. A set of experiments was designed by central composite design to process modeling and statistically evaluate the findings. Five independent process variables, including enzyme amount, reaction time, reaction temperature, substrates molar ratio of OA to TEA, and agitation speed, were studied under the given conditions designed by Design Expert software. Experimental data were examined for normality test before data processing stage and skewness and kurtosis indices were determined. The mathematical model developed was found to be adequate and statistically accurate to predict the optimum conversion of product. Response surface methodology with central composite design gave the best performance in this study, and the methodology as a whole has been proven to be adequate for the design and optimization of the enzymatic process.
Uniting statistical and individual-based approaches for animal movement modelling.
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.
Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling
Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel
2014-01-01
The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047
Experimental toxicology: Issues of statistics, experimental design, and replication.
Briner, Wayne; Kirwan, Jeral
2017-01-01
The difficulty of replicating experiments has drawn considerable attention. Issues with replication occur for a variety of reasons ranging from experimental design to laboratory errors to inappropriate statistical analysis. Here we review a variety of guidelines for statistical analysis, design, and execution of experiments in toxicology. In general, replication can be improved by using hypothesis driven experiments with adequate sample sizes, randomization, and blind data collection techniques. Copyright © 2016 Elsevier B.V. All rights reserved.
CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi
2010-01-01
Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
Area estimation of crops by digital analysis of Landsat data
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Hixson, M. M.; Davis, B. J.
1978-01-01
The study for which the results are presented had these objectives: (1) to use Landsat data and computer-implemented pattern recognition to classify the major crops from regions encompassing different climates, soils, and crops; (2) to estimate crop areas for counties and states by using crop identification data obtained from the Landsat identifications; and (3) to evaluate the accuracy, precision, and timeliness of crop area estimates obtained from Landsat data. The paper describes the method of developing the training statistics and evaluating the classification accuracy. Landsat MSS data were adequate to accurately identify wheat in Kansas; corn and soybean estimates for Indiana were less accurate. Systematic sampling of entire counties made possible by computer classification methods resulted in very precise area estimates at county, district, and state levels.
McLachlan, G J; Bean, R W; Jones, L Ben-Tovim
2006-07-01
An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.
Evaluating the Implementation of an Olympic Education Program in Greece
NASA Astrophysics Data System (ADS)
Grammatikopoulos, Vasilios; Tsigilis, Nikolaos; Koustelios, Athanasios; Theodorakis, Yannis
2005-11-01
The aim of this study was to develop an instrument for evaluating how an education program has been implemented. Such evaluation can provide insight into the effectiveness of a program. Examined here was the Olympic Education Program used in Greek schools since 2000. In it, students learn the history of the Olympic games and the importance of exercise for health along with the principles and values of sports and volunteerism. The evaluation instrument underlying this study addressed the following six factors: `facilities', `administration', `educational material', `student-teacher relationships', `educational procedures', and `training'. Results indicate that the instrument, while adequate for assessing effectiveness, should be combined with advanced statistical methods.
NONPARAMETRIC MANOVA APPROACHES FOR NON-NORMAL MULTIVARIATE OUTCOMES WITH MISSING VALUES
He, Fanyin; Mazumdar, Sati; Tang, Gong; Bhatia, Triptish; Anderson, Stewart J.; Dew, Mary Amanda; Krafty, Robert; Nimgaonkar, Vishwajit; Deshpande, Smita; Hall, Martica; Reynolds, Charles F.
2017-01-01
Between-group comparisons often entail many correlated response variables. The multivariate linear model, with its assumption of multivariate normality, is the accepted standard tool for these tests. When this assumption is violated, the nonparametric multivariate Kruskal-Wallis (MKW) test is frequently used. However, this test requires complete cases with no missing values in response variables. Deletion of cases with missing values likely leads to inefficient statistical inference. Here we extend the MKW test to retain information from partially-observed cases. Results of simulated studies and analysis of real data show that the proposed method provides adequate coverage and superior power to complete-case analyses. PMID:29416225
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
36 CFR 79.7 - Methods to fund curatorial services.
Code of Federal Regulations, 2014 CFR
2014-07-01
... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...
36 CFR 79.7 - Methods to fund curatorial services.
Code of Federal Regulations, 2012 CFR
2012-07-01
... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...
36 CFR 79.7 - Methods to fund curatorial services.
Code of Federal Regulations, 2010 CFR
2010-07-01
... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...
36 CFR 79.7 - Methods to fund curatorial services.
Code of Federal Regulations, 2011 CFR
2011-07-01
... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...
36 CFR 79.7 - Methods to fund curatorial services.
Code of Federal Regulations, 2013 CFR
2013-07-01
... available for adequate, long-term care and maintenance of collections. Those methods include, but are not..., expanding, operating, and maintaining a repository that has the capability to provide adequate long-term... with a repository that has the capability to provide adequate long-term curatorial services as set...
Guetterman, Timothy C; Creswell, John W; Wittink, Marsha; Barg, Fran K; Castro, Felipe G; Dahlberg, Britt; Watkins, Daphne C; Deutsch, Charles; Gallo, Joseph J
2017-01-01
Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a self-rated mixed methods skills assessment and provide validity evidence. The instrument taps six research domains: "Research question," "Design/approach," "Sampling," "Data collection," "Analysis," and "Dissemination." Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using the Cronbach alpha to assess reliability and an analysis of variance that compared a mixed methods experience index with assessment scores to assess criterion relatedness. Internal consistency reliability was high for the total set of items (0.95) and adequate (≥0.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (eg, published a mixed methods article) rated themselves as more skilled, which was statistically significant across the research domains. This self-rated mixed methods assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.
Extending the Peak Bandwidth of Parameters for Softmax Selection in Reinforcement Learning.
Iwata, Kazunori
2016-05-11
Softmax selection is one of the most popular methods for action selection in reinforcement learning. Although various recently proposed methods may be more effective with full parameter tuning, implementing a complicated method that requires the tuning of many parameters can be difficult. Thus, softmax selection is still worth revisiting, considering the cost savings of its implementation and tuning. In fact, this method works adequately in practice with only one parameter appropriately set for the environment. The aim of this paper is to improve the variable setting of this method to extend the bandwidth of good parameters, thereby reducing the cost of implementation and parameter tuning. To achieve this, we take advantage of the asymptotic equipartition property in a Markov decision process to extend the peak bandwidth of softmax selection. Using a variety of episodic tasks, we show that our setting is effective in extending the bandwidth and that it yields a better policy in terms of stability. The bandwidth is quantitatively assessed in a series of statistical tests.
Rezende, Patrícia Sueli; Carmo, Geraldo Paulo do; Esteves, Eduardo Gonçalves
2015-06-01
We report the use of a method to determine the refractive index of copper(II) serum (RICS) in milk as a tool to detect the fraudulent addition of water. This practice is highly profitable, unlawful, and difficult to deter. The method was optimized and validated and is simple, fast and robust. The optimized method yielded statistically equivalent results compared to the reference method with an accuracy of 0.4% and quadrupled analytical throughput. Trueness, precision (repeatability and intermediate precision) and ruggedness are determined to be satisfactory at a 95.45% confidence level. The expanded uncertainty of the measurement was ±0.38°Zeiss at the 95.45% confidence level (k=3.30), corresponding to 1.03% of the minimum measurement expected in adequate samples (>37.00°Zeiss). Copyright © 2015 Elsevier B.V. All rights reserved.
Disposable screen-printed sensors for determination of duloxetine hydrochloride
2012-01-01
A screen-printed disposable electrode system for the determination of duloxetine hydrochloride (DL) was developed using screen-printing technology. Homemade printing has been characterized and optimized on the basis of effects of the modifier and plasticizers. The fabricated bi-electrode potentiometric strip containing both working and reference electrodes was used as duloxetine hydrochloride sensor. The proposed sensors worked satisfactorily in the concentration range from 1.0 × 10-6-1.0 × 10-2 mol L-1 with detection limit reaching 5.0 × 10-7 mol L-1 and adequate shelf life of 6 months. The method is accurate, precise and economical. The proposed method has been applied successfully for the analysis of the drug in pure and in its dosage forms. In this method, there is no interference from any common pharmaceutical additives and diluents. Results of the analysis were validated statistically by recovery studies. PMID:22264225
Analysis of cigarette purchase task instrument data with a left-censored mixed effects model.
Liao, Wenjie; Luo, Xianghua; Le, Chap T; Chu, Haitao; Epstein, Leonard H; Yu, Jihnhee; Ahluwalia, Jasjit S; Thomas, Janet L
2013-04-01
The drug purchase task is a frequently used instrument for measuring the relative reinforcing efficacy (RRE) of a substance, a central concept in psychopharmacological research. Although a purchase task instrument, such as the cigarette purchase task (CPT), provides a comprehensive and inexpensive way to assess various aspects of a drug's RRE, the application of conventional statistical methods to data generated from such an instrument may not be adequate by simply ignoring or replacing the extra zeros or missing values in the data with arbitrary small consumption values, for example, 0.001. We applied the left-censored mixed effects model to CPT data from a smoking cessation study of college students and demonstrated its superiority over the existing methods with simulation studies. Theoretical implications of the findings, limitations of the proposed method, and future directions of research are also discussed.
National Trends in Trace Metals Concentrations in Ambient Particulate Matter
NASA Astrophysics Data System (ADS)
McCarthy, M. C.; Hafner, H. R.; Charrier, J. G.
2007-12-01
Ambient measurements of trace metals identified as hazardous air pollutants (HAPs, air toxics) collected in the United States from 1990 to 2006 were analyzed for long-term trends. Trace metals analyzed include lead, manganese, arsenic, chromium, nickel, cadmium, and selenium. Visual and statistical analyses were used to identify and quantify temporal variations in air toxics at national and regional levels. Trend periods were required to be at least five years. Lead particles decreased in concentration at most monitoring sites, but trends in other metals were not consistent over time or spatially. In addition, routine ambient monitoring methods had method detection limits (MDLs) too high to adequately measure concentrations for trends analysis. Differences between measurement methods at urban and rural sites also confound trends analyses. Improvements in MDLs, and a better understanding of comparability between networks, are needed to better quantify trends in trace metal concentrations in the future.
Analysis of Cigarette Purchase Task Instrument Data with a Left-Censored Mixed Effects Model
Liao, Wenjie; Luo, Xianghua; Le, Chap; Chu, Haitao; Epstein, Leonard H.; Yu, Jihnhee; Ahluwalia, Jasjit S.; Thomas, Janet L.
2015-01-01
The drug purchase task is a frequently used instrument for measuring the relative reinforcing efficacy (RRE) of a substance, a central concept in psychopharmacological research. While a purchase task instrument, such as the cigarette purchase task (CPT), provides a comprehensive and inexpensive way to assess various aspects of a drug’s RRE, the application of conventional statistical methods to data generated from such an instrument may not be adequate by simply ignoring or replacing the extra zeros or missing values in the data with arbitrary small consumption values, e.g. 0.001. We applied the left-censored mixed effects model to CPT data from a smoking cessation study of college students and demonstrated its superiority over the existing methods with simulation studies. Theoretical implications of the findings, limitations of the proposed method and future directions of research are also discussed. PMID:23356731
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; ...
2012-01-01
Objective . Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods . We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results . Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relativemore » accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions . When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.« less
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs. PMID:22505949
NASA Astrophysics Data System (ADS)
Prikner, K.
A statistical method for interpreting data from experimental investigations of vertically-propagating electromagnetic ULF waves in the inhomogeneous magnetoactive ionosphere is considered theoretically. Values are obtained for the transmission, reflection and absorption characteristics of ULF waves in a limited ionospheric layer, in order to describe the relation between the frequency of a wave generated at the earth surface and that of a total wave propagating above the ionospheric layer. This relation is used to express the frequency-selective amplitude filtration of ULF waves in the layer. The method is applied to a model of the night ionosphere of mid-geomagnetic latitudes in the form of a plate 1000 km thick. It is found that the relative characteristics of transmission and amplitude loss in the wave adequately describe the frequency selectiveness and wave filtration capacity of the ionosphere. The method is recommended for studies of the structural changes of wave parameters in ionospheric models.
Kids Count: The State of the Child in Tennessee, 1996. A County-by-County Statistical Report.
ERIC Educational Resources Information Center
Tennessee State Commission on Children and Youth, Nashville.
This Kids Count report examines statewide trends from 1992 to 1996 in the well being of Tennessee's children. The statistical portrait is based on trends in 16 indicators of child well being: (1) enrollment in state health insurance program; (2) births lacking adequate prenatal care; (3) low-birthweight births; (4) infant mortality rate; (5) child…
Coronary Flow Velocity Reserve during Dobutamine Stress Echocardiography
de Abreu, José Sebastião; Lima, José Wellington Oliveira; Diógenes, Tereza Cristina Pinheiro; Siqueira, Jordana Magalhães; Pimentel, Nayara Lima; Gomes, Pedro Sabino; de Abreu, Marília Esther Benevides; Paes, José Nogueira
2014-01-01
Background A coronary flow velocity reserve (CFVR) ≥ 2 is adequate to infer a favorable prognosis or the absence of significant coronary artery disease. Objective To identify parameters which are relevant to obtain CFVR (adequate or inadequate) in the left anterior descending coronary artery (LAD) during dobutamine stress echocardiography (DSE). Methods 100 patients referred for detection of myocardial ischemia by DSE were evaluated; they were instructed to discontinue the use of β-blockers 72 hours prior to the test. CFVR was calculated as a ratio of the diastolic peak velocity (cm/s) (DPV) on DSE (DPV-DSE) to baseline DPV at rest (DPV-Rest). In group I, CFVR was < 2 and, in group II, CFVR was ≥ 2. The Fisher's exact test and Student's t test were used for the statistical analyses. P values < 0.05 were considered statistically significant. Results At rest, the time (in seconds) to obtain Doppler in LAD in groups I and II was not different (53±31 vs. 45±32; p=0.23). During DSE, LAD was recorded in 92 patients. Group I patients were older (65.9±9.3 vs. 61.2±10.8 years; p=0.04), had lower ejection fraction (61±10 vs. 66±6%; p=0.005), higher DPV-Rest (36.81±08 vs. 25.63 ± 06cm/s; p<0.0001) and lower CFVR (1.67 ± 0.24 vs. 2.53 ± 0.57; p<0.0001), but no difference was observed regarding DPVDSE (61.40±16 vs. 64.23±16cm/s; p=0.42). β-blocker discontinuation was associated with a 4-fold higher chance of a CFVR < 2 (OR= 4; 95% CI [1.171-13.63], p=0.027). Conclusion DPV-Rest was the main parameter to determine an adequate CFVR. β-blocker discontinuation was significantly associated with inadequate CFVR. The high feasibility and the time to record the LAD corroborate the use of this methodology. PMID:24676368
Narayanan, Sarath Kumar; Cohen, Ralph Clinton; Shun, Albert
2014-06-01
Minimal access techniques have transformed the way pediatric surgery is practiced. Due to various constraints, surgical residency programs have not been able to tutor adequate training skills in the routine setting. The advent of new technology and methods in minimally invasive surgery (MIS), has similarly contributed to the need for systematic skills' training in a safe, simulated environment. To enable the training of the proper technique among pediatric surgery trainees, we have advanced a porcine non-survival model for endoscopic surgery. The technical advancements over the past 3 years and a subjective validation of the porcine model from 114 participating trainees using a standard questionnaire and a 5-point Likert scale have been described here. Mean attitude scores and analysis of variance (ANOVA) were used for statistical analysis of the data. Almost all trainees agreed or strongly agreed that the animal-based model was appropriate (98.35%) and also acknowledged that such workshops provided adequate practical experience before attempting on human subjects (96.6%). Mean attitude score for respondents was 19.08 (SD 3.4, range 4-20). Attitude scores showed no statistical association with years of experience or the level of seniority, indicating a positive attitude among all groups of respondents. Structured porcine-based MIS training should be an integral part of skill acquisition for pediatric surgery trainees and the experience gained can be transferred into clinical practice. We advocate that laparoscopic training should begin in a controlled workshop setting before procedures are attempted on human patients.
Statistical Approaches for Spatiotemporal Prediction of Low Flows
NASA Astrophysics Data System (ADS)
Fangmann, A.; Haberlandt, U.
2017-12-01
An adequate assessment of regional climate change impacts on streamflow requires the integration of various sources of information and modeling approaches. This study proposes simple statistical tools for inclusion into model ensembles, which are fast and straightforward in their application, yet able to yield accurate streamflow predictions in time and space. Target variables for all approaches are annual low flow indices derived from a data set of 51 records of average daily discharge for northwestern Germany. The models require input of climatic data in the form of meteorological drought indices, derived from observed daily climatic variables, averaged over the streamflow gauges' catchments areas. Four different modeling approaches are analyzed. Basis for all pose multiple linear regression models that estimate low flows as a function of a set of meteorological indices and/or physiographic and climatic catchment descriptors. For the first method, individual regression models are fitted at each station, predicting annual low flow values from a set of annual meteorological indices, which are subsequently regionalized using a set of catchment characteristics. The second method combines temporal and spatial prediction within a single panel data regression model, allowing estimation of annual low flow values from input of both annual meteorological indices and catchment descriptors. The third and fourth methods represent non-stationary low flow frequency analyses and require fitting of regional distribution functions. Method three is subject to a spatiotemporal prediction of an index value, method four to estimation of L-moments that adapt the regional frequency distribution to the at-site conditions. The results show that method two outperforms successive prediction in time and space. Method three also shows a high performance in the near future period, but since it relies on a stationary distribution, its application for prediction of far future changes may be problematic. Spatiotemporal prediction of L-moments appeared highly uncertain for higher-order moments resulting in unrealistic future low flow values. All in all, the results promote an inclusion of simple statistical methods in climate change impact assessment.
Olsen, L R; Jensen, D V; Noerholm, V; Martiny, K; Bech, P
2003-02-01
We have developed the Major Depression Inventory (MDI), consisting of 10 items, covering the DSM-IV as well as the ICD-10 symptoms of depressive illness. We aimed to evaluate this as a scale measuring severity of depressive states with reference to both internal and external validity. Patients representing the score range from no depression to marked depression on the Hamilton Depression Scale (HAM-D) completed the MDI. Both classical and modern psychometric methods were applied for the evaluation of validity, including the Rasch analysis. In total, 91 patients were included. The results showed that the MDI had an adequate internal validity in being a unidimensional scale (the total score an appropriate or sufficient statistic). The external validity of the MDI was also confirmed as the total score of the MDI correlated significantly with the HAM-D (Pearson's coefficient 0.86, P < or = 0.01, Spearman 0.80, P < or = 0.01). When used in a sample of patients with different states of depression the MDI has an adequate internal and external validity.
A segmentation editing framework based on shape change statistics
NASA Astrophysics Data System (ADS)
Mostapha, Mahmoud; Vicory, Jared; Styner, Martin; Pizer, Stephen
2017-02-01
Segmentation is a key task in medical image analysis because its accuracy significantly affects successive steps. Automatic segmentation methods often produce inadequate segmentations, which require the user to manually edit the produced segmentation slice by slice. Because editing is time-consuming, an editing tool that enables the user to produce accurate segmentations by only drawing a sparse set of contours would be needed. This paper describes such a framework as applied to a single object. Constrained by the additional information enabled by the manually segmented contours, the proposed framework utilizes object shape statistics to transform the failed automatic segmentation to a more accurate version. Instead of modeling the object shape, the proposed framework utilizes shape change statistics that were generated to capture the object deformation from the failed automatic segmentation to its corresponding correct segmentation. An optimization procedure was used to minimize an energy function that consists of two terms, an external contour match term and an internal shape change regularity term. The high accuracy of the proposed segmentation editing approach was confirmed by testing it on a simulated data set based on 10 in-vivo infant magnetic resonance brain data sets using four similarity metrics. Segmentation results indicated that our method can provide efficient and adequately accurate segmentations (Dice segmentation accuracy increase of 10%), with very sparse contours (only 10%), which is promising in greatly decreasing the work expected from the user.
[Flavouring estimation of quality of grape wines with use of methods of mathematical statistics].
Yakuba, Yu F; Khalaphyan, A A; Temerdashev, Z A; Bessonov, V V; Malinkin, A D
2016-01-01
The questions of forming of wine's flavour integral estimation during the tasting are discussed, the advantages and disadvantages of the procedures are declared. As investigating materials we used the natural white and red wines of Russian manufactures, which were made with the traditional technologies from Vitis Vinifera, straight hybrids, blending and experimental wines (more than 300 different samples). The aim of the research was to set the correlation between the content of wine's nonvolatile matter and wine's tasting quality rating by mathematical statistics methods. The content of organic acids, amino acids and cations in wines were considered as the main factors influencing on the flavor. Basically, they define the beverage's quality. The determination of those components in wine's samples was done by the electrophoretic method «CAPEL». Together with the analytical checking of wine's samples quality the representative group of specialists simultaneously carried out wine's tasting estimation using 100 scores system. The possibility of statistical modelling of correlation of wine's tasting estimation based on analytical data of amino acids and cations determination reasonably describing the wine's flavour was examined. The statistical modelling of correlation between the wine's tasting estimation and the content of major cations (ammonium, potassium, sodium, magnesium, calcium), free amino acids (proline, threonine, arginine) and the taking into account the level of influence on flavour and analytical valuation within fixed limits of quality accordance were done with Statistica. Adequate statistical models which are able to predict tasting estimation that is to determine the wine's quality using the content of components forming the flavour properties have been constructed. It is emphasized that along with aromatic (volatile) substances the nonvolatile matter - mineral substances and organic substances - amino acids such as proline, threonine, arginine influence on wine's flavour properties. It has been shown the nonvolatile components contribute in organoleptic and flavour quality estimation of wines as aromatic volatile substances but they take part in forming the expert's evaluation.
Abbey, Mercy; Chinbuah, Margaret A; Gyapong, Margaret; Bartholomew, L Kay; van den Borne, Bart
2016-08-22
The World Health Organization recommends community case management of malaria and pneumonia for reduction of under-five mortality in developing countries. Caregivers' perception and understanding of the illness influences the care a sick child receives. Studies in Ghana and elsewhere have routinely shown adequate recognition of malaria by caregivers. Similarly, evidence from Asia and some African countries have shown adequate knowledge on pneumonia. However, in Ghana, little has been documented about community awareness, knowledge, perceptions and management of childhood pneumonia particularly in the Dangme West district. Therefore this formative study was conducted to determine community perceptions of pneumonia for the purpose of informing the design and implementation of context specific health communication strategies to promote early and appropriate care seeking behaviour for childhood pneumonia. A mixed method approach was adopted. Data were obtained from structured interviews (N = 501) and eight focus group discussions made up of 56 caregivers of under-fives and eight community Key Informants. Descriptive and inference statistics were used for the quantitative data and grounded theory to guide the analysis of the qualitative data. Two-thirds of the respondents had never heard the name pneumonia. Most respondents did not know about the signs and symptoms of pneumonia. For the few who have heard about pneumonia, causes were largely attributed to coming into contact with cold temperature in various forms. Management practices mostly were self-treatment with home remedies and allopathic care. The low awareness and inadequate recognition of pneumonia implies that affected children may not receive prompt and appropriate treatment as their caregivers may misdiagnose the illness. Adequate measures need to be taken to create the needed awareness to improve care seeking behaviour.
Evaluating the One-in-Five Statistic: Women's Risk of Sexual Assault While in College.
Muehlenhard, Charlene L; Peterson, Zoë D; Humphreys, Terry P; Jozkowski, Kristen N
In 2014, U.S. president Barack Obama announced a White House Task Force to Protect Students From Sexual Assault, noting that "1 in 5 women on college campuses has been sexually assaulted during their time there." Since then, this one-in-five statistic has permeated public discourse. It is frequently reported, but some commentators have criticized it as exaggerated. Here, we address the question, "What percentage of women are sexually assaulted while in college?" After discussing definitions of sexual assault, we systematically review available data, focusing on studies that used large, representative samples of female undergraduates and multiple behaviorally specific questions. We conclude that one in five is a reasonably accurate average across women and campuses. We also review studies that are inappropriately cited as either supporting or debunking the one-in-five statistic; we explain why they do not adequately address this question. We identify and evaluate several assumptions implicit in the public discourse (e.g., the assumption that college students are at greater risk than nonstudents). Given the empirical support for the one-in-five statistic, we suggest that the controversy occurs because of misunderstandings about studies' methods and results and because this topic has implications for gender relations, power, and sexuality; this controversy is ultimately about values.
Evaluation Studies of Robotic Rollators by the User Perspective: A Systematic Review.
Werner, Christian; Ullrich, Phoebe; Geravand, Milad; Peer, Angelika; Hauer, Klaus
2016-01-01
Robotic rollators enhance the basic functions of established devices by technically advanced physical, cognitive, or sensory support to increase autonomy in persons with severe impairment. In the evaluation of such ambient assisted living solutions, both the technical and user perspectives are important to prove usability, effectiveness and safety, and to ensure adequate device application. The aim of this systematic review is to summarize the methodology of studies evaluating robotic rollators with focus on the user perspective and to give recommendations for future evaluation studies. A systematic literature search up to December 31, 2014, was conducted based on the Cochrane Review methodology using the electronic databases PubMed and IEEE Xplore. Articles were selected according to the following inclusion criteria: evaluation studies of robotic rollators documenting human-robot interaction, no case reports, published in English language. Twenty-eight studies were identified that met the predefined inclusion criteria. Large heterogeneity in the definitions of the target user group, study populations, study designs and assessment methods was found across the included studies. No generic methodology to evaluate robotic rollators could be identified. We found major methodological shortcomings related to insufficient sample descriptions and sample sizes, and lack of appropriate, standardized and validated assessment methods. Long-term use in habitual environment was also not evaluated. Apart from the heterogeneity, methodological deficits in most of the identified studies became apparent. Recommendations for future evaluation studies include: clear definition of target user group, adequate selection of subjects, inclusion of other assistive mobility devices for comparison, evaluation of the habitual use of advanced prototypes, adequate assessment strategy with established, standardized and validated methods, and statistical analysis of study results. Assessment strategies may additionally focus on specific functionalities of the robotic rollators allowing an individually tailored assessment of innovative features to document their added value. © 2016 S. Karger AG, Basel.
Comments on the present state and future directions of PDF methods
NASA Technical Reports Server (NTRS)
Obrien, E. E.
1992-01-01
The one point probability density function (PDF) method is examined in light of its use in actual engineering problems. The PDF method, although relatively complicated, appears to be the only format available to handle the nonlinear stochastic difficulties caused by typical reaction kinetics. Turbulence modeling, if it is to play a central role in combustion modeling, has to be integrated with the chemistry in a way which produces accurate numerical solutions to combustion problems. It is questionable whether the development of turbulent models in isolation from the peculiar statistics of reactant concentrations is a fruitful line of development as far as propulsion is concerned. There are three issues for which additional viewgraphs are prepared: the one point pdf method; the amplitude mapping closure; and a hybrid strategy for replacing a full two point pdf treatment of reacting flows by a single point pdf and correlation functions. An appeal is made for the establishment of an adequate data base for compressible flow with reactions for Mach numbers of unity or higher.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil
2014-08-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.
Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods
Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil
2015-01-01
We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922
Cysique, Lucette A; Waters, Edward K; Brew, Bruce J
2011-11-22
There is conflicting information as to whether antiretroviral drugs with better central nervous system (CNS) penetration (neuroHAART) assist in improving neurocognitive function and suppressing cerebrospinal fluid (CSF) HIV RNA. The current review aims to better synthesise existing literature by using an innovative two-phase review approach (qualitative and quantitative) to overcome methodological differences between studies. Sixteen studies, all observational, were identified using a standard citation search. They fulfilled the following inclusion criteria: conducted in the HAART era; sample size > 10; treatment effect involved more than one antiretroviral and none had a retrospective design. The qualitative phase of review of these studies consisted of (i) a blind assessment rating studies on features such as sample size, statistical methods and definitions of neuroHAART, and (ii) a non-blind assessment of the sensitivity of the neuropsychological methods to HIV-associated neurocognitive disorder (HAND). During quantitative evaluation we assessed the statistical power of studies, which achieved a high rating in the qualitative analysis. The objective of the power analysis was to determine the studies ability to assess their proposed research aims. After studies with at least three limitations were excluded in the qualitative phase, six studies remained. All six found a positive effect of neuroHAART on neurocognitive function or CSF HIV suppression. Of these six studies, only two had statistical power of at least 80%. Studies assessed as using more rigorous methods found that neuroHAART was effective in improving neurocognitive function and decreasing CSF viral load, but only two of those studies were adequately statistically powered. Because all of these studies were observational, they represent a less compelling evidence base than randomised control trials for assessing treatment effect. Therefore, large randomised trials are needed to determine the robustness of any neuroHAART effect. However, such trials must be longitudinal, include the full spectrum of HAND, ideally carefully control for co-morbidities, and be based on optimal neuropsychology methods.
Jahn, I; Foraita, R
2008-01-01
In Germany gender-sensitive approaches are part of guidelines for good epidemiological practice as well as health reporting. They are increasingly claimed to realize the gender mainstreaming strategy in research funding by the federation and federal states. This paper focuses on methodological aspects of data analysis, as an empirical data example of which serves the health report of Bremen, a population-based cross-sectional study. Health reporting requires analysis and reporting methods that are able to discover sex/gender issues of questions, on the one hand, and consider how results can adequately be communicated, on the other hand. The core question is: Which consequences do a different inclusion of the category sex in different statistical analyses for identification of potential target groups have on the results? As evaluation methods logistic regressions as well as a two-stage procedure were exploratively conducted. This procedure combines graphical models with CHAID decision trees and allows for visualising complex results. Both methods are analysed by stratification as well as adjusted by sex/gender and compared with each other. As a result, only stratified analyses are able to detect differences between the sexes and within the sex/gender groups as long as one cannot resort to previous knowledge. Adjusted analyses can detect sex/gender differences only if interaction terms have been included in the model. Results are discussed from a statistical-epidemiological perspective as well as in the context of health reporting. As a conclusion, the question, if a statistical method is gender-sensitive, can only be answered by having concrete research questions and known conditions. Often, an appropriate statistic procedure can be chosen after conducting a separate analysis for women and men. Future gender studies deserve innovative study designs as well as conceptual distinctiveness with regard to the biological and the sociocultural elements of the category sex/gender.
Review of surface steam sterilization for validation purposes.
van Doornmalen, Joost; Kopinga, Klaas
2008-03-01
Sterilization is an essential step in the process of producing sterile medical devices. To guarantee sterility, the process of sterilization must be validated. Because there is no direct way to measure sterility, the techniques applied to validate the sterilization process are based on statistical principles. Steam sterilization is the most frequently applied sterilization method worldwide and can be validated either by indicators (chemical or biological) or physical measurements. The steam sterilization conditions are described in the literature. Starting from these conditions, criteria for the validation of steam sterilization are derived and can be described in terms of physical parameters. Physical validation of steam sterilization appears to be an adequate and efficient validation method that could be considered as an alternative for indicator validation. Moreover, physical validation can be used for effective troubleshooting in steam sterilizing processes.
Systematic Review of Plant-Based Homeopathic Basic Research: An Update.
Ücker, Annekathrin; Baumgartner, Stephan; Sokol, Anezka; Huber, Roman; Doesburg, Paul; Jäger, Tim
2018-05-01
Plant-based test systems have been described as a useful tool for investigating possible effects of homeopathic preparations. The last reviews of this research field were published in 2009/2011. Due to recent developments in the field, an update is warranted. Publications on plant-based test systems were analysed with regard to publication quality, reproducibility and potential for further research. A literature search was conducted in online databases and specific journals, including publications from 2008 to 2017 dealing with plant-based test systems in homeopathic basic research. To be included, they had to contain statistical analysis and fulfil quality criteria according to a pre-defined manuscript information score (MIS). Publications scoring at least 5 points (maximum 10 points) were assumed to be adequate. They were analysed for the use of adequate controls, outcome and reproducibility. Seventy-four publications on plant-based test systems were found. Thirty-nine publications were either abstracts or proceedings of conferences and were excluded. From the remaining 35 publications, 26 reached a score of 5 or higher in the MIS. Adequate controls were used in 13 of these publications. All of them described specific effects of homeopathic preparations. The publication quality still varied: a substantial number of publications (23%) did not adequately document the methods used. Four reported on replication trials. One replication trial found effects of homeopathic preparations comparable to the original study. Three replication trials failed to confirm the original study but identified possible external influencing factors. Five publications described novel plant-based test systems. Eight trials used systematic negative control experiments to document test system stability. Regarding research design, future trials should implement adequate controls to identify specific effects of homeopathic preparations and include systematic negative control experiments. Further external and internal replication trials, and control of influencing factors, are needed to verify results. Standardised test systems should be developed. The Faculty of Homeopathy.
High prevalence of iodine deficiency in pregnant women living in adequate iodine area.
Mioto, Verônica Carneiro Borges; Monteiro, Ana Carolina de Castro Nassif Gomes; de Camargo, Rosalinda Yossie Asato; Borel, Andréia Rodrigues; Catarino, Regina Maria; Kobayashi, Sergio; Chammas, Maria Cristina; Marui, Suemi
2018-05-01
Iodine deficiency during pregnancy is associated with obstetric and neonatal adverse outcomes. Serum thyroglobulin (sTg) and thyroid volume (TV) are optional tools to urinary iodine concentration (UIC) for defining iodine status. This cross-sectional study aims to evaluate the iodine status of pregnant women living in iodine-adequate area by spot UIC and correlation with sTg, TV and thyroid function. Two hundred and seventy-three pregnant women were evaluated at three trimesters. All had no previous thyroid disease, no iodine supplementation and negative thyroperoxidase and thyroglobulin antibodies. Thyroid function and sTg were measured using electrochemiluminescence immunoassays. TV was determined by ultrasonography; UIC was determined using a modified Sandell-Kolthoff method. Median UIC was 146 µg/L, being 52% iodine deficient and only 4% excessive. TSH values were 1.50 ± 0.92, 1.50 ± 0.92 and 1.91 ± 0.96 mIU/L, respectively, in each trimester ( P = 0.001). sTg did not change significantly during trimesters with median 11.2 ng/mL and only 3.3% had above 40 ng/mL. Mean TV was 9.3 ± 3.4 mL, which positively correlated with body mass index, but not with sTg. Only 4.5% presented with goitre.When pregnant women were categorized as iodine deficient (UIC < 150 µg/L), adequate (≥150 and <250 µg/L) and excessive (≥250 µg/L), sTg, thyroid hormones and TV at each trimester showed no statistical differences. Iodine deficiency was detected frequently in pregnant women living in iodine-adequate area. sTg concentration and TV did not correlate to UIC. Our observation also demonstrated that the Brazilian salt-iodization programme prevents deficiency, but does not maintain iodine status within adequate and recommended ranges for pregnant women. © 2018 The authors.
Guetterman, Timothy C.; Creswell, John W.; Wittink, Marsha; Barg, Fran K.; Castro, Felipe G.; Dahlberg, Britt; Watkins, Daphne C.; Deutsch, Charles; Gallo, Joseph J.
2017-01-01
Introduction Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a Self-Rated Mixed Methods Skills Assessment and provide validity evidence. The instrument taps six research domains: “Research question,” “Design/approach,” “Sampling,” “Data collection,” “Analysis,” and “Dissemination.” Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. Methods We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using Cronbach’s alpha to assess reliability and an ANOVA that compared a mixed methods experience index with assessment scores to assess criterion-relatedness. Results Internal consistency reliability was high for the total set of items (.95) and adequate (>=.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (e.g., published a mixed methods paper) rated themselves as more skilled, which was statistically significant across the research domains. Discussion This Self-Rated Mixed Methods Assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning. PMID:28562495
Case-mix groups for VA hospital-based home care.
Smith, M E; Baker, C R; Branch, L G; Walls, R C; Grimes, R M; Karklins, J M; Kashner, M; Burrage, R; Parks, A; Rogers, P
1992-01-01
The purpose of this study is to group hospital-based home care (HBHC) patients homogeneously by their characteristics with respect to cost of care to develop alternative case mix methods for management and reimbursement (allocation) purposes. Six Veterans Affairs (VA) HBHC programs in Fiscal Year (FY) 1986 that maximized patient, program, and regional variation were selected, all of which agreed to participate. All HBHC patients active in each program on October 1, 1987, in addition to all new admissions through September 30, 1988 (FY88), comprised the sample of 874 unique patients. Statistical methods include the use of classification and regression trees (CART software: Statistical Software; Lafayette, CA), analysis of variance, and multiple linear regression techniques. The resulting algorithm is a three-factor model that explains 20% of the cost variance (R2 = 20%, with a cross validation R2 of 12%). Similar classifications such as the RUG-II, which is utilized for VA nursing home and intermediate care, the VA outpatient resource allocation model, and the RUG-HHC, utilized in some states for reimbursing home health care in the private sector, explained less of the cost variance and, therefore, are less adequate for VA home care resource allocation.
Tabelow, Karsten; König, Reinhard; Polzehl, Jörg
2016-01-01
Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809
Evaluating national cause-of-death statistics: principles and application to the case of China.
Rao, Chalapati; Lopez, Alan D.; Yang, Gonghuan; Begg, Stephen; Ma, Jiemin
2005-01-01
Mortality statistics systems provide basic information on the levels and causes of mortality in populations. Only a third of the world's countries have complete civil registration systems that yield adequate cause-specific mortality data for health policy-making and monitoring. This paper describes the development of a set of criteria for evaluating the quality of national mortality statistics and applies them to China as an example. The criteria cover a range of structural, statistical and technical aspects of national mortality data. Little is known about cause-of-death data in China, which is home to roughly one-fifth of the world's population. These criteria were used to evaluate the utility of data from two mortality statistics systems in use in China, namely the Ministry of Health-Vital Registration (MOH-VR) system and the Disease Surveillance Point (DSP) system. We concluded that mortality registration was incomplete in both. No statistics were available for geographical subdivisions of the country to inform resource allocation or for the monitoring of health programmes. Compilation and publication of statistics is irregular in the case of the DSP, and they are not made publicly available at all by the MOH-VR. More research is required to measure the content validity of cause-of-death attribution in the two systems, especially due to the use of verbal autopsy methods in rural areas. This framework of criteria-based evaluation is recommended for the evaluation of national mortality data in developing countries to determine their utility and to guide efforts to improve their value for guiding policy. PMID:16184281
25 CFR 700.267 - Disclosure of records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... system in which the record is maintained with advance adequate written assurance that the record will be used solely as a statistical research or reporting record, and the record is to be transferred in a...
28 CFR 512.15 - Access to Bureau of Prisons records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... may receive records in a form not individually identifiable when advance adequate written assurance that the record will be used solely as a statistical research or reporting record is provided to the...
43 CFR 2.56 - Disclosure of records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... responsible for the system in which the record is maintained with advance adequate written assurance that the record will be used solely as a statistical research or reporting record, and the record is to be...
4 CFR 83.4 - Conditions of disclosure.
Code of Federal Regulations, 2010 CFR
2010-01-01
...); or (d) To a recipient who has provided GAO with advance adequate written assurance that the record will be used solely as a statistical research or reporting record, and the record is to be transferred...
Afonso, P. Diana; Vinson, Emily N.; Turnbull, James D.; Morris, Karla K.; Foye, Adam; Madden, John F.; Roy Choudhury, Kingshuk; Febbo, Phillip G.; George, Daniel J.
2013-01-01
Purpose To determine the rate at which computed tomographically guided pelvic percutaneous bone biopsy in men with metastatic castration-resistant prostate cancer (mCRPC) yields adequate tissue for genomic profiling and to identify issues likely to affect diagnostic yields. Materials and Methods This study was institutional review board approved, and written informed consent was obtained. In a phase II trial assessing response to everolimus, 31 men with mCRPC underwent 54 biopsy procedures (eight men before and 23 men both before and during treatment). Variables assessed were lesion location (iliac wing adjacent to sacroiliac joint, iliac wing anterior and/or superior to sacroiliac joint, sacrum, and remainder of pelvis), mean lesion attenuation, subjective lesion attenuation (purely sclerotic vs mixed), central versus peripheral lesion sampling, lesion size, core number, and use of zoledronic acid for more than 1 year. Results Of 54 biopsy procedures, 21 (39%) yielded adequate tissue for RNA isolation and genomic profiling. Three of four sacral biopsies were adequate. Biopsies of the ilium adjacent to the sacroiliac joints were more likely adequate than those from elsewhere in the ilium (48% vs 28%, respectively). All five biopsies performed in other pelvic locations yielded inadequate tissue for RNA isolation. Mean attenuation of lesions with inadequate tissue was 172 HU greater than those with adequate tissue (621.1 HU ± 166 vs 449 HU ± 221, respectively; P = .002). Use of zoledronic acid, peripheral sampling, core number, and lesion size affected yields, but the differences were not statistically significant. Histologic examination with hematoxylin-eosin staining showed that results of 36 (67%) biopsies were positive for cancer; only mean attenuation differences were significant (707 HU ± 144 vs 473 HU ± 191, negative vs positive, respectively; P < .001). Conclusion In men with mCRPC, percutaneous sampling of osseous metastases for genomic profiling is possible, but use of zoledronic acid for more than 1 year may reduce the yield of adequate tissue for RNA isolation. Sampling large low-attenuating lesions at their periphery maximizes yield. © RSNA, 2013 PMID:23925271
Sociodemographic factors associated with pregnant women's level of knowledge about oral health
Barbieri, Wander; Peres, Stela Verzinhasse; Pereira, Carla de Britto; Peres, João; de Sousa, Maria da Luz Rosário; Cortellazzi, Karine Laura
2018-01-01
ABSTRACT Objective To evaluate knowledge on oral health and associated sociodemographic factors in pregnant women. Methods A cross-sectional study with a sample of 195 pregnant women seen at the Primary Care Unit Paraisópolis I, in São Paulo (SP), Brazil. For statistical analysis, χ2 or Fisher's exact test and multiple logistic regression were used. A significance level of 5% was used in all analyses. Results Schooling level equal to or greater than 8 years and having one or two children were associated with an adequate knowledge about oral health. Conclusion Oral health promotion strategies during prenatal care should take into account sociodemographic aspects. PMID:29694612
Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias
2011-11-01
Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.
An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less
Tukiendorf, Andrzej; Mansournia, Mohammad Ali; Wydmański, Jerzy; Wolny-Rokicka, Edyta
2017-04-01
Background: Clinical datasets for epithelial ovarian cancer brain metastatic patients are usually small in size. When adequate case numbers are lacking, resulting estimates of regression coefficients may demonstrate bias. One of the direct approaches to reduce such sparse-data bias is based on penalized estimation. Methods: A re- analysis of formerly reported hazard ratios in diagnosed patients was performed using penalized Cox regression with a popular SAS package providing additional software codes for a statistical computational procedure. Results: It was found that the penalized approach can readily diminish sparse data artefacts and radically reduce the magnitude of estimated regression coefficients. Conclusions: It was confirmed that classical statistical approaches may exaggerate regression estimates or distort study interpretations and conclusions. The results support the thesis that penalization via weak informative priors and data augmentation are the safest approaches to shrink sparse data artefacts frequently occurring in epidemiological research. Creative Commons Attribution License
Williams, Donald R; Carlsson, Rickard; Bürkner, Paul-Christian
2017-10-01
Developmental studies of hormones and behavior often include littermates-rodent siblings that share early-life experiences and genes. Due to between-litter variation (i.e., litter effects), the statistical assumption of independent observations is untenable. In two literatures-natural variation in maternal care and prenatal stress-entire litters are categorized based on maternal behavior or experimental condition. Here, we (1) review both literatures; (2) simulate false positive rates for commonly used statistical methods in each literature; and (3) characterize small sample performance of multilevel models (MLM) and generalized estimating equations (GEE). We found that the assumption of independence was routinely violated (>85%), false positives (α=0.05) exceeded nominal levels (up to 0.70), and power (1-β) rarely surpassed 0.80 (even for optimistic sample and effect sizes). Additionally, we show that MLMs and GEEs have adequate performance for common research designs. We discuss implications for the extant literature, the field of behavioral neuroendocrinology, and provide recommendations. Copyright © 2017 Elsevier Inc. All rights reserved.
Optimized design and analysis of preclinical intervention studies in vivo
Laajala, Teemu D.; Jumppanen, Mikael; Huhtaniemi, Riikka; Fey, Vidal; Kaur, Amanpreet; Knuuttila, Matias; Aho, Eija; Oksala, Riikka; Westermarck, Jukka; Mäkelä, Sari; Poutanen, Matti; Aittokallio, Tero
2016-01-01
Recent reports have called into question the reproducibility, validity and translatability of the preclinical animal studies due to limitations in their experimental design and statistical analysis. To this end, we implemented a matching-based modelling approach for optimal intervention group allocation, randomization and power calculations, which takes full account of the complex animal characteristics at baseline prior to interventions. In prostate cancer xenograft studies, the method effectively normalized the confounding baseline variability, and resulted in animal allocations which were supported by RNA-seq profiling of the individual tumours. The matching information increased the statistical power to detect true treatment effects at smaller sample sizes in two castration-resistant prostate cancer models, thereby leading to saving of both animal lives and research costs. The novel modelling approach and its open-source and web-based software implementations enable the researchers to conduct adequately-powered and fully-blinded preclinical intervention studies, with the aim to accelerate the discovery of new therapeutic interventions. PMID:27480578
Optimized design and analysis of preclinical intervention studies in vivo.
Laajala, Teemu D; Jumppanen, Mikael; Huhtaniemi, Riikka; Fey, Vidal; Kaur, Amanpreet; Knuuttila, Matias; Aho, Eija; Oksala, Riikka; Westermarck, Jukka; Mäkelä, Sari; Poutanen, Matti; Aittokallio, Tero
2016-08-02
Recent reports have called into question the reproducibility, validity and translatability of the preclinical animal studies due to limitations in their experimental design and statistical analysis. To this end, we implemented a matching-based modelling approach for optimal intervention group allocation, randomization and power calculations, which takes full account of the complex animal characteristics at baseline prior to interventions. In prostate cancer xenograft studies, the method effectively normalized the confounding baseline variability, and resulted in animal allocations which were supported by RNA-seq profiling of the individual tumours. The matching information increased the statistical power to detect true treatment effects at smaller sample sizes in two castration-resistant prostate cancer models, thereby leading to saving of both animal lives and research costs. The novel modelling approach and its open-source and web-based software implementations enable the researchers to conduct adequately-powered and fully-blinded preclinical intervention studies, with the aim to accelerate the discovery of new therapeutic interventions.
Nurses' foot care activities in home health care.
Stolt, Minna; Suhonen, Riitta; Puukka, Pauli; Viitanen, Matti; Voutilainen, Päivi; Leino-Kilpi, Helena
2013-01-01
This study described the basic foot care activities performed by nurses and factors associated with these in the home care of older people. Data were collected from nurses (n=322) working in nine public home care agencies in Finland using the Nurses' Foot Care Activities Questionnaire (NFAQ). Data were analyzed statistically using descriptive statistics and multivariate liner models. Although some of the basic foot care activities of nurses reported using were outdated, the majority of foot care activities were consistent with recommendations in foot care literature. Longer working experience, referring patients with foot problems to a podiatrist and physiotherapist, and patient education in wart and nail care were associated with a high score for adequate foot care activities. Continuing education should focus on updating basic foot care activities and increasing the use of evidence-based foot care methods. Also, geriatric nursing research should focus in intervention research to improve the use of evidence-based basic foot care activities. Copyright © 2013 Mosby, Inc. All rights reserved.
Singh, Gyanender P.; Gonczy, Steve T.; Deck, Christian P.; ...
2018-04-19
An interlaboratory round robin study was conducted on the tensile strength of SiC–SiC ceramic matrix composite (CMC) tubular test specimens at room temperature with the objective of expanding the database of mechanical properties of nuclear grade SiC–SiC and establishing the precision and bias statement for standard test method ASTM C1773. The mechanical properties statistics from the round robin study and the precision statistics and precision statement are presented herein. The data show reasonable consistency across the laboratories, indicating that the current C1773–13 ASTM standard is adequate for testing ceramic fiber reinforced ceramic matrix composite tubular test specimen. Furthermore, it wasmore » found that the distribution of ultimate tensile strength data was best described with a two–parameter Weibull distribution, while a lognormal distribution provided a good description of the distribution of proportional limit stress data.« less
Goudouri, Ourania-Menti; Kontonasaki, Eleana; Papadopoulou, Lambrini; Manda, Marianthi; Kavouras, Panagiotis; Triantafyllidis, Konstantinos S; Stefanidou, Maria; Koidis, Petros; Paraskevopoulos, Konstantinos M
2017-02-01
The aim of this study was the evaluation of the textural characteristics of an experimental sol-gel derived feldspathic dental ceramic, which has already been proven bioactive and the investigation of its flexural strength through Weibull Statistical Analysis. The null hypothesis was that the flexural strength of the experimental and the commercial dental ceramic would be of the same order, resulting in a dental ceramic with apatite forming ability and adequate mechanical integrity. Although the flexural strength of the experimental ceramics was not statistically significant different compared to the commercial one, the amount of blind pores due to processing was greater. The textural characteristics of the experimental ceramic were in accordance with the standard low porosity levels reported for dental ceramics used for fixed prosthetic restorations. Feldspathic dental ceramics with typical textural characteristics and advanced mechanical properties as well as enhanced apatite forming ability can be synthesized through the sol-gel method. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ward, P. J.
1990-01-01
Recent developments have related quantitative trait expression to metabolic flux. The present paper investigates some implications of this for statistical aspects of polygenic inheritance. Expressions are derived for the within-sibship genetic mean and genetic variance of metabolic flux given a pair of parental, diploid, n-locus genotypes. These are exact and hold for arbitrary numbers of gene loci, arbitrary allelic values at each locus, and for arbitrary recombination fractions between adjacent gene loci. The within-sibship, genetic variance is seen to be simply a measure of parental heterozygosity plus a measure of the degree of linkage coupling within the parental genotypes. Approximations are given for the within-sibship phenotypic mean and variance of metabolic flux. These results are applied to the problem of attaining adequate statistical power in a test of association between allozymic variation and inter-individual variation in metabolic flux. Simulations indicate that statistical power can be greatly increased by augmenting the data with predictions and observations on progeny statistics in relation to parental allozyme genotypes. Adequate power may thus be attainable at small sample sizes, and when allozymic variation is scored at a only small fraction of the total set of loci whose catalytic products determine the flux. PMID:2379825
Multivariate model of female black bear habitat use for a Geographic Information System
Clark, Joseph D.; Dunn, James E.; Smith, Kimberly G.
1993-01-01
Simple univariate statistical techniques may not adequately assess the multidimensional nature of habitats used by wildlife. Thus, we developed a multivariate method to model habitat-use potential using a set of female black bear (Ursus americanus) radio locations and habitat data consisting of forest cover type, elevation, slope, aspect, distance to roads, distance to streams, and forest cover type diversity score in the Ozark Mountains of Arkansas. The model is based on the Mahalanobis distance statistic coupled with Geographic Information System (GIS) technology. That statistic is a measure of dissimilarity and represents a standardized squared distance between a set of sample variates and an ideal based on the mean of variates associated with animal observations. Calculations were made with the GIS to produce a map containing Mahalanobis distance values within each cell on a 60- × 60-m grid. The model identified areas of high habitat use potential that could not otherwise be identified by independent perusal of any single map layer. This technique avoids many pitfalls that commonly affect typical multivariate analyses of habitat use and is a useful tool for habitat manipulation or mitigation to favor terrestrial vertebrates that use habitats on a landscape scale.
Accounting for measurement error: a critical but often overlooked process.
Harris, Edward F; Smith, Richard N
2009-12-01
Due to instrument imprecision and human inconsistencies, measurements are not free of error. Technical error of measurement (TEM) is the variability encountered between dimensions when the same specimens are measured at multiple sessions. A goal of a data collection regimen is to minimise TEM. The few studies that actually quantify TEM, regardless of discipline, report that it is substantial and can affect results and inferences. This paper reviews some statistical approaches for identifying and controlling TEM. Statistically, TEM is part of the residual ('unexplained') variance in a statistical test, so accounting for TEM, which requires repeated measurements, enhances the chances of finding a statistically significant difference if one exists. The aim of this paper was to review and discuss common statistical designs relating to types of error and statistical approaches to error accountability. This paper addresses issues of landmark location, validity, technical and systematic error, analysis of variance, scaled measures and correlation coefficients in order to guide the reader towards correct identification of true experimental differences. Researchers commonly infer characteristics about populations from comparatively restricted study samples. Most inferences are statistical and, aside from concerns about adequate accounting for known sources of variation with the research design, an important source of variability is measurement error. Variability in locating landmarks that define variables is obvious in odontometrics, cephalometrics and anthropometry, but the same concerns about measurement accuracy and precision extend to all disciplines. With increasing accessibility to computer-assisted methods of data collection, the ease of incorporating repeated measures into statistical designs has improved. Accounting for this technical source of variation increases the chance of finding biologically true differences when they exist.
46 CFR 503.61 - Conditions of disclosure.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 13 U.S.C.; (5) To a recipient who has provided the Commission with adequate advance written assurance that the record will be used solely as a statistical research or reporting record, and the record is to...
42 CFR 412.22 - Excluded hospitals and hospital units: General rules.
Code of Federal Regulations, 2010 CFR
2010-10-01
... must meet the governance and control requirements at paragraphs (e)(1)(i) through (e)(1)(iv) of this... allocates costs and maintains adequate statistical data to support the basis of allocation. (G) It reports...
Radioactivity measurement of radioactive contaminated soil by using a fiber-optic radiation sensor
NASA Astrophysics Data System (ADS)
Joo, Hanyoung; Kim, Rinah; Moon, Joo Hyun
2016-06-01
A fiber-optic radiation sensor (FORS) was developed to measure the gamma radiation from radioactive contaminated soil. The FORS was fabricated using an inorganic scintillator (Lu,Y)2SiO5:Ce (LYSO:Ce), a mixture of epoxy resin and hardener, aluminum foil, and a plastic optical fiber. Before its real application, the FORS was tested to determine if it performed adequately. The test result showed that the measurements by the FORS adequately followed the theoretically estimated values. Then, the FORS was applied to measure the gamma radiation from radioactive contaminated soil. For comparison, a commercial radiation detector was also applied to measure the same soil samples. The measurement data were analyzed by using a statistical parameter, the critical level to determine if net radioactivity statistically different from background was present in the soil sample. The analysis showed that the soil sample had radioactivity distinguishable from background.
SU-F-T-212: A Comparison of Treatment Strategies for Intracranial Stereotactic Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamberton, T; Slater, J; Wroe, A
2016-06-15
Purpose: Stereotactic radiosurgery is an effective and noninvasive treatment for intracranial lesions that uses highly focused radiation beams in a single treatment fraction. The purpose of this study is to investigate the dosimetric differences between the treatment brain metastasis with a proton beam vs. intensity modulated radiation therapy (IMRT). Methods: Ten separate brain metastasis targets where chosen and treatment plans were created for each, using three different strategies: custom proton beam shaping devices, standardized proton beam shaping devices, and IMRT. Each plan was required to satisfy set parameters for providing adequate coverage and minimizing risk to adjacent tissues. The effectivenessmore » of each plan was calculated by comparing the homogeneity index, conformity index, and V12 for each target using a paired one tailed T-test (α=0.05). Specific comparison of the conformity indices was also made using a subcategory containing targets with volume>1cc. Results: There was no significant difference between the homogeneity indices of the three plans (p>0.05), showing that each plan has the capability of adequately covering the targets. There was a statistically significant difference (p<0.01) between the conformity indices of the custom and the standard proton plan, as with the custom proton and IMRT (p<0.01), with custom proton showing stronger conformity to the target in both cases. There was also a statistical difference between the V12 of all three plans (Custom v. Standardized: p=0.02, Custom v. IMRT: p<0.01, Standardized v. IMRT: p<0.01) with custom proton supplying the lowest dose to surrounding tissues. For large targets (volume>1cc) there was no statistical difference between the proton plans and the IMRT treatment for the conformity index. Conclusion: A custom proton plan is the recommended treatment explored in this study as it is the most reliable way of effectively treating the target while sparing the maximum amount of normal tissue.« less
Cervical vertebral maturation as a biologic indicator of skeletal maturity.
Santiago, Rodrigo César; de Miranda Costa, Luiz Felipe; Vitral, Robert Willer Farinazzo; Fraga, Marcelo Reis; Bolognese, Ana Maria; Maia, Lucianne Cople
2012-11-01
To identify and review the literature regarding the reliability of cervical vertebrae maturation (CVM) staging to predict the pubertal spurt. The selection criteria included cross-sectional and longitudinal descriptive studies in humans that evaluated qualitatively or quantitatively the accuracy and reproducibility of the CVM method on lateral cephalometric radiographs, as well as the correlation with a standard method established by hand-wrist radiographs. The searches retrieved 343 unique citations. Twenty-three studies met the inclusion criteria. Six articles had moderate to high scores, while 17 of 23 had low scores. Analysis also showed a moderate to high statistically significant correlation between CVM and hand-wrist maturation methods. There was a moderate to high reproducibility of the CVM method, and only one specific study investigated the accuracy of the CVM index in detecting peak pubertal growth. This systematic review has shown that the studies on CVM method for radiographic assessment of skeletal maturation stages suffer from serious methodological failures. Better-designed studies with adequate accuracy, reproducibility, and correlation analysis, including studies with appropriate sensitivity-specificity analysis, should be performed.
Bala, D V; Vyas, S; Shukla, A; Tiwari, H; Bhatt, G; Gupta, K
2012-07-01
This study compared the validity of the haemoglobin colour scale (HCS) and clinical signs in diagnosing anaemia against Sahli's haemoglobinometer method as the gold standard, and assessed the reliability of HCS. The sample comprised 129 pregnant women recruited from 6 urban health centres in Ahmedabad. The prevalence of anaemia was 69.8% by Sahli's method, 78.3% by HCS and 89.9% by clinical signs; there was no statistically significant difference between Sahli's method and HCS whereas there was between Sahlis method and clinical signs. The mean haemoglobin level by Sahli's method and HCS differed significantly. The sensitivity, specificity, positive predictive value and negative predictive value of HCS was 83.3%, 33.3%, 74.3% and 46.4% respectively and that of clinical signs was 91.1%, 12.8%, 70.7% and 38.5% respectively. Interobserver agreement for HCS was moderate (K = 0.43). Clinical signs are better than HCS for diagnosing anaemia. HCS can be used in the field provided assessors are adequately trained.
ERIC Educational Resources Information Center
Cairney, John; Streiner, David L.
2011-01-01
Although statistics such as kappa and phi are commonly used to assess agreement between tests, in situations where the base rate of a disorder in a population is low or high, these statistics tend to underestimate actual agreement. This can occur even if the tests are good and the classification of subjects is adequate. Relative improvement over…
Song, Fujian; Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-08-16
To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. The study included 112 independent trial networks (including 1552 trials with 478,775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence.
Data on the migration of health-care workers: sources, uses, and challenges.
Diallo, Khassoum
2004-01-01
The migration of health workers within and between countries is a growing concern worldwide because of its impact on health systems in developing and developed countries alike. Policy decisions need to be made at the national, regional and international levels to manage more effectively this phenomenon, but those decisions will be effective and correctly implemented and evaluated only if they are based on adequate statistical data. Most statistics on the migration of health-care workers are neither complete nor fully comparable, and they are often underused, limited (because they often give only a broad description of the phenomena) and not as timely as required. There is also a conflict between the wide range of potential sources of data and the poor statistical evidence on the migration of health personnel. There are two major problems facing researchers who wish to provide evidence on this migration: the problems commonly faced when studying migration in general, such as definitional and comparability problems of "worker migrations" and those related to the specific movements of the health workforce. This paper presents information on the uses of statistics and those who use them, the strengths and limitations of the main data sources, and other challenges that need to be met to obtain good evidence on the migration of health workers. This paper also proposes methods to improve the collection, analysis, sharing, and use of statistics on the migration of health workers. PMID:15375450
Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-01-01
Objective To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Design Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Main outcome measure Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. Results The study included 112 independent trial networks (including 1552 trials with 478 775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Conclusions Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence. PMID:21846695
Optimal sample sizes for the design of reliability studies: power consideration.
Shieh, Gwowen
2014-09-01
Intraclass correlation coefficients are used extensively to measure the reliability or degree of resemblance among group members in multilevel research. This study concerns the problem of the necessary sample size to ensure adequate statistical power for hypothesis tests concerning the intraclass correlation coefficient in the one-way random-effects model. In view of the incomplete and problematic numerical results in the literature, the approximate sample size formula constructed from Fisher's transformation is reevaluated and compared with an exact approach across a wide range of model configurations. These comprehensive examinations showed that the Fisher transformation method is appropriate only under limited circumstances, and therefore it is not recommended as a general method in practice. For advance design planning of reliability studies, the exact sample size procedures are fully described and illustrated for various allocation and cost schemes. Corresponding computer programs are also developed to implement the suggested algorithms.
Sampling studies to estimate the HIV prevalence rate in female commercial sex workers.
Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Barbosa Júnior, Aristides
2010-01-01
We investigated sampling methods being used to estimate the HIV prevalence rate among female commercial sex workers. The studies were classified according to the adequacy or not of the sample size to estimate HIV prevalence rate and according to the sampling method (probabilistic or convenience). We identified 75 studies that estimated the HIV prevalence rate among female sex workers. Most of the studies employed convenience samples. The sample size was not adequate to estimate HIV prevalence rate in 35 studies. The use of convenience sample limits statistical inference for the whole group. It was observed that there was an increase in the number of published studies since 2005, as well as in the number of studies that used probabilistic samples. This represents a large advance in the monitoring of risk behavior practices and HIV prevalence rate in this group.
Streamflow characteristics related to channel geometry of streams in western United States
Hedman, E.R.; Osterkamp, W.R.
1982-01-01
Assessment of surface-mining and reclamation activities generally requires extensive hydrologic data. Adequate streamflow data from instrumented gaging stations rarely are available, and estimates of surface- water discharge based on rainfall-runoff models, drainage area, and basin characteristics sometimes have proven unreliable. Channel-geometry measurements offer an alternative method of quickly and inexpensively estimating stream-flow characteristics for ungaged streams. The method uses the empirical development of equations to yield a discharge value from channel-geometry and channel-material data. The equations are developed by collecting data at numerous streamflow-gaging sites and statistically relating those data to selected discharge characteristics. Mean annual runoff and flood discharges with selected recurrence intervals can be estimated for perennial, intermittent, and ephemeral streams. The equations were developed from data collected in the western one-half of the conterminous United States. The effect of the channel-material and runoff characteristics are accounted for with the equations.
Simulation Analysis of Computer-Controlled pressurization for Mixture Ratio Control
NASA Technical Reports Server (NTRS)
Alexander, Leslie A.; Bishop-Behel, Karen; Benfield, Michael P. J.; Kelley, Anthony; Woodcock, Gordon R.
2005-01-01
A procedural code (C++) simulation was developed to investigate potentials for mixture ratio control of pressure-fed spacecraft rocket propulsion systems by measuring propellant flows, tank liquid quantities, or both, and using feedback from these measurements to adjust propellant tank pressures to set the correct operating mixture ratio for minimum propellant residuals. The pressurization system eliminated mechanical regulators in favor of a computer-controlled, servo- driven throttling valve. We found that a quasi-steady state simulation (pressure and flow transients in the pressurization systems resulting from changes in flow control valve position are ignored) is adequate for this purpose. Monte-Carlo methods are used to obtain simulated statistics on propellant depletion. Mixture ratio control algorithms based on proportional-integral-differential (PID) controller methods were developed. These algorithms actually set target tank pressures; the tank pressures are controlled by another PID controller. Simulation indicates this approach can provide reductions in residual propellants.
Protecting privacy of shared epidemiologic data without compromising analysis potential.
Cologne, John; Grant, Eric J; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.
Correction of Dual-PRF Doppler Velocity Outliers in the Presence of Aliasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altube, Patricia; Bech, Joan; Argemí, Oriol
In Doppler weather radars, the presence of unfolding errors or outliers is a well-known quality issue for radial velocity fields estimated using the dual–pulse repetition frequency (PRF) technique. Postprocessing methods have been developed to correct dual-PRF outliers, but these need prior application of a dealiasing algorithm for an adequate correction. Our paper presents an alternative procedure based on circular statistics that corrects dual-PRF errors in the presence of extended Nyquist aliasing. The correction potential of the proposed method is quantitatively tested by means of velocity field simulations and is exemplified in the application to real cases, including severe storm events.more » The comparison with two other existing correction methods indicates an improved performance in the correction of clustered outliers. The technique we propose is well suited for real-time applications requiring high-quality Doppler radar velocity fields, such as wind shear and mesocyclone detection algorithms, or assimilation in numerical weather prediction models.« less
Correction of Dual-PRF Doppler Velocity Outliers in the Presence of Aliasing
Altube, Patricia; Bech, Joan; Argemí, Oriol; ...
2017-07-18
In Doppler weather radars, the presence of unfolding errors or outliers is a well-known quality issue for radial velocity fields estimated using the dual–pulse repetition frequency (PRF) technique. Postprocessing methods have been developed to correct dual-PRF outliers, but these need prior application of a dealiasing algorithm for an adequate correction. Our paper presents an alternative procedure based on circular statistics that corrects dual-PRF errors in the presence of extended Nyquist aliasing. The correction potential of the proposed method is quantitatively tested by means of velocity field simulations and is exemplified in the application to real cases, including severe storm events.more » The comparison with two other existing correction methods indicates an improved performance in the correction of clustered outliers. The technique we propose is well suited for real-time applications requiring high-quality Doppler radar velocity fields, such as wind shear and mesocyclone detection algorithms, or assimilation in numerical weather prediction models.« less
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Viero, Daniele Pietro; Carniello, Luca; Defina, Andrea; D'Alpaos, Luigi
2014-09-01
Providing reliable and accurate storm surge forecasts is important for a wide range of problems related to coastal environments. In order to adequately support decision-making processes, it also become increasingly important to be able to estimate the uncertainty associated with the storm surge forecast. The procedure commonly adopted to do this uses the results of a hydrodynamic model forced by a set of different meteorological forecasts; however, this approach requires a considerable, if not prohibitive, computational cost for real-time application. In the present paper we present two simplified methods for estimating the uncertainty affecting storm surge prediction with moderate computational effort. In the first approach we use a computationally fast, statistical tidal model instead of a hydrodynamic numerical model to estimate storm surge uncertainty. The second approach is based on the observation that the uncertainty in the sea level forecast mainly stems from the uncertainty affecting the meteorological fields; this has led to the idea to estimate forecast uncertainty via a linear combination of suitable meteorological variances, directly extracted from the meteorological fields. The proposed methods were applied to estimate the uncertainty in the storm surge forecast in the Venice Lagoon. The results clearly show that the uncertainty estimated through a linear combination of suitable meteorological variances nicely matches the one obtained using the deterministic approach and overcomes some intrinsic limitations in the use of a statistical tidal model.
Mechanistic analysis of challenge-response experiments.
Shotwell, M S; Drake, K J; Sidorov, V Y; Wikswo, J P
2013-09-01
We present an application of mechanistic modeling and nonlinear longitudinal regression in the context of biomedical response-to-challenge experiments, a field where these methods are underutilized. In this type of experiment, a system is studied by imposing an experimental challenge, and then observing its response. The combination of mechanistic modeling and nonlinear longitudinal regression has brought new insight, and revealed an unexpected opportunity for optimal design. Specifically, the mechanistic aspect of our approach enables the optimal design of experimental challenge characteristics (e.g., intensity, duration). This article lays some groundwork for this approach. We consider a series of experiments wherein an isolated rabbit heart is challenged with intermittent anoxia. The heart responds to the challenge onset, and recovers when the challenge ends. The mean response is modeled by a system of differential equations that describe a candidate mechanism for cardiac response to anoxia challenge. The cardiac system behaves more variably when challenged than when at rest. Hence, observations arising from this experiment exhibit complex heteroscedasticity and sharp changes in central tendency. We present evidence that an asymptotic statistical inference strategy may fail to adequately account for statistical uncertainty. Two alternative methods are critiqued qualitatively (i.e., for utility in the current context), and quantitatively using an innovative Monte-Carlo method. We conclude with a discussion of the exciting opportunities in optimal design of response-to-challenge experiments. © 2013, The International Biometric Society.
Potential Mediators in Parenting and Family Intervention: Quality of Mediation Analyses
Patel, Chandni C.; Fairchild, Amanda J.; Prinz, Ronald J.
2017-01-01
Parenting and family interventions have repeatedly shown effectiveness in preventing and treating a range of youth outcomes. Accordingly, investigators in this area have conducted a number of studies using statistical mediation to examine some of the potential mechanisms of action by which these interventions work. This review examined from a methodological perspective in what ways and how well the family-based intervention studies tested statistical mediation. A systematic search identified 73 published outcome studies that tested mediation for family-based interventions across a wide range of child and adolescent outcomes (i.e., externalizing, internalizing, and substance-abuse problems; high-risk sexual activity; and academic achievement), for putative mediators pertaining to positive and negative parenting, family functioning, youth beliefs and coping skills, and peer relationships. Taken as a whole, the studies used designs that adequately addressed temporal precedence. The majority of studies used the product of coefficients approach to mediation, which is preferred, and less limiting than the causal steps approach. Statistical significance testing did not always make use of the most recently developed approaches, which would better accommodate small sample sizes and more complex functions. Specific recommendations are offered for future mediation studies in this area with respect to full longitudinal design, mediation approach, significance testing method, documentation and reporting of statistics, testing of multiple mediators, and control for Type I error. PMID:28028654
Improving Evaluation of Dental Hygiene Students' Cultural Competence with a Mixed-Methods Approach.
Flynn, Priscilla; Sarkarati, Nassim
2018-02-01
Most dental hygiene educational programs include cultural competence education, but may not evaluate student outcomes. The aim of this study was to design and implement a mixed-methods evaluation to measure dental hygiene students' progression toward cultural competence. Two cohorts consisting of consecutive classes in one U.S. dental hygiene program participated in the study. A total of 47 dental hygiene students (100% response rate) completed self-assessments to measure their attitudes and knowledge at three time points between 2014 and 2016. Mean scores were calculated for three domains: Physical Environment, Communication, and Values. Qualitative analysis of the students' cultural diversity papers was also conducted to further evaluate students' knowledge and skills. Bennett's five-level conceptual framework was used to code phrases or sentences to place students in the general categories of ethnocentric or ethno-relative. The quantitative and qualitative results yielded different outcomes for Cohort 1, but not for Cohort 2. The Cohort 1 students assessed themselves statistically significantly lower over time in one of the three measured domains. However, the Cohort 2 students assessed themselves as statistically significantly more culturally competent in all three domains. Qualitative results placed 72% of Cohort 1 students and 83% of Cohort 2 students in the more desirable ethno-relative category. Since quantitative methods consisting of student self-assessments may not adequately measure students' cultural competence, adding qualitative methods to measure skills specific to patient care in this study added a robust dimension to evaluating this complex dental hygiene student competence.
Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I
2015-01-01
Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.
Imaging of neural oscillations with embedded inferential and group prevalence statistics.
Donhauser, Peter W; Florin, Esther; Baillet, Sylvain
2018-02-01
Magnetoencephalography and electroencephalography (MEG, EEG) are essential techniques for studying distributed signal dynamics in the human brain. In particular, the functional role of neural oscillations remains to be clarified. For that reason, imaging methods need to identify distinct brain regions that concurrently generate oscillatory activity, with adequate separation in space and time. Yet, spatial smearing and inhomogeneous signal-to-noise are challenging factors to source reconstruction from external sensor data. The detection of weak sources in the presence of stronger regional activity nearby is a typical complication of MEG/EEG source imaging. We propose a novel, hypothesis-driven source reconstruction approach to address these methodological challenges. The imaging with embedded statistics (iES) method is a subspace scanning technique that constrains the mapping problem to the actual experimental design. A major benefit is that, regardless of signal strength, the contributions from all oscillatory sources, which activity is consistent with the tested hypothesis, are equalized in the statistical maps produced. We present extensive evaluations of iES on group MEG data, for mapping 1) induced oscillations using experimental contrasts, 2) ongoing narrow-band oscillations in the resting-state, 3) co-modulation of brain-wide oscillatory power with a seed region, and 4) co-modulation of oscillatory power with peripheral signals (pupil dilation). Along the way, we demonstrate several advantages of iES over standard source imaging approaches. These include the detection of oscillatory coupling without rejection of zero-phase coupling, and detection of ongoing oscillations in deeper brain regions, where signal-to-noise conditions are unfavorable. We also show that iES provides a separate evaluation of oscillatory synchronization and desynchronization in experimental contrasts, which has important statistical advantages. The flexibility of iES allows it to be adjusted to many experimental questions in systems neuroscience.
Imaging of neural oscillations with embedded inferential and group prevalence statistics
2018-01-01
Magnetoencephalography and electroencephalography (MEG, EEG) are essential techniques for studying distributed signal dynamics in the human brain. In particular, the functional role of neural oscillations remains to be clarified. For that reason, imaging methods need to identify distinct brain regions that concurrently generate oscillatory activity, with adequate separation in space and time. Yet, spatial smearing and inhomogeneous signal-to-noise are challenging factors to source reconstruction from external sensor data. The detection of weak sources in the presence of stronger regional activity nearby is a typical complication of MEG/EEG source imaging. We propose a novel, hypothesis-driven source reconstruction approach to address these methodological challenges. The imaging with embedded statistics (iES) method is a subspace scanning technique that constrains the mapping problem to the actual experimental design. A major benefit is that, regardless of signal strength, the contributions from all oscillatory sources, which activity is consistent with the tested hypothesis, are equalized in the statistical maps produced. We present extensive evaluations of iES on group MEG data, for mapping 1) induced oscillations using experimental contrasts, 2) ongoing narrow-band oscillations in the resting-state, 3) co-modulation of brain-wide oscillatory power with a seed region, and 4) co-modulation of oscillatory power with peripheral signals (pupil dilation). Along the way, we demonstrate several advantages of iES over standard source imaging approaches. These include the detection of oscillatory coupling without rejection of zero-phase coupling, and detection of ongoing oscillations in deeper brain regions, where signal-to-noise conditions are unfavorable. We also show that iES provides a separate evaluation of oscillatory synchronization and desynchronization in experimental contrasts, which has important statistical advantages. The flexibility of iES allows it to be adjusted to many experimental questions in systems neuroscience. PMID:29408902
Huang, Y F; Chang, Z; Bai, J; Zhu, M; Zhang, M X; Wang, M; Zhang, G; Li, X Y; Tong, Y G; Wang, J L; Lu, X X
2017-08-08
Objective: To establish and evaluate the feasibility of a pretreatment method for matrix-assisted laser desorption ionization-time of flight mass spectrometry identification of filamentous fungi developed by the laboratory. Methods: Three hundred and eighty strains of filamentous fungi from January 2014 to December 2016 were recovered and cultured on sabouraud dextrose agar (SDA) plate at 28 ℃ to mature state. Meanwhile, the fungi were cultured in liquid sabouraud medium with a vertical rotation method recommended by Bruker and a horizontal vibration method developed by the laboratory until adequate amount of colonies were observed. For the strains cultured with the three methods, protein was extracted with modified magnetic bead-based extraction method for mass spectrum identification. Results: For 380 fungi strains, it took 3-10 d to culture with SDA culture method, and the ratio of identification of the species and genus was 47% and 81%, respectively; it took 5-7 d to culture with vertical rotation method, and the ratio of identification of the species and genus was 76% and 94%, respectively; it took 1-2 d to culture with horizontal vibration method, and the ratio of identification of the species and genus was 96% and 99%, respectively. For the comparison between horizontal vibration method and SDA culture method comparison, the difference was statistically significant (χ(2)=39.026, P <0.01); for the comparison between horizontal vibration method and vertical rotation method recommended by Bruker, the difference was statistically significant(χ(2)=11.310, P <0.01). Conclusion: The horizontal vibration method and modified magnetic bead-based extraction method developed by the laboratory is superior to the method recommended by Bruker and SDA culture method in terms of the identification capacity for filamentous fungi, which can be applied in clinic.
Cumulative risk assessment for combined health effects from chemical and nonchemical stressors.
Sexton, Ken; Linder, Stephen H
2011-12-01
Cumulative risk assessment is a science policy tool for organizing and analyzing information to examine, characterize, and possibly quantify combined threats from multiple environmental stressors. We briefly survey the state of the art regarding cumulative risk assessment, emphasizing challenges and complexities of moving beyond the current focus on chemical mixtures to incorporate nonchemical stressors, such as poverty and discrimination, into the assessment paradigm. Theoretical frameworks for integrating nonchemical stressors into cumulative risk assessments are discussed, the impact of geospatial issues on interpreting results of statistical analyses is described, and four assessment methods are used to illustrate the diversity of current approaches. Prospects for future progress depend on adequate research support as well as development and verification of appropriate analytic frameworks.
Cumulative Risk Assessment for Combined Health Effects From Chemical and Nonchemical Stressors
Linder, Stephen H.
2011-01-01
Cumulative risk assessment is a science policy tool for organizing and analyzing information to examine, characterize, and possibly quantify combined threats from multiple environmental stressors. We briefly survey the state of the art regarding cumulative risk assessment, emphasizing challenges and complexities of moving beyond the current focus on chemical mixtures to incorporate nonchemical stressors, such as poverty and discrimination, into the assessment paradigm. Theoretical frameworks for integrating nonchemical stressors into cumulative risk assessments are discussed, the impact of geospatial issues on interpreting results of statistical analyses is described, and four assessment methods are used to illustrate the diversity of current approaches. Prospects for future progress depend on adequate research support as well as development and verification of appropriate analytic frameworks. PMID:21551386
NASA Astrophysics Data System (ADS)
Matsuda, Takashi S.; Nakamura, Takuji; Shiokawa, Kazuo; Tsutsumi, Masaki; Suzuki, Hidehiko; Ejiri, Mitsumu K.; Taguchi, Makoto
Atmospheric gravity waves (AGWs), which are generated in the lower atmosphere, transport significant amount of energy and momentum into the mesosphere and lower thermosphere and cause the mean wind accelerations in the mesosphere. This momentum deposit drives the general circulation and affects the temperature structure. Among many parameters to characterize AGWs, horizontal phase velocity is very important to discuss the vertical propagation. Airglow imaging is a useful technique for investigating the horizontal structures of AGWs at around 90 km altitude. Recently, there are many reports about statistical characteristics of AGWs observed by airglow imaging. However, comparison of these results obtained at various locations is difficult because each research group uses its own method for extracting and analyzing AGW events. We have developed a new statistical analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow image data, so as to deal with huge amounts of imaging data obtained on different years and at various observation sites, without bias caused by different event extraction criteria for the observer. This method was applied to the data obtained at Syowa Station, Antarctica, in 2011 and compared with a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics. This comparison shows that our new method is adequate to deriving the horizontal phase velocity characteristics of AGWs observed by airglow imaging technique. We plan to apply this method to airglow imaging data observed at Syowa Station in 2002 and between 2008 and 2013, and also to the data observed at other stations in Antarctica (e.g. Rothera Station (67S, 68W) and Halley Station (75S, 26W)), in order to investigate the behavior of AGWs propagation direction and source distribution in the MLT region over Antarctica. In this presentation, we will report interim analysis result of the data at Syowa Station.
Statistics of multiply scattered broadband terahertz pulses.
Pearce, Jeremy; Jian, Zhongping; Mittleman, Daniel M
2003-07-25
We describe the first measurements of the diffusion of broadband single-cycle optical pulses through a highly scattering medium. Using terahertz time-domain spectroscopy, we measure the electric field of a multiply scattered wave with a time resolution shorter than one optical cycle. This time-domain measurement provides information on the statistics of both the amplitude and phase distributions of the diffusive wave. We develop a theoretical description, suitable for broadband radiation, which adequately describes the experimental results.
1980-12-01
career retention rates , and to predict future career retention rates in the Navy. The statistical model utilizes economic variables as predictors...The model developed r has a high correlation with Navy career retention rates . The problem of Navy career retention has not been adequately studied, 0D...findings indicate Navy policymakers must be cognizant of the relationships of economic factors to Navy career retention rates . Accrzsiofl ’or NTIS GRA&I
Gutierrez-Corea, Federico-Vladimir; Manso-Callejo, Miguel-Angel; Moreno-Regidor, María-Pilar; Velasco-Gómez, Jesús
2014-01-01
This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of ±2 standard deviations). PMID:24732102
Gutierrez-Corea, Federico-Vladimir; Manso-Callejo, Miguel-Angel; Moreno-Regidor, María-Pilar; Velasco-Gómez, Jesús
2014-04-11
This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of ±2 standard deviations).
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-01-01
Introduction: Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. Materials and Methods: A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Results: Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Conclusion: Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories. PMID:25328332
2014-01-01
Background The DerSimonian and Laird approach (DL) is widely used for random effects meta-analysis, but this often results in inappropriate type I error rates. The method described by Hartung, Knapp, Sidik and Jonkman (HKSJ) is known to perform better when trials of similar size are combined. However evidence in realistic situations, where one trial might be much larger than the other trials, is lacking. We aimed to evaluate the relative performance of the DL and HKSJ methods when studies of different sizes are combined and to develop a simple method to convert DL results to HKSJ results. Methods We evaluated the performance of the HKSJ versus DL approach in simulated meta-analyses of 2–20 trials with varying sample sizes and between-study heterogeneity, and allowing trials to have various sizes, e.g. 25% of the trials being 10-times larger than the smaller trials. We also compared the number of “positive” (statistically significant at p < 0.05) findings using empirical data of recent meta-analyses with > = 3 studies of interventions from the Cochrane Database of Systematic Reviews. Results The simulations showed that the HKSJ method consistently resulted in more adequate error rates than the DL method. When the significance level was 5%, the HKSJ error rates at most doubled, whereas for DL they could be over 30%. DL, and, far less so, HKSJ had more inflated error rates when the combined studies had unequal sizes and between-study heterogeneity. The empirical data from 689 meta-analyses showed that 25.1% of the significant findings for the DL method were non-significant with the HKSJ method. DL results can be easily converted into HKSJ results. Conclusions Our simulations showed that the HKSJ method consistently results in more adequate error rates than the DL method, especially when the number of studies is small, and can easily be applied routinely in meta-analyses. Even with the HKSJ method, extra caution is needed when there are = <5 studies of very unequal sizes. PMID:24548571
The non-equilibrium statistical mechanics of a simple geophysical fluid dynamics model
NASA Astrophysics Data System (ADS)
Verkley, Wim; Severijns, Camiel
2014-05-01
Lorenz [1] has devised a dynamical system that has proved to be very useful as a benchmark system in geophysical fluid dynamics. The system in its simplest form consists of a periodic array of variables that can be associated with an atmospheric field on a latitude circle. The system is driven by a constant forcing, is damped by linear friction and has a simple advection term that causes the model to behave chaotically if the forcing is large enough. Our aim is to predict the statistics of Lorenz' model on the basis of a given average value of its total energy - obtained from a numerical integration - and the assumption of statistical stationarity. Our method is the principle of maximum entropy [2] which in this case reads: the information entropy of the system's probability density function shall be maximal under the constraints of normalization, a given value of the average total energy and statistical stationarity. Statistical stationarity is incorporated approximately by using `stationarity constraints', i.e., by requiring that the average first and possibly higher-order time-derivatives of the energy are zero in the maximization of entropy. The analysis [3] reveals that, if the first stationarity constraint is used, the resulting probability density function rather accurately reproduces the statistics of the individual variables. If the second stationarity constraint is used as well, the correlations between the variables are also reproduced quite adequately. The method can be generalized straightforwardly and holds the promise of a viable non-equilibrium statistical mechanics of the forced-dissipative systems of geophysical fluid dynamics. [1] E.N. Lorenz, 1996: Predictability - A problem partly solved, in Proc. Seminar on Predictability (ECMWF, Reading, Berkshire, UK), Vol. 1, pp. 1-18. [2] E.T. Jaynes, 2003: Probability Theory - The Logic of Science (Cambridge University Press, Cambridge). [3] W.T.M. Verkley and C.A. Severijns, 2014: The maximum entropy principle applied to a dynamical system proposed by Lorenz, Eur. Phys. J. B, 87:7, http://dx.doi.org/10.1140/epjb/e2013-40681-2 (open access).
Effectiveness of touch and feel (TAF) technique on first aid measures for visually challenged.
Mary, Helen; Sasikalaz, D; Venkatesan, Latha
2013-01-01
There is a common perception that a blind person cannot even help his own self. In order to challenge that view, a workshop for visually-impaired people to develop the skills to be independent and productive members of society was conceived. An experimental study was conducted at National Institute of Visually Handicapped, Chennai with the objective to assess the effectiveness of Touch and Feel (TAF) technique on first aid measures for the visually challenged. Total 25 visually challenged people were selected by non-probability purposive sampling technique and data was collected using demographic variable and structured knowledge questionnaire. The score obtained was categorised into three levels: inadequate (0-8), moderately adequate (8 - 17), adequate (17 -25). The study revealed that most of the visually challenged (40%) had inadequate knowledge, and 56 percent had moderately adequate and only few (4%) had adequate knowledge in the pre-test, whereas most (68%) of them had adequate knowledge in the post-test which is statistically significant at p < 0.000 with t-value 6.779. This proves that TAF technique was effective for the visually challenged. There was no association between the demographic variables and their level of knowledge regarding first aid.
Biau, D J; Meziane, M; Bhumbra, R S; Dumaine, V; Babinet, A; Anract, P
2011-09-01
The purpose of this study was to define immediate post-operative 'quality' in total hip replacements and to study prospectively the occurrence of failure based on these definitions of quality. The evaluation and assessment of failure were based on ten radiological and clinical criteria. The cumulative summation (CUSUM) test was used to study 200 procedures over a one-year period. Technical criteria defined failure in 17 cases (8.5%), those related to the femoral component in nine (4.5%), the acetabular component in 32 (16%) and those relating to discharge from hospital in five (2.5%). Overall, the procedure was considered to have failed in 57 of the 200 total hip replacements (28.5%). The use of a new design of acetabular component was associated with more failures. For the CUSUM test, the level of adequate performance was set at a rate of failure of 20% and the level of inadequate performance set at a failure rate of 40%; no alarm was raised by the test, indicating that there was no evidence of inadequate performance. The use of a continuous monitoring statistical method is useful to ensure that the quality of total hip replacement is maintained, especially as newer implants are introduced.
Impact of specimen adequacy on the assessment of renal allograft biopsy specimens.
Cimen, S; Geldenhuys, L; Guler, S; Imamoglu, A; Molinari, M
2016-01-01
The Banff classification was introduced to achieve uniformity in the assessment of renal allograft biopsies. The primary aim of this study was to evaluate the impact of specimen adequacy on the Banff classification. All renal allograft biopsies obtained between July 2010 and June 2012 for suspicion of acute rejection were included. Pre-biopsy clinical data on suspected diagnosis and time from renal transplantation were provided to a nephropathologist who was blinded to the original pathological report. Second pathological readings were compared with the original to assess agreement stratified by specimen adequacy. Cohen's kappa test and Fisher's exact test were used for statistical analyses. Forty-nine specimens were reviewed. Among these specimens, 81.6% were classified as adequate, 6.12% as minimal, and 12.24% as unsatisfactory. The agreement analysis among the first and second readings revealed a kappa value of 0.97. Full agreement between readings was found in 75% of the adequate specimens, 66.7 and 50% for minimal and unsatisfactory specimens, respectively. There was no agreement between readings in 5% of the adequate specimens and 16.7% of the unsatisfactory specimens. For the entire sample full agreement was found in 71.4%, partial agreement in 20.4% and no agreement in 8.2% of the specimens. Statistical analysis using Fisher's exact test yielded a P value above 0.25 showing that - probably due to small sample size - the results were not statistically significant. Specimen adequacy may be a determinant of a diagnostic agreement in renal allograft specimen assessment. While additional studies including larger case numbers are required to further delineate the impact of specimen adequacy on the reliability of histopathological assessments, specimen quality must be considered during clinical decision making while dealing with biopsy reports based on minimal or unsatisfactory specimens.
NASA Astrophysics Data System (ADS)
Dennison, Andrew G.
Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.
Attitude of teaching faculty towards statistics at a medical university in Karachi, Pakistan.
Khan, Nazeer; Mumtaz, Yasmin
2009-01-01
Statistics is mainly used in biological research to verify the clinicians and researchers findings and feelings, and gives scientific validity for their inferences. In Pakistan, the educational curriculum is developed in such a way that the students who are interested in entering in the field of biological sciences do not study mathematics after grade 10. Therefore, due to their fragile background of mathematical skills, the Pakistani medical professionals feel that they do not have adequate base to understand the basic concepts of statistical techniques when they try to use it in their research or read a scientific article. The aim of the study was to assess the attitude of medical faculty towards statistics. A questionnaire containing 42 close-ended and 4 open-ended questions, related to the attitude and knowledge of statistics, was distributed among the teaching faculty of Dow University of Health Sciences (DUHS). One hundred and sixty-seven filled questionnaires were returned from 374 faculty members (response rate 44.7%). Forty-three percent of the respondents claimed that they had 'introductive' level of statistics courses, 63% of the respondents strongly agreed that a good researcher must have some training in statistics, 82% of the faculty was in favour (strongly agreed or agreed) that statistics was really useful for research. Only 17% correctly stated that statistics is the science of uncertainty. Half of the respondents accepted that they have problem of writing the statistical section of the article. 64% of the subjects indicated that statistical teaching methods were the main reasons for the impression of its difficulties. 53% of the faculty indicated that the co-authorship of the statistician should depend upon his/her contribution in the study. Gender did not show any significant difference among the responses. However, senior faculty showed higher level of the importance for the use of statistics and difficulties of writing result section of articles as compared to junior faculty. The study showed a low level of knowledge, but high level of the awareness for the use of statistical techniques in research and exhibited a good level of motivation for further training.
NASA Technical Reports Server (NTRS)
Howell, L. W.
2001-01-01
A simple power law model consisting of a single spectral index alpha-1 is believed to be an adequate description of the galactic cosmic-ray (GCR) proton flux at energies below 10(exp 13) eV. Two procedures for estimating alpha-1 the method of moments and maximum likelihood (ML), are developed and their statistical performance compared. It is concluded that the ML procedure attains the most desirable statistical properties and is hence the recommended statistical estimation procedure for estimating alpha-1. The ML procedure is then generalized for application to a set of real cosmic-ray data and thereby makes this approach applicable to existing cosmic-ray data sets. Several other important results, such as the relationship between collecting power and detector energy resolution, as well as inclusion of a non-Gaussian detector response function, are presented. These results have many practical benefits in the design phase of a cosmic-ray detector as they permit instrument developers to make important trade studies in design parameters as a function of one of the science objectives. This is particularly important for space-based detectors where physical parameters, such as dimension and weight, impose rigorous practical limits to the design envelope.
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).
NASA Astrophysics Data System (ADS)
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
31 CFR 1.24 - Disclosure of records to person other than the individual to whom they pertain.
Code of Federal Regulations, 2010 CFR
2010-07-01
... provided the component with advance adequate written assurance that the record will be used solely as a statistical research or reporting record, and the record is to be transferred in a form that is not...
49 CFR 802.5 - Procedures for requests pertaining to individual records in a record system.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) To a recipient who has provided the NTSB with advance adequate assurance that the record will be used solely as a statistical research or reporting record and that it is to be transferred in a form not...
Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bia...
Evaluation of a newly developed infant chest compression technique
Smereka, Jacek; Bielski, Karol; Ladny, Jerzy R.; Ruetzler, Kurt; Szarpak, Lukasz
2017-01-01
Abstract Background: Providing adequate chest compression is essential during infant cardio-pulmonary-resuscitation (CPR) but was reported to be performed poor. The “new 2-thumb technique” (nTTT), which consists in using 2 thumbs directed at the angle of 90° to the chest while closing the fingers of both hands in a fist, was recently introduced. Therefore, the aim of this study was to compare 3 chest compression techniques, namely, the 2-finger-technique (TFT), the 2-thumb-technique (TTHT), and the nTTT in an randomized infant-CPR manikin setting. Methods: A total of 73 paramedics with at least 1 year of clinical experience performed 3 CPR settings with a chest compression:ventilation ratio of 15:2, according to current guidelines. Chest compression was performed with 1 out of the 3 chest compression techniques in a randomized sequence. Chest compression rate and depth, chest decompression, and adequate ventilation after chest compression served as outcome parameters. Results: The chest compression depth was 29 (IQR, 28–29) mm in the TFT group, 42 (40–43) mm in the TTHT group, and 40 (39–40) mm in the nTTT group (TFT vs TTHT, P < 0.001; TFT vs nTTT, P < 0.001; TTHT vs nTTT, P < 0.01). The median compression rate with TFT, TTHT, and nTTT varied and amounted to 136 (IQR, 133–144) min–1 versus 117 (115–121) min–1 versus 111 (109–113) min–1. There was a statistically significant difference in the compression rate between TFT and TTHT (P < 0.001), TFT and nTTT (P < 0.001), as well as TTHT and nTTT (P < 0.001). Incorrect decompressions after CC were significantly increased in the TTHT group compared with the TFT (P < 0.001) and the nTTT (P < 0.001) group. Conclusions: The nTTT provides adequate chest compression depth and rate and was associated with adequate chest decompression and possibility to adequately ventilate the infant manikin. Further clinical studies are necessary to confirm these initial findings. PMID:28383397
Cardiorespiratory Fitness, Waist Circumference and Alanine Aminotransferase in Youth
Trilk, Jennifer L.; Ortaglia, Andrew; Blair, Steven N.; Bottai, Matteo; Church, Timothy S.; Pate, Russell R.
2012-01-01
Non-alcoholic fatty liver disease (NAFLD) is considered the liver component of the metabolic syndrome and is strongly associated with cardiometabolic diseases. In adults, cardiorespiratory fitness (CRF) is inversely associated with alanine aminotransferase (ALT), a blood biomarker for NAFLD. However, information regarding these associations is scarce for youth. Purpose To examine associations between CRF, waist circumference (WC) and ALT in youth. Methods Data were obtained from youth (n=2844, 12-19 years) in the National Health and Nutrition Examination Survey (NHANES) 2001-2004. CRF was dichotomized into youth FITNESSGRAM® categories of “low” and “adequate” CRF. Logistic and quantile regression were used for a comprehensive analysis of associations, and variables with previously-reported associations with ALT were a priori included in the models. Results Results from logistic regression suggested that youth with low CRF had 1.5 times the odds of having an ALT>30 than youth with adequate CRF, although the association was not statistically significant (P=0.09). However, quantile regression demonstrated that youth with low CRF had statistically significantly higher ALT (+1.04, +1.05, and +2.57 U/L) at the upper end of the ALT distribution (80th, 85th, and 90th percentiles, respectively) than youth with adequate CRF. For every 1-cm increase in WC, the odds of having an ALT>30 increased by 1.06 (P<0.001), and the strength of this association increased across the ALT distribution. Conclusions Future studies should examine whether interventions to improve CRF can decrease hepatic fat and liver enzyme concentrations in youth with ALT ≥80th percentile or in youth diagnosed with NAFLD. PMID:23190589
Operator-related aspects in endodontic malpractice claims in Finland.
Vehkalahti, Miira M; Swanljung, Outi
2017-04-01
We analyzed operator-related differences in endodontic malpractice claims in Finland. Data comprised the endodontic malpractice claims handled at the Patient Insurance Centre (PIC) in 2002-2006 and 2011-2013. Two dental advisors at the PIC scrutinized the original documents of the cases (n = 1271). The case-related information included patient's age and gender, type of tooth, presence of radiographs, and methods of instrumentation and apex location. As injuries, we recorded broken instrument, perforation, injuries due to root canal irrigants/medicaments, and miscellaneous injuries. We categorized the injuries according to the PIC decisions as avoidable, unavoidable, or no injury. Operator-related information included dentist's age, gender, specialization, and service sector. We assessed level of patient documentation as adequate, moderate, or poor. Chi-squared tests, t-tests, and logistic regression modelling served in statistical analyses. Patients' mean age was 44.7 (range 8-85) years, and 71% were women. The private sector constituted 54% of claim cases. Younger patients, female dentists, and general practitioners predominated in the public sector. We found no sector differences in patients' gender, dentists' age, or type of injured tooth. PIC advisors confirmed no injury in 24% of claim cases; the advisors considered 65% of injury cases (n = 970) as avoidable and 35% as unavoidable. We found no operator-related differences in these figures. Working methods differed by operator's age and gender. Adequate patient documentation predominated in the public sector and among female, younger, or specialized dentists. Operator-related factors had no impact on endodontic malpractice claims.
75 FR 5518 - Dithianon; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-03
... adequate methodology LC/MS/MS method (BASF 244882) is available for enforcing the proposed tolerance on grapes. Adequate multi-residue method testing data are available for dithianon and these data have been... docket index available at http://www.regulations.gov . Although listed in the index, some information is...
NASA Technical Reports Server (NTRS)
Semler, T. T.
1973-01-01
The method of pseudo-resonance cross sections is used to analyze published temperature-dependent neutron transmission and self-indication measurements on tantalum in the unresolved region. In the energy region analyzed, 1825.0 to 2017.0 eV, a direct application of the pseudo-resonance approach using a customary average strength function will not provide effective cross sections which fit the measured cross section behavior. Rather a local value of the strength function is required, and a set of resonances which model the measured behavior of the effective cross sections is derived. This derived set of resonance parameters adequately represents the observed resonance hehavior in this local energy region. Similar analyses for the measurements in other unresolved energy regions are necessary to obtain local resonance parameters for improved reactor calculations. This study suggests that Doppler coefficients calculated by sampling from grand average statistical distributions over the entire unresolved resonance region can be in error, since significant local variations in the statistical distributions are not taken into consideration.
Muñoz-Lasa, Susana; López de Silanes, Carlos; Atín-Arratibel, M Ángeles; Bravo-Llatas, Carmen; Pastor-Jimeno, Salvador; Máximo-Bocanegra, Nuria
2018-04-19
Hippotherapy is being used as a promising method in the physical treatment of multiple sclerosis (MS). Comparative open clinical pre-post study into hippotherapy intervention during a 6-month period in patients with MS (n=6). Not randomised and with control group (n=4). The study was performed by MHG Foundation. A statistically significant improvement was observed in the therapy group in: spasticity pre-post measured by the modified Ashworth scale (P=.01). Statistically significant improvement in fatigue impact (P<.0001) measured with FIS; in general, perception of heath outcome in urinary quality of life scale KHQ (P=.033), and in subscales 2, 3 and 4 of MSQOL-54 (P=.011). Control group showed no improvement in any scale. This study reinforces current literature that supports hippotherapy as an adequate intervention for MS patients. Further studies with more participants, control groups and blinded research would be logical steps for future research in this field. Copyright © 2018 Elsevier España, S.L.U. All rights reserved.
Handique, Bijoy K; Khan, Siraj A; Mahanta, J; Sudhakar, S
2014-09-01
Japanese encephalitis (JE) is one of the dreaded mosquito-borne viral diseases mostly prevalent in south Asian countries including India. Early warning of the disease in terms of disease intensity is crucial for taking adequate and appropriate intervention measures. The present study was carried out in Dibrugarh district in the state of Assam located in the northeastern region of India to assess the accuracy of selected forecasting methods based on historical morbidity patterns of JE incidence during the past 22 years (1985-2006). Four selected forecasting methods, viz. seasonal average (SA), seasonal adjustment with last three observations (SAT), modified method adjusting long-term and cyclic trend (MSAT), and autoregressive integrated moving average (ARIMA) have been employed to assess the accuracy of each of the forecasting methods. The forecasting methods were validated for five consecutive years from 2007-2012 and accuracy of each method has been assessed. The forecasting method utilising seasonal adjustment with long-term and cyclic trend emerged as best forecasting method among the four selected forecasting methods and outperformed the even statistically more advanced ARIMA method. Peak of the disease incidence could effectively be predicted with all the methods, but there are significant variations in magnitude of forecast errors among the selected methods. As expected, variation in forecasts at primary health centre (PHC) level is wide as compared to that of district level forecasts. The study showed that adopted forecasting techniques could reasonably forecast the intensity of JE cases at PHC level without considering the external variables. The results indicate that the understanding of long-term and cyclic trend of the disease intensity will improve the accuracy of the forecasts, but there is a need for making the forecast models more robust to explain sudden variation in the disease intensity with detail analysis of parasite and host population dynamics.
Ilesanmi, O S; Alele, F O
2015-01-01
The role of Medical Audit in patient care needs to beexplored. This study aimed to determine doctors' knowledge and practice of Medical Audit in a tertiary health facility in South West Nigeria. Across-sectional study of 115 consenting doctors at Federal Medical Centre Owo was conducted. A semi-structured, self-administered questionnaire was used. Data was analyzed using SPSS version 21. Descriptive statistics were presented using frequency tables and bar chart, age and year of practice were summarized as mean and standard deviation. Chi square-test was used to compare sociodemographic variables with doctor's knowledge of MedicalAudit. Level of statistical significant was 5%. The mean age of the respondents was 32.5 ± 5.8 years. Males were 78%, and 61.7% were married. The mean duration of practice was 3.3 ± 2.2 years. Adequate knowledge of Medical Audit was found in 79% of the respondents while only 53% had practiced it. Formal training on Medical Audit has not been received by 91.3% of the respondents, 80.9% requested for training on Medical Audit. In all, 88.0% who had ≥ 3-years of practice had adequate knowledge compared with only 72.3% of those who had less than three years of practice (p = 0.040). Practice of MedicalAudit is low though adequate knowledge exist.Training of doctors on Medical Audit is required.
29 CFR Section 1607.16 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... action are open to users. T. Skill. A present, observable competence to perform a learned psychomoter act... criterion-related validity studies. These conditions include: (1) An adequate sample of persons available for the study to achieve findings of statistical significance; (2) having or being able to obtain a...
A downscaling method for the assessment of local climate change
NASA Astrophysics Data System (ADS)
Bruno, E.; Portoghese, I.; Vurro, M.
2009-04-01
The use of complimentary models is necessary to study the impact of climate change scenarios on the hydrological response at different space-time scales. However, the structure of GCMs is such that their space resolution (hundreds of kilometres) is too coarse and not adequate to describe the variability of extreme events at basin scale (Burlando and Rosso, 2002). To bridge the space-time gap between the climate scenarios and the usual scale of the inputs for hydrological prediction models is a fundamental requisite for the evaluation of climate change impacts on water resources. Since models operate a simplification of a complex reality, their results cannot be expected to fit with climate observations. Identifying local climate scenarios for impact analysis implies the definition of more detailed local scenario by downscaling GCMs or RCMs results. Among the output correction methods we consider the statistical approach by Déqué (2007) reported as a ‘Variable correction method' in which the correction of model outputs is obtained by a function build with the observation dataset and operating a quantile-quantile transformation (Q-Q transform). However, in the case of daily precipitation fields the Q-Q transform is not able to correct the temporal property of the model output concerning the dry-wet lacunarity process. An alternative correction method is proposed based on a stochastic description of the arrival-duration-intensity processes in coherence with the Poissonian Rectangular Pulse scheme (PRP) (Eagleson, 1972). In this proposed approach, the Q-Q transform is applied to the PRP variables derived from the daily rainfall datasets. Consequently the corrected PRP parameters are used for the synthetic generation of statistically homogeneous rainfall time series that mimic the persistency of daily observations for the reference period. Then the PRP parameters are forced through the GCM scenarios to generate local scale rainfall records for the 21st century. The statistical parameters characterizing daily storm occurrence, storm intensity and duration needed to apply the PRP scheme are considered among STARDEX collection of extreme indices.
The Influence of Roughness on Gear Surface Fatigue
NASA Technical Reports Server (NTRS)
Krantz, Timothy
2005-01-01
Gear working surfaces are subjected to repeated rolling and sliding contacts, and often designs require loads sufficient to cause eventual fatigue of the surface. This research provides experimental data and analytical tools to further the understanding of the causal relationship of gear surface roughness to surface fatigue. The research included evaluations and developments of statistical tools for gear fatigue data, experimental evaluation of the surface fatigue lives of superfinished gears with a near-mirror quality, and evaluations of the experiments by analytical methods and surface inspections. Alternative statistical methods were evaluated using Monte Carlo studies leading to a final recommendation to describe gear fatigue data using a Weibull distribution, maximum likelihood estimates of shape and scale parameters, and a presumed zero-valued location parameter. A new method was developed for comparing two datasets by extending the current methods of likelihood-ratio based statistics. The surface fatigue lives of superfinished gears were evaluated by carefully controlled experiments, and it is shown conclusively that superfinishing of gears can provide for significantly greater lives relative to ground gears. The measured life improvement was approximately a factor of five. To assist with application of this finding to products, the experimental condition was evaluated. The fatigue life results were expressed in terms of specific film thickness and shown to be consistent with bearing data. Elastohydrodynamic and stress analyses were completed to relate the stress condition to fatigue. Smooth-surface models do not adequately explain the improved fatigue lives. Based on analyses using a rough surface model, it is concluded that the improved fatigue lives of superfinished gears is due to a reduced rate of near-surface micropitting fatigue processes, not due to any reduced rate of spalling (sub-surface) fatigue processes. To complete the evaluations, surface inspection were completed. The surface topographies of the ground gears changed substantially due to running, but the topographies of the superfinished gears were essentially unchanged with running.
Reboiras-López, M D; Pérez-Sayáns, M; Somoza-Martín, J M; Gayoso-Diz, P; Barros-Angueira, F; Gándara-Rey, J M; García-García, A
2012-06-01
Interest in oral exfoliative cytology has increased with the availability of molecular markers that may lead to the earlier diagnosis of oral squamous cell carcinoma. This research aims to compare the efficacy of three different instruments (Cytobrush, curette and Oral CDx brush) in providing adequate material for molecular analysis. One hundred and four cytological samples obtained from volunteer healthy subjects were analysed using all three instruments. The clinical and demographical variables under study were age, sex and smoking habits. The three instruments were compared for their ability to obtain adequate samples and for the amount of RNA obtained using quantitative real-time polymerase chain reaction (PCR-qRT) analysis of the Abelson (ABL) housekeeping gene. RNA of the ABL gene has been quantified by number of copies. Adequate samples were more likely to be obtained with a curette (90.6%) or Oral CDx (80.0%) than a Cytobrush (48.6%); P < 0.001. Similarly, the RNA quantification was 17.64 ± 21.10 with a curette, 16.04 ± 15.81 with Oral CDx and 6.82 ± 6.71 with a Cytobrush. There were statistically significant differences between the Cytobrush and curette (P = 0.008) and between the Cytobrush and OralCDx (P = 0.034). There was no difference according to the demographical variables. Oral exfoliative cytology is a simple, non-invasive technique that provides sufficient RNA to perform studies on gene expression. Although material was obtained with all three instruments, adequate samples were more likely to be obtained with the curette or Oral CDx than with a Cytobrush. The Oral CDx is a less aggressive instrument than the curette, so could be a useful tool in a clinical setting. © 2011 Blackwell Publishing Ltd.
Succession Planning in State Health Agencies in the United States: A Brief Report.
Harper, Elizabeth; Leider, Jonathon P; Coronado, Fatima; Beck, Angela J
2017-11-02
Approximately 25% of the public health workforce plans to retire by 2020. Succession planning is a core capability of the governmental public health enterprise; however, limited data are available regarding these efforts in state health agencies (SHAs). We analyzed 2016 Workforce Gaps Survey data regarding succession planning in SHAs using the US Office of Personnel Management's (OPM's) succession planning model, including 6 domains and 27 activities. Descriptive statistics were calculated for all 41 responding SHAs. On average, SHAs self-reported adequately addressing 11 of 27 succession planning activities, with 93% of SHAs adequately addressing 1 or more activities and 61% adequately addressing 1 or more activities in each domain. The majority of OPM-recommended succession planning activities are not being addressed, and limited succession planning occurs across SHAs. Greater activity in the OPM-identified succession planning domains may help SHAs contend with significant turnover and better preserve institutional knowledge.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winkler, David A., E-mail: dave.winkler@csiro.au
2016-05-15
Nanomaterials research is one of the fastest growing contemporary research areas. The unprecedented properties of these materials have meant that they are being incorporated into products very quickly. Regulatory agencies are concerned they cannot assess the potential hazards of these materials adequately, as data on the biological properties of nanomaterials are still relatively limited and expensive to acquire. Computational modelling methods have much to offer in helping understand the mechanisms by which toxicity may occur, and in predicting the likelihood of adverse biological impacts of materials not yet tested experimentally. This paper reviews the progress these methods, particularly those QSAR-based,more » have made in understanding and predicting potentially adverse biological effects of nanomaterials, and also the limitations and pitfalls of these methods. - Highlights: • Nanomaterials regulators need good information to make good decisions. • Nanomaterials and their interactions with biology are very complex. • Computational methods use existing data to predict properties of new nanomaterials. • Statistical, data driven modelling methods have been successfully applied to this task. • Much more must be learnt before robust toolkits will be widely usable by regulators.« less
Review of: Methods to complete watershed analysis on Pacific Lumber lands in Northern California
L. M. Reid
1999-01-01
The three questions of primary concern for this review are: 1) are the WDNR modules adequately and validly modified to suit local conditions, as required by the HCP/SYP? 2) is there an adequate "distinct cumulative effects assessment" method, as required by the HCP/SYP? 3) will the cumulative effects assessment method and the modified WDNR modules be...
NASA Astrophysics Data System (ADS)
Farkas, I.; Helbing, D.; Vicsek, T.
2003-12-01
Mexican wave first widely broadcasted during the 1986 World Cup held in Mexico, is a human wave moving along the stands of stadiums as one section of spectators stands up, arms lifting, then sits down as the next section does the same. Here we use variants of models originally developed for the description of excitable media to demonstrate that this collective human behaviour can be quantitatively interpreted by methods of statistical physics. Adequate modelling of reactions to triggering attempts provides a deeper insight into the mechanisms by which a crowd can be stimulated to execute a particular pattern of behaviour and represents a possible tool of control during events involving excited groups of people. Interactive simulations, video recordings and further images are available at the webpage dedicated to this work: http://angel.elte.hu/wave.
Cardiovascular Disease and Associated Risk Factors in Cuba: Prospects for Prevention and Control
Cooper, Richard S.; Orduñez, Pedro; Iraola Ferrer, Marcos D.; Munoz, Jose Luis Bernal; Espinosa-Brito, Alfredo
2006-01-01
Objectives. An adequate description of the trends in cardiovascular disease (CVD) is not available for most of the developing world. Cuba provides an important exception, and we sought to use available data to offer insights into the changing patterns of CVD there. Methods. We reviewed Cuban public health statistics, surveys, and reports of health services. Results. CVD has been the leading cause of death since 1970. A 45% reduction in heart disease deaths was observed from 1970 to 2002; the decline in stroke was more limited. There are moderate prevalences of all major risk factors. Conclusions. The Cuban medical care system has responded vigorously to the challenge of CVD; levels of control of hypertension are the highest in the world. Nonindustrialized countries can decisively control CVD. PMID:16317211
Fast and accurate automated cell boundary determination for fluorescence microscopy
NASA Astrophysics Data System (ADS)
Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider
2013-07-01
Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.
NASA Technical Reports Server (NTRS)
Huyse, Luc; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
Free-form shape optimization of airfoils poses unexpected difficulties. Practical experience has indicated that a deterministic optimization for discrete operating conditions can result in dramatically inferior performance when the actual operating conditions are different from the - somewhat arbitrary - design values used for the optimization. Extensions to multi-point optimization have proven unable to adequately remedy this problem of "localized optimization" near the sampled operating conditions. This paper presents an intrinsically statistical approach and demonstrates how the shortcomings of multi-point optimization with respect to "localized optimization" can be overcome. The practical examples also reveal how the relative likelihood of each of the operating conditions is automatically taken into consideration during the optimization process. This is a key advantage over the use of multipoint methods.
Quantitative topographic differentiation of the neonatal EEG.
Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil
2006-09-01
To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.
Van Nuffel, A; Tuyttens, F A M; Van Dongen, S; Talloen, W; Van Poucke, E; Sonck, B; Lens, L
2007-12-01
Nonidentical development of bilateral traits due to disturbing genetic or developmental factors is called fluctuating asymmetry (FA) if such deviations are continuously distributed. Fluctuating asymmetry is believed to be a reliable indicator of the fitness and welfare of an animal. Despite an increasing body of research, the link between FA and animal performance or welfare is reported to be inconsistent, possibly, among other reasons, due to inaccurate measuring protocols or incorrect statistical analyses. This paper reviews problems of interpreting FA results in poultry and provides guidelines for the measurement and analysis of FA, applied to broilers. A wide range of morphological traits were measured by 7 different techniques (ranging from measurements on living broilers or intact carcasses to X-rays, bones, and digital images) and evaluated for their applicability to estimate FA. Following 4 selection criteria (significant FA, absence of directional asymmetry or antisymmetry, absence of between-trait correlation in signed FA values, and high signal-to-noise ratio), from 3 to 14 measurements per method were found suitable for estimating the degree of FA. The accuracy of FA estimates was positively related to the complexity and time investment of the measuring method. In addition, our study clearly shows the importance of securing adequate statistical power when designing FA studies. Repeatability analyses of FA estimates indicated the need for larger sample sizes, more repeated measurements, or both, than are commonly used in FA studies.
Statistics of Data Fitting: Flaws and Fixes of Polynomial Analysis of Channeled Spectra
NASA Astrophysics Data System (ADS)
Karstens, William; Smith, David
2013-03-01
Starting from general statistical principles, we have critically examined Baumeister's procedure* for determining the refractive index of thin films from channeled spectra. Briefly, the method assumes that the index and interference fringe order may be approximated by polynomials quadratic and cubic in photon energy, respectively. The coefficients of the polynomials are related by differentiation, which is equivalent to comparing energy differences between fringes. However, we find that when the fringe order is calculated from the published IR index for silicon* and then analyzed with Baumeister's procedure, the results do not reproduce the original index. This problem has been traced to 1. Use of unphysical powers in the polynomials (e.g., time-reversal invariance requires that the index is an even function of photon energy), and 2. Use of insufficient terms of the correct parity. Exclusion of unphysical terms and addition of quartic and quintic terms to the index and order polynomials yields significantly better fits with fewer parameters. This represents a specific example of using statistics to determine if the assumed fitting model adequately captures the physics contained in experimental data. The use of analysis of variance (ANOVA) and the Durbin-Watson statistic to test criteria for the validity of least-squares fitting will be discussed. *D.F. Edwards and E. Ochoa, Appl. Opt. 19, 4130 (1980). Supported in part by the US Department of Energy, Office of Nuclear Physics under contract DE-AC02-06CH11357.
10 CFR 1304.110 - Disclosure of records to third parties.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Board with adequate advance written assurance that the record will be used solely as a statistical... under the control of the United States for a civil or criminal law enforcement activity, if the activity... record is disclosed under such compulsory legal process, the Board shall make reasonable efforts to...
The Evaluation and Selection of Adequate Causal Models: A Compensatory Education Example.
ERIC Educational Resources Information Center
Tanaka, Jeffrey S.
1982-01-01
Implications of model evaluation (using traditional chi square goodness of fit statistics, incremental fit indices for covariance structure models, and latent variable coefficients of determination) on substantive conclusions are illustrated with an example examining the effects of participation in a compensatory education program on posttreatment…
Signal Detection Theory as a Tool for Successful Student Selection
ERIC Educational Resources Information Center
van Ooijen-van der Linden, Linda; van der Smagt, Maarten J.; Woertman, Liesbeth; te Pas, Susan F.
2017-01-01
Prediction accuracy of academic achievement for admission purposes requires adequate "sensitivity" and "specificity" of admission tools, yet the available information on the validity and predictive power of admission tools is largely based on studies using correlational and regression statistics. The goal of this study was to…
Is Neurofeedback an Efficacious Treatment for ADHD? A Randomised Controlled Clinical Trial
ERIC Educational Resources Information Center
Gevensleben, Holger; Holl, Birgit; Albrecht, Bjorn; Vogel, Claudia; Schlamp, Dieter; Kratz, Oliver; Studer, Petra; Rothenberger, Aribert; Moll, Gunther H.; Heinrich, Hartmut
2009-01-01
Background: For children with attention deficit/hyperactivity disorder (ADHD), a reduction of inattention, impulsivity and hyperactivity by neurofeedback (NF) has been reported in several studies. But so far, unspecific training effects have not been adequately controlled for andor studies do not provide sufficient statistical power. To overcome…
3D Self-Localisation From Angle of Arrival Measurements
2009-04-01
systems can provide precise position information. However, there are situations where GPS is not adequate such as indoor, underwater, extraterrestrial or...Transactions on Pattern Analysis and Machine Intelligence , Vol. 22, No. 6, June 2000, pp 610-622. 7. Torrieri, D.J., "Statistical Theory of Passive Location
Tai, Patricia; Yu, Edward; Cserni, Gábor; Vlastos, Georges; Royce, Melanie; Kunkler, Ian; Vinh-Hung, Vincent
2005-01-01
Background The present commonly used five-year survival rates are not adequate to represent the statistical cure. In the present study, we established the minimum number of years required for follow-up to estimate statistical cure rate, by using a lognormal distribution of the survival time of those who died of their cancer. We introduced the term, threshold year, the follow-up time for patients dying from the specific cancer covers most of the survival data, leaving less than 2.25% uncovered. This is close enough to cure from that specific cancer. Methods Data from the Surveillance, Epidemiology and End Results (SEER) database were tested if the survival times of cancer patients who died of their disease followed the lognormal distribution using a minimum chi-square method. Patients diagnosed from 1973–1992 in the registries of Connecticut and Detroit were chosen so that a maximum of 27 years was allowed for follow-up to 1999. A total of 49 specific organ sites were tested. The parameters of those lognormal distributions were found for each cancer site. The cancer-specific survival rates at the threshold years were compared with the longest available Kaplan-Meier survival estimates. Results The characteristics of the cancer-specific survival times of cancer patients who died of their disease from 42 cancer sites out of 49 sites were verified to follow different lognormal distributions. The threshold years validated for statistical cure varied for different cancer sites, from 2.6 years for pancreas cancer to 25.2 years for cancer of salivary gland. At the threshold year, the statistical cure rates estimated for 40 cancer sites were found to match the actuarial long-term survival rates estimated by the Kaplan-Meier method within six percentage points. For two cancer sites: breast and thyroid, the threshold years were so long that the cancer-specific survival rates could yet not be obtained because the SEER data do not provide sufficiently long follow-up. Conclusion The present study suggests a certain threshold year is required to wait before the statistical cure rate can be estimated for each cancer site. For some cancers, such as breast and thyroid, the 5- or 10-year survival rates inadequately reflect statistical cure rates, and highlight the need for long-term follow-up of these patients. PMID:15904508
HSQC-1,n-ADEQUATE: a new approach to long-range 13C-13C correlation by covariance processing.
Martin, Gary E; Hilton, Bruce D; Willcott, M Robert; Blinov, Kirill A
2011-10-01
Long-range, two-dimensional heteronuclear shift correlation NMR methods play a pivotal role in the assembly of novel molecular structures. The well-established GHMBC method is a high-sensitivity mainstay technique, affording connectivity information via (n)J(CH) coupling pathways. Unfortunately, there is no simple way of determining the value of n and hence no way of differentiating two-bond from three- and occasionally four-bond correlations. Three-bond correlations, however, generally predominate. Recent work has shown that the unsymmetrical indirect covariance or generalized indirect covariance processing of multiplicity edited GHSQC and 1,1-ADEQUATE spectra provides high-sensitivity access to a (13)C-(13) C connectivity map in the form of an HSQC-1,1-ADEQUATE spectrum. Covariance processing of these data allows the 1,1-ADEQUATE connectivity information to be exploited with the inherent sensitivity of the GHSQC spectrum rather than the intrinsically lower sensitivity of the 1,1-ADEQUATE spectrum itself. Data acquisition times and/or sample size can be substantially reduced when covariance processing is to be employed. In an extension of that work, 1,n-ADEQUATE spectra can likewise be subjected to covariance processing to afford high-sensitivity access to the equivalent of (4)J(CH) GHMBC connectivity information. The method is illustrated using strychnine as a model compound. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Rings, Joerg; Vrugt, Jasper A.; Schoups, Gerrit; Huisman, Johan A.; Vereecken, Harry
2012-05-01
Bayesian model averaging (BMA) is a standard method for combining predictive distributions from different models. In recent years, this method has enjoyed widespread application and use in many fields of study to improve the spread-skill relationship of forecast ensembles. The BMA predictive probability density function (pdf) of any quantity of interest is a weighted average of pdfs centered around the individual (possibly bias-corrected) forecasts, where the weights are equal to posterior probabilities of the models generating the forecasts, and reflect the individual models skill over a training (calibration) period. The original BMA approach presented by Raftery et al. (2005) assumes that the conditional pdf of each individual model is adequately described with a rather standard Gaussian or Gamma statistical distribution, possibly with a heteroscedastic variance. Here we analyze the advantages of using BMA with a flexible representation of the conditional pdf. A joint particle filtering and Gaussian mixture modeling framework is presented to derive analytically, as closely and consistently as possible, the evolving forecast density (conditional pdf) of each constituent ensemble member. The median forecasts and evolving conditional pdfs of the constituent models are subsequently combined using BMA to derive one overall predictive distribution. This paper introduces the theory and concepts of this new ensemble postprocessing method, and demonstrates its usefulness and applicability by numerical simulation of the rainfall-runoff transformation using discharge data from three different catchments in the contiguous United States. The revised BMA method receives significantly lower-prediction errors than the original default BMA method (due to filtering) with predictive uncertainty intervals that are substantially smaller but still statistically coherent (due to the use of a time-variant conditional pdf).
Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad
2018-02-01
Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.
Monitoring of bone regeneration process by means of texture analysis
NASA Astrophysics Data System (ADS)
Kokkinou, E.; Boniatis, I.; Costaridou, L.; Saridis, A.; Panagiotopoulos, E.; Panayiotakis, G.
2009-09-01
An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p < 0.05) were derived for the initial and the final textural features values, generated from the first and the last postoperatively obtained radiographs, respectively. A quadratic polynomial regression equation fitted data adequately (r2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.
Phytoplankton productivity in relation to light intensity: A simple equation
Peterson, D.H.; Perry, M.J.; Bencala, K.E.; Talbot, M.C.
1987-01-01
A simple exponential equation is used to describe photosynthetic rate as a function of light intensity for a variety of unicellular algae and higher plants where photosynthesis is proportional to (1-e-??1). The parameter ?? (=Ik-1) is derived by a simultaneous curve-fitting method, where I is incident quantum-flux density. The exponential equation is tested against a wide range of data and is found to adequately describe P vs. I curves. The errors associated with photosynthetic parameters are calculated. A simplified statistical model (Poisson) of photon capture provides a biophysical basis for the equation and for its ability to fit a range of light intensities. The exponential equation provides a non-subjective simultaneous curve fitting estimate for photosynthetic efficiency (a) which is less ambiguous than subjective methods: subjective methods assume that a linear region of the P vs. I curve is readily identifiable. Photosynthetic parameters ?? and a are used widely in aquatic studies to define photosynthesis at low quantum flux. These parameters are particularly important in estuarine environments where high suspended-material concentrations and high diffuse-light extinction coefficients are commonly encountered. ?? 1987.
Food Insecurity and Chronic Diseases Among American Indians in Rural Oklahoma: The THRIVE Study
Wetherill, Marianna S.; Hearod, Jordan; Jacob, Tvli; Salvatore, Alicia L.; Cannady, Tamela; Grammar, Mandy; Standridge, Joy; Fox, Jill; Spiegel, Jennifer; Wiley, AnDina; Noonan, Carolyn; Buchwald, Dedra
2017-01-01
Objectives. To examine food insecurity and cardiovascular disease–related health outcomes among American Indians (AIs) in rural Oklahoma. Methods. We surveyed a cross-sectional sample of 513 AI adults to assess food insecurity domains (i.e., food quality and quantity) and obesity, diabetes, and hypertension. Results. Among AIs surveyed, 56% reported inadequate food quantity and 62% reported inadequate food quality. The unadjusted prevalence of diabetes (28.4% vs 18.4%), obesity (60.0% vs 48.3%), and hypertension (54.1% vs 41.6%) was higher among participants with inadequate food quantity than among those with adequate food quantity. These associations did not reach statistical significance after adjustment for age, gender, study site, education, and income. The unadjusted prevalence of obesity (60.7% vs 45.8%), diabetes (27.3% vs 18.8%), and hypertension (52.5% vs 42.5%) was higher among those with inadequate food quality than among those with adequate food quality, even after adjustment for age, gender, study site, education, and income. Conclusions. Tribal, federal, and state policymakers, as well as businesses and nonprofit organizations, must collaboratively take aggressive action to address food insecurity and its underlying causes, including improving tribal food environments, reducing barriers to healthy foods, and increasing living wages. PMID:28103070
Leineweber, Constanze; Westerlund, Hugo; Chungkham, Holendro Singh; Lindqvist, Rikard; Runesdotter, Sara; Tishelman, Carol
2014-01-01
Objectives To investigate associations between nurse work practice environment measured at department level and individual level work-family conflict on burnout, measured as emotional exhaustion, depersonalization and personal accomplishment among Swedish RNs. Methods A multilevel model was fit with the individual RN at the 1st, and the hospital department at the 2nd level using cross-sectional RN survey data from the Swedish part of RN4CAST, an EU 7th framework project. The data analysed here is based on a national sample of 8,620 RNs from 369 departments in 53 hospitals. Results Generally, RNs reported high values of personal accomplishment and lower values of emotional exhaustion and depersonalization. High work-family conflict increased the risk for emotional exhaustion, but for neither depersonalization nor personal accomplishment. On department level adequate staffing and good leadership and support for nurses reduced the risk for emotional exhaustion and depersonalization. Personal accomplishment was statistically significantly related to staff adequacy. Conclusions The findings suggest that adequate staffing, good leadership, and support for nurses are crucial for RNs' mental health. Our findings also highlight the importance of hospital managers developing policies and practices to facilitate the successful combination of work with private life for employees. PMID:24820972
Gaussian membership functions are most adequate in representing uncertainty in measurements
NASA Technical Reports Server (NTRS)
Kreinovich, V.; Quintana, C.; Reznik, L.
1992-01-01
In rare situations, like fundamental physics, we perform experiments without knowing what their results will be. In the majority of real-life measurement situations, we more or less know beforehand what kind of results we will get. Of course, this is not the precise knowledge of the type 'the result will be between alpha - beta and alpha + beta,' because in this case, we would not need any measurements at all. This is usually a knowledge that is best represented in uncertain terms, like 'perhaps (or 'most likely', etc.) the measured value x is between alpha - beta and alpha + beta.' Traditional statistical methods neglect this additional knowledge and process only the measurement results. So it is desirable to be able to process this uncertain knowledge as well. A natural way to process it is by using fuzzy logic. But, there is a problem; we can use different membership functions to represent the same uncertain statements, and different functions lead to different results. What membership function do we choose? In the present paper, we show that under some reasonable assumptions, Gaussian functions mu(x) = exp(-beta(x(exp 2))) are the most adequate choice of the membership functions for representing uncertainty in measurements. This representation was efficiently used in testing jet engines to airplanes and spaceships.
SpatialEpiApp: A Shiny web application for the analysis of spatial and spatio-temporal disease data.
Moraga, Paula
2017-11-01
During last years, public health surveillance has been facilitated by the existence of several packages implementing statistical methods for the analysis of spatial and spatio-temporal disease data. However, these methods are still inaccesible for many researchers lacking the adequate programming skills to effectively use the required software. In this paper we present SpatialEpiApp, a Shiny web application that integrate two of the most common approaches in health surveillance: disease mapping and detection of clusters. SpatialEpiApp is easy to use and does not require any programming knowledge. Given information about the cases, population and optionally covariates for each of the areas and dates of study, the application allows to fit Bayesian models to obtain disease risk estimates and their uncertainty by using R-INLA, and to detect disease clusters by using SaTScan. The application allows user interaction and the creation of interactive data visualizations and reports showing the analyses performed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using geostatistical methods to estimate snow water equivalence distribution in a mountain watershed
Balk, B.; Elder, K.; Baron, Jill S.
1998-01-01
Knowledge of the spatial distribution of snow water equivalence (SWE) is necessary to adequately forecast the volume and timing of snowmelt runoff. In April 1997, peak accumulation snow depth and density measurements were independently taken in the Loch Vale watershed (6.6 km2), Rocky Mountain National Park, Colorado. Geostatistics and classical statistics were used to estimate SWE distribution across the watershed. Snow depths were spatially distributed across the watershed through kriging interpolation methods which provide unbiased estimates that have minimum variances. Snow densities were spatially modeled through regression analysis. Combining the modeled depth and density with snow-covered area (SCA produced an estimate of the spatial distribution of SWE. The kriged estimates of snow depth explained 37-68% of the observed variance in the measured depths. Steep slopes, variably strong winds, and complex energy balance in the watershed contribute to a large degree of heterogeneity in snow depth.
NASA Astrophysics Data System (ADS)
Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.
2015-06-01
Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.
NASA Astrophysics Data System (ADS)
Potapova, E. V.; Dremin, V. V.; Zherebtsov, E. A.; Makovik, I. N.; Zharkikh, E. V.; Dunaev, A. V.; Pilipenko, O. V.; Sidorov, V. V.; Krupatkin, A. I.
2017-12-01
The possibility of a complex approach for studying changes in the system of blood microcirculation and metabolic processes in the biotissue of lower extremities using optical noninvasive methods of laser doppler flowmetry (LDF), fluorescence spectroscopy, and diffuse reflectance spectroscopy in combination with different modes of heating tests has been assessed. Seventy-six patients with type 2 diabetes mellitus, with 14 patients having visible trophic foot impairments, and 48 healthy volunteers have been examined. The parameters of LDF signals and spectra of fluorescence intensity and diffuse reflectance for foot skin have been analyzed. Statistically significant differences in the recorded parameters between the groups under study have been found. It has been concluded that combined application of noninvasive methods of spectroscopy could be used for diagnostics of complications both upon the occurrence of preliminary symptoms of diabetes, when pathological changes are still reversible, and in the presence of impairments to prevent aggravation of the disease and select an adequate correction of the treatment.
Unsteady Probabilistic Analysis of a Gas Turbine System
NASA Technical Reports Server (NTRS)
Brown, Marilyn
2003-01-01
In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.
Simulations for designing and interpreting intervention trials in infectious diseases.
Halloran, M Elizabeth; Auranen, Kari; Baird, Sarah; Basta, Nicole E; Bellan, Steven E; Brookmeyer, Ron; Cooper, Ben S; DeGruttola, Victor; Hughes, James P; Lessler, Justin; Lofgren, Eric T; Longini, Ira M; Onnela, Jukka-Pekka; Özler, Berk; Seage, George R; Smith, Thomas A; Vespignani, Alessandro; Vynnycky, Emilia; Lipsitch, Marc
2017-12-29
Interventions in infectious diseases can have both direct effects on individuals who receive the intervention as well as indirect effects in the population. In addition, intervention combinations can have complex interactions at the population level, which are often difficult to adequately assess with standard study designs and analytical methods. Herein, we urge the adoption of a new paradigm for the design and interpretation of intervention trials in infectious diseases, particularly with regard to emerging infectious diseases, one that more accurately reflects the dynamics of the transmission process. In an increasingly complex world, simulations can explicitly represent transmission dynamics, which are critical for proper trial design and interpretation. Certain ethical aspects of a trial can also be quantified using simulations. Further, after a trial has been conducted, simulations can be used to explore the possible explanations for the observed effects. Much is to be gained through a multidisciplinary approach that builds collaborations among experts in infectious disease dynamics, epidemiology, statistical science, economics, simulation methods, and the conduct of clinical trials.
NASA Astrophysics Data System (ADS)
de Santana, Felipe Bachion; de Souza, André Marcelo; Poppi, Ronei Jesus
2018-02-01
This study evaluates the use of visible and near infrared spectroscopy (Vis-NIRS) combined with multivariate regression based on random forest to quantify some quality soil parameters. The parameters analyzed were soil cation exchange capacity (CEC), sum of exchange bases (SB), organic matter (OM), clay and sand present in the soils of several regions of Brazil. Current methods for evaluating these parameters are laborious, timely and require various wet analytical methods that are not adequate for use in precision agriculture, where faster and automatic responses are required. The random forest regression models were statistically better than PLS regression models for CEC, OM, clay and sand, demonstrating resistance to overfitting, attenuating the effect of outlier samples and indicating the most important variables for the model. The methodology demonstrates the potential of the Vis-NIR as an alternative for determination of CEC, SB, OM, sand and clay, making possible to develop a fast and automatic analytical procedure.
Multidrug-resistant tuberculosis.
Zager, Ellen M; McNerney, Ruth
2008-01-25
With almost 9 million new cases each year, tuberculosis remains one of the most feared diseases on the planet. Led by the STOP-TB Partnership and WHO, recent efforts to combat the disease have made considerable progress in a number of countries. However, the emergence of mutated strains of Mycobacterium tuberculosis that are resistant to the major anti-tuberculosis drugs poses a deadly threat to control efforts. Multidrug-resistant tuberculosis (MDR-TB) has been reported in all regions of the world. More recently, extensively drug resistant-tuberculosis (XDR-TB) that is also resistant to second line drugs has emerged in a number of countries. To ensure that adequate resources are allocated to prevent the emergence and spread of drug resistance it is important to understand the scale of the problem. In this article we propose that current methods of describing the epidemiology of drug resistant tuberculosis are not adequate for this purpose and argue for the inclusion of population based statistics in global surveillance data. Whereas the prevalence of tuberculosis is presented as the proportion of individuals within a defined population having disease, the prevalence of drug resistant tuberculosis is usually presented as the proportion of tuberculosis cases exhibiting resistance to anti-tuberculosis drugs. Global surveillance activities have identified countries in Eastern Europe, the former Soviet Union and regions of China as having a high proportion of MDR-TB cases and international commentary has focused primarily on the urgent need to improve control in these settings. Other regions, such as sub-Saharan Africa have been observed as having a low proportion of drug resistant cases. However, if one considers the incidence of new tuberculosis cases with drug resistant disease in terms of the population then countries of sub-Saharan Africa have amongst the highest rates of transmitted MDR-TB in the world. We propose that inclusion of population based statistics in global surveillance data is necessary to better inform debate on the control of drug resistant tuberculosis. Re-appraisal of global MDR-TB data to include population based statistics suggests that the problem of drug resistant tuberculosis in sub-Saharan Africa is more critical than previously perceived.
Bartsch, L.A.; Richardson, W.B.; Naimo, T.J.
1998-01-01
Estimation of benthic macroinvertebrate populations over large spatial scales is difficult due to the high variability in abundance and the cost of sample processing and taxonomic analysis. To determine a cost-effective, statistically powerful sample design, we conducted an exploratory study of the spatial variation of benthic macroinvertebrates in a 37 km reach of the Upper Mississippi River. We sampled benthos at 36 sites within each of two strata, contiguous backwater and channel border. Three standard ponar (525 cm(2)) grab samples were obtained at each site ('Original Design'). Analysis of variance and sampling cost of strata-wide estimates for abundance of Oligochaeta, Chironomidae, and total invertebrates showed that only one ponar sample per site ('Reduced Design') yielded essentially the same abundance estimates as the Original Design, while reducing the overall cost by 63%. A posteriori statistical power analysis (alpha = 0.05, beta = 0.20) on the Reduced Design estimated that at least 18 sites per stratum were needed to detect differences in mean abundance between contiguous backwater and channel border areas for Oligochaeta, Chironomidae, and total invertebrates. Statistical power was nearly identical for the three taxonomic groups. The abundances of several taxa of concern (e.g., Hexagenia mayflies and Musculium fingernail clams) were too spatially variable to estimate power with our method. Resampling simulations indicated that to achieve adequate sampling precision for Oligochaeta, at least 36 sample sites per stratum would be required, whereas a sampling precision of 0.2 would not be attained with any sample size for Hexagenia in channel border areas, or Chironomidae and Musculium in both strata given the variance structure of the original samples. Community-wide diversity indices (Brillouin and 1-Simpsons) increased as sample area per site increased. The backwater area had higher diversity than the channel border area. The number of sampling sites required to sample benthic macroinvertebrates during our sampling period depended on the study objective and ranged from 18 to more than 40 sites per stratum. No single sampling regime would efficiently and adequately sample all components of the macroinvertebrate community.
Developing a model for the adequate description of electronic communication in hospitals.
Saboor, Samrend; Ammenwerth, Elske
2011-01-01
Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.
Hastings, K L
2001-02-02
Immune-based systemic hypersensitivities account for a significant number of adverse drug reactions. There appear to be no adequate nonclinical models to predict systemic hypersensitivity to small molecular weight drugs. Although there are very good methods for detecting drugs that can induce contact sensitization, these have not been successfully adapted for prediction of systemic hypersensitivity. Several factors have made the development of adequate models difficult. The term systemic hypersensitivity encompases many discrete immunopathologies. Each type of immunopathology presumably is the result of a specific cluster of immunologic and biochemical phenomena. Certainly other factors, such as genetic predisposition, metabolic idiosyncrasies, and concomitant diseases, further complicate the problem. Therefore, it may be difficult to find common mechanisms upon which to construct adequate models to predict specific types of systemic hypersensitivity reactions. There is some reason to hope, however, that adequate methods could be developed for at least identifying drugs that have the potential to produce signs indicative of a general hazard for immune-based reactions.
NASA Astrophysics Data System (ADS)
Lautze, N. C.; Ito, G.; Thomas, D. M.; Hinz, N.; Frazer, L. N.; Waller, D.
2015-12-01
Hawaii offers the opportunity to gain knowledge and develop geothermal energy on the only oceanic hotspot in the U.S. As a remote island state, Hawaii is more dependent on imported fossil fuel than any other state in the U.S., and energy prices are 3 to 4 times higher than the national average. The only proven resource, located on Hawaii Island's active Kilauea volcano, is a region of high geologic risk; other regions of probable resource exist but lack adequate assessment. The last comprehensive statewide geothermal assessment occurred in 1983 and found a potential resource on all islands (Hawaii Institute of Geophysics, 1983). Phase 1 of a Department of Energy funded project to assess the probability of geothermal resource potential statewide in Hawaii was recently completed. The execution of this project was divided into three main tasks: (1) compile all historical and current data for Hawaii that is relevant to geothermal resources into a single Geographic Information System (GIS) project; (2) analyze and rank these datasets in terms of their relevance to the three primary properties of a viable geothermal resource: heat (H), fluid (F), and permeability (P); and (3) develop and apply a Bayesian statistical method to incorporate the ranks and produce probability models that map out Hawaii's geothermal resource potential. Here, we summarize the project methodology and present maps that highlight both high prospect areas as well as areas that lack enough data to make an adequate assessment. We suggest a path for future exploration activities in Hawaii, and discuss how this method of analysis can be adapted to other regions and other types of resources. The figure below shows multiple layers of GIS data for Hawaii Island. Color shades indicate crustal density anomalies produced from inversions of gravity (Flinders et al. 2013). Superimposed on this are mapped calderas, rift zones, volcanic cones, and faults (following Sherrod et al., 2007). These features were used to identify probable locations of intrusive rock (heat) and permeability.
NASA Technical Reports Server (NTRS)
1996-01-01
Preliminary design guidelines necessary to assure electromagnetic compatibility (EMC) of spacecraft using composite materials, are presented. A database of electrical properties of composite materials which may have an effect on EMC is established. The guidelines concentrate on the composites that are conductive but may require enhancement to be adequate for EMC purposes. These composites are represented by graphite reinforced polymers. Methods for determining adequate conductivity levels for various EMC purposes are defined, along with the methods of design which increase conductivity of composite materials and joints to adequate levels.
Reliability of unstable periodic orbit based control strategies in biological systems.
Mishra, Nagender; Hasse, Maria; Biswal, B; Singh, Harinder P
2015-04-01
Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry of the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.
Al-Dubai, SAR; Ganasegeran, K; Barua, A; Rizal, AM; Rampal, KG
2014-01-01
Background: The 10-item version of Perceived Stress Scale (PSS-10) is a widely used tool to measure stress. The Malay version of the PSS-10 has been validated among Malaysian Medical Students. However, studies have not been conducted to assess its validity in occupational settings. Aim: The aim of this study is to assess the psychometric properties of the Malay version of the PSS-10 in two occupational setting in Malaysia. Subjects and Methods: This study was conducted among 191 medical residents and 513 railway workers. An exploratory factor analysis was performed using the principal component method with varimax rotation. Correlation analyses, Kaiser-Meyer-Olkin, Bartlett's test of Sphericity and Cronbach's alpha were obtained. Statistical analysis was carried out using statistical package for the social sciences version 16 (SPSS, Chicago, IL, USA) software. Results: Analysis yielded two factor structure of the Malay version of PSS-10 in both occupational groups. The two factors accounted for 59.2% and 64.8% of the variance in the medical residents and the railway workers respectively. Factor loadings were greater than 0.59 in both occupational groups. Cronbach's alpha co-efficient was 0.70 for medical residents and 0.71 for railway workers. Conclusion: The Malay version of PSS-10 had adequate psychometric properties and can be used to measure stress among occupational settings in Malaysia. PMID:25184074
Reliability of unstable periodic orbit based control strategies in biological systems
NASA Astrophysics Data System (ADS)
Mishra, Nagender; Hasse, Maria; Biswal, B.; Singh, Harinder P.
2015-04-01
Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry of the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.
[Big data in official statistics].
Zwick, Markus
2015-08-01
The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.
Hori, Seiichi; Kawada, Tsubasa; Kogure, Sanae; Yabu, Shinako; Mori, Kenji; Akimoto, Masayuki
2017-02-01
The release characteristics of lipophilic suppositories containing acetaminophen (AAP) were examined using four types of dissolution methods: the basket, paddle, dialysis tubing (DT) and flow-through cell (FTC) methods. The suitability of each apparatus for quality control in AAP compounded suppositories was evaluated using statistical procedures. More than 80% of the drug was released over 60 min in all the release methods studied, with the exception of the basket method. Reproducible and faster release was achieved using the paddle method at 100 and 200 rpm, whereas poor release occurred with the basket method. The mean dissolution time (MDT), maximum dissolved quantity of AAP at the end of the sampling time (Q) and dissolution efficiency (DE) were calculated by model-independent methods. The FTC method with a single chamber used in this study was also appreciable for AAP suppositories (Q of 100%, MDT of 71-91 min and DE of 75-80%). The DT apparatus is considered similar to the FTC apparatus from a quality control perspective for judging the release properties of lipophilic base suppositories containing AAP. However, even the single chamber FTC used in this study has potential as an in vitro drug release test for suppositories. The comparative dissolution method is expected to become one of the valuable tools for selecting an adequate dissolution test.
Wang, Zhifang; Zhu, Wenming; Mo, Zhe; Wang, Yuanyang; Mao, Guangming; Wang, Xiaofeng; Lou, Xiaoming
2017-01-01
Universal salt iodization (USI) has been implemented for two decades in China. It is crucial to periodically monitor iodine status in the most vulnerable population, such as pregnant women. A cross-sectional study was carried out in an evidence-proved iodine-sufficient province to evaluate iodine intake in pregnancy. According to the WHO/UNICEF/ICCIDD recommendation criteria of adequate iodine intake in pregnancy (150–249 µg/L), the median urinary iodine concentration (UIC) of the total 8159 recruited pregnant women was 147.5 µg/L, which indicated pregnant women had iodine deficiency at the province level. Overall, 51.0% of the total study participants had iodine deficiency with a UIC < 150 µg/L and only 32.9% of them had adequate iodine. Participants living in coastal areas had iodine deficiency with a median UIC of 130.1 µg/L, while those in inland areas had marginally adequate iodine intake with a median UIC of 158.1 µg/L (p < 0.001). Among the total study participants, 450 pregnant women consuming non-iodized salt had mild-moderate iodine deficiency with a median UIC of 99.6 µg/L; 7363 pregnant women consuming adequately iodized salt had a lightly statistically higher median UIC of 151.9 µg/L, compared with the recommended adequate level by the WHO/UNICEF/ICCIDD (p < 0.001). Consuming adequately iodized salt seemed to lightly increase the median UIC level, but it may not be enough to correct iodine nutrition status to an optimum level as recommended by the WHO/UNICEF/ICCIDD. We therefore suggest that, besides strengthening USI policy, additional interventive measure may be needed to improve iodine intake in pregnancy. PMID:28230748
Wang, Zhifang; Zhu, Wenming; Mo, Zhe; Wang, Yuanyang; Mao, Guangming; Wang, Xiaofeng; Lou, Xiaoming
2017-02-20
Universal salt iodization (USI) has been implemented for two decades in China. It is crucial to periodically monitor iodine status in the most vulnerable population, such as pregnant women. A cross-sectional study was carried out in an evidence-proved iodine-sufficient province to evaluate iodine intake in pregnancy. According to the WHO/UNICEF/ICCIDD recommendation criteria of adequate iodine intake in pregnancy (150-249 µg/L), the median urinary iodine concentration (UIC) of the total 8159 recruited pregnant women was 147.5 µg/L, which indicated pregnant women had iodine deficiency at the province level. Overall, 51.0% of the total study participants had iodine deficiency with a UIC < 150 µg/L and only 32.9% of them had adequate iodine. Participants living in coastal areas had iodine deficiency with a median UIC of 130.1 µg/L, while those in inland areas had marginally adequate iodine intake with a median UIC of 158.1 µg/L ( p < 0.001). Among the total study participants, 450 pregnant women consuming non-iodized salt had mild-moderate iodine deficiency with a median UIC of 99.6 µg/L; 7363 pregnant women consuming adequately iodized salt had a lightly statistically higher median UIC of 151.9 µg/L, compared with the recommended adequate level by the WHO/UNICEF/ICCIDD ( p < 0.001). Consuming adequately iodized salt seemed to lightly increase the median UIC level, but it may not be enough to correct iodine nutrition status to an optimum level as recommended by the WHO/UNICEF/ICCIDD. We therefore suggest that, besides strengthening USI policy, additional interventive measure may be needed to improve iodine intake in pregnancy.
A Comparison of Latent Growth Models for Constructs Measured by Multiple Items
ERIC Educational Resources Information Center
Leite, Walter L.
2007-01-01
Univariate latent growth modeling (LGM) of composites of multiple items (e.g., item means or sums) has been frequently used to analyze the growth of latent constructs. This study evaluated whether LGM of composites yields unbiased parameter estimates, standard errors, chi-square statistics, and adequate fit indexes. Furthermore, LGM was compared…
Predicting fire spread in Arizona's oak chaparral
A. W. Lindenmuth; James R. Davis
1973-01-01
Five existing fire models, both experimental and theoretical, did not adequately predict rate-of-spread (ROS) when tested on single- and multiclump fires in oak chaparral in Arizona. A statistical model developed using essentially the same input variables but weighted differently accounted for 81 percent ofthe variation in ROS. A chemical coefficient that accounts for...
Height and Weight of Southeast Asian Preschool Children in Northern California.
ERIC Educational Resources Information Center
Dewey, Kathryn G.; And Others
1986-01-01
Anthropometric data were obtained from 526 Southeast Asian preschool children during 1980-84. Mean weights and heights were substantially below the National Center for Health Statistics (NCHS) 50th percentile, but rates of weight and height gain were similar to reference values, indicating adequate growth after arrival in the United States.…
A Call for a New National Norming Methodology.
ERIC Educational Resources Information Center
Ligon, Glynn; Mangino, Evangelina
Issues related to achieving adequate national norms are reviewed, and a new methodology is proposed that would work to provide a true measure of national achievement levels on an annual basis and would enable reporting results in current-year norms. Statistical methodology and technology could combine to create a national norming process that…
Are Academic Programs Adequate for the Software Profession?
ERIC Educational Resources Information Center
Koster, Alexis
2010-01-01
According to the Bureau of Labor Statistics, close to 1.8 million people, or 77% of all computer professionals, were working in the design, development, deployment, maintenance, and management of software in 2006. The ACM [Association for Computing Machinery] model curriculum for the BS in computer science proposes that about 42% of the core body…
What Response Rates Are Needed to Make Reliable Inferences from Student Evaluations of Teaching?
ERIC Educational Resources Information Center
Zumrawi, Abdel Azim; Bates, Simon P.; Schroeder, Marianne
2014-01-01
This paper addresses the determination of statistically desirable response rates in students' surveys, with emphasis on assessing the effect of underlying variability in the student evaluation of teaching (SET). We discuss factors affecting the determination of adequate response rates and highlight challenges caused by non-response and lack of…
ERIC Educational Resources Information Center
Benton-Borghi, Beatrice Hope; Chang, Young Mi
2011-01-01
The National Center for Educational Statistics (NCES, 2010) continues to report substantial underachievement of diverse student populations in the nation's schools. After decades of focus on diversity and multicultural education, with integrating field and clinical practice, candidates continue to graduate without adequate knowledge, skills and…
Progressive statistics for studies in sports medicine and exercise science.
Hopkins, William G; Marshall, Stephen W; Batterham, Alan M; Hanin, Juri
2009-01-01
Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.
Charan, J; Saxena, D
2014-01-01
Biased negative studies not only reflect poor research effort but also have an impact on 'patient care' as they prevent further research with similar objectives, leading to potential research areas remaining unexplored. Hence, published 'negative studies' should be methodologically strong. All parameters that may help a reader to judge validity of results and conclusions should be reported in published negative studies. There is a paucity of data on reporting of statistical and methodological parameters in negative studies published in Indian Medical Journals. The present systematic review was designed with an aim to critically evaluate negative studies published in prominent Indian Medical Journals for reporting of statistical and methodological parameters. Systematic review. All negative studies published in 15 Science Citation Indexed (SCI) medical journals published from India were included in present study. Investigators involved in the study evaluated all negative studies for the reporting of various parameters. Primary endpoints were reporting of "power" and "confidence interval." Power was reported in 11.8% studies. Confidence interval was reported in 15.7% studies. Majority of parameters like sample size calculation (13.2%), type of sampling method (50.8%), name of statistical tests (49.1%), adjustment of multiple endpoints (1%), post hoc power calculation (2.1%) were reported poorly. Frequency of reporting was more in clinical trials as compared to other study designs and in journals having impact factor more than 1 as compared to journals having impact factor less than 1. Negative studies published in prominent Indian medical journals do not report statistical and methodological parameters adequately and this may create problems in the critical appraisal of findings reported in these journals by its readers.
2007-01-01
Background The US Food and Drug Administration approved the Charité artificial disc on October 26, 2004. This approval was based on an extensive analysis and review process; 20 years of disc usage worldwide; and the results of a prospective, randomized, controlled clinical trial that compared lumbar artificial disc replacement to fusion. The results of the investigational device exemption (IDE) study led to a conclusion that clinical outcomes following lumbar arthroplasty were at least as good as outcomes from fusion. Methods The author performed a new analysis of the Visual Analog Scale pain scores and the Oswestry Disability Index scores from the Charité artificial disc IDE study and used a nonparametric statistical test, because observed data distributions were not normal. The analysis included all of the enrolled subjects in both the nonrandomized and randomized phases of the study. Results Subjects from both the treatment and control groups improved from the baseline situation (P < .001) at all follow-up times (6 weeks to 24 months). Additionally, these pain and disability levels with artificial disc replacement were superior (P < .05) to the fusion treatment at all follow-up times including 2 years. Conclusions The a priori statistical plan for an IDE study may not adequately address the final distribution of the data. Therefore, statistical analyses more appropriate to the distribution may be necessary to develop meaningful statistical conclusions from the study. A nonparametric statistical analysis of the Charité artificial disc IDE outcomes scores demonstrates superiority for lumbar arthroplasty versus fusion at all follow-up time points to 24 months. PMID:25802574
[Digital administrative maps - a tool for visualization of epidemiological data].
Niewiadomska, Ewa; Kowalska, Malgorzata; Czech, Elibieta; Skrzypek, Michal
2013-01-01
The aim of the study is to present the methods for visualization of epidemiological data using digital contour maps that take into account administrative division of Poland. The possibility of epidemiological data visualization in a geographical order, limited to the administrative level of the country, voivodeships and poviats (countics), are presented. They are crucial for the process of identifying and undertaking adequate prophylactic activities directed towards decreasing the risk and improving the population's health. This paper presents tools and techniques available in Geographic Information System ArcGIS and statistical software package R. The work includes our own data reflecting: 1) the values of specific mortality rates due to respiratory diseases, Poland, 2010, based on the Central Statistical Office data, using the R statistical software package; 2) the averaged registered incidence rates of sarcoidosis in 2006-2010 for the population aged 19+ in the Silesian voivodeship, using G(eographic Information System ArcGIS; and 3) the number of children with diagnosed respiratory diseases in the city of L.egnica in 2009, taking into account their place of residence, using layered maps in Geographic Information System ArcGIS. The tools presented and described in this paper make it possible to visualize the results of research, to increase attractiveness of courses for students, as well as to enhance the skills and competence of students and participants of courses.
Slob, Wout
2006-07-01
Probabilistic dietary exposure assessments that are fully based on Monte Carlo sampling from the raw intake data may not be appropriate. This paper shows that the data should first be analysed by using a statistical model that is able to take the various dimensions of food consumption patterns into account. A (parametric) model is discussed that takes into account the interindividual variation in (daily) consumption frequencies, as well as in amounts consumed. Further, the model can be used to include covariates, such as age, sex, or other individual attributes. Some illustrative examples show how this model may be used to estimate the probability of exceeding an (acute or chronic) exposure limit. These results are compared with the results based on directly counting the fraction of observed intakes exceeding the limit value. This comparison shows that the latter method is not adequate, in particular for the acute exposure situation. A two-step approach for probabilistic (acute) exposure assessment is proposed: first analyse the consumption data by a (parametric) statistical model as discussed in this paper, and then use Monte Carlo techniques for combining the variation in concentrations with the variation in consumption (by sampling from the statistical model). This approach results in an estimate of the fraction of the population as a function of the fraction of days at which the exposure limit is exceeded by the individual.
Gambarini, Gianluca; Piasecki, Lucila; Miccoli, Gabriele; Gaimari, Gianfranco; Nardo, Dario Di; Testarelli, Luca
2018-01-01
Objective: This study aimed to evaluate the relationship between the quality of the coronal restoration and the root canal filling on the periapical status of endodontically treated teeth using CBCT. Materials and Methods: CBCT data were obtained from the records of patients who deny any dental treatment in the 2 years prior to the CBCT examination. CBCT images (90 kVp and 7 mA, exposure time of 23 s, and a voxel size of 0.2 mm, with a field of view of 13 cm × 13 cm) of 1011 endodontically treated teeth were observed. A score was given to the quality of the root filling and the quality of the coronal restoration. Statistical Analysis Used: Data were statistically analyzed to correlate the periapical status with gender, dental group. and quality of endodontic treatment and restoration (Chi-square test with a significance level of P < 0.001). Results: Absence of periapical periodontitis was found in 54.9% of the cases. The periapical outcome was not related to gender or dental group (P > 0.05). A statistically significant factor (Chi-square test, P < 0.0001) resulted when different qualities of sealing were compared. Conclusions: CBCT showed that high-quality root canal treatments followed by an adequate coronal sealing restoration avoid the presence of periapical periodontitis in time. PMID:29657539
NASA Astrophysics Data System (ADS)
Lazar, Dora; Ihasz, Istvan
2013-04-01
The short and medium range operational forecasts, warning and alarm of the severe weather are one of the most important activities of the Hungarian Meteorological Service. Our study provides comprehensive summary of newly developed methods based on ECMWF ensemble forecasts to assist successful prediction of the convective weather situations. . In the first part of the study a brief overview is given about the components of atmospheric convection, which are the atmospheric lifting force, convergence and vertical wind shear. The atmospheric instability is often used to characterize the so-called instability index; one of the most popular and often used indexes is the convective available potential energy. Heavy convective events, like intensive storms, supercells and tornadoes are needed the vertical instability, adequate moisture and vertical wind shear. As a first step statistical studies of these three parameters are based on nine years time series of 51-member ensemble forecasting model based on convective summer time period, various statistical analyses were performed. Relationship of the rate of the convective and total precipitation and above three parameters was studied by different statistical methods. Four new visualization methods were applied for supporting successful forecasts of severe weathers. Two of the four visualization methods the ensemble meteogram and the ensemble vertical profiles had been available at the beginning of our work. Both methods show probability of the meteorological parameters for the selected location. Additionally two new methods have been developed. First method provides probability map of the event exceeding predefined values, so the incident of the spatial uncertainty is well-defined. The convective weather events are characterized by the incident of space often rhapsodic occurs rather have expected the event area can be selected so that the ensemble forecasts give very good support. Another new visualization tool shows time evolution of predefined multiple thresholds in graphical form for any selected location. With applying this tool degree of the dangerous weather conditions can be well estimated. Besides intensive convective periods are clearly marked during the forecasting period. Developments were done by MAGICS++ software under UNIX operating system. The third part of the study usefulness of these tools is demonstrated in three interesting cases studies of last summer.
NASA Astrophysics Data System (ADS)
Scheuerer, Michael; Hamill, Thomas M.; Whitin, Brett; He, Minxue; Henkel, Arthur
2017-04-01
Hydrological forecasts strongly rely on predictions of precipitation amounts and temperature as meteorological inputs to hydrological models. Ensemble weather predictions provide a number of different scenarios that reflect the uncertainty about these meteorological inputs, but are often biased and underdispersive, and therefore require statistical postprocessing. In hydrological applications it is crucial that spatial and temporal (i.e. between different forecast lead times) dependencies as well as dependence between the two weather variables is adequately represented by the recalibrated forecasts. We present a study with temperature and precipitation forecasts over four river basins over California that are postprocessed with a variant of the nonhomogeneous Gaussian regression method (Gneiting et al., 2005) and the censored, shifted gamma distribution approach (Scheuerer and Hamill, 2015) respectively. For modelling spatial, temporal and inter-variable dependence we propose a variant of the Schaake Shuffle (Clark et al., 2005) that uses spatio-temporal trajectories of observed temperture and precipitation as a dependence template, and chooses the historic dates in such a way that the divergence between the marginal distributions of these trajectories and the univariate forecast distributions is minimized. For the four river basins considered in our study, this new multivariate modelling technique consistently improves upon the Schaake Shuffle and yields reliable spatio-temporal forecast trajectories of temperature and precipitation that can be used to force hydrological forecast systems. References: Clark, M., Gangopadhyay, S., Hay, L., Rajagopalan, B., Wilby, R., 2004. The Schaake Shuffle: A method for reconstructing space-time variability in forecasted precipitation and temperature fields. Journal of Hydrometeorology, 5, pp.243-262. Gneiting, T., Raftery, A.E., Westveld, A.H., Goldman, T., 2005. Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS. Monthly Weather Review, 133, pp.1098-1118. Scheuerer, M., Hamill, T.M., 2015. Statistical postprocessing of ensemble precipitation forecasts by fitting censored, shifted gamma distributions. Monthly Weather Review, 143, pp.4578-4596. Scheuerer, M., Hamill, T.M., Whitin, B., He, M., and Henkel, A., 2016: A method for preferential selection of dates in the Schaake shuffle approach to constructing spatio-temporal forecast fields of temperature and precipitation. Water Resources Research, submitted.
Wyka, Joanna; Biernat, Jadwiga; Mikołajczak, Jolanta; Piotrowska, Ewa
2012-01-01
The proportion of elderly people in the global population is rapidly increasing. Their nutritional status indicates many deficiencies that are risky to health. The aim of this paper was to assess the nutrition and nutritional status in elderly individuals above 60 years old living in their family houses in rural areas. Dietary intake and nutritional status were measured in 174 elderly women and 64 men living in the rural areas of Oleśnica (near Wrocław, SW Poland). Energy intake, consumption of nutrients, selected anthropometric and biochemical indicators, were measured in two groups: one at risk of malnutrition and one with adequate nutrition. Using the mini nutritional assessment (MNA) questionnaire, 238 persons over 60 years of age were qualified according to their nutritional status. Anthropometric and biochemical parameters were measured. The group of women at risk of malnutrition (n=30) showed a statistically significantly lower energy intake in their diet (1,127 kcal) compared to women with adequate nutrition (1,351 kcal). The entire group of examined individuals showed a too low consumption of fiber, calcium, vitamins C and D, and folates. Most of the examined women had a too high body mass index (BMI) (on average 28.8), waist circumference was 96.3 cm, and the triceps skinfold (TSF) was 25.2mm thick. Women at a risk of malnutrition had statistically significantly lower lipid parameters than those with adequate nutrition (respectively: TC 191.1 vs. 219.1m/dl, p<0.001, LDL-cholesterol 107.1 vs. 125.1m/dl, p<0.008, TG 129 vs. 143 mg/dl). Men with a risk of malnutrition had a statistically significantly lower BMI (26.0 vs. 28.7, p<0.04), and also lower waist and arm perimeters compared to men with correct nutrition. According to the Charlson comorbidity index (CCI), 8.2% of person with adequate nutrition had poor prognostic indicator for overall survival. All the examined individuals showed many significant nutritional deficiencies. The group with nutritional risk had more pronounced nutritional deficiencies. Despite a too low energy value of foods among individuals with correct nutrition, their anthropometric parameters paradoxically showed the presence of excessive fatty tissue. The most frequent diseases existed in examined group were coronary artery disease and congestive heart failure. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Statistical inference on censored data for targeted clinical trials under enrichment design.
Chen, Chen-Fang; Lin, Jr-Rung; Liu, Jen-Pei
2013-01-01
For the traditional clinical trials, inclusion and exclusion criteria are usually based on some clinical endpoints; the genetic or genomic variability of the trial participants are not totally utilized in the criteria. After completion of the human genome project, the disease targets at the molecular level can be identified and can be utilized for the treatment of diseases. However, the accuracy of diagnostic devices for identification of such molecular targets is usually not perfect. Some of the patients enrolled in targeted clinical trials with a positive result for the molecular target might not have the specific molecular targets. As a result, the treatment effect may be underestimated in the patient population truly with the molecular target. To resolve this issue, under the exponential distribution, we develop inferential procedures for the treatment effects of the targeted drug based on the censored endpoints in the patients truly with the molecular targets. Under an enrichment design, we propose using the expectation-maximization algorithm in conjunction with the bootstrap technique to incorporate the inaccuracy of the diagnostic device for detection of the molecular targets on the inference of the treatment effects. A simulation study was conducted to empirically investigate the performance of the proposed methods. Simulation results demonstrate that under the exponential distribution, the proposed estimator is nearly unbiased with adequate precision, and the confidence interval can provide adequate coverage probability. In addition, the proposed testing procedure can adequately control the size with sufficient power. On the other hand, when the proportional hazard assumption is violated, additional simulation studies show that the type I error rate is not controlled at the nominal level and is an increasing function of the positive predictive value. A numerical example illustrates the proposed procedures. Copyright © 2013 John Wiley & Sons, Ltd.
Characterization of oily sludge from a Tehran oil refinery.
Heidarzadeh, Nima; Gitipour, Saeid; Abdoli, Mohammad Ali
2010-10-01
In this study, oily sludge samples generated from a Tehran oil refinery (Pond I) were evaluated for their contamination levels and to propose an adequate remediation technique for the wastes. A simple, random, sampling method was used to collect the samples. The samples were analyzed to measure Total petroleum hydrocarbon (TPH), polyaromatic hydrocarbon (PAH) and heavy metal concentrations in the sludge. Statistical analysis showed that seven samples were adequate to assess the sludge with respect to TPH analyses. The mean concentration of TPHs in the samples was 265,600 mg kg⁻¹. A composite sample prepared from a mix of the seven samples was used to determine the sludge's additional characteristics. Composite sample analysis showed that there were no detectable amounts of PAHs in the sludge. In addition, mean concentrations of the selected heavy metals Ni, Pb, Cd and Zn were 2700, 850, 100, 6100 mg kg⁻¹, respectively. To assess the sludge contamination level, the results from the analysis above were compared with soil clean-up levels. Due to a lack of national standards for soil clean-up levels in Iran, sludge pollutant concentrations were compared with standards set in developed countries. According to these standards, the sludge was highly polluted with petroleum hydrocarbons. The results indicated that incineration, biological treatment and solidification/stabilization treatments would be the most appropriate methods for treatment of the sludges. In the case of solidification/stabilization, due to the high organic content of the sludge, it is recommended to use organophilic clays prior to treatment of the wastes.
Harris, Alex; Reeder, Rachelle; Hyun, Jenny
2011-01-01
The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.
Hoyle, R H
1991-02-01
Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.
Delivery of cardiopulmonary resuscitation in the microgravity environment
NASA Technical Reports Server (NTRS)
Barratt, M. R.; Billica, R. D.
1992-01-01
The microgravity environment presents several challenges for delivering effective cardiopulmonary resuscitation (CPR). Chest compressions must be driven by muscular force rather than by the weight of the rescuer's upper torso. Airway stabilization is influenced by the neutral body posture. Rescuers will consist of crew members of varying sizes and degrees of physical deconditioning from space flight. Several methods of CPR designed to accommodate these factors were tested in the one G environment, in parabolic flight, and on a recent shuttle flight. Methods: Utilizing study participants of varying sizes, different techniques of CPR delivery were evaluated using a recording CPR manikin to assess adequacy of compressive force and frequency. Under conditions of parabolic flight, methods tested included conventional positioning of rescuer and victim, free floating 'Heimlich type' compressions, straddling the patient with active and passive restraints, and utilizing a mechanical cardiac compression assist device (CCAD). Multiple restrain systems and ventilation methods were also assessed. Results: Delivery of effective CPR was possible in all configurations tested. Reliance on muscular force alone was quickly fatiguing to the rescuer. Effectiveness of CPR was dependent on technique, adequate restraint of the rescuer and patient, and rescuer size and preference. Free floating CPR was adequate but rapidly fatiguing. The CCAD was able to provide adequate compressive force but positioning was problematic. Conclusions: Delivery of effective CPR in microgravity will be dependent on adequate resuer and patient restraint, technique, and rescuer size and preference. Free floating CPR may be employed as a stop gap method until patient restraint is available. Development of an adequate CCAD would be desirable to compensate for the effects of deconditioning.
Abdurahmen, Junayde
2018-01-01
Background Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. Methods A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. Result The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97–7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. Conclusion This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD. PMID:29765978
Rajadhyaksha, Milind
2012-01-01
Abstract. Coherent speckle influences the resulting image when narrow spectral line-width and single spatial mode illumination are used, though these are the same light-source properties that provide the best radiance-to-cost ratio. However, a suitable size of the detection pinhole can be chosen to maintain adequate optical sectioning while making the probability density of the speckle noise more normal and reducing its effect. The result is a qualitatively better image with improved contrast, which is easier to read. With theoretical statistics and experimental results, we show that the detection pinhole size is a fundamental parameter for designing imaging systems for use in turbid media. PMID:23224184
Inference of reaction rate parameters based on summary statistics from experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin
Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less
Inference of reaction rate parameters based on summary statistics from experiments
Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin; ...
2016-10-15
Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less
Silvestri, Erin E; Yund, Cynthia; Taft, Sarah; Bowling, Charlena Yoder; Chappie, Daniel; Garrahan, Kevin; Brady-Roberts, Eletha; Stone, Harry; Nichols, Tonya L
2017-01-01
In the event of an indoor release of an environmentally persistent microbial pathogen such as Bacillus anthracis, the potential for human exposure will be considered when remedial decisions are made. Microbial site characterization and clearance sampling data collected in the field might be used to estimate exposure. However, there are many challenges associated with estimating environmental concentrations of B. anthracis or other spore-forming organisms after such an event before being able to estimate exposure. These challenges include: (1) collecting environmental field samples that are adequate for the intended purpose, (2) conducting laboratory analyses and selecting the reporting format needed for the laboratory data, and (3) analyzing and interpreting the data using appropriate statistical techniques. This paper summarizes some key challenges faced in collecting, analyzing, and interpreting microbial field data from a contaminated site. Although the paper was written with considerations for B. anthracis contamination, it may also be applicable to other bacterial agents. It explores the implications and limitations of using field data for determining environmental concentrations both before and after decontamination. Several findings were of interest. First, to date, the only validated surface/sampling device combinations are swabs and sponge-sticks on stainless steel surfaces, thus limiting availability of quantitative analytical results which could be used for statistical analysis. Second, agreement needs to be reached with the analytical laboratory on the definition of the countable range and on reporting of data below the limit of quantitation. Finally, the distribution of the microbial field data and statistical methods needed for a particular data set could vary depending on these data that were collected, and guidance is needed on appropriate statistical software for handling microbial data. Further, research is needed to develop better methods to estimate human exposure from pathogens using environmental data collected from a field setting. PMID:26883476
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-07-01
Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories.
Péron, Julien; Pond, Gregory R; Gan, Hui K; Chen, Eric X; Almufti, Roula; Maillet, Denis; You, Benoit
2012-07-03
The Consolidated Standards of Reporting Trials (CONSORT) guidelines were developed in the mid-1990s for the explicit purpose of improving clinical trial reporting. However, there is little information regarding the adherence to CONSORT guidelines of recent publications of randomized controlled trials (RCTs) in oncology. All phase III RCTs published between 2005 and 2009 were reviewed using an 18-point overall quality score for reporting based on the 2001 CONSORT statement. Multivariable linear regression was used to identify features associated with improved reporting quality. To provide baseline data for future evaluations of reporting quality, RCTs were also assessed according to the 2010 revised CONSORT statement. All statistical tests were two-sided. A total of 357 RCTs were reviewed. The mean 2001 overall quality score was 13.4 on a scale of 0-18, whereas the mean 2010 overall quality score was 19.3 on a scale of 0-27. The overall RCT reporting quality score improved by 0.21 points per year from 2005 to 2009. Poorly reported items included method used to generate the random allocation (adequately reported in 29% of trials), whether and how blinding was applied (41%), method of allocation concealment (51%), and participant flow (59%). High impact factor (IF, P = .003), recent publication date (P = .008), and geographic origin of RCTs (P = .003) were independent factors statistically significantly associated with higher reporting quality in a multivariable regression model. Sample size, tumor type, and positivity of trial results were not associated with higher reporting quality, whereas funding source and treatment type had a borderline statistically significant impact. The results show that numerous items remained unreported for many trials. Thus, given the potential impact of poorly reported trials, oncology journals should require even stricter adherence to the CONSORT guidelines.
NASA Astrophysics Data System (ADS)
Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.
2017-06-01
The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.
NASA Astrophysics Data System (ADS)
Tariq, Imran; Humbert-Vidan, Laia; Chen, Tao; South, Christopher P.; Ezhil, Veni; Kirkby, Norman F.; Jena, Rajesh; Nisbet, Andrew
2015-05-01
This paper reports a modelling study of tumour volume dynamics in response to stereotactic ablative radiotherapy (SABR). The main objective was to develop a model that is adequate to describe tumour volume change measured during SABR, and at the same time is not excessively complex as lacking support from clinical data. To this end, various modelling options were explored, and a rigorous statistical method, the Akaike information criterion, was used to help determine a trade-off between model accuracy and complexity. The models were calibrated to the data from 11 non-small cell lung cancer patients treated with SABR. The results showed that it is feasible to model the tumour volume dynamics during SABR, opening up the potential for using such models in a clinical environment in the future.
Pharmacy Students' Knowledge Assessment of Naegleria fowleri Infection
Shakeel, Sadia; Iffat, Wajiha; Khan, Madeeha
2016-01-01
A cross-sectional study was conducted from April to August 2015 to assess the knowledge of pharmacy students towards Naegleria fowleri infection. A questionnaire was distributed to senior pharmacy students in different private and public sector universities of Karachi. Descriptive statistics were used to demonstrate students' demographic information and their responses to the questionnaire. Pearson chi-square test was adopted to assess the relationship between independent variables and responses of students. The study revealed that pharmacy students were having adequate awareness of Naegleria fowleri infection and considered it as a serious health issue that necessitates instantaneous steps by the government to prevent the general public from the fatal neurological infection. The students recommended that appropriate methods should be projected in the community from time to time that increases public awareness about the associated risk factors. PMID:26981318
NASA Technical Reports Server (NTRS)
Smith, R. F.; Stanton, K.; Stoop, D.; Brown, D.; Janusz, W.; King, P.
1977-01-01
The objectives of Skylab Experiment M093 were to measure electrocardiographic signals during space flight, to elucidate the electrophysiological basis for the changes observed, and to assess the effect of the change on the human cardiovascular system. Vectorcardiographic methods were used to quantitate changes, standardize data collection, and to facilitate reduction and statistical analysis of data. Since the Skylab missions provided a unique opportunity to study the effects of prolonged weightlessness on human subjects, an effort was made to construct a data base that contained measurements taken with precision and in adequate number to enable conclusions to be made with a high degree of confidence. Standardized exercise loads were incorporated into the experiment protocol to increase the sensitivity of the electrocardiogram for effects of deconditioning and to detect susceptability for arrhythmias.
Age-Period-Cohort approaches to back-calculation of cancer incidence rate
Oh, Cheongeun; Holford, Theodore R.
2016-01-01
A compartment model for cancer incidence and mortality is developed in which healthy subjects may develop cancer, and subsequently die of cancer or another cause. In order to adequately represent the experience of a defined population, it is also necessary to allow for subjects who are diagnosed at death, as well as subjects who migrate and are subsequently lost to follow-up. Expressions are derived for the number of cancer deaths as a function of the number of incidence cases and vice versa, which allows for the use of mortality statistics to obtain estimates of incidence using survival information. In addition, the model can be used to obtain estimates of cancer prevalence, which is useful for health care planning. The method is illustrated using data on lung cancer among males in Connecticut. PMID:25715831
Reliability-Based Electronics Shielding Design Tools
NASA Technical Reports Server (NTRS)
Wilson, J. W.; O'Neill, P. J.; Zang, T. A.; Pandolf, J. E.; Tripathi, R. K.; Koontz, Steven L.; Boeder, P.; Reddell, B.; Pankop, C.
2007-01-01
Shielding design on large human-rated systems allows minimization of radiation impact on electronic systems. Shielding design tools require adequate methods for evaluation of design layouts, guiding qualification testing, and adequate follow-up on final design evaluation.
Total body nitrogen analysis. [neutron activation analysis
NASA Technical Reports Server (NTRS)
Palmer, H. E.
1975-01-01
Studies of two potential in vivo neutron activation methods for determining total and partial body nitrogen in animals and humans are described. A method using the CO-11 in the expired air as a measure of nitrogen content was found to be adequate for small animals such as rats, but inadequate for human measurements due to a slow excretion rate. Studies on the method of measuring the induced N-13 in the body show that with further development, this method should be adequate for measuring muscle mass changes occurring in animals or humans during space flight.
Biometrical issues in the analysis of adverse events within the benefit assessment of drugs.
Bender, Ralf; Beckmann, Lars; Lange, Stefan
2016-07-01
The analysis of adverse events plays an important role in the benefit assessment of drugs. Consequently, results on adverse events are an integral part of reimbursement dossiers submitted by pharmaceutical companies to health policy decision-makers. Methods applied in the analysis of adverse events commonly include simple standard methods for contingency tables. However, the results produced may be misleading if observations are censored at the time of discontinuation due to treatment switching or noncompliance, resulting in unequal follow-up periods. In this paper, we present examples to show that the application of inadequate methods for the analysis of adverse events in the reimbursement dossier can lead to a downgrading of the evidence on a drug's benefit in the subsequent assessment, as greater harm from the drug cannot be excluded with sufficient certainty. Legal regulations on the benefit assessment of drugs in Germany are presented, in particular, with regard to the analysis of adverse events. Differences in safety considerations between the drug approval process and the benefit assessment are discussed. We show that the naive application of simple proportions in reimbursement dossiers frequently leads to uninterpretable results if observations are censored and the average follow-up periods differ between treatment groups. Likewise, the application of incidence rates may be misleading in the case of recurrent events and unequal follow-up periods. To allow for an appropriate benefit assessment of drugs, adequate survival time methods accounting for time dependencies and duration of follow-up are required, not only for time-to-event efficacy endpoints but also for adverse events. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd. © 2016 The Authors. Pharmaceutical Statistics published by John Wiley & Sons Ltd.
Perser, Karen; Godfrey, David; Bisson, Leslie
2011-01-01
Context: Double-row rotator cuff repair methods have improved biomechanical performance when compared with single-row repairs. Objective: To review clinical outcomes of single-row versus double-row rotator cuff repair with the hypothesis that double-row rotator cuff repair will result in better clinical and radiographic outcomes. Data Sources: Published literature from January 1980 to April 2010. Key terms included rotator cuff, prospective studies, outcomes, and suture techniques. Study Selection: The literature was systematically searched, and 5 level I and II studies were found comparing clinical outcomes of single-row and double-row rotator cuff repair. Coleman methodology scores were calculated for each article. Data Extraction: Meta-analysis was performed, with treatment effect between single row and double row for clinical outcomes and with odds ratios for radiographic results. The sample size necessary to detect a given difference in clinical outcome between the 2 methods was calculated. Results: Three level I studies had Coleman scores of 80, 74, and 81, and two level II studies had scores of 78 and 73. There were 156 patients with single-row repairs and 147 patients with double-row repairs, both with an average follow-up of 23 months (range, 12-40 months). Double-row repairs resulted in a greater treatment effect for each validated outcome measure in 4 studies, but the differences were not clinically or statistically significant (range, 0.4-2.2 points; 95% confidence interval, –0.19, 4.68 points). Double-row repairs had better radiographic results, but the differences were also not statistically significant (P = 0.13). Two studies had adequate power to detect a 10-point difference between repair methods using the Constant score, and 1 study had power to detect a 5-point difference using the UCLA (University of California, Los Angeles) score. Conclusions: Double-row rotator cuff repair does not show a statistically significant improvement in clinical outcome or radiographic healing with short-term follow-up. PMID:23016017
Perser, Karen; Godfrey, David; Bisson, Leslie
2011-05-01
Double-row rotator cuff repair methods have improved biomechanical performance when compared with single-row repairs. To review clinical outcomes of single-row versus double-row rotator cuff repair with the hypothesis that double-row rotator cuff repair will result in better clinical and radiographic outcomes. Published literature from January 1980 to April 2010. Key terms included rotator cuff, prospective studies, outcomes, and suture techniques. The literature was systematically searched, and 5 level I and II studies were found comparing clinical outcomes of single-row and double-row rotator cuff repair. Coleman methodology scores were calculated for each article. Meta-analysis was performed, with treatment effect between single row and double row for clinical outcomes and with odds ratios for radiographic results. The sample size necessary to detect a given difference in clinical outcome between the 2 methods was calculated. Three level I studies had Coleman scores of 80, 74, and 81, and two level II studies had scores of 78 and 73. There were 156 patients with single-row repairs and 147 patients with double-row repairs, both with an average follow-up of 23 months (range, 12-40 months). Double-row repairs resulted in a greater treatment effect for each validated outcome measure in 4 studies, but the differences were not clinically or statistically significant (range, 0.4-2.2 points; 95% confidence interval, -0.19, 4.68 points). Double-row repairs had better radiographic results, but the differences were also not statistically significant (P = 0.13). Two studies had adequate power to detect a 10-point difference between repair methods using the Constant score, and 1 study had power to detect a 5-point difference using the UCLA (University of California, Los Angeles) score. Double-row rotator cuff repair does not show a statistically significant improvement in clinical outcome or radiographic healing with short-term follow-up.
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
Modeling envelope statistics of blood and myocardium for segmentation of echocardiographic images.
Nillesen, Maartje M; Lopata, Richard G P; Gerrits, Inge H; Kapusta, Livia; Thijssen, Johan M; de Korte, Chris L
2008-04-01
The objective of this study was to investigate the use of speckle statistics as a preprocessing step for segmentation of the myocardium in echocardiographic images. Three-dimensional (3D) and biplane image sequences of the left ventricle of two healthy children and one dog (beagle) were acquired. Pixel-based speckle statistics of manually segmented blood and myocardial regions were investigated by fitting various probability density functions (pdf). The statistics of heart muscle and blood could both be optimally modeled by a K-pdf or Gamma-pdf (Kolmogorov-Smirnov goodness-of-fit test). Scale and shape parameters of both distributions could differentiate between blood and myocardium. Local estimation of these parameters was used to obtain parametric images, where window size was related to speckle size (5 x 2 speckles). Moment-based and maximum-likelihood estimators were used. Scale parameters were still able to differentiate blood from myocardium; however, smoothing of edges of anatomical structures occurred. Estimation of the shape parameter required a larger window size, leading to unacceptable blurring. Using these parameters as an input for segmentation resulted in unreliable segmentation. Adaptive mean squares filtering was then introduced using the moment-based scale parameter (sigma(2)/mu) of the Gamma-pdf to automatically steer the two-dimensional (2D) local filtering process. This method adequately preserved sharpness of the edges. In conclusion, a trade-off between preservation of sharpness of edges and goodness-of-fit when estimating local shape and scale parameters is evident for parametric images. For this reason, adaptive filtering outperforms parametric imaging for the segmentation of echocardiographic images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, T.; Tietjen, G.L.; McInroy, J.F.
The Autopsy Tissue Program was begun in 1960. To date, tissues on 900 or more persons in 7 geographic regions have been collected and analyzed for plutonium content. The tissues generally consist of lung, liver, kidney, lymph, bone, and gonadal tissues for each individual. The original objective of the program was to determine the level of plutonium in human tissues due solely to fallout from weapons testing. The baseline thus established was to be used to evaluate future changes. From the first, this program was beset with chemical and statistical difficulties. Many factors whose effects were not recognized and notmore » planned for were found later to be important. Privacy and ethical considerations hindered the gathering of adequate data. Since the chemists were looking for amounts of plutonium very close to background, possible contamination was a very real problem. Widely used chemical techniques introduced a host of statistical problems. The difficulties encountered touch on areas common to large data sets, unusual outlier detection methods minimum detection limits, problems with aliquot sizes, and time-trends in the data. The conclusions point out areas to which the biologists will have to devote much more careful attention than was believed.« less
Understanding sexuality among Indian urban school adolescents
Ramadugu, Shashikumar; Ryali, VSSR; Srivastava, K.; Bhat, P. S.; Prakash, J.
2011-01-01
Context: Adolescence is a very exciting phase of life fraught with many challenges like sexuality. Understanding them is important in helping the adolescents grow up healthily. Aims: To ascertain the attitudes and knowledge about sexuality among school-going adolescents. Settings and Design: Students in two urban schools of an Indian city from class IX to XII were administered a self-reporting questionnaire on matters related to sexuality. Materials and Methods: Requisite ethical clearances were taken as also the consent of the parents and students before administration of the questionnaire. The authors clarified doubts to adolescents. Statistical analysis: Statistical package for social sciences. Results: The incidence of having sexual contact was 30.08% for boys and 17.18% for girls. 6.31% boys and 1.31% girls reported having had experienced sexual intercourse. Friends constituted the main sexual partners for both boys and girls. Sexual abuse had been reported by both girls and boys. These and other findings are discussed in the article. Conclusions: Adolescent school students are involved in sexual activity, but lack adequate knowledge in this regard. Students, teachers, and parents need to understand various aspects of sexuality to be able to help adolescents’ healthy sexual development. PMID:22969181
The Acute Effects of Upper Extremity Stretching on Throwing Velocity in Baseball Throwers
Melton, Jason; Delobel, Ashley; Puentedura, Emilio J.
2013-01-01
Purpose. To examine the effects of static and proprioceptive neuromuscular facilitation (PNF) stretching of the shoulder internal rotators on throwing velocity. Subjects. 27 male throwers (mean age = 25.1 years old, SD = 2.4) with adequate knowledge of demonstrable throwing mechanics. Study Design. Randomized crossover trial with repeated measures. Methods. Subjects warmed up, threw 10 pitches at their maximum velocity, were randomly assigned to 1 of 3 stretching protocols (static, PNF, or no stretch), and then repeated their 10 pitches. Velocities were recorded after each pitch and average and peak velocities were recorded after each session. Results. Data were analyzed using a 3 × 2 repeated measures ANOVA. No significant interaction between stretching and throwing velocity was observed. Main effects for time were not statistically significant. Main effects for the stretching groups were statistically significant. Discussion. Results suggest that stretching of the shoulder internal rotators did not significantly affect throwing velocity immediately after stretching. This may be due to the complexity of the throwing task. Conclusions. Stretching may be included in a thrower's warm-up without any effects on throwing velocity. Further research should be performed using a population with more throwing experience and skill. PMID:26464880
Akboğa, Özge; Baradan, Selim
2017-02-07
Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods.
Golbaz, Isabelle; Ahlers, Christian; Goesseringer, Nina; Stock, Geraldine; Geitzenauer, Wolfgang; Prünte, Christian; Schmidt-Erfurth, Ursula Margarethe
2011-03-01
This study compared automatic- and manual segmentation modalities in the retina of healthy eyes using high-definition optical coherence tomography (HD-OCT). Twenty retinas in 20 healthy individuals were examined using an HD-OCT system (Carl Zeiss Meditec, Inc.). Three-dimensional imaging was performed with an axial resolution of 6 μm at a maximum scanning speed of 25,000 A-scans/second. Volumes of 6 × 6 × 2 mm were scanned. Scans were analysed using a matlab-based algorithm and a manual segmentation software system (3D-Doctor). The volume values calculated by the two methods were compared. Statistical analysis revealed a high correlation between automatic and manual modes of segmentation. The automatic mode of measuring retinal volume and the corresponding three-dimensional images provided similar results to the manual segmentation procedure. Both methods were able to visualize retinal and subretinal features accurately. This study compared two methods of assessing retinal volume using HD-OCT scans in healthy retinas. Both methods were able to provide realistic volumetric data when applied to raster scan sets. Manual segmentation methods represent an adequate tool with which to control automated processes and to identify clinically relevant structures, whereas automatic procedures will be needed to obtain data in larger patient populations. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.
Differentially Private Histogram Publication For Dynamic Datasets: An Adaptive Sampling Approach
Li, Haoran; Jiang, Xiaoqian; Xiong, Li; Liu, Jinfei
2016-01-01
Differential privacy has recently become a de facto standard for private statistical data release. Many algorithms have been proposed to generate differentially private histograms or synthetic data. However, most of them focus on “one-time” release of a static dataset and do not adequately address the increasing need of releasing series of dynamic datasets in real time. A straightforward application of existing histogram methods on each snapshot of such dynamic datasets will incur high accumulated error due to the composibility of differential privacy and correlations or overlapping users between the snapshots. In this paper, we address the problem of releasing series of dynamic datasets in real time with differential privacy, using a novel adaptive distance-based sampling approach. Our first method, DSFT, uses a fixed distance threshold and releases a differentially private histogram only when the current snapshot is sufficiently different from the previous one, i.e., with a distance greater than a predefined threshold. Our second method, DSAT, further improves DSFT and uses a dynamic threshold adaptively adjusted by a feedback control mechanism to capture the data dynamics. Extensive experiments on real and synthetic datasets demonstrate that our approach achieves better utility than baseline methods and existing state-of-the-art methods. PMID:26973795
Raicu, Valerică
2018-06-15
Investigations of static or dynamic interactions between proteins or other biological macromolecules in living cells often rely on the use of fluorescent tags with two different colors in conjunction with adequate theoretical descriptions of Förster Resonance Energy Transfer (FRET) and molecular-level micro-spectroscopic technology. One such method based on these general principles is FRET spectrometry, which allows determination of the quaternary structure of biomolecules from cell-level images of the distributions, or spectra of occurrence frequency of FRET efficiencies. Subsequent refinements allowed combining FRET frequency spectra with molecular concentration information, thereby providing the proportion of molecular complexes with various quaternary structures as well as their binding/dissociation energies. In this paper, we build on the mathematical principles underlying FRET spectrometry to propose two new spectrometric methods, which have distinct advantages compared to other methods. One of these methods relies on statistical analysis of color mixing in subpopulations of fluorescently tagged molecules to probe molecular association stoichiometry, while the other exploits the color shift induced by FRET to also derive geometric information in addition to stoichiometry. The appeal of the first method stems from its sheer simplicity, while the strength of the second consists in its ability to provide structural information. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Raicu, Valerică
2018-06-01
Investigations of static or dynamic interactions between proteins or other biological macromolecules in living cells often rely on the use of fluorescent tags with two different colors in conjunction with adequate theoretical descriptions of Förster Resonance Energy Transfer (FRET) and molecular-level micro-spectroscopic technology. One such method based on these general principles is FRET spectrometry, which allows determination of the quaternary structure of biomolecules from cell-level images of the distributions, or spectra of occurrence frequency of FRET efficiencies. Subsequent refinements allowed combining FRET frequency spectra with molecular concentration information, thereby providing the proportion of molecular complexes with various quaternary structures as well as their binding/dissociation energies. In this paper, we build on the mathematical principles underlying FRET spectrometry to propose two new spectrometric methods, which have distinct advantages compared to other methods. One of these methods relies on statistical analysis of color mixing in subpopulations of fluorescently tagged molecules to probe molecular association stoichiometry, while the other exploits the color shift induced by FRET to also derive geometric information in addition to stoichiometry. The appeal of the first method stems from its sheer simplicity, while the strength of the second consists in its ability to provide structural information.
Aerobic conditioning for team sport athletes.
Stone, Nicholas M; Kilding, Andrew E
2009-01-01
Team sport athletes require a high level of aerobic fitness in order to generate and maintain power output during repeated high-intensity efforts and to recover. Research to date suggests that these components can be increased by regularly performing aerobic conditioning. Traditional aerobic conditioning, with minimal changes of direction and no skill component, has been demonstrated to effectively increase aerobic function within a 4- to 10-week period in team sport players. More importantly, traditional aerobic conditioning methods have been shown to increase team sport performance substantially. Many team sports require the upkeep of both aerobic fitness and sport-specific skills during a lengthy competitive season. Classic team sport trainings have been shown to evoke marginal increases/decreases in aerobic fitness. In recent years, aerobic conditioning methods have been designed to allow adequate intensities to be achieved to induce improvements in aerobic fitness whilst incorporating movement-specific and skill-specific tasks, e.g. small-sided games and dribbling circuits. Such 'sport-specific' conditioning methods have been demonstrated to promote increases in aerobic fitness, though careful consideration of player skill levels, current fitness, player numbers, field dimensions, game rules and availability of player encouragement is required. Whilst different conditioning methods appear equivalent in their ability to improve fitness, whether sport-specific conditioning is superior to other methods at improving actual game performance statistics requires further research.
Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin
2017-11-10
The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
THE RAPID GROWTH OF COMMUNITY COLLEGES AND THEIR ACCESSIBILITY IN RURAL AREAS.
ERIC Educational Resources Information Center
ELDRIDGE, DONALD A.
THE COURSE OFFERINGS IN SOME JUNIOR COLLEGES FAIL TO MEET ADEQUATELY THE UNIQUE NEEDS OF RURAL YOUTH. A STUDY IN 1964 REVEALED THAT ONLY TWENTY OF THE SEVENTY JUNIOR COLLEGES IN CALIFORNIA OFFERED TRAINING IN AGRICULTURE, ALTHOUGH THE RECENTLY PUBLISHED "DIRECTORY OF JUNIOR COLLEGES" SHOWS AN INCREASE TO SIXTY. FURTHER STATISTICS REVEAL THAT 253…
10 CFR 9.80 - Disclosure of record to persons other than the individual to whom it pertains.
Code of Federal Regulations, 2010 CFR
2010-01-01
... has provided the agency with advance adequate written assurance that the record will be used solely as a statistical research or reporting record and the record is transferred in a form that is not individually identifiable. The advance written statement of assurance shall (i) state the purpose for which the...
?Cuan buenas son nuestras viviendas?: Los hispanos [How Good Is Our Housing? Hispanics].
ERIC Educational Resources Information Center
Yezer, Anthony; Limmer, Ruth
This report provides statistical information regarding the quality and cost of housing occupied by Hispanic Americans throughout the United States. Some of the findings include: (1) Hispanos occupy older and worse dwellings than the general U.S. population, with a significant number of dwellings lacking heat and adequate electricity and plumbing…
Closing the Gender Gap: Girls and Computers.
ERIC Educational Resources Information Center
Fuchs, Lucy
While 15 years ago only a few schools had microcomputers, today a majority of public schools have some computers, although an adequate number of computers for students to use is still in the future. Unfortunately, statistics show that, in many states, a higher percentage of male students are enrolled in computer classes than female; boys seem to…
ERIC Educational Resources Information Center
Bullis, Michael; Reiman, John
1992-01-01
The Transition Competence Battery for Deaf Adolescents and Young Adults (TCB) measures employment and independent living skills. The TCB was standardized on students (N from 180 to 230 for the different subtests) from both mainstreamed and residential settings. Item statistics and subtest reliabilities were adequate; evidence of construct validity…
ERIC Educational Resources Information Center
Yang, Dazhi
2017-01-01
Background: Teaching online is a different experience from that of teaching in a face-to-face setting. Knowledge and skills developed for teaching face-to-face classes are not adequate preparation for teaching online. It is even more challenging to teach science, technology, engineering and math (STEM) courses completely online because these…
Idaho Kids Count Data Book, 1996: Profiles of Child Well-Being.
ERIC Educational Resources Information Center
Idaho KIDS COUNT Project, Boise.
This Kids Count report examines statewide trends in the well-being of Idaho's children. The statistical portrait is based on 15 indicators of child and family well-being: (1) poverty; (2) single parent families; (3) infant mortality; (4) low birth weight babies; (5) percent of all mothers not receiving adequate prenatal care; (6) mothers ages…
NASA Astrophysics Data System (ADS)
Kbaier Ben Ismail, Dhouha; Lazure, Pascal; Puillat, Ingrid
2016-10-01
In marine sciences, many fields display high variability over a large range of spatial and temporal scales, from seconds to thousands of years. The longer recorded time series, with an increasing sampling frequency, in this field are often nonlinear, nonstationary, multiscale and noisy. Their analysis faces new challenges and thus requires the implementation of adequate and specific methods. The objective of this paper is to highlight time series analysis methods already applied in econometrics, signal processing, health, etc. to the environmental marine domain, assess advantages and inconvenients and compare classical techniques with more recent ones. Temperature, turbidity and salinity are important quantities for ecosystem studies. The authors here consider the fluctuations of sea level, salinity, turbidity and temperature recorded from the MAREL Carnot system of Boulogne-sur-Mer (France), which is a moored buoy equipped with physico-chemical measuring devices, working in continuous and autonomous conditions. In order to perform adequate statistical and spectral analyses, it is necessary to know the nature of the considered time series. For this purpose, the stationarity of the series and the occurrence of unit-root are addressed with the Augmented-Dickey Fuller tests. As an example, the harmonic analysis is not relevant for temperature, turbidity and salinity due to the nonstationary condition, except for the nearly stationary sea level datasets. In order to consider the dominant frequencies associated to the dynamics, the large number of data provided by the sensors should enable the estimation of Fourier spectral analysis. Different power spectra show a complex variability and reveal an influence of environmental factors such as tides. However, the previous classical spectral analysis, namely the Blackman-Tukey method, requires not only linear and stationary data but also evenly-spaced data. Interpolating the time series introduces numerous artifacts to the data. The Lomb-Scargle algorithm is adapted to unevenly-spaced data and is used as an alternative. The limits of the method are also set out. It was found that beyond 50% of missing measures, few significant frequencies are detected, several seasonalities are no more visible, and even a whole range of high frequency disappears progressively. Furthermore, two time-frequency decomposition methods, namely wavelets and Hilbert-Huang Transformation (HHT), are applied for the analysis of the entire dataset. Using the Continuous Wavelet Transform (CWT), some properties of the time series are determined. Then, the inertial wave and several low-frequency tidal waves are identified by the application of the Empirical Mode Decomposition (EMD). Finally, EMD based Time Dependent Intrinsic Correlation (TDIC) analysis is applied to consider the correlation between two nonstationary time series.
External cooling methods for treatment of fever in adults: a systematic review.
Chan, E Y; Chen, W T; Assam, P N
It is unclear if the use of external cooling to treat fever contributes to better patient outcomes. Despite this, it is a common practice to treat febrile patients using external cooling methods alone or in combination with pharmacological antipyretics. The objective of this systematic review was to evaluate the effectiveness and complications of external cooling methods in febrile adults in acute care settings. We included adults admitted to acute care settings and developed elevated body temperature.We considered any external cooling method compared to no cooling.We considered randomised control trials (RCTs), quasi-randomised trials and controlled trials with concurrent control groups SEARCH STRATEGY: We searched relevant published or unpublished studies up to October 2009 regardless of language. We searched major databases, reference lists, bibliographies of all relevant articles, and contacted experts in the field for additional studies. Two reviewers independently screened titles and abstracts, and retrieved all potentially relevant studies. Two reviewers independently conducted the assessment of methodological quality of included studies. The results of studies where appropriate was quantitatively summarised. Relative risks or weighted mean difference and their 95% confidence intervals were calculated using the random effects model in Review Manager 5. For each pooled comparison, heterogeneity was assessed using the chi-squared test at the 5% level of statistical significance, with I statistic used to assess the impact of statistical heterogeneity on study results. Where statistical summary was not appropriate or possible, the findings were summarised in narrative form. We found six RCTs that compared the effectiveness and complications of external cooling methods against no external cooling. There was wide variation in the outcome measures between the included trials. We performed meta-analyses on data from two RCTs totalling 356 patients testing external cooling combined with antipyretics versus antipyretics alone, for the resolution of fever. The results did not show a statistically significant reduction in fever (relative risk 1.12, 95% CI 0.95 to 1.31; P=0.35; I =0%).The evidence from four trials suggested that there was no difference in the mean drop in body temperature post treatment initiation, between external cooling and no cooling groups. The results of most other outcomes also did not demonstrate a statistically significant difference. However summarising the results of five trials consisting of 371 patients found that the external cooling group was more likely to shiver when compared to the no cooling group (relative risk 6.37, 95% CI 2.01 to 20.11; P=0.61; I =0%).Overall this review suggested that external cooling methods (whether used alone or in combination with pharmacologic methods) were not effective in treating fever among adults admitted to acute care settings. Yet they were associated with higher incidences of shivering. These results should be interpreted in light of the methodological limitations of available trials. Given the current available evidence, the routine use of external cooling methods to treat fever in adults may not be warranted until further evidence is available. They could be considered for patients whose conditions are unable to tolerate even slight increase in temperature or who request for them. Whenever they are used, shivering should be prevented. Well-designed, adequately powered, randomised trials comparing external cooling methods against no cooling are needed.
Rukundo, Peter M; Iversen, Per O; Andreassen, Bård A; Oshaug, Arne; Kikafunda, Joyce; Rukooko, Byaruhanga
2015-04-25
Despite the instruments on the right to adequate food adopted by the United Nations, there exists limited information on how this right is perceived. Following a major 2010 landslide disaster in the Bududa district of Eastern Uganda and the resettlement of some affected households into the Kiryandongo district in Western Uganda, we surveyed both districts to explore perceptions about the right to adequate food among households with different experiences; disaster-affected and controls. We deployed qualitative and quantitative techniques to a cross-sectional survey. The index respondent was the head of each randomly selected household from the landslide affected communities and controls from a bordering sub-county. Data was collected by interviews and focus group discussions (FGDs). Structured entries were tested statistically to report associations using Pearson's Chi-square at the 95% CI. Information from FGDs was transcribed, coded, sequenced and patterned. Findings from both techniques were triangulated to facilitate interpretations. Analysis included 1,078 interview entries and 12 FGDs. Significant differences between the affected and control households (P < 0.05) were observed with: age; education level; religious affiliation; existence of assets that complement food source; and having received relief food. Analysis between groups showed differences in responses on: whether everyone has a right to adequate food; who was supposed to supply relief food; whether relief food was adequate; and preferred choice on the means to ensure the right to adequate food. FGDs emphasized that access to land was the most important means to food and income. Affected households desired remedial interventions especially alternative land for livelihood. Despite the provision of adequate relief food being a state's obligation, there was no opportunity to exercise choice and preference. Comprehension and awareness of accountability and transparency issues was also low. Though a significant proportion of participants affirmed they have a right to adequate food, relief food was largely perceived as insufficient. Given the high regard for land as a preferred remedy, a resettlement policy is of the essence to streamline post-landslide displacement and resettlement. Information materials need to be assembled and disseminated to stimulate awareness and debate on the right to adequate food.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arutyunyan, R.V.; Bol`shov, L.A.; Vasil`ev, S.K.
1994-06-01
The objective of this study was to clarify a number of issues related to the spatial distribution of contaminants from the Chernobyl accident. The effects of local statistics were addressed by collecting and analyzing (for Cesium 137) soil samples from a number of regions, and it was found that sample activity differed by a factor of 3-5. The effect of local non-uniformity was estimated by modeling the distribution of the average activity of a set of five samples for each of the regions, with the spread in the activities for a {+-}2 range being equal to 25%. The statistical characteristicsmore » of the distribution of contamination were then analyzed and found to be a log-normal distribution with the standard deviation being a function of test area. All data for the Bryanskaya Oblast area were analyzed statistically and were adequately described by a log-normal function.« less
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Mikkonen, Hannah G; Clarke, Bradley O; Dasika, Raghava; Wallis, Christian J; Reichman, Suzie M
2017-02-15
Understanding ambient background concentrations in soil, at a local scale, is an essential part of environmental risk assessment. Where high resolution geochemical soil surveys have not been undertaken, soil data from alternative sources, such as environmental site assessment reports, can be used to support an understanding of ambient background conditions. Concentrations of metals/metalloids (As, Mn, Ni, Pb and Zn) were extracted from open-source environmental site assessment reports, for soils derived from the Newer Volcanics basalt, of Melbourne, Victoria, Australia. A manual screening method was applied to remove samples that were indicated to be contaminated by point sources and hence not representative of ambient background conditions. The manual screening approach was validated by comparison to data from a targeted background soil survey. Statistical methods for exclusion of contaminated samples from background soil datasets were compared to the manual screening method. The statistical methods tested included the Median plus Two Median Absolute Deviations, the upper whisker of a normal and log transformed Tukey boxplot, the point of inflection on a cumulative frequency plot and the 95th percentile. We have demonstrated that where anomalous sample results cannot be screened using site information, the Median plus Two Median Absolute Deviations is a conservative method for derivation of ambient background upper concentration limits (i.e. expected maximums). The upper whisker of a boxplot and the point of inflection on a cumulative frequency plot, were also considered adequate methods for deriving ambient background upper concentration limits, where the percentage of contaminated samples is <25%. Median ambient background concentrations of metals/metalloids in the Newer Volcanic soils of Melbourne were comparable to ambient background concentrations in Europe and the United States, except for Ni, which was naturally enriched in the basalt-derived soils of Melbourne. Copyright © 2016 Elsevier B.V. All rights reserved.
Hu, Dayi; Sun, Yihong; Liao, Yuhua; Huang, Jing; Zhao, Ruiping; Yang, Kan
2016-01-01
To assess the blood pressure-lowering efficacy and tolerability of perindopril/amlodipine fixed-dose combinations in Chinese patients with mild-to-moderate essential hypertension not adequately controlled with monotherapy alone. In 2 separate double-blind studies, patients received a 4-week run-in monotherapy of amlodipine 5 mg or perindopril 4 mg, respectively. Those whose blood pressure was uncontrolled were then randomized to receive the fixed-dose combination of perindopril 5 mg/amlodipine 5 mg (Per/Amlo group) or remain on the monotherapy for 8 weeks. Patients who were uncontrolled at the week 8 (W8) visit were up-titrated for the Per/Amlo combination, or received additional treatment if on monotherapy, for a further 4 weeks. The main efficacy assessment was at 8 weeks. After 8 weeks, systolic blood pressure (SBP; primary criterion) was statistically significantly lower in the Per/Amlo group (vs. Amlo 5 mg, p = 0.0095; vs. Per 4 mg, p < 0.0001). Uncontrolled patients at W8 who received an up-titration of the Per/Amlo combination showed a further SBP reduction. These changes were mirrored by reassuring reductions in diastolic blood pressure. The fixed-dose combinations were well tolerated. Single-pill combinations of perindopril and amlodipine provide hypertensive patients with a convenient and effective method of reducing blood pressure. © 2016 S. Karger AG, Basel.
Child Care Providers' Knowledge About Dental Injury First Aid in Preschool-age Children.
Sienkiewicz, Kristine L; Rainchuso, Lori; Boyd, Linda D; Giblin, Lori
2017-06-01
Purpose: The aim of this study was to assess child care providers' level of knowledge of first aid management and attitudes towards dental injuries among preschool-age children within Fairfield County, Connecticut and Boston, Massachusetts. Methods: This descriptive cross-sectional study used a web-based, validated questionnaire adapted from several studies with permission from authors. A panel of 5 dental experts determined the relevance of the questions and overall content (I-CVI range 0.8-1; S-CVI = 0.95). The 28 question survey included demographics, level of knowledge, attitudes about traumatic dental injuries, emergency management, and 2 case study questions on management of luxation and tooth fracture. Survey data was coded and analyzed for associations and trends using STATA® statistics/data analysis software v. 11.2. Results: A total of 100 child care providers completed the online questionnaire. Eighty-four percent self-reported little to no knowledge about dental injury management. Sixty percent of child care providers agreed that they are responsible for managing dental injuries. Approximately two-thirds of child care providers reported not feeling adequately informed about dental injuries, with 77% expressing interest in receiving more information. Conclusions: The majority of child care providers' do not have the knowledge to perform adequate first aid following a dental injury. Professional development on first aid for dental injuries is recommended among this workforce population. Copyright © 2017 The American Dental Hygienists’ Association.
Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein
2017-01-01
Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials. PMID:29445703
Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein
2017-01-01
Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials.
Examining school-based hygiene facilities: a quantitative assessment in a Ghanaian municipality.
Appiah-Brempong, Emmanuel; Harris, Muriel J; Newton, Samuel; Gulis, Gabriel
2018-05-02
The crucial role of adequate water, sanitation and hygiene (WASH) facilities in influencing children's handwashing behaviour is widely reported. Report from UNICEF indicates a dearth of adequate data on WASH facilities in schools, especially in the developing world. This study sought to contribute to building the evidence-base on school hygiene facilities in Ghana. The study further explored for possible associations and differences between key variables within the context of school water, sanitation and hygiene. Data was collected from 37 junior high schools using an observational checklist. Methods of data analysis included a Scalogram model, Fisher's exact test, and a Student's t-test. Results of the study showed a facility deficiency in many schools: 33% of schools had students washing their hands in a shared receptacle (bowl), 24% had students using a single cotton towel to dry hands after handwashing, and only 16% of schools had a functional water facility. Furthermore, results of a proportion test indicated that 83% of schools which had functional water facilities also had functional handwashing stations. On the other hand, only 3% of schools which had functional water facilities also had a functional handwashing stations. A test of difference in the proportions of the two sets of schools showed a statistically significant difference (p < 0.001). In addition, 40% of schools which had financial provisions for water supply also had functional handwashing stations. On the other hand, only 7% of schools which had financial provisions for water supply also had functional handwashing stations. There was a statistically significant difference in the proportions of the two sets of schools (p = 0.02). We conclude that it is essential to have a financial provision for water supply in schools as this can potentially influence the existence of a handwashing station in a school. An intervention by government, educational authorities and civil society organisations towards enabling schools in low resource areas to have a sustainable budgetary allocation for WASH facilities would be timely.
Janetzki, Sylvia; Panageas, Katherine S; Ben-Porat, Leah; Boyer, Jean; Britten, Cedrik M; Clay, Timothy M; Kalos, Michael; Maecker, Holden T; Romero, Pedro; Yuan, Jianda; Kast, W Martin; Hoos, Axel
2008-03-01
The Cancer Vaccine Consortium of the Sabin Vaccine Institute (CVC/SVI) is conducting an ongoing large-scale immune monitoring harmonization program through its members and affiliated associations. This effort was brought to life as an external validation program by conducting an international Elispot proficiency panel with 36 laboratories in 2005, and was followed by a second panel with 29 participating laboratories in 2006 allowing for application of learnings from the first panel. Critical protocol choices, as well as standardization and validation practices among laboratories were assessed through detailed surveys. Although panel participants had to follow general guidelines in order to allow comparison of results, each laboratory was able to use its own protocols, materials and reagents. The second panel recorded an overall significantly improved performance, as measured by the ability to detect all predefined responses correctly. Protocol choices and laboratory practices, which can have a dramatic effect on the overall assay outcome, were identified and lead to the following recommendations: (A) Establish a laboratory SOP for Elispot testing procedures including (A1) a counting method for apoptotic cells for determining adequate cell dilution for plating, and (A2) overnight rest of cells prior to plating and incubation, (B) Use only pre-tested serum optimized for low background: high signal ratio, (C) Establish a laboratory SOP for plate reading including (C1) human auditing during the reading process and (C2) adequate adjustments for technical artifacts, and (D) Only allow trained personnel, which is certified per laboratory SOPs to conduct assays. Recommendations described under (A) were found to make a statistically significant difference in assay performance, while the remaining recommendations are based on practical experiences confirmed by the panel results, which could not be statistically tested. These results provide initial harmonization guidelines to optimize Elispot assay performance to the immunotherapy community. Further optimization is in process with ongoing panels.
Optimization of the trienzyme extraction for the microbiological assay of folate in vegetables.
Chen, Liwen; Eitenmiller, Ronald R
2007-05-16
Response surface methodology was applied to optimize the trienzyme digestion for the extraction of folate from vegetables. Trienzyme extraction is a combined enzymatic digestion by protease, alpha-amylase, and conjugase (gamma-glutamyl hydrolase) to liberate the carbohydrate and protein-bound folates from food matrices for total folate analysis. It is the extraction method used in AOAC Official Method 2004.05 for assay of total folate in cereal grain products. Certified reference material (CRM) 485 mixed vegetables was used to represent the matrix of vegetables. Regression and ridge analysis were performed by statistical analysis software. The predicted second-order polynomial model was adequate (R2 = 0.947), without significant lack of fit (p > 0.1). Both protease and alpha-amylase have significant effects on the extraction. Ridge analysis gave an optimum trienzyme digestion time: Pronase, 1.5 h; alpha-amylase, 1.5 h; and conjugase, 3 h. The experimental value for CRM 485 under this optimum digestion was close to the predicted value from the model, confirming the validity and adequacy of the model. The optimized trienzyme digestion condition was applied to five vegetables and yielded higher folate levels than the trienzyme digestion parameters employed in AOAC Official Method 2004.05.
Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.
Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike
2015-03-01
Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. Copyright © 2014 Elsevier B.V. All rights reserved.
Sari, Elif; Ayyildiz, Nusret
2012-12-01
Honey is a sweet food made by bees using nectar from flowers. Its quality depends on a number of factors, such as floral type, pH, moisture, free acidity, diastase activity, invert sugar and sucrose. The aim of the study is to examine the qualities of 50 sunflower honey (Helianthus annuus L.) collected from the Thrace region of Turkey, in terms of melissopalynological analysis, important chemical parameters and antioxidant activities . The total phenolic content of the honey samples was determined by the Folin-Ciocalteu method with spectrophotometry. The 1,2-diphenyl-2-picryl hydrazyl (DPPH) method was used to determine anti-radical activity and the phosphomolybdenum method was utilized for antioxidant activity. Correlations between the analysed parameters were found to be statistically significant (p < 0.05). The results obtained for physicochemical characteristics of sunflower honey indicate a good quality level, adequate processing, good maturity and freshness and that the sunflower honey samples studied proved to be good source of natural dietary antioxidants. This is the first report of the total phenolic content, antioxidant and antiradical activities of sunflower honeys collected from the Thrace region of Turkey.
NASA Technical Reports Server (NTRS)
Strub, P. Ted
1991-01-01
The overall goal of this project was to increase our understanding of processes which determine the temporally varying distributions of surface chlorophyll pigment concentration and surface temperature in the California Current System (CCS) on the time-scale of 'events', i.e., several days to several weeks. We also proposed to investigate seasonal and regional differences in these events. Additionally, we proposed to evaluate methods of estimating surface velocities and horizontal transport of pigment and heat from sequences of AVHRR and CZCS images. The four specific objectives stated in the original proposal were to: (1) test surface current estimates made from sequences of both SST and color images using variations of the statistical method of Emery et al. (1986) and estimate the uncertainties in these satellite-derived surface currents; (2) characterize the spatial and temporal relationships of chlorophyll and temperature in rapidly evolving features for which adequate imagery exist and evaluate the contribution of these events to monthly and seasonal averages; (3) use the methods tested in (1) to determine the nature of the velocity fields in the CCS; and (4) compare the currents, temperature, and currents in different seasons and in different geographic regions.
New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF
NASA Astrophysics Data System (ADS)
Cane, D.; Milelli, M.
2009-09-01
The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.
Chromatography methods and chemometrics for determination of milk fat adulterants
NASA Astrophysics Data System (ADS)
Trbović, D.; Petronijević, R.; Đorđević, V.
2017-09-01
Milk and milk-based products are among the leading food categories according to reported cases of food adulteration. Although many authentication problems exist in all areas of the food industry, adequate control methods are required to evaluate the authenticity of milk and milk products in the dairy industry. Moreover, gas chromatography (GC) analysis of triacylglycerols (TAGs) or fatty acid (FA) profiles of milk fat (MF) in combination with multivariate statistical data processing have been used to detect adulterations of milk and dairy products with foreign fats. The adulteration of milk and butter is a major issue for the dairy industry. The major adulterants of MF are vegetable oils (soybean, sunflower, groundnut, coconut, palm and peanut oil) and animal fat (cow tallow and pork lard). Multivariate analysis enables adulterated MF to be distinguished from authentic MF, while taking into account many analytical factors. Various multivariate analysis methods have been proposed to quantitatively detect levels of adulterant non-MFs, with multiple linear regression (MLR) seemingly the most suitable. There is a need for increased use of chemometric data analyses to detect adulterated MF in foods and for their expanded use in routine quality assurance testing.
Update of Standard Practices for New Method Validation in Forensic Toxicology.
Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T
2017-01-01
International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Mallast, U.; Gloaguen, R.; Geyer, S.; Rödiger, T.; Siebert, C.
2011-08-01
In this paper we present a semi-automatic method to infer groundwater flow-paths based on the extraction of lineaments from digital elevation models. This method is especially adequate in remote and inaccessible areas where in-situ data are scarce. The combined method of linear filtering and object-based classification provides a lineament map with a high degree of accuracy. Subsequently, lineaments are differentiated into geological and morphological lineaments using auxiliary information and finally evaluated in terms of hydro-geological significance. Using the example of the western catchment of the Dead Sea (Israel/Palestine), the orientation and location of the differentiated lineaments are compared to characteristics of known structural features. We demonstrate that a strong correlation between lineaments and structural features exists. Using Euclidean distances between lineaments and wells provides an assessment criterion to evaluate the hydraulic significance of detected lineaments. Based on this analysis, we suggest that the statistical analysis of lineaments allows a delineation of flow-paths and thus significant information on groundwater movements. To validate the flow-paths we compare them to existing results of groundwater models that are based on well data.
Villagómez-Ornelas, Paloma; Hernández-López, Pedro; Carrasco-Enríquez, Brenda; Barrios-Sánchez, Karina; Pérez-Escamilla, Rafael; Melgar-Quiñónez, Hugo
2014-01-01
This article validates the statistical consistency of two food security scales: the Mexican Food Security Scale (EMSA) and the Latin American and Caribbean Food Security Scale (ELCSA). Validity tests were conducted in order to verify that both scales were consistent instruments, conformed by independent, properly calibrated and adequately sorted items, arranged in a continuum of severity. The following tests were developed: sorting of items; Cronbach's alpha analysis; parallelism of prevalence curves; Rasch models; sensitivity analysis through mean differences' hypothesis test. The tests showed that both scales meet the required attributes and are robust statistical instruments for food security measurement. This is relevant given that the lack of access to food indicator, included in multidimensional poverty measurement in Mexico, is calculated with EMSA.
NASA Technical Reports Server (NTRS)
Falls, L. W.
1975-01-01
Vandenberg Air Force Base (AFB), California, wind component statistics are presented to be used for aerospace engineering applications that require component wind probabilities for various flight azimuths and selected altitudes. The normal (Gaussian) distribution is presented as a statistical model to represent component winds at Vandenberg AFB. Head tail, and crosswind components are tabulated for all flight azimuths for altitudes from 0 to 70 km by monthly reference periods. Wind components are given for 11 selected percentiles ranging from 0.135 percent to 99.865 percent for each month. The results of statistical goodness-of-fit tests are presented to verify the use of the Gaussian distribution as an adequate model to represent component winds at Vandenberg AFB.
Fold assessment for comparative protein structure modeling.
Melo, Francisco; Sali, Andrej
2007-11-01
Accurate and automated assessment of both geometrical errors and incompleteness of comparative protein structure models is necessary for an adequate use of the models. Here, we describe a composite score for discriminating between models with the correct and incorrect fold. To find an accurate composite score, we designed and applied a genetic algorithm method that searched for a most informative subset of 21 input model features as well as their optimized nonlinear transformation into the composite score. The 21 input features included various statistical potential scores, stereochemistry quality descriptors, sequence alignment scores, geometrical descriptors, and measures of protein packing. The optimized composite score was found to depend on (1) a statistical potential z-score for residue accessibilities and distances, (2) model compactness, and (3) percentage sequence identity of the alignment used to build the model. The accuracy of the composite score was compared with the accuracy of assessment by single and combined features as well as by other commonly used assessment methods. The testing set was representative of models produced by automated comparative modeling on a genomic scale. The composite score performed better than any other tested score in terms of the maximum correct classification rate (i.e., 3.3% false positives and 2.5% false negatives) as well as the sensitivity and specificity across the whole range of thresholds. The composite score was implemented in our program MODELLER-8 and was used to assess models in the MODBASE database that contains comparative models for domains in approximately 1.3 million protein sequences.
Identifying predictors of time-inhomogeneous viral evolutionary processes.
Bielejec, Filip; Baele, Guy; Rodrigo, Allen G; Suchard, Marc A; Lemey, Philippe
2016-07-01
Various factors determine the rate at which mutations are generated and fixed in viral genomes. Viral evolutionary rates may vary over the course of a single persistent infection and can reflect changes in replication rates and selective dynamics. Dedicated statistical inference approaches are required to understand how the complex interplay of these processes shapes the genetic diversity and divergence in viral populations. Although evolutionary models accommodating a high degree of complexity can now be formalized, adequately informing these models by potentially sparse data, and assessing the association of the resulting estimates with external predictors, remains a major challenge. In this article, we present a novel Bayesian evolutionary inference method, which integrates multiple potential predictors and tests their association with variation in the absolute rates of synonymous and non-synonymous substitutions along the evolutionary history. We consider clinical and virological measures as predictors, but also changes in population size trajectories that are simultaneously inferred using coalescent modelling. We demonstrate the potential of our method in an application to within-host HIV-1 sequence data sampled throughout the infection of multiple patients. While analyses of individual patient populations lack statistical power, we detect significant evidence for an abrupt drop in non-synonymous rates in late stage infection and a more gradual increase in synonymous rates over the course of infection in a joint analysis across all patients. The former is predicted by the immune relaxation hypothesis while the latter may be in line with increasing replicative fitness during the asymptomatic stage.
do Nascimento, Ticiano Gomes; de Jesus Oliveira, Eduardo; Basílio Júnior, Irinaldo Diniz; de Araújo-Júnior, João Xavier; Macêdo, Rui Oliveira
2013-01-25
A limited number of studies with application of the Arrhenius equation have been reported to drugs and biopharmaceuticals in biological fluids at frozen temperatures. This paper describes stability studies of ampicillin and cephalexin in aqueous solution and human plasma applying the Arrhenius law for determination of adequate temperature and time of storage of these drugs using appropriate statistical analysis. Stability studies of the beta-lactams in human plasma were conducted at temperatures of 20°C, 2°C, -20°C and also during four cycles of freeze-thawing. Chromatographic separation was achieved using a Shimpak C(18) column, acetonitrile as organic modifier and detection at 215nm. LC-UV-MS/MS was used to demonstrate the conversion of ampicillin into two diastereomeric forms of ampicilloic acid. Stability studies demonstrated degradation greater than 10% for ampicillin in human plasma at 20°C, 2°C and -20°C after 15h, 2.7days, 11days and for cephalexin at the same temperatures after 14h, 3.4days and 19days, respectively, and after the fourth cycle of freezing-thawing. The Arrhenius plot showed good prediction for the ideal temperature and time of storage for ampicillin (52days) and cephalexin (151days) at a temperature of -40°C, but statistical analysis (least squares method) must be applied to avoid incorrect extrapolations and estimated values out uncertainty limits. Copyright © 2012 Elsevier B.V. All rights reserved.
Reliability of unstable periodic orbit based control strategies in biological systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Nagender; Singh, Harinder P.; Hasse, Maria
2015-04-15
Presence of recurrent and statistically significant unstable periodic orbits (UPOs) in time series obtained from biological systems is now routinely used as evidence for low dimensional chaos. Extracting accurate dynamical information from the detected UPO trajectories is vital for successful control strategies that either aim to stabilize the system near the fixed point or steer the system away from the periodic orbits. A hybrid UPO detection method from return maps that combines topological recurrence criterion, matrix fit algorithm, and stringent criterion for fixed point location gives accurate and statistically significant UPOs even in the presence of significant noise. Geometry ofmore » the return map, frequency of UPOs visiting the same trajectory, length of the data set, strength of the noise, and degree of nonstationarity affect the efficacy of the proposed method. Results suggest that establishing determinism from unambiguous UPO detection is often possible in short data sets with significant noise, but derived dynamical properties are rarely accurate and adequate for controlling the dynamics around these UPOs. A repeat chaos control experiment on epileptic hippocampal slices through more stringent control strategy and adaptive UPO tracking is reinterpreted in this context through simulation of similar control experiments on an analogous but stochastic computer model of epileptic brain slices. Reproduction of equivalent results suggests that far more stringent criteria are needed for linking apparent success of control in such experiments with possible determinism in the underlying dynamics.« less
Álvarez-Díaz, N; Amador-García, I; Fuentes-Hernández, M; Dorta-Guerra, R
2015-01-01
To compare the ability of lung ultrasound and a clinical method in the confirmation of a selective bronchial intubation by left double-lumen tube in elective thoracic surgery. A prospective and blind, observational study was conducted in the setting of a university hospital operating room assigned for thoracic surgery. A single group of 105 consecutive patients from a total of 130, were included. After blind intubation, the position of the tube was confirmed by clinical and ultrasound assessment. Finally, the fiberoptic bronchoscopy confirmation as a reference standard was used to confirm the position of the tube. Under manual ventilation, by sequentially clamping the tracheal and bronchial limbs of the tube, clinical confirmation was made by auscultation, capnography, visualizing the chest wall expansion, and perceiving the lung compliance in the reservoir bag. Ultrasound confirmation was obtained by visualizing lung sliding, diaphragmatic movements, and the appearance of lung pulse sign. The sensitivity of the clinical method was 84.5%, with a specificity of 41.1%. The positive and negative likelihood ratio was 1.44 and 0.38, respectively. The sensitivity of the ultrasound method was 98.6%, specificity was 52.9%, with a positive likelihood ratio of 2.10 and a negative likelihood ratio of 0.03. Comparisons between the diagnostic performance of the 2 methods were calculated with McNemar's test. There was a significant difference in sensitivity between the ultrasound method and the clinical method (P=.002). Nevertheless, there was no statistically significant difference in specificity between both methods (P=.34). A p value<.01 was considered statistically significant. Lung ultrasound was superior to the clinical method in confirming the adequate position of the left double-lumen tube. On the other hand, in confirming the misplacement of the tube, differences between both methods could not be ensured. Copyright © 2014 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.
Time-lapse microscopy and image processing for stem cell research: modeling cell migration
NASA Astrophysics Data System (ADS)
Gustavsson, Tomas; Althoff, Karin; Degerman, Johan; Olsson, Torsten; Thoreson, Ann-Catrin; Thorlin, Thorleif; Eriksson, Peter
2003-05-01
This paper presents hardware and software procedures for automated cell tracking and migration modeling. A time-lapse microscopy system equipped with a computer controllable motorized stage was developed. The performance of this stage was improved by incorporating software algorithms for stage motion displacement compensation and auto focus. The microscope is suitable for in-vitro stem cell studies and allows for multiple cell culture image sequence acquisition. This enables comparative studies concerning rate of cell splits, average cell motion velocity, cell motion as a function of cell sample density and many more. Several cell segmentation procedures are described as well as a cell tracking algorithm. Statistical methods for describing cell migration patterns are presented. In particular, the Hidden Markov Model (HMM) was investigated. Results indicate that if the cell motion can be described as a non-stationary stochastic process, then the HMM can adequately model aspects of its dynamic behavior.
Culicov, Otilia A; Zinicovscaia, Inga; Duliu, O G
2016-05-01
The moss-bag transplant technique was used to investigate the kinetics of the accumulation of 38 elements in Sphagnum girgensohni moss samples in the highly polluted municipality of Baia Mare, Romania. The moss samples collected from the unpolluted Vitosha Mountain Natural Reserve, Bulgaria, were analyzed after 1, 2, 3, and 4 months of exposure, respectively. The ANOVA method was used to assay the statistical significance of the observed changes in elemental content, as determined by neutron activation analysis. The content of Zn, Se, As, Ag, Cd, and Sb increased steadily, while that of physiologically active K and Cl, as well as Rb and Cs, decreased exponentially. The study showed that an adequate application of the moss transplant technique in an urban environment should consider the exposure time as a critical parameter, since particular elements are depleted in the moss at sites with high atmospheric loading of metals.
Rehm, Jürgen
2008-06-01
In summarizing the key themes and results of the second meeting of the German Addiction Research Network 'Understanding Addiction: Mediators and Moderators of Behaviour Change Process', the following concrete steps forward were laid out to improve knowledge. The steps included pleas to (1) redefine substance abuse disorders, especially redefine the concept of abuse and harmful use; (2) increase the use of longitudinal and life-course studies with more adequate statistical methods such as latent growth modelling; (3) empirically test more specific and theoretically derived common factors and mechanisms of behavioural change processes; (4) better exploit cross-regional and cross-cultural differences.Funding agencies are urged to support these developments by specifically supporting interdisciplinary research along the lines specified above. This may include improved forms of international funding of groups of researchers from different countries, where each national group conducts a specific part of an integrated proposal. 2008 John Wiley & Sons, Ltd
Gravel resources, urbanization, and future land use, Front Range Urban Corridor, Colorado
Soule, James M.; Fitch, Harold R.
1974-01-01
An assessment of gravel needs in Front Range Urban Corridor markets to 2000 A.D., based on forecast population increases and urbanization, indicates that adequate resources to meet anticipated needs are potentially available, if future land use does not preclude their extraction. Because of urban encroachment onto gravel-bearing lands, this basic construction material is in short supply nationally and in the Front Range Urban Corridor. Longer hauls, increased prices, and use of alternatives, especially crushed rock aggregate, have resulted. An analysis of possible sequential land uses following gravel mining indicates that a desirable use is for 'real estate' ponds and small lakes. A method for computing gravel reserves, based on planimeter measurement of area of resource-bearing lands and statistical analysis of reliability of thickness and size distribution data, was developed to compute reserves in individual markets. A discussion of the qualitative 'usability' of these reserves is then made for the individual markets.
Mode-dependent templates and scan order for H.264/AVC-based intra lossless coding.
Gu, Zhouye; Lin, Weisi; Lee, Bu-Sung; Lau, Chiew Tong; Sun, Ming-Ting
2012-09-01
In H.264/advanced video coding (AVC), lossless coding and lossy coding share the same entropy coding module. However, the entropy coders in the H.264/AVC standard were original designed for lossy video coding and do not yield adequate performance for lossless video coding. In this paper, we analyze the problem with the current lossless coding scheme and propose a mode-dependent template (MD-template) based method for intra lossless coding. By exploring the statistical redundancy of the prediction residual in the H.264/AVC intra prediction modes, more zero coefficients are generated. By designing a new scan order for each MD-template, the scanned coefficients sequence fits the H.264/AVC entropy coders better. A fast implementation algorithm is also designed. With little computation increase, experimental results confirm that the proposed fast algorithm achieves about 7.2% bit saving compared with the current H.264/AVC fidelity range extensions high profile.
Love, Margaret M; Pearce, Kevin A; Williamson, M Ann; Barron, Mary A; Shelton, Brent J
2006-01-01
The Cardiovascular Risk Education and Social Support (CaRESS) study is a randomized controlled trial that evaluates a social support intervention toward reducing cardiovascular risk in type 2 diabetic patients. It involves multiple community-based practice sites from the Kentucky Ambulatory Network (KAN), which is a regional primary care practice-based research network (PBRN). CaRESS also implements multiple modes of data collection. The purpose of this methods article is to share lessons learned that might be useful to others developing or implementing complex studies that consent patients in PBRNs. Key points include building long-term relationships with the clinicians, adaptability when integrating into practice sites, adequate funding to support consistent data management and statistical support during all phases of the study, and creativity and perseverance for recruiting patients and practices while maintaining the integrity of the protocol.
Magnitude and frequency of summer floods in western New Mexico and eastern Arizona
Kennon, F.W.
1955-01-01
Numerous small reservoirs and occasional water-spreading structures are being built on the ephemeral streams draining the public and Indian lands of the Southwest as part of the Soil and Moisture Conservation Program of the Bureau of Land Management and Bureau of Indian Affairs. Economic design of these structures requires some knowledge of the flood rates and volumes. Information concerning flood frequencies on areas less than 100 square miles is deficient throughout the country, particularly on intermittent streams of the Southwest. Design engineers require a knowledge of the frequency and magnitude of flood volumes for the planning of adequate reservoir capacities and a knowledge of frequency and magnitude of flood peaks for spillway design. Hence, this study deals with both flood volumes and peaks, the same statistical methods being used to develop frequency curves for each.
Resolving microstructures in Z pinches with intensity interferometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apruzese, J. P.; Kroupp, E.; Maron, Y.
2014-03-15
Nearly 60 years ago, Hanbury Brown and Twiss [R. Hanbury Brown and R. Q. Twiss, Nature 178, 1046 (1956)] succeeded in measuring the 30 nrad angular diameter of Sirius using a new type of interferometry that exploited the interference of photons independently emitted from different regions of the stellar disk. Its basis was the measurement of intensity correlations as a function of detector spacing, with no beam splitting or preservation of phase information needed. Applied to Z pinches, X pinches, or laser-produced plasmas, this method could potentially provide spatial resolution under one micron. A quantitative analysis based on the workmore » of Purcell [E. M. Purcell, Nature 178, 1449 (1956)] reveals that obtaining adequate statistics from x-ray interferometry of a Z-pinch microstructure would require using the highest-current generators available. However, using visible light interferometry would reduce the needed photon count and could enable its use on sub-MA machines.« less
Measurement of surface microtopography
NASA Technical Reports Server (NTRS)
Wall, S. D.; Farr, T. G.; Muller, J.-P.; Lewis, P.; Leberl, F. W.
1991-01-01
Acquisition of ground truth data for use in microwave interaction modeling requires measurement of surface roughness sampled at intervals comparable to a fraction of the microwave wavelength and extensive enough to adequately represent the statistics of a surface unit. Sub-centimetric measurement accuracy is thus required over large areas, and existing techniques are usually inadequate. A technique is discussed for acquiring the necessary photogrammetric data using twin film cameras mounted on a helicopter. In an attempt to eliminate tedious data reduction, an automated technique was applied to the helicopter photographs, and results were compared to those produced by conventional stereogrammetry. Derived root-mean-square (RMS) roughness for the same stereo-pair was 7.5 cm for the automated technique versus 6.5 cm for the manual method. The principal source of error is probably due to vegetation in the scene, which affects the automated technique but is ignored by a human operator.
GATA: A graphic alignment tool for comparative sequenceanalysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nix, David A.; Eisen, Michael B.
2005-01-01
Several problems exist with current methods used to align DNA sequences for comparative sequence analysis. Most dynamic programming algorithms assume that conserved sequence elements are collinear. This assumption appears valid when comparing orthologous protein coding sequences. Functional constraints on proteins provide strong selective pressure against sequence inversions, and minimize sequence duplications and feature shuffling. For non-coding sequences this collinearity assumption is often invalid. For example, enhancers contain clusters of transcription factor binding sites that change in number, orientation, and spacing during evolution yet the enhancer retains its activity. Dotplot analysis is often used to estimate non-coding sequence relatedness. Yet dotmore » plots do not actually align sequences and thus cannot account well for base insertions or deletions. Moreover, they lack an adequate statistical framework for comparing sequence relatedness and are limited to pairwise comparisons. Lastly, dot plots and dynamic programming text outputs fail to provide an intuitive means for visualizing DNA alignments.« less
Statistical tests and measures for the presence and influence of digit preference
Jay Beaman; Grenier Michel
1998-01-01
Digit preference which is really showing preference for certain numbers has often described as the heaping or rounding of responses to numbers ending in zero or five. Number preference, NP, has been a topic in the social science literature for some years. However, until recently concepts were not adequately rigorously specified to allow, for example, the estimation of...
ERIC Educational Resources Information Center
Bernard, Robert M.; Borokhovski, Eugene; Schmid, Richard F.; Tamim, Rana M.
2014-01-01
This article contains a second-order meta-analysis and an exploration of bias in the technology integration literature in higher education. Thirteen meta-analyses, dated from 2000 to 2014 were selected to be included based on the questions asked and the presence of adequate statistical information to conduct a quantitative synthesis. The weighted…
Forest statistics for Georgia, 1982
John E. Tansey
1983-01-01
Since the fourth inventory of the forest resources of Georgia in 1972, the area of commercial forest land decreased over 4 percent, or by almost 1.1 million acres. Commercial forests now cover approximately 23.7 million acres, 64 percent of the land area in the State. Nearly 5.1 million acres were harvested, while about 2.9 million acres were adequately regenerated...
Idaho Kids Count Data Book, 1994: Profiles of Child Well-Being.
ERIC Educational Resources Information Center
Idaho KIDS COUNT Project, Boise.
This Kids Count data book examines county and statewide trends of Idaho children's well-being. The statistical profile is based on 14 indicators: (1) children under age 18 in poverty; (2) children in single parent families; (3) births with adequate prenatal care; (4) infant mortality rate; (5) births to mothers age 10 to 19 without prenatal care;…
de Almeida, Geane Silva; de Oliveira, Iara Brandão
2018-03-07
This work applied the Water Quality Index developed by the Canadian Council of Ministers of the Environment (WQI-CCME), to communicate the water quality per section of the Joanes River basin, State of Bahia, Brazil. WQI-CCME is a statistical procedure that originally requires the execution of at least four monitoring campaigns per monitoring location and the measurement of at least four parameters. This paper presents a new aggregation method to calculate the WQI-CCME because, to apply the original method in Joanes River, a huge loss of information would occur, by the fact that, the number of analyzed parameters varied between the monitoring campaigns developed by the Government Monitoring Program. This work modified the original aggregation method replacing it by a data aggregation for a single monitoring campaign, in a minimum of four monitoring locations per section of the river and a minimum of four parameters per monitoring location. Comparison between the calculation of WQI-CCME for river sections, with the index, WQI-CETESB, developed by the Brazilian Environmental Sanitation and Technology Company-CETESB, proved the applicability of the new aggregation method. The WQI-CETESB has it bases on the WQI from the National Sanitation Foundation and uses nine fixed parameters. As WQI-CCME uses the totality of the analyzed parameters without restrictions, it is more flexible, and the results seem more adequate to indicate the real river water quality. However, the WQI-CCME has a more stringent water quality scale in comparison with the WQI-CETESB, resulting in inferior water quality information. In conclusion, the WQI-CCME with a new aggregation method is adequate for communicating the water quality at a given time, per section of a river, respecting the minimum number of four analyses and four monitoring points. As a result, without a need to wait for other campaigns, it reduces the cost of a monitoring program and the period to communicate the water quality. The adequacy of the WQI-CCME was similar to the finding of others.
Emery, Carolyn A; Cassidy, J David; Klassen, Terry P; Rosychuk, Rhonda J; Rowe, Brian B
2005-06-01
There is a need in sports medicine for a static and dynamic standing balance measure to quantify balance ability in adolescents. The purposes of this study were to determine the test-retest reliability of timed static (eyes open) and dynamic (eyes open and eyes closed) unipedal balance measurements and to examine factors associated with balance. Adolescents (n=123) were randomly selected from 10 Calgary high schools. This study used a repeated-measures design. One rater measured unipedal standing balance, including timed eyes-closed static (ECS), eyes-open dynamic (EOD), and eyes-closed dynamic (ECD) balance at baseline and 1 week later. Dynamic balance was measured on a foam surface. Reliability was examined using both intraclass correlation coefficients (ICCs) and Bland and Altman statistical techniques. Multiple linear regressions were used to examine other potentially influencing factors. Based on ICCs, test-retest reliability was adequate for ECS, EOD, and ECD balance (ICC=.69, .59, and .46, respectively). The results of Bland and Altman methods, however, suggest that caution is required in interpreting reliability based on ICCs alone. Although both ECS balance and ECD balance appear to demonstrate adequate test-retest reliability by ICC, Bland and Altman methods of agreement demonstrate sufficient reliability for ECD balance only. Thirty percent of the subjects reached the 180-second maximum on EOD balance, suggesting that this test is not appropriate for use in this population. Balance ability (ECS and ECD) was better in adolescents with no past history of lower-extremity injury. Timed ECD balance is an appropriate and reliable clinical measurement for use in adolescents and is influenced by previous injury.
Somogyi, O; Meskó, A; Csorba, L; Szabó, P; Zelkó, R
2017-08-30
The division of tablets and adequate methods of splitting them are a complex problem in all sectors of health care. Although tablet-splitting is often required, this procedure can be difficult for patients. Four tablets were investigated with different external features (shape, score-line, film-coat and size). The influencing effect of these features and the splitting methods was investigated according to the precision and "weight loss" of splitting techniques. All four types of tablets were halved by four methods: by hand, with a kitchen knife, with an original manufactured splitting device and with a modified tablet splitter based on a self-developed mechanical model. The mechanical parameters (harness and friability) of the products were measured during the study. The "weight loss" and precision of splitting methods were determined and compared by statistical analysis. On the basis of the results, the external features (geometry), the mechanical parameters of tablets and the mechanical structure of splitting devices can influence the "weight loss" and precision of tablet-splitting. Accordingly, a new decision-making scheme was developed for the selection of splitting methods. In addition, the skills of patients and the specialties of therapy should be considered so that pharmaceutical counselling can be more effective regarding tablet-splitting. Copyright © 2017 Elsevier B.V. All rights reserved.
Preoperative diagnosis of orbital cavernous hemangioma: a 99mTc-RBC SPECT study.
Burroni, Luca; Borsari, Giulia; Pichierri, Patrizia; Polito, Ennio; Toscano, Olga; Grassetto, Gaia; Al-Nahhas, Adil; Rubello, Domenico; Vattimo, Angelo Giuseppe
2012-11-01
This study aimed to describe 99mTc-labeled RBC scintigraphy as a diagnostic method for orbital cavernous hemangiomas and to evaluate this diagnostic tool according to surgical outcomes. Fifty-five patients with clinical and radiological (US, CT, and/or MRI) suspicion of unilateral cavernous hemangioma of the orbit underwent 99mTc-RBC SPECT study.Qualitative and semiquantitative evaluations were performed, and results were statistically analyzed. SPECT images showed focal uptake in the orbital mass in 36 of 55 patients. Nineteen patients had a negative scintigraphic pattern, with concordance of early and late absence of uptake of 99mTc-RBC.Our procedure showed 100% sensitivity and 88.9% specificity for the diagnosis of orbital cavernous hemangioma, with a positive predictive value of 90.9% and a negative predictive value of 100%. 99mTc-RBC imaging is safe, easy to perform, and highly accurate in providing adequate clinical and surgical management. As a noninvasive and highly specific method for diagnosing orbital hemangioma, 99mTc-RBC scintigraphy can avoid more invasive imaging or biopsy.
Normalization methods in time series of platelet function assays
Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham
2016-01-01
Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217
Rational use and interpretation of urine drug testing in chronic opioid therapy.
Reisfield, Gary M; Salazar, Elaine; Bertholf, Roger L
2007-01-01
Urine drug testing (UDT) has become an essential feature of pain management, as physicians seek to verify adherence to prescribed opioid regimens and to detect the use of illicit or unauthorized licit drugs. Results of urine drug tests have important consequences in regard to therapeutic decisions and the trust between physician and patient. However, reliance on UDT to confirm adherence can be problematic if the results are not interpreted correctly, and evidence suggests that many physicians lack an adequate understanding of the complexities of UDT and the factors that can affect test results. These factors include metabolic conversion between drugs, genetic variations in drug metabolism, the sensitivity and specificity of the analytical method for a particular drug or metabolite, and the effects of intentional and unintentional interferants. In this review, we focus on the technical features and limitations of analytical methods used for detecting drugs or their metabolites in urine, the statistical constructs that are pertinent to ordering UDT and interpreting test results, and the application of these concepts to the clinical monitoring of patients maintained on chronic opioid therapy.
Congruency of scapula locking plates: implications for implant design.
Park, Andrew Y; DiStefano, James G; Nguyen, Thuc-Quyen; Buckley, Jenni M; Montgomery, William H; Grimsrud, Chris D
2012-04-01
We conducted a study to evaluate the congruency of fit of current scapular plate designs. Three-dimensional image-processing and -analysis software, and computed tomography scans of 12 cadaveric scapulae were used to generate 3 measurements: mean distance from plate to bone, maximum distance, and percentage of plate surface within 2 mm of bone. These measurements were used to quantify congruency. The scapular spine plate had the most congruent fit in all 3 measured variables. The lateral border and glenoid plates performed statistically as well as the scapular spine plate in at least 1 of the measured variables. The medial border plate had the least optimal measurements in all 3 variables. With locking-plate technology used in a wide variety of anatomical locations, the locking scapula plate system can allow for a fixed-angle construct in this region. Our study results showed that the scapular spine, glenoid, and lateral border plates are adequate in terms of congruency. However, design improvements may be necessary for the medial border plate. In addition, we describe a novel method for quantifying hardware congruency, a method that can be applied to any anatomical location.
Health and morbidity survey, Seychelles, 1956-57
Spitz, A. J. W.
1960-01-01
Adequate knowledge of existing health and morbidity conditions is the basis for all planning of future health services. For this reason, a health and morbidity survey of the population of the Seychelles was carried out in 1956-57 under the joint auspices of the Seychelles Government and the World Health Organization. Statistical sampling methods were used and the information was obtained by the household interview method. Health, morbidity and relevant demographic data were thus disclosed for the first time for Seychelles. Basic information was obtained on: general morbidity of the population, including dental and nutritional status, malnutrition, incidence of intestinal diseases and other easily diagnosable conditions; growth and weight curves of children up to the age of 16; haemoglobin levels; erythrocyte sedimentation rates; general living conditions such as housing and overcrowding, social status and latrine arrangements; the connexion of soil pollution with the incidence of amoebiasis and helminthiasis; and lastly, the incidence of the sickle cell trait, eosinophilia and positive serological reactions to the Chediak test (for manifest or latent syphilis). The findings are presented with a minimum of remarks and interpretation. PMID:13833401
NASA Astrophysics Data System (ADS)
Kivalov, Sergey N.; Fitzjarrald, David R.
2018-02-01
Cloud shadows lead to alternating light and dark periods at the surface, with the most abrupt changes occurring in the presence of low-level forced cumulus clouds. We examine multiyear irradiance time series observed at a research tower in a midlatitude mixed deciduous forest (Harvard Forest, Massachusetts, USA: 42.53{°}N, 72.17{°}W) and one made at a similar tower in a tropical rain forest (Tapajós National Forest, Pará, Brazil: 2.86{°}S, 54.96{°}W). We link the durations of these periods statistically to conventional meteorological reports of sky type and cloud height at the two forests and present a method to synthesize the surface irradiance time series from sky-type information. Four classes of events describing distinct sequential irradiance changes at the transition from cloud shadow and direct sunlight are identified: sharp-to-sharp, slow-to-slow, sharp-to-slow, and slow-to-sharp. Lognormal and the Weibull statistical distributions distinguish among cloudy-sky types. Observers' qualitative reports of `scattered' and `broken' clouds are quantitatively distinguished by a threshold value of the ratio of mean clear to cloudy period durations. Generated synthetic time series based on these statistics adequately simulate the temporal "radiative forcing" linked to sky type. Our results offer a quantitative way to connect the conventional meteorological sky type to the time series of irradiance experienced at the surface.
A Bifactor Approach to Model Multifaceted Constructs in Statistical Mediation Analysis.
Gonzalez, Oscar; MacKinnon, David P
Statistical mediation analysis allows researchers to identify the most important mediating constructs in the causal process studied. Identifying specific mediators is especially relevant when the hypothesized mediating construct consists of multiple related facets. The general definition of the construct and its facets might relate differently to an outcome. However, current methods do not allow researchers to study the relationships between general and specific aspects of a construct to an outcome simultaneously. This study proposes a bifactor measurement model for the mediating construct as a way to parse variance and represent the general aspect and specific facets of a construct simultaneously. Monte Carlo simulation results are presented to help determine the properties of mediated effect estimation when the mediator has a bifactor structure and a specific facet of a construct is the true mediator. This study also investigates the conditions when researchers can detect the mediated effect when the multidimensionality of the mediator is ignored and treated as unidimensional. Simulation results indicated that the mediation model with a bifactor mediator measurement model had unbiased and adequate power to detect the mediated effect with a sample size greater than 500 and medium a - and b -paths. Also, results indicate that parameter bias and detection of the mediated effect in both the data-generating model and the misspecified model varies as a function of the amount of facet variance represented in the mediation model. This study contributes to the largely unexplored area of measurement issues in statistical mediation analysis.
Analysis of entropy extraction efficiencies in random number generation systems
NASA Astrophysics Data System (ADS)
Wang, Chao; Wang, Shuang; Chen, Wei; Yin, Zhen-Qiang; Han, Zheng-Fu
2016-05-01
Random numbers (RNs) have applications in many areas: lottery games, gambling, computer simulation, and, most importantly, cryptography [N. Gisin et al., Rev. Mod. Phys. 74 (2002) 145]. In cryptography theory, the theoretical security of the system calls for high quality RNs. Therefore, developing methods for producing unpredictable RNs with adequate speed is an attractive topic. Early on, despite the lack of theoretical support, pseudo RNs generated by algorithmic methods performed well and satisfied reasonable statistical requirements. However, as implemented, those pseudorandom sequences were completely determined by mathematical formulas and initial seeds, which cannot introduce extra entropy or information. In these cases, “random” bits are generated that are not at all random. Physical random number generators (RNGs), which, in contrast to algorithmic methods, are based on unpredictable physical random phenomena, have attracted considerable research interest. However, the way that we extract random bits from those physical entropy sources has a large influence on the efficiency and performance of the system. In this manuscript, we will review and discuss several randomness extraction schemes that are based on radiation or photon arrival times. We analyze the robustness, post-processing requirements and, in particular, the extraction efficiency of those methods to aid in the construction of efficient, compact and robust physical RNG systems.
Reverse control for humanoid robot task recognition.
Hak, Sovannara; Mansard, Nicolas; Stasse, Olivier; Laumond, Jean Paul
2012-12-01
Efficient methods to perform motion recognition have been developed using statistical tools. Those methods rely on primitive learning in a suitable space, for example, the latent space of the joint angle and/or adequate task spaces. Learned primitives are often sequential: A motion is segmented according to the time axis. When working with a humanoid robot, a motion can be decomposed into parallel subtasks. For example, in a waiter scenario, the robot has to keep some plates horizontal with one of its arms while placing a plate on the table with its free hand. Recognition can thus not be limited to one task per consecutive segment of time. The method presented in this paper takes advantage of the knowledge of what tasks the robot is able to do and how the motion is generated from this set of known controllers, to perform a reverse engineering of an observed motion. This analysis is intended to recognize parallel tasks that have been used to generate a motion. The method relies on the task-function formalism and the projection operation into the null space of a task to decouple the controllers. The approach is successfully applied on a real robot to disambiguate motion in different scenarios where two motions look similar but have different purposes.
Haile, Zelalem T; Teweldeberhan, Asli K; Chertok, Ilana R A
2016-01-01
Studies that explored women's knowledge on mother-to-child transmission (MTCT) of HIV and its prevention (PMTCT) in the general population are currently lacking. This paper examined factors associated with having adequate knowledge of MTCT of HIV and PMTCT among a nationally representative sample of women in Tanzania. We conducted a cross-sectional analysis including 10,299 women from the 2011-2012 Tanzania HIV/AIDS and Malaria Indicator Survey. The outcome of interest was the presence of adequate knowledge on MTCT and PMTCT of HIV. We used multivariable logistic regression to identify factors associated with having adequate knowledge on MTCT and PMTCT of HIV. Results revealed that the overall prevalence of having adequate knowledge on MTCT and PMTCT of HIV was low (46%). We found a statistically significant difference in the proportions of having adequate knowledge between HIV-negative and HIV-positive women (45% vs. 56%; p < .0001), although knowledge of the transplacental route of transmission did not differ by HIV serostatus. Overall, having adequate knowledge on MTCT and PMTCT of HIV was positively associated with experiencing at least one pregnancy, having some education, having higher household wealth, residing in urban area, being exposed to HIV education, having tested for HIV, knowing a place to get HIV test, and having comprehensive knowledge on HIV and AIDS. Among HIV-seropositive women, experiencing at least one pregnancy and having comprehensive knowledge on HIV and AIDS were strongly associated with having adequate knowledge on MTCT and PMTCT of HIV (Adjusted odds ratio: aOR 2.78, 95% CI 1.21, 6.37 and aOR 1.71, 95% CI 1.15, 2.73, respectively). Further efforts are needed to enhance HIV/AIDS education among women of childbearing age and strengthen PMTCT services in Tanzania.
Adolescents: Contraceptive Knowledge and Use, a Brazilian Study
Correia, Divanise S.; Pontes, Ana C. P.; Cavalcante, Jairo C.; Egito, E. Sócrates T.; Maia, Eulália M.C.
2009-01-01
The purpose of this study was to identify the knowledge and use of contraceptive methods by female adolescent students. The study was cross-sectional and quantitative, using a semi-structured questionnaire that was administered to 12- to 19-year-old female students in Maceió, Brazil. A representative and randomized sample was calculated, taking into account the number of hospital admissions for curettage. This study was approved by the Human Research Ethics Committee, and Epi InfoTM software was used for data and result evaluation using the mean and chi-square statistical test. Our results show that the majority of students know of some contraceptive methods (95.5%), with the barrier/hormonal methods being the most mentioned (72.4%). Abortion and aborting drugs were inaccurately described as contraceptives, and 37.9% of the sexually active girls did not make use of any method. The barrier methods were the most used (35.85%). A significant association was found in the total sample (2,592) between pregnancy and the use of any contraceptive method. This association was not found, however, in the group having an active sexual life (559). The study points to a knowledge of contraceptive methods, especially by teenagers who have already been pregnant, but contraceptives were not adequately used. The low use of chemical methods of contraception brings the risk of pregnancy. Since abortion and aborting drugs were incorrectly cited as contraceptive methods, this implies a nonpreventive attitude towards pregnancy. PMID:19151897
SDN solutions for switching dedicated long-haul connections: Measurements and comparative analysis
Rao, Nageswara S. V.
2016-01-01
We consider a scenario of two sites connected over a dedicated, long-haul connection that must quickly fail-over in response to degradations in host-to-host application performance. The traditional layer-2/3 hot stand-by fail-over solutions do not adequately address the variety of application degradations, and more recent single controller Software Defined Networks (SDN) solutions are not effective for long-haul connections. We present two methods for such a path fail-over using OpenFlow enabled switches: (a) a light-weight method that utilizes host scripts to monitor application performance and dpctl API for switching, and (b) a generic method that uses two OpenDaylight (ODL) controllers and RESTmore » interfaces. For both methods, the restoration dynamics of applications contain significant statistical variations due to the complexities of controllers, north bound interfaces and switches; they, together with the wide variety of vendor implementations, complicate the choice among such solutions. We develop the impulse-response method based on regression functions of performance parameters to provide a rigorous and objective comparison of different solutions. We describe testing results of the two proposed methods, using TCP throughput and connection rtt as main parameters, over a testbed consisting of HP and Cisco switches connected over longhaul connections emulated in hardware by ANUE devices. Lastly, the combination of analytical and experimental results demonstrate that the dpctl method responds seconds faster than the ODL method on average, even though both methods eventually restore original TCP throughput.« less
Machtiger, N A; Fischler, G E; Adams, M C; Spielmaker, R; Graf, J F
2001-01-01
A collaborative study was conducted to test a method developed to distinguish between adequately and inadequately preserved cosmetic formulations. Nineteen laboratories participated in the study. Samples tested included shampoos, hair conditioners, oil-in-water emulsions, and water-in-oil-emulsions. Triplicate samples of 4 adequately preserved and 4 inadequately preserved cosmetic products were tested by each collaborative laboratory. Results showed that all inadequately preserved shampoo and conditioner samples failed to meet the acceptance criteria for adequately preserved formulations. Of the 51 preserved samples, 49 shampoos and 48 conditioners met the criteria for adequate preservation. All samples of inadequately preserved water-in-oil emulsions and oil-in-water emulsions failed to meet the acceptance criteria, whereas all adequately preserved emulsion formulations met the acceptance criteria.
A simple statistical model for geomagnetic reversals
NASA Technical Reports Server (NTRS)
Constable, Catherine
1990-01-01
The diversity of paleomagnetic records of geomagnetic reversals now available indicate that the field configuration during transitions cannot be adequately described by simple zonal or standing field models. A new model described here is based on statistical properties inferred from the present field and is capable of simulating field transitions like those observed. Some insight is obtained into what one can hope to learn from paleomagnetic records. In particular, it is crucial that the effects of smoothing in the remanence acquisition process be separated from true geomagnetic field behavior. This might enable us to determine the time constants associated with the dominant field configuration during a reversal.
Common inputs in subthreshold membrane potential: The role of quiescent states in neuronal activity
NASA Astrophysics Data System (ADS)
Montangie, Lisandro; Montani, Fernando
2018-06-01
Experiments in certain regions of the cerebral cortex suggest that the spiking activity of neuronal populations is regulated by common non-Gaussian inputs across neurons. We model these deviations from random-walk processes with q -Gaussian distributions into simple threshold neurons, and investigate the scaling properties in large neural populations. We show that deviations from the Gaussian statistics provide a natural framework to regulate population statistics such as sparsity, entropy, and specific heat. This type of description allows us to provide an adequate strategy to explain the information encoding in the case of low neuronal activity and its possible implications on information transmission.
NASA Astrophysics Data System (ADS)
di Luca, Alejandro; de Elía, Ramón; Laprise, René
2012-03-01
Regional Climate Models (RCMs) constitute the most often used method to perform affordable high-resolution regional climate simulations. The key issue in the evaluation of nested regional models is to determine whether RCM simulations improve the representation of climatic statistics compared to the driving data, that is, whether RCMs add value. In this study we examine a necessary condition that some climate statistics derived from the precipitation field must satisfy in order that the RCM technique can generate some added value: we focus on whether the climate statistics of interest contain some fine spatial-scale variability that would be absent on a coarser grid. The presence and magnitude of fine-scale precipitation variance required to adequately describe a given climate statistics will then be used to quantify the potential added value (PAV) of RCMs. Our results show that the PAV of RCMs is much higher for short temporal scales (e.g., 3-hourly data) than for long temporal scales (16-day average data) due to the filtering resulting from the time-averaging process. PAV is higher in warm season compared to cold season due to the higher proportion of precipitation falling from small-scale weather systems in the warm season. In regions of complex topography, the orographic forcing induces an extra component of PAV, no matter the season or the temporal scale considered. The PAV is also estimated using high-resolution datasets based on observations allowing the evaluation of the sensitivity of changing resolution in the real climate system. The results show that RCMs tend to reproduce relatively well the PAV compared to observations although showing an overestimation of the PAV in warm season and mountainous regions.
NASA Astrophysics Data System (ADS)
Feyen, Luc; Caers, Jef
2006-06-01
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport predictions.
Sghaireen, Mohd G; Albhiran, Heyam Mobark; Alzoubi, Ibrahim A; Lynch, Edward; Al-Omiri, Mahmoud K
2015-01-01
This study aimed to clinically quantify the apicoincisal height of the upper interproximal areas directly in patients' mouths compared to measurements on stone models. One hundred and fifty participants (75 females and 75 males, age range 20-45 years) were recruited for this study. A digital caliper was used to measure the anterior maxillary interproximal contact areas directly in patients' mouths and on stone models. The digital caliper accuracy was up to 0.01. The Statistical Package for Social Sciences software (SPSS, version 19.0, Chicago, Ill., USA) was used for statistical analysis. Statistical significance was based on probability values <0.05. The intraoral measurement of proximal contacts as well as the measurement on stone models showed that the dimensions of interproximal contacts on both sides of each tooth were significantly different (p < 0.001) and that the dimension of the mesial contact point was larger than that of the distal contact point of each tooth. The largest contact point was the one between the central incisors (direct intraoral measurement = 2.9-6.49 mm; model measurement = 3.31-6.91 mm). On the other hand, the contact point between the canine and first premolar was the smallest on both sides of the arch (0.63-2.52 mm intraorally, 0.98-2.88 mm on models). The intraoral measurement of contact points was more accurate than model measurements, and the differences were statistically significant (p < 0.001). The clinical evaluation of contact point dimensions using a digital caliper was more precise than measuring contact points on stone models; hence, it is a viable, quick and adequate method to be used routinely. © 2015 S. Karger AG, Basel.
Family planning in conflict: results of cross-sectional baseline surveys in three African countries
2011-01-01
Background Despite the serious consequences of conflict for reproductive health, populations affected by conflict and its aftermath face tremendous barriers to accessing reproductive health services, due to insecurity, inadequate numbers of trained personnel and lack of supplies. Family planning is often particularly neglected. Methods In six conflict-affected areas in Sudan, northern Uganda and the Democratic Republic of Congo, household surveys of married or in-union women of reproductive age were conducted to determine baseline measures of family planning knowledge, attitudes and behaviors regarding contraception. Health facility assessments were carried out to assess baseline measures of family planning services availability. Data were double-entered into CSPro 3.2 and exported to SAS 9.2, which was used to calculate descriptive statistics. The studies' purposes were to guide program activities and to serve as a baseline against which program accomplishments could be measured. Results Knowledge of modern contraceptive methods was low relative to other sub-Saharan African countries, and use of modern methods was under 4% in four sites; in two sites with prior family planning services it was 12% and 16.2%. From 30% to 40% of women reported they did not want a child within two years, however, and an additional 12% to 35% wanted no additional children, suggesting a clear need for family planning services. The health facilities assessment showed that at most only one-third of the facilities mandated to provide family planning had the necessary staff, equipment and supplies to do so adequately; in some areas, none of the facilities were prepared to offer such services. Conclusions Family planning services are desired by women living in crisis situations when offered in a manner appropriate to their needs, yet services are rarely adequate to meet these needs. Refugee and internally displaced women must be included in national and donors' plans to improve family planning in Africa. PMID:21752241
NASA Astrophysics Data System (ADS)
Pregowski, Piotr; Owadowska, Edyta; Pietrzak, Jan; Zwolenik, Slawomir
2005-09-01
The paper presents method of acquiring a new form of statistical information about the changes at scenery, overseen by thermal imaging camera in static configuration. This type of imagers reach uniquely high efficiency during nighttime surveillance and targeting. The technical issue we have solved, resulted from the problem: how to verify the hypothesis that small, nocturnal rodents, like bank voles, use common paths inside their range and that they form a common, rather stable system? Such research has been especially difficult because the mentioned mammals are secretive, move with various speed and due to low contrast to their natural surroundings - as leaves or grass - nearly impossible for other kind of observations from a few meters distance. The main advantage of the elaborated method showed to be both adequately filtered long thermal movies for manual analyses, as well as auto-creation of the synthetic images which present maps of invisible paths and activity of their usage. Additional file with logs describing objects and their dislocations as the ".txt" files allows various, more detailed studies of animal behavior. The obtained results proved that this original method delivers a new, non-invasive, powerful and dynamic concept of solving various ecological problems. Creation of networks consisted of uncooled thermal imagers - of significantly increased availability - with data transmissions to digital centers allows to investigate of moving - particularly heat generated - objects in complete darkness, much wider and much more efficiently than up today. Thus, although our system was elaborated for ecological studies, a similar one can be considered as a tool for chosen tasks in the optical security areas.
Cognitive Correlates of Inadequate Response to Reading Intervention
Fletcher, Jack M.; Stuebing, Karla K.; Barth, Amy E.; Denton, Carolyn A.; Cirino, Paul T.; Francis, David J.; Vaughn, Sharon
2012-01-01
The cognitive attributes of Grade 1 students who responded adequately and inadequately to a Tier 2 reading intervention were evaluated. The groups included inadequate responders based on decoding and fluency criteria (n = 29), only fluency criteria (n = 75), adequate responders (n = 85), and typically achieving students (n = 69). The cognitive measures included assessments of phonological awareness, rapid letter naming, oral language skills, processing speed, vocabulary, and nonverbal problem solving. Comparisons of all four groups identified phonological awareness as the most significant contributor to group differentiation. Measures of rapid letter naming, syntactic comprehension/working memory, and vocabulary also contributed uniquely to some comparisons of adequate and inadequate responders. In a series of regression analyses designed to evaluate the contributions of responder status to cognitive skills independently of variability in reading skills, only the model for rapid letter naming achieved statistical significance, accounting for a small (1%) increment in explained variance beyond that explained by models based only on reading levels. Altogether, these results do not suggest qualitative differences among the groups, but are consistent with a continuum of severity associated with the level of reading skills across the four groups. PMID:23125475
Mahdavi, Abdollah; Sedghi, Shahram; Sadoghi, Farahnaz; Azar, Farbod Ebadi Fard
2015-01-01
Introduction: In the death registration system, issuance of death certificate, as a binding rule, is considered among the major necessities of preparation of death statistics. In order to prepare death statistics that are adequately valid for subsequent applications, it is necessary to properly encode death certificates and fully follow rules on causes underlying death. This study aimed to assess the awareness and performance of agents involved in issuance of death certificate in the national death records system. Methods: It was a descriptive cross-sectional research, which was performed from September 2013 to March 2014 on 96 agents involved in issuance of death certificate Imam Khomeini, Alavi, Fatemi and BuAli education and treatment centers of Ardebil University of Medical Sciences. The population included faculty staff physicians, residents and health information management staffs. The research scale was also a researcher-made questionnaire that questioned the demographic information as well as awareness and performance of participants regarding death certificate coding rules. Research data was analyzed based on descriptive statistics and the chi-square test method in the SPSS software at a confidence level of 95%. Findings: A total of 34.42% of participants were aware of the general rules on issuance of death certificates while faculty staff higher specialists (41.67%) and clinical coders (38.34%) with five years of experience demonstrated the highest awareness levels. Only 23 participants (24.6%) were trained to issue death certificates. A total of 76 participants (79.3%) announced their need for learning how to complete death certificate forms on a constant basis. The awareness of participants about the general principle was assessed to be low (30.25%). Moreover, their awareness of selection rules and modification rules was low (27.75%) and moderate (45.25%), respectively. The chi-square test revealed a significant relationship between work experience and awareness of participants about coding rules (P=0.001), but no significant relationship was observed between education and awareness of coding rules (P=0.497). Conclusion: The awareness of participants about rules on coding death causes and their performance in this field was so satisfactory. That is to say, the awareness of faculty staff and health information management staffs was unexpectedly low. Seemingly, lack of adequate training is an international issue that causes mistakes in the recording of information on mortality. Hence, a short-term solution is to train faculty staff and residents and also revise the training provided to health information management staffs. As a long-term solution it is possible to provide related courses to general practitioner students. PMID:26156914
ERIC Educational Resources Information Center
Holtzapple, Carol K.
2011-01-01
Character education programs support the development of positive character traits in children and adults. Effective violence prevention programs improve pro-social competencies and reduce negative behaviors in students by enhancing protective factors (strong bonds with teachers; clear rules of conduct that are consistently enforced) and targeting…
Azodo, Clement Chinedu; Umoh, Agnes O
2015-09-15
The few existing studies on herpes labialis among health care workers have been predominantly among non-dental health care workers. The purpose of this study was to determine Nigerian dental health care providers' knowledge of, attitudes toward, preventive behaviors for, and refusal to treat patients with herpes labialis. This cross-sectional study was conducted among final-year dental students at the University of Benin, dental house officers, and residents at the University of Benin Teaching Hospital, Benin City, Nigeria. Data collection was via a self-administered questionnaire. Bivariate statistics and logistic regression were used to relate the dependent and independent variables. Of the 120 questionnaires distributed, 110 were completed and returned, giving a 91.7% retrieval rate. However, 15 of the returned questionnaires were discarded because they were improperly completed, leaving a total of 95 questionnaires for final analysis in this study. The majority of participants were over 28 years old (54.7%), male (67.4%), unmarried (66.3%), and postgraduate dental health care providers (51.6%). Less than half (43.2%) of participants demonstrated adequate overall knowledge of herpes labialis. About one-tenth (10.5%) and more than three-quarters (87.4%) of participants reported a positive attitude and performance of adequate preventive behaviors, respectively. A total of 16.8% of participants reported a high tendency to refuse treatment to patients with herpes labialis. Although not statistically significant, young, unmarried, male undergraduate participants reported a greater likelihood to refuse treatment to herpes labialis patients. We found a statistically significant positive correlation between attitude and refusal to treat patients with herpes labialis. However, marital status and the attitude of participants toward these patients emerged as the determinants for refusal to treat patients with herpes labialis. Data from this study revealed a high level of inadequate knowledge, negative attitudes, and reasonably adequate preventive behaviors with respect to herpes labialis. One out of every six dental health care workers studied reported having refused to treat patients with herpes labialis. Unmarried dental health care providers and those with negative attitudes toward herpes labialis patients were more prone to refuse treatment to these patients.
Bernardo, L M; Gardner, M J; Lucke, J; Ford, H
2001-04-01
Injured children are at risk for thermoregulatory compromise, where temperature maintenance mechanisms are overwhelmed by severe injury, environmental exposure, and resuscitation measures. Adequate thermoregulation can be maintained, and heat loss can be prevented, by core (administration of warmed intravenous fluid) and peripheral (application of convective air warming) methods. It is not known which warming method is better to maintain thermoregulation and prevent heat loss in injured children during their trauma resuscitations. The purpose of this feasibility study was to compare the effects of core and peripheral warming measures on body temperature and physiologic changes in a small sample of injured children during their initial emergency department (ED) treatment. A prospective, randomized experimental design was used. Eight injured children aged 3 to 14 years (mean = 6.87, SD = 3.44 ) treated in the ED of Children's Hospital of Pittsburgh were enrolled. Physiologic responses (eg, heart rate, blood pressure, respiratory rate, arterial oxygen saturation, core, peripheral temperatures) and level of consciousness were continuously measured and recorded every 5 minutes to detect early thermoregulatory compromise and to determine the child's response to warming. Data were collected throughout the resuscitation period, including transport to CT scan, the inpatient nursing unit, intensive care unit, operating room or discharge to home. Core warming was accomplished with the Hotline Fluid Warmer (Sims Level 1, Inc., Rockland, MA), and peripheral warming was accomplished with the Snuggle Warm Convective Warming System (Sins Level 1, Inc., Rockland, MA). Data were analyzed using descriptive and inferential statistics. There were no statistically significant differences between the two groups on age (t = -0.485, P = 0.645); weight (t = -0.005, P = 0.996); amount of prehospital intravenous (IV) fluid (t = 0314, P = 0.766); temperature on ED arrival (t = 0.287, P = 0.784); total amount of infused IV fluid (t = -0.21, P = 0.8); and length of time from ED admission to hospital admission (t = -0.613, P = 0.56). There were no statistically significant differences between the two groups on RTS (t = -0.516, P = 0.633). When comparing the mean differences in temperature upon hospital admission, no statistically significant differences were found (t = -1.572, P = 0.167). There were no statistically significant differences between the two groups in tympanic [F(15) = 0.71, P = 0.44] and skin [F(15) = 0.06, P = 0.81] temperature measurements over time. Core and peripheral warming methods appeared to e effective in preventing heat loss in this stable patient population. A reasonable next step would be to continue this trial in a larger sample of patients who are at greater risk for heat loss and subsequent hypothermia and to use a control group.
Nilsson, Erik; De Deco, Pietro; Trevisan, Marco; Bellocco, Rino; Lindholm, Bengt; Lund, Lars H; Coresh, Josef; Carrero, Juan J
2018-05-02
Clinical heart failure (HF) guidelines recommend monitoring of creatinine and potassium throughout the initial weeks of mineralocorticoid receptor antagonists (MRAs) therapy. We here assessed the extent to which this occurs in our healthcare. Observational study in 2007-2010 HF patients starting MRA therapy in Stockholm, Sweden. Outcomes included potassium and creatinine laboratory testing before MRA initiation and in the early (days 1-10) and extended (days 11-90) post-initiation periods. Exclusion criteria considered death/hospitalization within 90 days, and lack of a second MRA dispense. Of 4,036 HF patients starting on MRA, 45% were initiated from a hospital, 24% from a primary care center and 30% from other private centers. Overall, 89% underwent pre-initiation testing, being more common among hospital (97%) than for primary care (74%) initiations. Only 24% were adequately monitored in all three recommended intervals, being again more frequent following hospital (33%) than private (21%) or primary care (17%) initiations. In multivariable analyses, adequate monitoring was more likely for hospital [odds ratio (OR), 95% confidence interval; 2.85, 2.34-3.56] initiations, and for patients with chronic kidney disease (OR 1.79, 1.30-2.43) and concomitant use of ACE (OR 1.27, 1.05-1.52), ARBs (OR 1.19, 1.01-1.40) or beta blockers (OR 1.65, 1.22-2.26). Age, sex and prescribing center explained a small portion of adequate monitoring (c-statistic, 0.63). Addition of comorbidities and medications improved prediction marginally (c-statistic, 0.65). Although serum potassium and creatinine monitoring before MRA initiation for HF is frequent, rates of post-initiation monitoring remain suboptimal, especially among primary care centers.
23 CFR 645.117 - Cost development and reimbursement.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of any alternate costing method should consider the factors listed in paragraphs (b) through (g) of... worked on the project are reimbursable when supported by adequate records. This includes labor associated... the utility may be reimbursed for the time worked directly on the project when supported by adequate...
Evaluating the Reliability, Validity, and Usefulness of Education Cost Studies
ERIC Educational Resources Information Center
Baker, Bruce D.
2006-01-01
Recent studies that purport to estimate the costs of constitutionally adequate education have been described as either a "gold standard" that should guide legislative school finance policy design and judicial evaluation, or as pure "alchemy." Methods for estimating the cost of constitutionally adequate education can be roughly…
Is the Stock of VET Skills Adequate? Assessment Methodologies.
ERIC Educational Resources Information Center
Blandy, Richard; Freeland, Brett
In Australia and elsewhere, four approaches have been used to determine whether stocks of vocational education and training (VET) skills are adequate to meet industry needs. The four methods are as follows: (1) the manpower requirements approach; (2) the international, national, and industry comparisons approach; (3) the labor market analysis…
Shawna, Wicks; M., Taylor Christopher; Meng, Luo; Eugene, Blanchard IV; David, Ribnicky; T., Cefalu William; L., Mynatt Randall; A., Welsh David
2014-01-01
Objective The gut microbiome has been implicated in obesity and metabolic syndrome; however, most studies have focused on fecal or colonic samples. Several species of Artemisia have been reported to ameliorate insulin signaling both in vitro and in vivo. The aim of this study was to characterize the mucosal and luminal bacterial populations in the terminal ileum with or without supplementation with Artemisia extracts. Materials/Methods Following 4 weeks of supplementation with different Artemisia extracts (PMI 5011, Santa or Scopa), diet-induced obese mice were sacrificed and luminal and mucosal samples of terminal ileum were used to evaluate microbial community composition by pyrosequencing of 16S rDNA hypervariable regions. Results Significant differences in community structure and membership were observed between luminal and mucosal samples, irrespective of diet group. All Artemisia extracts increased the Bacteroidetes:Firmicutes ratio in mucosal samples. This effect was not observed in the luminal compartment. There was high inter-individual variability in the phylogenetic assessments of the ileal microbiota, limiting the statistical power of this pilot investigation. Conclusions Marked differences in bacterial communities exist dependent upon the biogeographic compartment in the terminal ileum. Future studies testing the effects of Artemisia or other botanical supplements require larger sample sizes for adequate statistical power. PMID:24985102
Raupp, Ludimila; Fávaro, Thatiana Regina; Cunha, Geraldo Marcelo; Santos, Ricardo Ventura
2017-01-01
The aims of this study were to analyze and describe the presence and infrastructure of basic sanitation in the urban areas of Brazil, contrasting indigenous with non-indigenous households. Methods: A cross-sectional study based on microdata from the 2010 Census was conducted. The analyses were based on descriptive statistics (prevalence) and the construction of multiple logistic regression models (adjusted by socioeconomic and demographic covariates). The odds ratios were estimated for the association between the explanatory variables (covariates) and the outcome variables (water supply, sewage, garbage collection, and adequate sanitation). The statistical significance level established was 5%. Among the analyzed services, sewage proved to be the most precarious. Regarding race or color, indigenous households presented the lowest rate of sanitary infrastructure in Urban Brazil. The adjusted regression showed that, in general, indigenous households were at a disadvantage when compared to other categories of race or color, especially in terms of the presence of garbage collection services. These inequalities were much more pronounced in the South and Southeastern regions. The analyses of this study not only confirm the profile of poor conditions and infrastructure of the basic sanitation of indigenous households in urban areas, but also demonstrate the persistence of inequalities associated with race or color in the country.
Validation of a Survey Questionnaire on Organ Donation: An Arabic World Scenario
Agarwal, Tulika Mehta; Al-Thani, Hassan; Al Maslamani, Yousuf
2018-01-01
Objective To validate a questionnaire for measuring factors influencing organ donation and transplant. Methods The constructed questionnaire was based on the theory of planned behavior by Ajzen Icek and had 45 questions including general inquiry and demographic information. Four experts on the topic, Arabic culture, and the Arabic and English languages established content validity through review. It was quantified by content validity index (CVI). Construct validity was established by principal component analysis (PCA), whereas internal consistency was checked by Cronbach's Alpha and intraclass correlation coefficient (ICC). Statistical analysis was performed by SPSS 22.0 statistical package. Results Content validity in the form of S-CVI/Average and S-CVI/UA was 0.95 and 0.82, respectively, suggesting adequate relevance content of the questionnaire. Factor analysis indicated that the construct validity for each domain (knowledge, attitudes, beliefs, and intention) was 65%, 71%, 77%, and 70%, respectively. Cronbach's Alpha and ICC coefficients were 0.90, 0.67, 0.75, and 0.74 and 0.82, 0.58, 0.61, and 0.74, respectively, for the domains. Conclusion The questionnaire consists of 39 items on knowledge, attitudes, beliefs, and intention domains which is valid and reliable tool to use for organ donation and transplant survey. PMID:29593894
Cekic-Nagas, Isil; Egilmez, Ferhan; Ergun, Gulfem
2010-01-01
Objectives: The aim of this study was to compare the microhardness of five different resin composites at different irradiation distances (2 mm and 9 mm) by using three light curing units (quartz tungsten halogen, light emitting diodes and plasma arc). Methods: A total of 210 disc-shaped samples (2 mm height and 6 mm diameter) were prepared from different resin composites (Simile, Aelite Aesthetic Enamel, Clearfil AP-X, Grandio caps and Filtek Z250). Photoactivation was performed by using quartz tungsten halogen, light emitting diode and plasma arc curing units at two irradiation distances (2 mm and 9 mm). Then the samples (n=7/per group) were stored dry in dark at 37°C for 24 h. The Vickers hardness test was performed on the resin composite layer with a microhardness tester (Shimadzu HMV). Data were statistically analyzed using nonparametric Kruskal Wallis and Mann-Whitney U tests. Results: Statistical analysis revealed that the resin composite groups, the type of the light curing units and the irradiation distances have significant effects on the microhardness values (P<.05). Conclusions: Light curing unit and irradiation distance are important factors to be considered for obtaining adequate microhardness of different resin composite groups. PMID:20922164
Acar, Buket; Kamburoğlu, Kıvanç; Tatar, İlkan; Arıkan, Volkan; Çelik, Hakan Hamdi; Yüksel, Selcen; Özen, Tuncer
2015-12-01
This study was performed to compare the accuracy of micro-computed tomography (CT) and cone-beam computed tomography (CBCT) in detecting accessory canals in primary molars. Forty-one extracted human primary first and second molars were embedded in wax blocks and scanned using micro-CT and CBCT. After the images were taken, the samples were processed using a clearing technique and examined under a stereomicroscope in order to establish the gold standard for this study. The specimens were classified into three groups: maxillary molars, mandibular molars with three canals, and mandibular molars with four canals. Differences between the gold standard and the observations made using the imaging methods were calculated using Spearman's rho correlation coefficient test. The presence of accessory canals in micro-CT images of maxillary and mandibular root canals showed a statistically significant correlation with the stereomicroscopic images used as a gold standard. No statistically significant correlation was found between the CBCT findings and the stereomicroscopic images. Although micro-CT is not suitable for clinical use, it provides more detailed information about minor anatomical structures. However, CBCT is convenient for clinical use but may not be capable of adequately analyzing the internal anatomy of primary teeth.
AKBOĞA, Özge; BARADAN, Selim
2016-01-01
Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods. PMID:27524105
Liel, Y
1999-01-01
Relatively little is known about the cytological characteristics of hyperfunctioning (hot) thyroid nodules. Concern has been expressed that fine-needle aspiration (FNA) identifies hot nodules as follicular tumors or indeterminate, and as a consequence patients could be unnecessarily referred for surgery. Between 1979 and 1996, thyroid FNA was performed on 829 patients. Result of thyroid scan was available in 326; 69 (21%) patients had hot, and 257 (79%) had warm or cold thyroid nodules. Nodules in each of these major groups were divided into 2 subgroups: clinically solitary nodules and dominant nodules in multinodular goiters (MNG). The frequencies of adequate versus inadequate FNA samples, and of conclusive versus indeterminate FNA results were determined separately for each of the groups and subgroups. In addition, patients with hot nodules and overt hyperthyroidism were identified and evaluated separately. Bivariate analyses were performed for the frequency of adequate versus inadequate smears and conclusive versus indeterminate results between hot, toxic, and cold-warm nodules, and between solitary nodules and MNG. The frequency of adequate aspirations and conclusive results in the various groups and subgroups was found to be statistically indistinguishable. In conclusion, the yield of adequate samples and the rate of conclusive results of FNA in thyroid nodules is similar, irrespective of the functional state or of goiter presentation. Hot thyroid nodules do not seem to produce an increase in the rate of inadequate or indeterminate FNA results, and therefore, do not affect the overall performance of thyroid FNA.
Penaranda, Eribeth; Diaz, Marco; Noriega, Oscar; Shokar, Navkiran
2012-07-01
Health literacy (HL) is a measure of the communication skills that are needed by an individual to effectively navigate the healthcare system. Hispanic adults have lower average levels of HL than any other racial/ethnic group; however, the prevalence of adequate HL among Hispanics along the US-Mexico border is unknown. We performed a cross-sectional survey of 200 adult primary care patients who attended four low-income community clinics along the US-Mexico border. Patients were included in the study if they were self-described Hispanics whose first language was Spanish or bilingual patients who reported that they were primarily Spanish speakers. Adequate HL was defined as having a score of ≥38 on the Short Assessment of Health Literacy for Spanish Adults-50. Three patients (1.5%) had inadequate HL. Because of the high proportion of patients having adequate HL, we found no statistical differences between patients with adequate HL versus inadequate HL by age, sex, educational attainment, health coverage, or self-reported health status; however, all three patients with inadequate HL were found to be 60 years old or older and had less than a high school education. The results of HL assessment varied according to the tool and setting used in measuring Spanish-speaking Hispanics. In certain clinical scenarios, current tools may underestimate the actual prevalence of adequate HL. Further development and assessment of HL tools appropriate for Spanish-speaking Hispanics is needed as a first step in developing interventions to limit disparities in health care among all Americans.
Andujar-Vazquez, Gabriela M; Gardiner, Bradley; Magro, Francis; Beaulac, Kirthana R; Doron, Shira; Snydman, David R
2017-01-01
Abstract Background Bloodstream infections are a major cause of morbidity and mortality worldwide, with favorable clinical outcomes associated with early optimal antibiotic selection. Rapid diagnostics have become a key part in achieving this. Biofire Filmarray® was introduced at our institution for rapid blood culture (BC) identification, coupled with antimicrobial stewardship (AS) interventions. We aimed to assess the impact of this test on time to adequate antimicrobial therapy in a setting with pre-existing effective AS interventions. Methods An observational retrospective chart review, pre and post study was performed. We reviewed adult positive BC before and after implementation of Biofire. Outcomes were: (1) time from BC result reported to health care provider to start of adequate antimicrobial therapy,(2) time to stopping antimicrobial therapy in BC thought to be contaminants, (3) time to any change in antimicrobial therapy and (4) a composite outcome of outcomes 1 and 2. A univariate Cox proportional hazards model was performed. Results 326 positive BC were analyzed, 173 before and 153 after Biofire implementation. At the time of healthcare provider notification, 77 were not on adequate antimicrobials, with median time to adequate therapy of 6.98 hours. (IQR 3.93–23.96) before and 6.1 hours. (IQR 1.84–20.95) after implementation, P = 0.48. There were 75 BC classified as contaminants and median time to stopping antimicrobials was 48.28 (IQR 18.56–89.36) vs. 45.25 hours. (IQR 15.12–100.60), P = 0.61. Time to any change in any antimicrobial therapy was similar with a median of 13.05 (IQR 4.00–36.77) vs. 10.90 hours. (IQR 2.97–31.10), P = 0.87. Analysis of the composite outcome revealed a median of 23.95 (6.29–58.50) vs. 14.82 (IQR 4.07–44.79) hours. (Hazard ratio 1.33, 95% confidence interval 0.96–1.84, P = 0.09). Conclusion Implementation of the Biofire Filmarray® did not have a statistically significant effect on our composite outcome of time to adequate therapy and time to discontinuation in the case of contaminants. Our findings suggests that when added to other effective AS surveillance and interventions, the magnitude of the clinical impact of rapid PCR diagnostics for BC identification is minimal. Disclosures D. R. Snydman, Merck: Scientific Advisor, Consulting fee; Shire: Scientific Advisor, Consulting fee
Content-Based VLE Designs Improve Learning Efficiency in Constructivist Statistics Education
Wessa, Patrick; De Rycker, Antoon; Holliday, Ian Edward
2011-01-01
Background We introduced a series of computer-supported workshops in our undergraduate statistics courses, in the hope that it would help students to gain a deeper understanding of statistical concepts. This raised questions about the appropriate design of the Virtual Learning Environment (VLE) in which such an approach had to be implemented. Therefore, we investigated two competing software design models for VLEs. In the first system, all learning features were a function of the classical VLE. The second system was designed from the perspective that learning features should be a function of the course's core content (statistical analyses), which required us to develop a specific–purpose Statistical Learning Environment (SLE) based on Reproducible Computing and newly developed Peer Review (PR) technology. Objectives The main research question is whether the second VLE design improved learning efficiency as compared to the standard type of VLE design that is commonly used in education. As a secondary objective we provide empirical evidence about the usefulness of PR as a constructivist learning activity which supports non-rote learning. Finally, this paper illustrates that it is possible to introduce a constructivist learning approach in large student populations, based on adequately designed educational technology, without subsuming educational content to technological convenience. Methods Both VLE systems were tested within a two-year quasi-experiment based on a Reliable Nonequivalent Group Design. This approach allowed us to draw valid conclusions about the treatment effect of the changed VLE design, even though the systems were implemented in successive years. The methodological aspects about the experiment's internal validity are explained extensively. Results The effect of the design change is shown to have substantially increased the efficiency of constructivist, computer-assisted learning activities for all cohorts of the student population under investigation. The findings demonstrate that a content–based design outperforms the traditional VLE–based design. PMID:21998652
Ahmed, K S
1979-01-01
In Bangladesh the Population Control and Family Planning Division of the Ministry of Health and Population Control has decided to delegate increased financial and administrative powers to the officers of the family planning program at the district level and below. Currently, about 20,000 family planning workers and officials are at work in rural areas. The government believes that the success of the entire family planning program depends on the performance of workers in rural areas, because that is where about 90% of the population lives. Awareness of the need to improve statistical data in Bangladesh has been increasing, particularly in regard to the development of rural areas. An accurate statistical profile of rural Bangladesh is crucial to the formation, implementation and evaluation of rural development programs. A Seminar on Statistics for Rural Development will be held from June 18-20, 1980. The primary objectives of the Seminar are to make an exhaustive analysis of the current availability of statistics required for rural development programs and to consider methodological and operational improvements toward building up an adequate data base.
Metrology of vibration measurements by laser techniques
NASA Astrophysics Data System (ADS)
von Martens, Hans-Jürgen
2008-06-01
Metrology as the art of careful measurement has been understood as uniform methodology for measurements in natural sciences, covering methods for the consistent assessment of experimental data and a corpus of rules regulating application in technology and in trade and industry. The knowledge, methods and tools available for precision measurements can be exploited for measurements at any level of uncertainty in any field of science and technology. A metrological approach to the preparation, execution and evaluation (including expression of uncertainty) of measurements of translational and rotational motion quantities using laser interferometer methods and techniques will be presented. The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and upgraded ISO standards are reviewed with respect to their suitability for ensuring traceable vibration measurements and calibrations in an extended frequency range of 0.4 Hz to higher than 100 kHz. Using adequate vibration exciters to generate sufficient displacement or velocity amplitudes, the upper frequency limits of the laser interferometer methods specified in ISO 16063-11 for frequencies <= 10 kHz can be expanded to 100 kHz and beyond. A comparison of different methods simultaneously used for vibration measurements at 100 kHz will be demonstrated. A statistical analysis of numerous experimental results proves the highest accuracy achievable currently in vibration measurements by specific laser methods, techniques and procedures (i.e. measurement uncertainty 0.05 % at frequencies <= 10 kHz, <= 1 % up to 100 kHz).
Applications of statistical physics to technology price evolution
NASA Astrophysics Data System (ADS)
McNerney, James
Understanding how changing technology affects the prices of goods is a problem with both rich phenomenology and important policy consequences. Using methods from statistical physics, I model technology-driven price evolution. First, I examine a model for the price evolution of individual technologies. The price of a good often follows a power law equation when plotted against its cumulative production. This observation turns out to have significant consequences for technology policy aimed at mitigating climate change, where technologies are needed that achieve low carbon emissions at low cost. However, no theory adequately explains why technology prices follow power laws. To understand this behavior, I simplify an existing model that treats technologies as machines composed of interacting components. I find that the power law exponent of the price trajectory is inversely related to the number of interactions per component. I extend the model to allow for more realistic component interactions and make a testable prediction. Next, I conduct a case-study on the cost evolution of coal-fired electricity. I derive the cost in terms of various physical and economic components. The results suggest that commodities and technologies fall into distinct classes of price models, with commodities following martingales, and technologies following exponentials in time or power laws in cumulative production. I then examine the network of money flows between industries. This work is a precursor to studying the simultaneous evolution of multiple technologies. Economies resemble large machines, with different industries acting as interacting components with specialized functions. To begin studying the structure of these machines, I examine 20 economies with an emphasis on finding common features to serve as targets for statistical physics models. I find they share the same money flow and industry size distributions. I apply methods from statistical physics to show that industries cluster the same way according to industry type. Finally, I use these industry money flows to model the price evolution of many goods simultaneously, where network effects become important. I derive a prediction for which goods tend to improve most rapidly. The fastest-improving goods are those with the highest mean path lengths in the money flow network.
Kumma, Wondimagegn Paulos; Haji, Yusuf; Abdurahmen, Junayde; Mehretie Adinew, Yohannes
2018-01-01
Universal use of iodized salt is a simple and inexpensive method to prevent and eliminate iodine deficiency disorders like mental retardation. However, little is known about the level of adequately iodized salt consumption in the study area. Therefore, the study was aimed at assessing the proportion of households having adequately iodized salt and associated factors in Wolaita Sodo town and its peripheries, Southern Ethiopia. A cross-sectional study was conducted from May 10 to 20, 2016, in 441 households in Sodo town and its peripheries. Samples were selected using the systematic sampling technique. An iodometric titration method (AOAC, 2000) was used to analyze the iodine content of the salt samples. Data entry and analysis were done using Epi Info version 3.5.1 and SPSS version 16, respectively. The female to male ratio of the respondents was 219. The mean age of the respondents was 30.2 (±7.3 SD). The proportion of households having adequately iodized salt was 37.7%, with 95% CI of 33.2% to 42.2%. Not exposing salt to sunlight with [OR: 3.75; 95% CI: 2.14, 6.57], higher monthly income [OR: 3.71; 95% CI: 1.97-7.01], and formal education of respondents with [OR: 1.75; 95% CI: 1.14, 2.70] were found associated with the presence of adequately iodized salt at home. This study revealed low levels of households having adequately iodized salt in Wolaita Sodo town and its peripheries. The evidence here shows that there is a need to increase the supply of adequately iodized salt to meet the goal for monitoring progress towards sustainable elimination of IDD.
Identifying the Source of Misfit in Item Response Theory Models.
Liu, Yang; Maydeu-Olivares, Alberto
2014-01-01
When an item response theory model fails to fit adequately, the items for which the model provides a good fit and those for which it does not must be determined. To this end, we compare the performance of several fit statistics for item pairs with known asymptotic distributions under maximum likelihood estimation of the item parameters: (a) a mean and variance adjustment to bivariate Pearson's X(2), (b) a bivariate subtable analog to Reiser's (1996) overall goodness-of-fit test, (c) a z statistic for the bivariate residual cross product, and (d) Maydeu-Olivares and Joe's (2006) M2 statistic applied to bivariate subtables. The unadjusted Pearson's X(2) with heuristically determined degrees of freedom is also included in the comparison. For binary and ordinal data, our simulation results suggest that the z statistic has the best Type I error and power behavior among all the statistics under investigation when the observed information matrix is used in its computation. However, if one has to use the cross-product information, the mean and variance adjusted X(2) is recommended. We illustrate the use of pairwise fit statistics in 2 real-data examples and discuss possible extensions of the current research in various directions.
2011-01-01
Background Pregnancy is a good time to develop healthy lifestyle habits including regular exercise and good nutrition. Programs of physical exercise for pregnant women have been recommended; however, there are few references on this subject in the literature. The objective of this study was to evaluate the knowledge, attitude and practice of pregnant women with respect to appropriate physical exercise during pregnancy, and also to investigate why some women do not exercise during pregnancy. Methods A descriptive study was conducted in which 161 women of 18 to 45 years of age were interviewed in the third trimester of pregnancy. These women were receiving prenatal care at National Health Service (SUS) primary healthcare units and had no pathologies for which physical exercise would constitute a risk. The women were selected at an ultrasonography clinic accredited to the SUS in Campinas, São Paulo. A previously elaborated knowledge, attitude and practice (KAP) questionnaire was used to collect data, which were then stored in an Epinfo database. Statistical analysis was conducted using Pearson's chi-square test and Fisher's exact test to evaluate the association between the study variables (p < 0.05). Results Almost two-thirds (65.6%) of the women were sufficiently informed about the practice of physical exercise during pregnancy and the vast majority (93.8%) was in favor of it. Nevertheless, only just over 20% of the women in this sample exercised adequately. Significant associations were found between an adequate knowledge of physical exercise during pregnancy and education level (p = 0.0014) and between the adequate practice of physical exercise during pregnancy and having had fewer pregnancies (p = 0.0001). Lack of time and feeling tired and uncomfortable were the principal reasons given by the women for not exercising. Conclusion These results suggest that women's knowledge concerning the practice of physical exercise during pregnancy is reasonable and their attitude is favorable; however, relatively few actually exercise during pregnancy. PMID:22051371
Adequacy of Prenatal Care and Gestational Weight Gain
Crandell, Jamie L.; Jones-Vessey, Kathleen
2016-01-01
Abstract Background: The goal of prenatal care is to maximize health outcomes for a woman and her fetus. We examined how prenatal care is associated with meeting the 2009 Institute of Medicine (IOM) guidelines for gestational weight gain. Sample: The study used deidentified birth certificate data supplied by the North Carolina State Center for Health Statistics. The sample included 197,354 women (≥18 years) who delivered singleton full-term infants in 2011 and 2012. Methods: A generalized multinomial model was used to identify how adequate prenatal care was associated with the odds of gaining excessive or insufficient weight during pregnancy according to the 2009 IOM guidelines. The model adjusted for prepregnancy body size, sociodemographic factors, and birth weight. Results: A total of 197,354 women (≥18 years) delivered singleton full-term infants. The odds ratio (OR) for excessive weight gain was 2.44 (95% CI 2.37–2.50) in overweight and 2.33 (95% CI 2.27–2.40) in obese women compared with normal weight women. The OR for insufficient weight gain was 1.15 (95% CI 1.09–1.22) for underweight and 1.34 (95% CI 1.30–1.39) for obese women compared with normal weight women. Prenatal care at the inadequate or intermediate levels was associated with insufficient weight gain (OR: 1.32, 95% CI 1.27–1.38; OR: 1.15, 95% CI 1.09–1.21, respectively) compared with adequate prenatal care. Women with inadequate care were less likely to gain excessive weight (OR: 0.88, 95% CI 0.86–0.91). Conclusions: Whereas prenatal care was effective for preventing insufficient weight gain regardless of prepregnancy body size, educational background, and racial/ethnic group, there were no indications that adequate prenatal care was associated with reduced risk for excessive gestational weight gain. Further research is needed to improve prenatal care programs for preventing excess weight gain. PMID:26741198
Azar, Marleine; Riehm, Kira E.; McKay, Dean; Thombs, Brett D.
2015-01-01
Background Confidence that randomized controlled trial (RCT) results accurately reflect intervention effectiveness depends on proper trial conduct and the accuracy and completeness of published trial reports. The Journal of Consulting and Clinical Psychology (JCCP) is the primary trials journal amongst American Psychological Association (APA) journals. The objectives of this study were to review RCTs recently published in JCCP to evaluate (1) adequacy of primary outcome analysis definitions; (2) registration status; and, (3) among registered trials, adequacy of outcome registrations. Additionally, we compared results from JCCP to findings from a recent study of top psychosomatic and behavioral medicine journals. Methods Eligible RCTs were published in JCCP in 2013–2014. For each RCT, two investigators independently extracted data on (1) adequacy of outcome analysis definitions in the published report, (2) whether the RCT was registered prior to enrolling patients, and (3) adequacy of outcome registration. Results Of 70 RCTs reviewed, 12 (17.1%) adequately defined primary or secondary outcome analyses, whereas 58 (82.3%) had multiple primary outcome analyses without statistical adjustment or undefined outcome analyses. There were 39 (55.7%) registered trials. Only two trials registered prior to patient enrollment with a single primary outcome variable and time point of assessment. However, in one of the two trials, registered and published outcomes were discrepant. No studies were adequately registered as per Standard Protocol Items: Recommendation for Interventional Trials guidelines. Compared to psychosomatic and behavioral medicine journals, the proportion of published trials with adequate outcome analysis declarations was significantly lower in JCCP (17.1% versus 32.9%; p = 0.029). The proportion of registered trials in JCCP (55.7%) was comparable to behavioral medicine journals (52.6%; p = 0.709). Conclusions The quality of published outcome analysis definitions and trial registrations in JCCP is suboptimal. Greater attention to proper trial registration and outcome analysis definition in published reports is needed. PMID:26581079
Estimating effects of improved drinking water and sanitation on cholera.
Leidner, Andrew J; Adusumilli, Naveen C
2013-12-01
Demand for adequate provision of drinking-water and sanitation facilities to promote public health and economic growth is increasing in the rapidly urbanizing countries of the developing world. With a panel of data on Asia and Africa from 1990 to 2008, associations are estimated between the occurrence of cholera outbreaks, the case rates in given outbreaks, the mortality rates associated with cholera and two disease control mechanisms, drinking-water and sanitation services. A statistically significant and negative effect is found between drinking-water services and both cholera case rates as well as cholera-related mortality rates. A relatively weak statistical relationship is found between the occurrence of cholera outbreaks and sanitation services.
RhinAsthma patient perspective: A Rasch validation study.
Molinengo, Giorgia; Baiardini, Ilaria; Braido, Fulvio; Loera, Barbara
2018-02-01
In daily practice, Health-Related Quality of Life (HRQoL) tools are useful for supplementing clinical data with the patient's perspective. To encourage their use by clinicians, the availability of tools that can quickly provide valid results is crucial. A new HRQoL tool has been proposed for patients with asthma and rhinitis: the RhinAsthma Patient Perspective-RAPP. The aim of this study was to evaluate the psychometric robustness of the RAPP using the Item Response Theory (IRT) approach, to evaluate the scalability of items and test whether or not patients use the items response scale correctly. 155 patients (53.5% women, mean age 39.1, range 16-76) were recruited during a multicenter study. RAPP metric properties were investigated using IRT models. Differential item functioning (DIF) was used for gender, age, and asthma control test (ACT). The RAPP adequately fitted the Rating Scale model, demonstrating the equality of the rating scale structure for all items. All statistics on items were satisfactory. The RAPP had adequate internal reliability and showed good ability to discriminate among different groups of participants. DIF analysis indicated that there were no differential item functioning issues for gender. One item showed a DIF by age and four items by ACT. The psychometric evaluation performed using IRT models demonstrated that the RAPP met all the criteria to be considered a reliable and valid method of measurement. From a clinical perspective, this will allow physicians to confidently interpret scores as good indicators of Quality of Life of patients with asthma.
Shah, Shagun Bhatia; Chowdhury, Itee; Bhargava, Ajay Kumar; Sabbharwal, Bhawnish
2015-01-01
Background and Aims: This study aimed to compare the hemodynamic responses during induction and intubation between propofol and etomidate using entropy guided hypnosis. Material and Methods: Sixty ASA I & II patients in the age group 20-60 yrs, scheduled for modified radical mastectomy were randomly allocated in two groups based on induction agent Etomidate or Propofol. Both groups received intravenous midazolam 0.03 mg kg-1 and fentanyl 2 μg kg-1 as premedication. After induction with the desired agent titrated to entropy 40, vecuronium 0.1 mg kg-1 was administered for neuromuscular blockade. Heart rate, systolic, diastolic and mean arterial pressures, response entropy [RE] and state entropy [SE] were recorded at baseline, induction and upto three minutes post intubation. Data was subject to statistical analysis SPSS (version 12.0) the paired and the unpaired Student's T-tests for equality of means. Results: Etomidate provided hemodynamic stability without the requirement of any rescue drug in 96.6% patients whereas rescue drug ephedrine was required in 36.6% patients in propofol group. Reduced induction doses 0.15mg kg-1 for etomidate and 0.98 mg kg-1 for propofol, sufficed to give an adequate anaesthetic depth based on entropy. Conclusion: Etomidate provides more hemodynamic stability than propofol during induction and intubation. Reduced induction doses of etomidate and propofol titrated to entropy translated into increased hemodynamic stability for both drugs and sufficed to give an adequate anaesthetic depth. PMID:25948897
Farzi, Sedigheh; Moladoost, Azam; Bahrami, Masoud; Farzi, Saba; Etminani, Reza
2017-01-01
One of the goals of nursing is providing safe care, prevention of injury, and health promotion of patients. Patient safety in intensive care units is threatened for various reasons. This study aimed to survey patient safety culture from the perspective of nurses in intensive care units. This cross-sectional study was conducted in 2016. Sampling was done using the convenience method. The sample consisted of 367 nurses working in intensive care units of teaching hospitals affiliated to Isfahan University of Medical Sciences. Data collection was performed using a two-part questionnaire that included demographic and hospital survey on Patient Safety Culture (HSOPSC) questionnaire. Data analysis was done using descriptive statistics (mean and standard deviation). Among the 12 dimensions of safety culture, the nurses assigned the highest score to "team work within units" (97.3%) and "Organizational learning-continuous improvement" (84%). They assigned the least score to "handoffs and transitions"(21.1%), "non-punitive response to errors" (24.7%), "Staffing" (35.6%), "Communication openness" (47.5%), and "Teamwork across units" (49.4%). The patient safety culture dimensions have low levels that require adequate attention and essential measures of health care centers including facilitating teamwork, providing adequate staff, and developing a checklist of handoffs and transitions. Furthermore, to increase reporting error and to promote a patient safety culture in intensive care units, some strategies should be adopted including a system-based approach to deal with the error.
USDA-ARS?s Scientific Manuscript database
Population genetic studies on a global scale may be hampered by the ability to acquire quality samples from distant countries. Preservation methods must be adequate to prevent the samples from decay during shipping, so an adequate quantity of quality DNA can be extracted for analysis, and materials...
Park, Myung Sook; Kang, Kyung Ja; Jang, Sun Joo; Lee, Joo Yun; Chang, Sun Ju
2018-03-01
This study aimed to evaluate the components of test-retest reliability including time interval, sample size, and statistical methods used in patient-reported outcome measures in older people and to provide suggestions on the methodology for calculating test-retest reliability for patient-reported outcomes in older people. This was a systematic literature review. MEDLINE, Embase, CINAHL, and PsycINFO were searched from January 1, 2000 to August 10, 2017 by an information specialist. This systematic review was guided by both the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist and the guideline for systematic review published by the National Evidence-based Healthcare Collaborating Agency in Korea. The methodological quality was assessed by the Consensus-based Standards for the selection of health Measurement Instruments checklist box B. Ninety-five out of 12,641 studies were selected for the analysis. The median time interval for test-retest reliability was 14days, and the ratio of sample size for test-retest reliability to the number of items in each measure ranged from 1:1 to 1:4. The most frequently used statistical methods for continuous scores was intraclass correlation coefficients (ICCs). Among the 63 studies that used ICCs, 21 studies presented models for ICC calculations and 30 studies reported 95% confidence intervals of the ICCs. Additional analyses using 17 studies that reported a strong ICC (>0.09) showed that the mean time interval was 12.88days and the mean ratio of the number of items to sample size was 1:5.37. When researchers plan to assess the test-retest reliability of patient-reported outcome measures for older people, they need to consider an adequate time interval of approximately 13days and the sample size of about 5 times the number of items. Particularly, statistical methods should not only be selected based on the types of scores of the patient-reported outcome measures, but should also be described clearly in the studies that report the results of test-retest reliability. Copyright © 2017 Elsevier Ltd. All rights reserved.
[Sem: a suitable statistical software adaptated for research in oncology].
Kwiatkowski, F; Girard, M; Hacene, K; Berlie, J
2000-10-01
Many softwares have been adapted for medical use; they rarely enable conveniently both data management and statistics. A recent cooperative work ended up in a new software, Sem (Statistics Epidemiology Medicine), which allows data management of trials and, as well, statistical treatments on them. Very convenient, it can be used by non professional in statistics (biologists, doctors, researchers, data managers), since usually (excepted with multivariate models), the software performs by itself the most adequate test, after what complementary tests can be requested if needed. Sem data base manager (DBM) is not compatible with usual DBM: this constitutes a first protection against loss of privacy. Other shields (passwords, cryptage...) strengthen data security, all the more necessary today since Sem can be run on computers nets. Data organization enables multiplicity: forms can be duplicated by patient. Dates are treated in a special but transparent manner (sorting, date and delay calculations...). Sem communicates with common desktop softwares, often with a simple copy/paste. So, statistics can be easily performed on data stored in external calculation sheets, and slides by pasting graphs with a single mouse click (survival curves...). Already used over fifty places in different hospitals for daily work, this product, combining data management and statistics, appears to be a convenient and innovative solution.
Armed Services Pricing Manual (ASPM). Volume 2: Price Analysis
1987-01-01
able to conclude that competition is adequate and thelowest. price is reasonable. You may compare it with the most recent prices paid or the Government ...estimates When price comparisons are not possible, the offered price may be compared with the purcha:ie request estimate (if there is one) or other Government ...both Government and public libraries. There are highly specialized governmental statistical puhlications besides those listed, and there are many
Prediction of the dollar to the ruble rate. A system-theoretic approach
NASA Astrophysics Data System (ADS)
Borodachev, Sergey M.
2017-07-01
Proposed a simple state-space model of dollar rate formation based on changes in oil prices and some mechanisms of money transfer between monetary and stock markets. Comparison of predictions by means of input-output model and state-space model is made. It concludes that with proper use of statistical data (Kalman filter) the second approach provides more adequate predictions of the dollar rate.
Three Dimensional CFD Analysis of the GTX Combustor
NASA Technical Reports Server (NTRS)
Steffen, C. J., Jr.; Bond, R. B.; Edwards, J. R.
2002-01-01
The annular combustor geometry of a combined-cycle engine has been analyzed with three-dimensional computational fluid dynamics. Both subsonic combustion and supersonic combustion flowfields have been simulated. The subsonic combustion analysis was executed in conjunction with a direct-connect test rig. Two cold-flow and one hot-flow results are presented. The simulations compare favorably with the test data for the two cold flow calculations; the hot-flow data was not yet available. The hot-flow simulation indicates that the conventional ejector-ramjet cycle would not provide adequate mixing at the conditions tested. The supersonic combustion ramjet flowfield was simulated with frozen chemistry model. A five-parameter test matrix was specified, according to statistical design-of-experiments theory. Twenty-seven separate simulations were used to assemble surrogate models for combustor mixing efficiency and total pressure recovery. ScramJet injector design parameters (injector angle, location, and fuel split) as well as mission variables (total fuel massflow and freestream Mach number) were included in the analysis. A promising injector design has been identified that provides good mixing characteristics with low total pressure losses. The surrogate models can be used to develop performance maps of different injector designs. Several complex three-way variable interactions appear within the dataset that are not adequately resolved with the current statistical analysis.
Zeisel, Steven H.
2013-01-01
Nutrigenetics/nutrigenomics (the study of the bidirectional interactions between genes and diet) is a rapidly developing field that is changing research and practice in human nutrition. Though eventually nutrition clinicians may be able to provide personalized nutrition recommendations, in the immediate future they are most likely to use this knowledge to improve dietary recommendations for populations. Currently, estimated average requirements are used to set dietary reference intakes because scientists cannot adequately identify subsets of the population that differ in requirement for a nutrient. Recommended intake levels must exceed the actual required intake for most of the population in order to assure that individuals with the highest requirement ingest adequate amounts of the nutrient. As a result, dietary reference intake levels often are set so high that diet guidelines suggest almost unattainable intakes of some foods. Once it is possible to identify common subgroups that differ in nutrient requirements using nutrigenetic/nutrigenomic profiling, targeted interventions and recommendations can be refined. In addition, when a large variance exists in response to a nutrient, statistical analyses often argue for a null effect. If responders could be differentiated from nonre-sponders based on nutrigenetic/nutrigenomic profiling, this statistical noise could be eliminated and the sensitivity of nutrition research greatly increased. PMID:20436254
Assessing the performance of sewer rehabilitation on the reduction of infiltration and inflow.
Staufer, P; Scheidegger, A; Rieckermann, J
2012-10-15
Inflow and Infiltration (I/I) into sewer systems is generally unwanted, because, among other things, it decreases the performance of wastewater treatment plants and increases combined sewage overflows. As sewer rehabilitation to reduce I/I is very expensive, water managers not only need methods to accurately measure I/I, but also they need sound approaches to assess the actual performance of implemented rehabilitation measures. However, such performance assessment is rarely performed. On the one hand, it is challenging to adequately take into account the variability of influential factors, such as hydro-meteorological conditions. On the other hand, it is currently not clear how experimental data can indeed support robust evidence for reduced I/I. In this paper, we therefore statistically assess the performance of rehabilitation measures to reduce I/I. This is possible by using observations in a suitable reference catchment as a control group and assessing the significance of the observed effect by regression analysis, which is well established in other disciplines. We successfully demonstrate the usefulness of the approach in a case study, where rehabilitation reduced groundwater infiltration by 23.9%. A reduction of stormwater inflow of 35.7%, however, was not statistically significant. Investigations into the experimental design of monitoring campaigns confirmed that the variability of the data as well as the number of observations collected before the rehabilitation impact the detection limit of the effect. This implies that it is difficult to improve the data quality after the rehabilitation has been implemented. Therefore, future practical applications should consider a careful experimental design. Further developments could employ more sophisticated monitoring methods, such as stable environmental isotopes, to directly observe the individual infiltration components. In addition, water managers should develop strategies to effectively communicate statistically not significant I/I reduction ratios to decision makers. Copyright © 2012 Elsevier Ltd. All rights reserved.
Spatially adapted augmentation of age-specific atlas-based segmentation using patch-based priors
NASA Astrophysics Data System (ADS)
Liu, Mengyuan; Seshamani, Sharmishtaa; Harrylock, Lisa; Kitsch, Averi; Miller, Steven; Chau, Van; Poskitt, Kenneth; Rousseau, Francois; Studholme, Colin
2014-03-01
One of the most common approaches to MRI brain tissue segmentation is to employ an atlas prior to initialize an Expectation- Maximization (EM) image labeling scheme using a statistical model of MRI intensities. This prior is commonly derived from a set of manually segmented training data from the population of interest. However, in cases where subject anatomy varies significantly from the prior anatomical average model (for example in the case where extreme developmental abnormalities or brain injuries occur), the prior tissue map does not provide adequate information about the observed MRI intensities to ensure the EM algorithm converges to an anatomically accurate labeling of the MRI. In this paper, we present a novel approach for automatic segmentation of such cases. This approach augments the atlas-based EM segmentation by exploring methods to build a hybrid tissue segmentation scheme that seeks to learn where an atlas prior fails (due to inadequate representation of anatomical variation in the statistical atlas) and utilize an alternative prior derived from a patch driven search of the atlas data. We describe a framework for incorporating this patch-based augmentation of EM (PBAEM) into a 4D age-specific atlas-based segmentation of developing brain anatomy. The proposed approach was evaluated on a set of MRI brain scans of premature neonates with ages ranging from 27.29 to 46.43 gestational weeks (GWs). Results indicated superior performance compared to the conventional atlas-based segmentation method, providing improved segmentation accuracy for gray matter, white matter, ventricles and sulcal CSF regions.
Cypriot nurses' knowledge and attitudes towards alternative medicine.
Zoe, Roupa; Charalambous, Charalambos; Popi, Sotiropoulou; Maria, Rekleiti; Aris, Vasilopoulos; Agoritsa, Koulouri; Evangelia, Kotrotsiou
2014-02-01
To investigate Cypriot nurses' knowledge and attitude towards alternative treatments. Two hundred randomly selected registered Nurses from public hospitals in Cyprus were administered an anonymous self-report questionnaire with closed-type questions. The particular questionnaire has previously been used in similar surveys. Six questions referred to demographic data and 14 questions to attitudes and knowledge towards alternative medicine. One hundred and thirty-eight questionnaires were adequately completed and evaluated. Descriptive and inferential statistics was performed. SPSS 17.0 was used. Statistical significance was set at p < 0.05. Over 1/3 of our sample nurses reported that they had turned to some form of alternative treatment at some point in their lives in order to deal with a certain medical situation. Most of these nurses who reported some knowledge on specific alternative treatment methods, (75.9%) also reported using such methods within their clinical practice. The nurses who had received some form of alternative treatment reported using them more often in their clinical practice, in comparison to those who had never received such treatment (Mann-Whitney U = 1137, p = 0.006). The more frequently nurses used alternative treatment in their clinical practice, the more interested they got in expanding their knowledge on the subject (Pearson's r = 0.250, p = 0.006). Most nurses are familiar with alternative medicine and interested in expanding their knowledge on subject, despite the fact that they do not usually practice it. Special education and training as well as legislative actions are necessary for alternative medicine to be broadly accepted. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Winiarek, Victor; Vira, Julius; Bocquet, Marc; Sofiev, Mikhail; Saunier, Olivier
2011-06-01
In the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites.
Maia, Adriana M; Baby, André Rolim; Pinto, Claudinéia A S O; Yasaka, Wilson J; Suenaga, Eunice; Kaneko, Telma M; Velasco, Maria Valéria Robles
2006-09-28
Vitamin C exerts several functions on skin as collagen synthesis, depigmentant and antioxidant activity. Vitamin C is unstable in the presence of oxygen, luminosity, humidity, high temperatures and heavy metals, which presents a significant challenge to the development of cosmetic formulations. Therefore, the utilization of an effective antioxidant system is required to maintain the vitamin C stability. The purpose of this research work was to develop prototypes of cosmetic formulations, as O/W emulsion and extemporaneous aqueous gel, containing vitamin C and to evaluate the influence of sodium metabisulfite (SMB) and glutathione (GLT), as antioxidants, on the stability of the active substance. A HPLC stability-indicating method was developed and validated for this study and stability assays were performed in 90 and 26 days and storage conditions were 5.0+/-0.5, 24+/-2 and 40.0+/-0.5 degrees C. The HPLC stability-indicating method showed linearity (r(2)>0.99), specificity, R.S.D.<1.22% and accuracy/recovery ranging from 95.46 to 101.54%. Preparations with SMB or GLT and the antioxidant-free presented results statistically distinct, demonstrating the necessity of the antioxidant system addition. O/W emulsions with SMB or GLT retained the vitamin C content >90.38% stored at 5.0+/-0.5 and 24+/-2 degrees C. For the aqueous gel with SMB or GLT, the active substance concentration was maintained >94.03%. Considering the vitamin C stability, the SMB and the GLT showed to be statistically adequate, as antioxidants, for the cosmetic formulations.
Njue, P Mwaniki; Cheboi, K Solomon; Shadrak, Oiye
2015-10-01
Despite the set guidelines on Healthcare Waste Management in Kenya, mixing of different categories of waste, crude dumping and poor incineration are still a common phenomenon in public health facilities in Thika Subcounty, Kenya. Thika Subcounty generates 560 Kilograms of healthcare waste daily, which is risk to the many patients (admission rate of 26%). This may pose a potential environmental risk and be a source of disease diffusion. This research explored the adherence to healthcare waste management waste guidelines in health care facilities among the nurses and waste handlers. This was a cross sectional survey in which mixed methods were applied. A census and proportionate random sampling method were used. Quantitative data was analyzed using Statistical Package for Social Science (SPSS) version 20.0, while qualitative data was analyzed manually into themes. Full adherence to the seven waste disposal guidelines was low (16.3%). Knowledge on waste segregation, waste separation then disposal and means of transports were statistically significant in relation to adherence. The type of incinerator and burning status, protection maintenance and supply of adequate waste bins were also important to adherence level. Adherence level was low (16.3%,) and insignificantly different among nurses and waste handlers. From this finding, compliance remains a key challenge. Strategies targeted at contextualizing waste regulations and guidelines into local settings are necessary and important. Policy makers may design and implement standard incinerators across all the health facilities. This study is not exhaustive; therefore, it is necessary to carry out a study linking poor treatment and disposal of clinical waste to purported health outcomes in Kenya.
Pérez-Báez, Wendy; García-Latorre, Ethel A; Maldonado-Martínez, Héctor Aquiles; Coronado-Martínez, Iris; Flores-García, Leonardo; Taja-Chayeb, Lucía
2017-10-01
Treatment in metastatic colorectal cancer (mCRC) has expanded with monoclonal antibodies targeting epidermal growth factor receptor, but is restricted to patients with a wild-type (WT) KRAS mutational status. The most sensitive assays for KRAS mutation detection in formalin-fixed paraffin embedded (FFPE) tissues are based on real-time PCR. Among them, high resolution melting analysis (HRMA), is a simple, fast, highly sensitive, specific and cost-effective method, proposed as adjunct for KRAS mutation detection. However the method to categorize WT vs mutant sequences in HRMA is not clearly specified in available studies, besides the impact of FFPE artifacts on HRMA performance hasn't been addressed either. Avowedly adequate samples from 104 consecutive mCRC patients were tested for KRAS mutations by Therascreen™ (FDA Validated test), HRMA, and HRMA with UDG pre-treatment to reverse FFPE fixation artifacts. Comparisons of KRAS status allocation among the three methods were done. Focusing on HRMA as screening test, ROC curve analyses were performed for HRMA and HMRA-UDG against Therascreen™, in order to evaluate their discriminative power and to determine the threshold of profile concordance between WT control and sample for KRAS status determination. Comparing HRMA and HRMA-UDG against Therascreen™ as surrogate gold standard, sensitivity was 1 for both HRMA and HRMA-UDG; and specificity and positive predictive values were respectively 0.838 and 0.939; and 0.777 and 0.913. As evaluated by the McNemar test, HRMA-UDG allocated samples to a WT/mutated genotype in a significatively different way from HRMA (p > 0.001). On the other hand HRMA-UDG did not differ from Therascreen™ (p = 0.125). ROC-curve analysis showed a significant discriminative power for both HRMA and HRMA-UDG against Therascreen™ (respectively, AUC of 0.978, p > 0.0001, CI 95% 0.957-0.999; and AUC of 0.98, p > 0.0001, CI 95% 0.000-1.0). For HRMA as a screening tool, the best threshold (degree of concordance between sample curves and WT control) was attained at 92.14% for HRMA (specificity of 0.887), and at 92.55% for HRMA-UDG (specificity of 0.952). HRMA is a highly sensitive method for KRAS mutation detection, with apparently adequate and statistically significant discriminative power. FFPE sample fixation artifacts have an impact on HRMA results, so for HRMA on FFPE samples pre-treatment with UDG should be strongly suggested. The choice of the threshold for melting curve concordance has also great impact on HRMA performance. A threshold of 93% or greater might be adequate if using HRMA as a screening tool. Further validation of this threshold is required. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pistorio, M L; Veroux, M; Corona, D; Sinagra, N; Giaquinta, A; Zerbo, D; Giacchi, F; Gagliano, M; Tallarita, T; Veroux, P; De Pasquale, C
2013-09-01
This study explored the personality characteristic traits within a sample of renal transplant patients, seeking to obtain predictive index for likely clinical impacts. The personality study was performed using the Structured Clinical Interview Axis II Personality Disorders for Diagnostic and Statistical Manual of Mental Disorders fourth edition, text revision in 60 recipients of kidney transplantations from deceased donors. The personality trait that prevailed in the female gender was borderline, while in the male gender it appeared to be predominantly obsessive-compulsive personality trait. The personality study proved to be a good index to predict effects on the level of social adjustment. In this way, patients who have shown pathologic personality traits can be identified early to provide adequate psychologic-psychiatric support and follow-up. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zagouras, Athanassios; Argiriou, Athanassios A.; Flocas, Helena A.; Economou, George; Fotopoulos, Spiros
2012-11-01
Classification of weather maps at various isobaric levels as a methodological tool is used in several problems related to meteorology, climatology, atmospheric pollution and to other fields for many years. Initially the classification was performed manually. The criteria used by the person performing the classification are features of isobars or isopleths of geopotential height, depending on the type of maps to be classified. Although manual classifications integrate the perceptual experience and other unquantifiable qualities of the meteorology specialists involved, these are typically subjective and time consuming. Furthermore, during the last years different approaches of automated methods for atmospheric circulation classification have been proposed, which present automated and so-called objective classifications. In this paper a new method of atmospheric circulation classification of isobaric maps is presented. The method is based on graph theory. It starts with an intelligent prototype selection using an over-partitioning mode of fuzzy c-means (FCM) algorithm, proceeds to a graph formulation for the entire dataset and produces the clusters based on the contemporary dominant sets clustering method. Graph theory is a novel mathematical approach, allowing a more efficient representation of spatially correlated data, compared to the classical Euclidian space representation approaches, used in conventional classification methods. The method has been applied to the classification of 850 hPa atmospheric circulation over the Eastern Mediterranean. The evaluation of the automated methods is performed by statistical indexes; results indicate that the classification is adequately comparable with other state-of-the-art automated map classification methods, for a variable number of clusters.
Qualitative evaluations and comparisons of six night-vision colorization methods
NASA Astrophysics Data System (ADS)
Zheng, Yufeng; Reese, Kristopher; Blasch, Erik; McManamon, Paul
2013-05-01
Current multispectral night vision (NV) colorization techniques can manipulate images to produce colorized images that closely resemble natural scenes. The colorized NV images can enhance human perception by improving observer object classification and reaction times especially for low light conditions. This paper focuses on the qualitative (subjective) evaluations and comparisons of six NV colorization methods. The multispectral images include visible (Red-Green- Blue), near infrared (NIR), and long wave infrared (LWIR) images. The six colorization methods are channel-based color fusion (CBCF), statistic matching (SM), histogram matching (HM), joint-histogram matching (JHM), statistic matching then joint-histogram matching (SM-JHM), and the lookup table (LUT). Four categries of quality measurements are used for the qualitative evaluations, which are contrast, detail, colorfulness, and overall quality. The score of each measurement is rated from 1 to 3 scale to represent low, average, and high quality, respectively. Specifically, high contrast (of rated score 3) means an adequate level of brightness and contrast. The high detail represents high clarity of detailed contents while maintaining low artifacts. The high colorfulness preserves more natural colors (i.e., closely resembles the daylight image). Overall quality is determined from the NV image compared to the reference image. Nine sets of multispectral NV images were used in our experiments. For each set, the six colorized NV images (produced from NIR and LWIR images) are concurrently presented to users along with the reference color (RGB) image (taken at daytime). A total of 67 subjects passed a screening test ("Ishihara Color Blindness Test") and were asked to evaluate the 9-set colorized images. The experimental results showed the quality order of colorization methods from the best to the worst: CBCF < SM < SM-JHM < LUT < JHM < HM. It is anticipated that this work will provide a benchmark for NV colorization and for quantitative evaluation using an objective metric such as objective evaluation index (OEI).
Souto Bayarri, M; Masip Capdevila, L; Remuiñan Pereira, C; Suárez-Cuenca, J J; Martínez Monzonís, A; Couto Pérez, M I; Carreira Villamor, J M
2015-01-01
To compare the methods of right ventricle segmentation in the short-axis and 4-chamber planes in cardiac magnetic resonance imaging and to correlate the findings with those of the tricuspid annular plane systolic excursion (TAPSE) method in echocardiography. We used a 1.5T MRI scanner to study 26 patients with diverse cardiovascular diseases. In all MRI studies, we obtained cine-mode images from the base to the apex in both the short-axis and 4-chamber planes using steady-state free precession sequences and 6mm thick slices. In all patients, we quantified the end-diastolic volume, end-systolic volume, and the ejection fraction of the right ventricle. On the same day as the cardiac magnetic resonance imaging study, 14 patients also underwent echocardiography with TAPSE calculation of right ventricular function. No statistically significant differences were found in the volumes and function of the right ventricle calculated using the 2 segmentation methods. The correlation between the volume estimations by the two segmentation methods was excellent (r=0,95); the correlation for the ejection fraction was slightly lower (r=0,8). The correlation between the cardiac magnetic resonance imaging estimate of right ventricular ejection fraction and TAPSE was very low (r=0,2, P<.01). Both ventricular segmentation methods quantify right ventricular function adequately. The correlation with the echocardiographic method is low. Copyright © 2012 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Fletcher, Jack M.; Stuebing, Karla K.; Barth, Amy E.; Miciak, Jeremy; Francis, David J.; Denton, Carolyn A.
2013-01-01
Purpose Agreement across methods for identifying students as inadequate responders or as learning disabled is often poor. We report (1) an empirical examination of final status (post-intervention benchmarks) and dual-discrepancy growth methods based on growth during the intervention and final status for assessing response to intervention; and (2) a statistical simulation of psychometric issues that may explain low agreement. Methods After a Tier 2 intervention, final status benchmark criteria were used to identify 104 inadequate and 85 adequate responders to intervention, with comparisons of agreement and coverage for these methods and a dual-discrepancy method. Factors affecting agreement were investigated using computer simulation to manipulate reliability, the intercorrelation between measures, cut points, normative samples, and sample size. Results Identification of inadequate responders based on individual measures showed that single measures tended not to identify many members of the pool of 104 inadequate responders. Poor to fair levels of agreement for identifying inadequate responders were apparent between pairs of measures In the simulation, comparisons across two simulated measures generated indices of agreement (kappa) that were generally low because of multiple psychometric issues inherent in any test. Conclusions Expecting excellent agreement between two correlated tests with even small amounts of unreliability may not be realistic. Assessing outcomes based on multiple measures, such as level of CBM performance and short norm-referenced assessments of fluency may improve the reliability of diagnostic decisions. PMID:25364090
Advances in borehole geophysics for hydrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, P.H.
1982-01-01
Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less
Gaudin, Valerie; Juhel-Gaugain, Murielle; Morétain, Jean-Pierre; Sanders, Pascal
2008-12-01
Premi Test contains viable spores of a strain of Bacillus stearothermophilus which is sensitive to antimicrobial residues, such as beta-lactams, tetracyclines, macrolides and sulphonamides. The growth of the strain is inhibited by the presence of antimicrobial residues in muscle tissue samples. Premi Test was validated according to AFNOR rules (French Association for Normalisation). The AFNOR validation was based on the comparison of reference methods (French Official method, i.e. four plate test (FPT) and the STAR protocol (five plate test)) with the alternative method (Premi Test). A preliminary study was conducted in an expert laboratory (Community Reference Laboratory, CRL) on both spiked and incurred samples (field samples). Several method performance criteria (sensitivity, specificity, relative accuracy) were estimated and are discussed, in addition to detection capabilities. Adequate agreement was found between the alternative method and the reference methods. However, Premi Test was more sensitive to beta-lactams and sulphonamides than the FPT. Subsequently, a collaborative study with 11 laboratories was organised by the CRL. Blank and spiked meat juice samples were sent to participants. The expert laboratory (CRL) statistically analysed the results. It was concluded that Premi Test could be used for the routine determination of antimicrobial residues in muscle of different animal origin with acceptable analytical performance. The detection capabilities of Premi Test for beta-lactams (amoxicillin, ceftiofur), one macrolide (tylosin) and tetracycline were at the level of the respective maximum residue limits (MRL) in muscle samples or even lower.
Model-Based Normalization of a Fractional-Crystal Collimator for Small-Animal PET Imaging
Li, Yusheng; Matej, Samuel; Karp, Joel S.; Metzler, Scott D.
2017-01-01
Previously, we proposed to use a coincidence collimator to achieve fractional-crystal resolution in PET imaging. We have designed and fabricated a collimator prototype for a small-animal PET scanner, A-PET. To compensate for imperfections in the fabricated collimator prototype, collimator normalization, as well as scanner normalization, is required to reconstruct quantitative and artifact-free images. In this study, we develop a normalization method for the collimator prototype based on the A-PET normalization using a uniform cylinder phantom. We performed data acquisition without the collimator for scanner normalization first, and then with the collimator from eight different rotation views for collimator normalization. After a reconstruction without correction, we extracted the cylinder parameters from which we generated expected emission sinograms. Single scatter simulation was used to generate the scattered sinograms. We used the least-squares method to generate the normalization coefficient for each LOR based on measured, expected and scattered sinograms. The scanner and collimator normalization coefficients were factorized by performing two normalizations separately. The normalization methods were also verified using experimental data acquired from A-PET with and without the collimator. In summary, we developed a model-base collimator normalization that can significantly reduce variance and produce collimator normalization with adequate statistical quality within feasible scan time. PMID:29270539
Model-Based Normalization of a Fractional-Crystal Collimator for Small-Animal PET Imaging.
Li, Yusheng; Matej, Samuel; Karp, Joel S; Metzler, Scott D
2017-05-01
Previously, we proposed to use a coincidence collimator to achieve fractional-crystal resolution in PET imaging. We have designed and fabricated a collimator prototype for a small-animal PET scanner, A-PET. To compensate for imperfections in the fabricated collimator prototype, collimator normalization, as well as scanner normalization, is required to reconstruct quantitative and artifact-free images. In this study, we develop a normalization method for the collimator prototype based on the A-PET normalization using a uniform cylinder phantom. We performed data acquisition without the collimator for scanner normalization first, and then with the collimator from eight different rotation views for collimator normalization. After a reconstruction without correction, we extracted the cylinder parameters from which we generated expected emission sinograms. Single scatter simulation was used to generate the scattered sinograms. We used the least-squares method to generate the normalization coefficient for each LOR based on measured, expected and scattered sinograms. The scanner and collimator normalization coefficients were factorized by performing two normalizations separately. The normalization methods were also verified using experimental data acquired from A-PET with and without the collimator. In summary, we developed a model-base collimator normalization that can significantly reduce variance and produce collimator normalization with adequate statistical quality within feasible scan time.
Anderson, Samantha F; Maxwell, Scott E
2017-01-01
Psychology is undergoing a replication crisis. The discussion surrounding this crisis has centered on mistrust of previous findings. Researchers planning replication studies often use the original study sample effect size as the basis for sample size planning. However, this strategy ignores uncertainty and publication bias in estimated effect sizes, resulting in overly optimistic calculations. A psychologist who intends to obtain power of .80 in the replication study, and performs calculations accordingly, may have an actual power lower than .80. We performed simulations to reveal the magnitude of the difference between actual and intended power based on common sample size planning strategies and assessed the performance of methods that aim to correct for effect size uncertainty and/or bias. Our results imply that even if original studies reflect actual phenomena and were conducted in the absence of questionable research practices, popular approaches to designing replication studies may result in a low success rate, especially if the original study is underpowered. Methods correcting for bias and/or uncertainty generally had higher actual power, but were not a panacea for an underpowered original study. Thus, it becomes imperative that 1) original studies are adequately powered and 2) replication studies are designed with methods that are more likely to yield the intended level of power.
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
Wei, Yaqiang; Dong, Yanhui; Yeh, Tian-Chyi J; Li, Xiao; Wang, Liheng; Zha, Yuanyuan
2017-11-01
There have been widespread concerns about solute transport problems in fractured media, e.g. the disposal of high-level radioactive waste in geological fractured rocks. Numerical simulation of particle tracking is gradually being employed to address these issues. Traditional predictions of radioactive waste transport using discrete fracture network (DFN) models often consider one particular realization of the fracture distribution based on fracture statistic features. This significantly underestimates the uncertainty of the risk of radioactive waste deposit evaluation. To adequately assess the uncertainty during the DFN modeling in a potential site for the disposal of high-level radioactive waste, this paper utilized the probabilistic distribution method (PDM). The method was applied to evaluate the risk of nuclear waste deposit in Beishan, China. Moreover, the impact of the number of realizations on the simulation results was analyzed. In particular, the differences between the modeling results of one realization and multiple realizations were demonstrated. Probabilistic distributions of 20 realizations at different times were also obtained. The results showed that the employed PDM can be used to describe the ranges of the contaminant particle transport. The high-possibility contaminated areas near the release point were more concentrated than the farther areas after 5E6 days, which was 25,400 m 2 .
Allele-specific copy-number discovery from whole-genome and whole-exome sequencing
Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.
2015-01-01
Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151
Statistical distribution of mechanical properties for three graphite-epoxy material systems
NASA Technical Reports Server (NTRS)
Reese, C.; Sorem, J., Jr.
1981-01-01
Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.
Statistical inference involving binomial and negative binomial parameters.
García-Pérez, Miguel A; Núñez-Antón, Vicente
2009-05-01
Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.
NASA Technical Reports Server (NTRS)
Moin, Parviz; Reynolds, William C.
1988-01-01
Lagrangian techniques have found widespread application to the prediction and understanding of turbulent transport phenomena and have yielded satisfactory results for different cases of shear flow problems. However, it must be kept in mind that in most experiments what is really available are Eulerian statistics, and it is far from obvious how to extract from them the information relevant to the Lagrangian behavior of the flow; in consequence, Lagrangian models still include some hypothesis for which no adequate supporting evidence was until now available. Direct numerical simulation of turbulence offers a new way to obtain Lagrangian statistics and so verify the validity of the current predictive models and the accuracy of their results. After the pioneering work of Riley (Riley and Patterson, 1974) in the 70's, some such results have just appeared in the literature (Lee et al, Yeung and Pope). The present contribution follows in part similar lines, but focuses on two particle statistics and comparison with existing models.
Townsend, E; Hawton, K; Altman, D G; Arensman, E; Gunnell, D; Hazell, P; House, A; Van Heeringen, K
2001-08-01
Brief problem-solving therapy is regarded as a pragmatic treatment for deliberate self-harm (DSH) patients. A recent meta-analysis of randomized controlled trials (RCTs) evaluating this approach indicated a trend towards reduced repetition of DSH but the pooled odds ratio was not statistically significant. We have now examined other important outcomes using this procedure, namely depression, hopelessness and improvement in problems. Six trials in which problem-solving therapy was compared with control treatment were identified from an extensive literature review of RCTs of treatments for DSH patients. Data concerning depression, hopelessness and improvement in problems were extracted. Where relevant statistical data (e.g. standard deviations) were missing these were imputed using various statistical methods. Results were pooled using meta-analytical procedures. At follow-up, patients who were offered problem-solving therapy had significantly greater improvement in scores for depression (standardized mean difference = -0.36; 95% CI -0.61 to -0.11) and hopelessness (weighted mean difference =-3.2; 95% CI -4.0 to -2.41), and significantly more reported improvement in their problems (odds ratio = 2.31; 95% CI 1.29 to 4.13), than patients who were in the control treatment groups. Problem-solving therapy for DSH patients appears to produce better results than control treatment with regard to improvement in depression, hopelessness and problems. It is desirable that this finding is confirmed in a large trial, which will also allow adequate testing of the impact of this treatment on repetition of DSH.
Generalized theory of semiflexible polymers.
Wiggins, Paul A; Nelson, Philip C
2006-03-01
DNA bending on length scales shorter than a persistence length plays an integral role in the translation of genetic information from DNA to cellular function. Quantitative experimental studies of these biological systems have led to a renewed interest in the polymer mechanics relevant for describing the conformational free energy of DNA bending induced by protein-DNA complexes. Recent experimental results from DNA cyclization studies have cast doubt on the applicability of the canonical semiflexible polymer theory, the wormlike chain (WLC) model, to DNA bending on biologically relevant length scales. This paper develops a theory of the chain statistics of a class of generalized semiflexible polymer models. Our focus is on the theoretical development of these models and the calculation of experimental observables. To illustrate our methods, we focus on a specific, illustrative model of DNA bending. We show that the WLC model generically describes the long-length-scale chain statistics of semiflexible polymers, as predicted by renormalization group arguments. In particular, we show that either the WLC or our present model adequately describes force-extension, solution scattering, and long-contour-length cyclization experiments, regardless of the details of DNA bend elasticity. In contrast, experiments sensitive to short-length-scale chain behavior can in principle reveal dramatic departures from the linear elastic behavior assumed in the WLC model. We demonstrate this explicitly by showing that our toy model can reproduce the anomalously large short-contour-length cyclization factors recently measured by Cloutier and Widom. Finally, we discuss the applicability of these models to DNA chain statistics in the context of future experiments.
2013-01-01
Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability “hotspots” into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention strategies. The results support decision makers to allocate resources in a manner that may reduce existing susceptibilities and strengthen resilience, and thus help to reduce the burden of vector-borne diseases. PMID:23945265
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
First "glass" education: telementored cardiac ultrasonography using Google Glass- a pilot study.
Russell, Patrick M; Mallin, Michael; Youngquist, Scott T; Cotton, Jennifer; Aboul-Hosn, Nael; Dawson, Matt
2014-11-01
The objective of this study was to determine the feasibility of telementored instruction in bedside ultrasonography (US) using Google Glass. The authors sought to examine whether first-time US users could obtain adequate parasternal long axis (PSLA) views to approximate ejection fraction (EF) using Google Glass telementoring. This was a prospective, randomized, single-blinded study. Eighteen second-year medical students were randomized into three groups and tasked with obtaining PSLA cardiac imaging. Group A received real-time telementored education through Google Glass via Google Hangout from a remotely located expert. Group B received bedside education from the same expert. Group C represented the control and received no instruction. Each subject was given 3 minutes to obtain a best PSLA cardiac imaging using a portable GE Vscan. Image clips obtained by each subject were stored. A second expert, blinded to instructional mode, evaluated images for adequacy and assigned an image quality rating on a 0 to 10 scale. Group A was able to obtain adequate images six out of six times (100%) with a median image quality rating of 7.5 (interquartile range [IQR] = 6 to 10) out of 10. Group B was also able to obtain adequate views six out of six times (100%), with a median image quality rating of 8 (IQR = 7 to 9). Group C was able to obtain adequate views one out of six times (17%), with a median image quality of 0 (IQR = 0 to 2). There were no statistically significant differences between Group A and Group B in the achievement of adequate images for E-point septal separation measurement or in image quality. In this pilot/feasibility study, novice US users were able to obtain adequate imaging to determine a healthy patient's EF through telementored education using Google Glass. These preliminary data suggest telementoring as an adequate means of medical education in bedside US. This conclusion will need to be validated with larger, more powerful studies including evaluation of pathologic findings and varying body habitus among models. © 2014 by the Society for Academic Emergency Medicine.
Design of a sediment data-collection program in Kansas as affected by time trends
Jordan, P.R.
1985-01-01
Data collection programs need to be re-examined periodically in order to insure their usefulness, efficiency, and applicability. The possibility of time trends in sediment concentration, in particular, makes the examination with new statistical techniques desirable. After adjusting sediment concentrations for their relation to streamflow rates and by using a seasonal adaptation of Kendall 's nonparametric statistical test, time trends of flow-adjusted concentrations were detected for 11 of the 38 sediment records tested that were not affected by large reservoirs. Ten of the 11 trends were toward smaller concentrations; only 1 was toward larger concentrations. Of the apparent trends that were not statistically significant (0.05 level) using data available, nearly all were toward smaller concentrations. Because the reason for the lack of statistical significance of an apparent trend may be inadequacy of data rather than absence of trend and because of the prevalence of apparent trends in one direction, the assumption was made that a time trend may be present at any station. This assumption can significantly affect the design of a sediment data collection program. Sudden decreases (step trends) in flow-adjusted sediment concentrations were found at all stations that were short distances downstream from large reservoirs and that had adequate data for a seasonal adaptation of Wilcoxon 's nonparametric statistical test. Examination of sediment records in the 1984 data collection program of the Kansas Water Office indicated 13 stations that can be discontinued temporarily because data are now adequate. Data collection could be resumed in 1992 when new data may be needed because of possible time trends. New data are needed at eight previously operated stations where existing data may be inadequate or misleading because of time trends. Operational changes may be needed at some stations, such as hiring contract observers or installing automatic pumping samplers. Implementing the changes in the program can provide a substantial increase in the quantity of useful information on stream sediment for the same funding as the 1984 level. (Author 's abstract)
A simple and effective solution to the constrained QM/MM simulations
NASA Astrophysics Data System (ADS)
Takahashi, Hideaki; Kambe, Hiroyuki; Morita, Akihiro
2018-04-01
It is a promising extension of the quantum mechanical/molecular mechanical (QM/MM) approach to incorporate the solvent molecules surrounding the QM solute into the QM region to ensure the adequate description of the electronic polarization of the solute. However, the solvent molecules in the QM region inevitably diffuse into the MM bulk during the QM/MM simulation. In this article, we developed a simple and efficient method, referred to as the "boundary constraint with correction (BCC)," to prevent the diffusion of the solvent water molecules by means of a constraint potential. The point of the BCC method is to compensate the error in a statistical property due to the bias potential by adding a correction term obtained through a set of QM/MM simulations. The BCC method is designed so that the effect of the bias potential completely vanishes when the QM solvent is identical with the MM solvent. Furthermore, the desirable conditions, that is, the continuities of energy and force and the conservations of energy and momentum, are fulfilled in principle. We applied the QM/MM-BCC method to a hydronium ion(H3O+) in aqueous solution to construct the radial distribution function (RDF) of the solvent around the solute. It was demonstrated that the correction term fairly compensated the error and led the RDF in good agreement with the result given by an ab initio molecular dynamics simulation.
El-Damanhoury, Hatem M; Gaintantzopoulou, Marianna
2016-01-01
To evaluate the effect of immediate dentin sealing and optical powder removal method on the fracture resistance of CAD/CAM-fabricated ceramic endocrowns. Seventy-eight extracted premolars were endodontically treated. Standardized endocrown preparations were done in 60 teeth. Teeth were divided equally (n = 10) depending on the treatment of dentin (delayed sealing [DS] or immediate sealing [IS]), and the methods of optical powder removal (air-water spray washing [AW]; microabrasion [MA]; or aqueous suspension of pumice [PB], followed by air-water spray washing). After cementation, specimens were thermocycled (5,000 cycles, 5°C/50°C) and stored in distilled water for 1 week. Specimens were loaded in compression using a universal testing machine until failure. Failure load was recorded, and modes of failure were examined under a stereomicroscope. Micromorphological evaluation of different dentin treatments was done under SEM (n = 3). Results were analyzed using two-way ANOVA and Bonferroni post hoc multiple comparison tests (α = 0.05). Fracture resistance of all IS groups was significantly lower than DS groups, except for AW. There was no statistically significant difference between powder removal methods. Immediate dentin sealing does not improve the fracture resistance of endocrown restorations. Air-water spray washing is adequate to remove the optical powder after optical scanning impression taking.
Sundararajan, S R; Rajagopalakrishnan, Ramakanth; Rajasekaran, S
2016-05-01
To predict adequacy of semitendinosus (ST) graft dimension for ACLR from anthropometric measures. Single tendon harvest for autograft hamstring ACLR could be beneficial to limit donor site morbidity; however, concerns for reconstruction failure based upon inadequate graft size may limit this surgical technique. To predict adequacy, prospectively, 108 patients who underwent ACLR by hamstring graft (STG graft) were enrolled for the study. Mean age was 33.028 years ± 9.539 SD (14-59) with 88 males and 20 females. Anthropometric measurements (height, weight, BMI, thigh and total limb length) and intraoperative data (graft dimensions and bone tunnel measurements) were collected for analysis. Semitendinosus graft can be used as 3-strand (ST3) or 4-strand (ST4) graft. Adequacy criteria for ST3 and ST4 graft dimensions were determined from data analysis. SPSS (v.17) Pearson's correlation coefficient and ROC curves were used for statistical analyses. A total of 74 out of 108 patients (68.52 %) had adequate graft dimensions for ST3 reconstruction. Height equal or greater than 158 cm was predictive of adequate graft for ST3 reconstruction. Only 23 patients (21.3 %) had adequate graft dimensions for ST4 reconstruction. Height equal or greater than 170 cm was predictive of adequate graft for ST4 reconstruction. Height variable had the highest ROC curve area of 0.840 and 0.910 for both ST3 graft and ST4 graft, respectively. Hence, height was used as best predictor to determine adequacy of the graft. Height can be predictive of adequate graft for single tendon ACL reconstruction.
Suitability assessment of health education brochures in Qassim province, Kingdom of Saudi Arabia.
Jahan, Saulat; Al-Saigul, Abdullah M; Alharbi, Ali M; Abdelgadir, Muzamil H
2014-09-01
Health education is the cornerstone of primary health care. Health education materials distributed to the community should, therefore, be suitable and effective. The purpose of this study was to evaluate the health education brochures, designed and disseminated by Ministry of Health institutions in the Qassim province. The study was a cross-sectional review of health education brochures. We used a structured evaluation form, comprising general information on the brochures and a modified Suitability Assessment of Materials (SAM) score sheet. The SAM consisting of 22 criteria in six groups, includes content, literacy demands, graphics, layout/typography, learning stimulation/motivation, and cultural appropriateness. SAM criteria categorize written material into "superior," "adequate" and "not suitable." Two qualified consultant family physicians evaluated the brochures. Data were analyzed using Epi Info version 3.4 statistical package. We evaluated 110 brochures, the majority of which addressed chronic health conditions such as mental health, diabetes mellitus and hypertension. Seventy-four (67.3%) brochures were evaluated as "adequate," 34 (30.9%) as "not suitable" and 2 (1.8%) as "superior." "Cultural appropriateness" was the highest scoring factor, with 92 (83.6%) brochures falling into either the "superior" or "adequate" category. With regard to "content," 88 (80.0%) brochures fell into either the "superior" or "adequate" category. This was the second highest scoring factor. Graphics was the factor that scored the least. Seventy-five (68.2%) brochures were rated in this factor as "not suitable." Although two-thirds of our brochures were considered "adequate," the majority needed improvement to their graphics and learning stimulation factors. We recommend that guidelines for designing health education brochures should be formulated to improve the quality of health education brochures.
Air-Q intubating laryngeal airway: A study of the second generation supraglottic airway device.
Attarde, Viren Bhaskar; Kotekar, Nalini; Shetty, Sarika M
2016-05-01
Air-Q intubating laryngeal mask airway (ILA) is used as a supraglottic airway device and as a conduit for endotracheal intubation. This study aims to assess the efficacy of the Air-Q ILA regarding ease of insertion, adequacy of ventilation, rate of successful intubation, haemodynamic response and airway morbidity. Sixty patients presenting for elective surgery at our Medical College Hospital were selected. Following adequate premedication, baseline vital parameters, pulse rate and blood pressure were recorded. Air-Q size 3.5 for patients 50-70 kg and size 4.5 for 70-100 kg was selected. After achieving adequate intubating conditions, Air-Q ILA was introduced. Confirming adequate ventilation, appropriate sized endotracheal tube was advanced through the Air-Q blindly to intubate the trachea. Placement of the endotracheal tube in trachea was confirmed. Air-Q ILA was successfully inserted in 88.3% of patients in first attempt and 11.7% patients in second attempt. Ventilation was adequate in 100% of patients. Intubation was successful in 76.7% of patients with Air-Q ILA. 23.3% of patients were intubated by direct laryngoscopy following failure with two attempts using Air-Q ILA. Post-intubation the change in heart rate was statistically significant (P < 0.0001). 10% of patients were noted to have a sore throat and 5% of patients had mild airway trauma. Air-Q ILA is a reliable device as a supraglottic airway ensuring adequate ventilation as well as a conduit for endotracheal intubation. It benefits the patient by avoiding the stress of direct laryngoscopy and is also superior alternative device for use in a difficult airway.
Quick, Jacob A; MacIntyre, Allan D; Barnes, Stephen L
2014-02-01
Surgical airway creation has a high potential for disaster. Conventional methods can be cumbersome and require special instruments. A simple method utilizing three steps and readily available equipment exists, but has yet to be adequately tested. Our objective was to compare conventional cricothyroidotomy with the three-step method utilizing high-fidelity simulation. Utilizing a high-fidelity simulator, 12 experienced flight nurses and paramedics performed both methods after a didactic lecture, simulator briefing, and demonstration of each technique. Six participants performed the three-step method first, and the remaining 6 performed the conventional method first. Each participant was filmed and timed. We analyzed videos with respect to the number of hand repositions, number of airway instrumentations, and technical complications. Times to successful completion were measured from incision to balloon inflation. The three-step method was completed faster (52.1 s vs. 87.3 s; p = 0.007) as compared with conventional surgical cricothyroidotomy. The two methods did not differ statistically regarding number of hand movements (3.75 vs. 5.25; p = 0.12) or instrumentations of the airway (1.08 vs. 1.33; p = 0.07). The three-step method resulted in 100% successful airway placement on the first attempt, compared with 75% of the conventional method (p = 0.11). Technical complications occurred more with the conventional method (33% vs. 0%; p = 0.05). The three-step method, using an elastic bougie with an endotracheal tube, was shown to require fewer total hand movements, took less time to complete, resulted in more successful airway placement, and had fewer complications compared with traditional cricothyroidotomy. Published by Elsevier Inc.
ERIC Educational Resources Information Center
DeSantis, Josh
2012-01-01
The adoption of interactive whiteboards (IWB) in many schools outpaced the delivery of adequate professional development on their use. Many teachers receive IWBs without adequate training on methods to use the technology to improve their instruction. Consequently, IWBs remain an underutilized resource in many classrooms. Teachers who are given…
Diagnostic games: from adequate formalization of clinical experience to structure discovery.
Shifrin, Michael A; Kasparova, Eva I
2008-01-01
A method of obtaining well-founded and reproducible results in clinical decision making is presented. It is based on "diagnostic games", a procedure of elicitation and formalization of experts' knowledge and experience. The use of this procedure allows formulating decision rules in the terms of an adequate language, that are both unambiguous and clinically clear.
Statistical methods used in articles published by the Journal of Periodontal and Implant Science.
Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young
2014-12-01
The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashenafi, M; McDonald, D; Peng, J
Purpose: Improved patient imaging used for planning the treatment of cervical cancer with Tandem and Ovoid (T&O) Intracavitary high-dose-rate brachytherapy (HDR) now allows for 3D delineation of target volumes and organs-at-risk. However, historical data relies on the conventional point A-based planning technique. A comparative dosimetric study was performed by generating both target-based (TBP) and point-based (PBP) plans for ten clinical patients. Methods: Treatment plans created using Elekta Oncentra v. 4.3 for ten consecutive cervical cancer patients were analyzed. All patients were treated with HDR using the Utrecht T&O applicator. Both CT and MRI imaging modalities were utilized to delineate clinicalmore » target volume (CTV) and organs-at-risk (rectum, sigmoid, bladder, and small bowel). Point A (left and right), vaginal mucosa, and ICRU rectum and bladder points were defined on CT. Two plans were generated for each patient using two prescription methods (PBP and TBP). 7Gy was prescribed to each point A for each PBP plan and to the target D90% for each TBP plan. Target V90%, V100%, and V200% were evaluated. In addition, D0.1cc and D2cc were analyzed for each organ-at-risk. Differences were assessed for statistical significance (p<0.05) by use of Student’s t-test. Results: Target coverage was comparable for both planning methods, with each method providing adequate target coverage. TBP showed lower absolute dose to the target volume than PBP (D90% = 7.0Gy vs. 7.4Gy, p=0.028), (V200% = 10.9cc vs. 12.8cc, p=0.014), (ALeft = 6.4Gy vs. 7Gy, p=0.009), and (ARight = 6.4Gy vs. 7Gy, p=0.013). TBP also showed a statistically significant reduction in bladder, rectum, small bowel, and sigmoid doses compared to PBP. There was no statistically significant difference in vaginal mucosa or ICRU-defined rectum and bladder dose. Conclusion: Target based prescription resulted in substantially lower dose to delineated organs-at-risk compared to point based prescription, while maintaining similar target coverage.« less
The politics of assessment: water and sanitation MDGs in the Middle East.
Zawahri, Neda; Sowers, Jeannie; Weinthal, Erika
2011-01-01
The Middle East and North Africa (MENA) is generally considered to be making adequate progress towards meeting Target 10 of the Millennium Development Goals (MDGs), which calls for halving the proportion of the population with inadequate access to drinking water and sanitation. Progress towards achieving Target 10 is evaluated by the Joint Monitoring Programme (JMP), run by UNICEF and WHO. This article shows that the assessment methodologies employed by the JMP significantly overstate coverage rates in the drinking water and sanitation sectors, by overlooking and ‘not counting’ problems of access, affordability, quality of service and pollution. The authors show that states in MENA often fail to provide safe drinking water and adequate sanitation services, particularly in densely populated informal settlements, and that many centralized water and sanitation infrastructures contribute to water pollution and contamination. Despite the glaring gap between the MDG statistics and the evidence available from national and local reports, exclusionary political regimes in the region have had few incentives to adopt more accurate assessments and improve the quality of service. While international organizations have proposed some reforms, they too lack incentives to employ adequate measures that gauge access, quality and affordability of drinking water and sanitation services.
Analysis and Evaluation of Parameters Determining Maximum Efficiency of Fish Protection
NASA Astrophysics Data System (ADS)
Khetsuriani, E. D.; Kostyukov, V. P.; Khetsuriani, T. E.
2017-11-01
The article is concerned with experimental research findings. The efficiency of fish fry protection from entering water inlets is the main criterion of any fish protection facility or device. The research was aimed to determine an adequate mathematical model E = f(PCT, Vp, α), where PCT, Vp and α are controlled factors influencing the process of fish fry protection. The result of the processing of experimental data was an adequate regression model. We determined the maximum of fish protection Emax=94,21 and the minimum of optimization function Emin=44,41. As a result of the statistical processing of experimental data we obtained adequate dependences for determining an optimal rotational speed of tip and fish protection efficiency. The analysis of fish protection efficiency dependence E% = f(PCT, Vp, α) allowed the authors to recommend the following optimized operating modes for it: the maximum fish protection efficiency is achieved at the process pressure PCT=3 atm, stream velocity Vp=0,42 m/s and nozzle inclination angle α=47°49’. The stream velocity Vp has the most critical influence on fish protection efficiency. The maximum efficiency of fish protection is obtained at the tip rotational speed of 70.92 rpm.
Usage and Design Evaluation by Family Caregivers of aStroke Intervention Website
Pierce, Linda L.; Steiner, Victoria
2013-01-01
Background Four out of 5 families are affected by stroke. Many caregivers access the Internet and gather healthcare information from web-based sources. Design The purpose of this descriptive evaluation was to assess the usage and design of the Caring~Web© site, which provides education/support for family caregivers of persons with stroke residing in home settings. Sample and Setting Thirty-six caregivers from two Midwest states accessed this intervention in a 1-year study. The average participant was fifty-four years of age, white, female, and the spouse of the care recipient. Methods In a telephone interview, four website questions were asked twice-/bi-monthly and a 33-item Survey at the conclusion of the study evaluated the website usage and design of its components. Descriptive analysis methods were used and statistics were collected on the number of visits to the website. Results On average, participants logged on to the website one to two hours per week, although usage declined after several months for some participants. Participants positively rated the website’s appearance and usability that included finding the training to be adequate. Conclusion Website designers can replicate this intervention for other health conditions. PMID:24025464
High-resolution detection of Brownian motion for quantitative optical tweezers experiments.
Grimm, Matthias; Franosch, Thomas; Jeney, Sylvia
2012-08-01
We have developed an in situ method to calibrate optical tweezers experiments and simultaneously measure the size of the trapped particle or the viscosity of the surrounding fluid. The positional fluctuations of the trapped particle are recorded with a high-bandwidth photodetector. We compute the mean-square displacement, as well as the velocity autocorrelation function of the sphere, and compare it to the theory of Brownian motion including hydrodynamic memory effects. A careful measurement and analysis of the time scales characterizing the dynamics of the harmonically bound sphere fluctuating in a viscous medium directly yields all relevant parameters. Finally, we test the method for different optical trap strengths, with different bead sizes and in different fluids, and we find excellent agreement with the values provided by the manufacturers. The proposed approach overcomes the most commonly encountered limitations in precision when analyzing the power spectrum of position fluctuations in the region around the corner frequency. These low frequencies are usually prone to errors due to drift, limitations in the detection, and trap linearity as well as short acquisition times resulting in poor statistics. Furthermore, the strategy can be generalized to Brownian motion in more complex environments, provided the adequate theories are available.
Exercise Training in Treatment and Rehabilitation of Hip Osteoarthritis: A 12-Week Pilot Trial
Patil, Radhika; Karinkanta, Saija; Tokola, Kari; Kannus, Pekka
2017-01-01
Introduction. Osteoarthritis (OA) of the hip is one of the major causes of pain and disability in the older population. Although exercise is an effective treatment for knee OA, there is lack of evidence regarding hip OA. The aim of this trial was to test the safety and feasibility of a specifically designed exercise program in relieving hip pain and improving function in hip OA participants and to evaluate various methods to measure changes in their physical functioning. Materials and Methods. 13 women aged ≥ 65 years with hip OA were recruited in this 12-week pilot study. Results. Pain declined significantly over 30% from baseline, and joint function and health-related quality of life improved slightly. Objective assessment of physical functioning showed statistically significant improvement in the maximal isometric leg extensor strength by 20% and in the hip extension range of motion by 30%. Conclusions. The exercise program was found to be safe and feasible. The present evidence indicates that the exercise program is effective in the short term. However, adequate powered RCTs are needed to determine effects of long-term exercise therapy on pain and progression of hip OA. PMID:28116214
Parmar, Suresh K; Rathinam, Bertha A D
2011-01-01
The purpose of the present pilot study was to evaluate the benefits of innovative teaching methodologies introduced to final year occupational and physical therapy students in Christian Medical College in India. Students' satisfactions along the long-term retention of knowledge and clinical application of the respiratory anatomy have been assessed. The final year undergraduate physical therapy and occupational therapy students had respiratory anatomy teaching over two sessions. The teaching involved case-based learning and integrated anatomy lectures (vertical integration) with the Anatomy department. Pretest and immediate and follow-up post-tests were conducted to assess the effectiveness of the innovative methods. A feedback questionnaire was marked to grade case-based learning. The method of integrated and case-based teaching was appreciated and found to be useful in imparting knowledge to the students. Students retained the gained knowledge adequately and the same was inferred by statistically significant improvement in both post-test scores. Vertical integration of anatomy in the final year reinforces their existing knowledge of anatomy. Case-based learning may facilitate the development of effective and clinically sound therapists. Copyright © 2011 American Association of Anatomists.
Goodness of fit of probability distributions for sightings as species approach extinction.
Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael
2009-04-01
Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, G.B.; Kiraly, R.J.; Nose, Y.
The objective of the study is to define the human thorax in a quantitative statistical manner such that the information will be useful to the designers of cardiac prostheses, both total replacement and assist devices. This report pertains specifically to anatomical parameters relevant to the total cardiac prosthesis. This information will also be clinically useful in that the proposed recipient of a cardiac prosthesis can by simple radiography be assured of an adequate fit with the prosthesis prior to the implantation.
Dorazio, R.M.; Johnson, F.A.
2003-01-01
Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.
Analysis of cost regression and post-accident absence
NASA Astrophysics Data System (ADS)
Wojciech, Drozd
2017-07-01
The article presents issues related with costs of work safety. It proves the thesis that economic aspects cannot be overlooked in effective management of occupational health and safety and that adequate expenditures on safety can bring tangible benefits to the company. Reliable analysis of this problem is essential for the description the problem of safety the work. In the article attempts to carry it out using the procedures of mathematical statistics [1, 2, 3].
Peoples, M D; Grimson, R C; Daughtry, G L
1984-01-01
This study was designed to assess the effects of the North Carolina Improved Pregnancy Outcome (IPO) Project on use of prenatal care and incidence of low birthweight among its primarily Black registrants . Weighted least squares and stratified analysis procedures were used to scrutinize vital statistics data for subpopulation effects. IPO services were received by 51.7 per cent of Black women in the counties served by the project. For all Black registrants , the risk of receiving less than adequate prenatal care was 55.1 per cent of that of the comparison group. For Black teenage registrants , the risk was even less: 37.2 per cent of that of the comparison group. Nevertheless, no corresponding effects on the incidence of low birthweight could be detected. The evaluation methods used in this study can be applied to programs for mothers and infants in other locales to generate useful and practical information for state-level decision-making. PMID:6721010
Anchorage in Orthodontics: Three-dimensional Scanner Input
Nabbout, Fidele; Baron, Pascal
2018-01-01
Aims and Objectives: The aim of this article is to re-evaluate anchorage coefficient values in orthodontics and their influence in the treatment decision through the usage of three-dimensional (3D) scanner. Materials and Methods: A sample of 80 patients was analyzed with the 3D scanner using the C2000 and Cepha 3DT softwares (CIRAD Montpellier, France). Tooth anatomy parameters (linear measurements, root, and crown volumes) were then calculated to determine new anchorage coefficients based on root volume. Data were collected and statistically evaluated with the StatView software (version 5.0). Results: The anchorage coefficient values found in this study are compared to those established in previous studies. These new values affect and modify our approach in orthodontic treatment from the standpoint of anchorage. Conclusion: The use of new anchorage coefficient values has significant clinical implications in conventional and in microimplants-assisted orthodontic mechanics through the selection and delivery of the optimal force system (magnitude and moment) for an adequate biological response. PMID:29629323
Supervised target detection in hyperspectral images using one-class Fukunaga-Koontz Transform
NASA Astrophysics Data System (ADS)
Binol, Hamidullah; Bal, Abdullah
2016-05-01
A novel hyperspectral target detection technique based on Fukunaga-Koontz transform (FKT) is presented. FKT offers significant properties for feature selection and ordering. However, it can only be used to solve multi-pattern classification problems. Target detection may be considered as a two-class classification problem, i.e., target versus background clutter. Nevertheless, background clutter typically contains different types of materials. That's why; target detection techniques are different than classification methods by way of modeling clutter. To avoid the modeling of the background clutter, we have improved one-class FKT (OC-FKT) for target detection. The statistical properties of target training samples are used to define tunnel-like boundary of the target class. Non-target samples are then created synthetically as to be outside of the boundary. Thus, only limited target samples become adequate for training of FKT. The hyperspectral image experiments confirm that the proposed OC-FKT technique provides an effective means for target detection.
Mujic, Fedza; Cairns, Ruth; Mak, Vivienne; Squire, Clare; Wells, Andrew; Al-Harrasi, Ahmed; Prince, Martin
2018-02-01
Aims and method This study used data collected to describe the activity, case-load characteristics and outcome measures for all patients seen during a 6-year period. The service reviewed 2153 patients over 6 years with referral rates and case-load characteristics comparable to those described in a previous study period. The team saw 82% of patients on the day they were referred. Data and outcome measures collected showed significant complexity in the cases seen and statistically significant improvement in Health of the Nation Outcome Scales (HoNOS) scores following service input. Clinical implications The outcome measures used were limited, but the study supports the need for specialist liaison psychiatry for older adults (LPOA) services in the general hospital. The Framework of Outcome Measures - Liaison Psychiatry has now been introduced, but it remains unclear how valid this is in LPOA. It is of note that cost-effectiveness secondary to service input and training activities are not adequately monitored. Declaration of interest None.
SSSFD manipulator engineering using statistical experiment design techniques
NASA Technical Reports Server (NTRS)
Barnes, John
1991-01-01
The Satellite Servicer System Flight Demonstration (SSSFD) program is a series of Shuttle flights designed to verify major on-orbit satellite servicing capabilities, such as rendezvous and docking of free flyers, Orbital Replacement Unit (ORU) exchange, and fluid transfer. A major part of this system is the manipulator system that will perform the ORU exchange. The manipulator must possess adequate toolplate dexterity to maneuver a variety of EVA-type tools into position to interface with ORU fasteners, connectors, latches, and handles on the satellite, and to move workpieces and ORUs through 6 degree of freedom (dof) space from the Target Vehicle (TV) to the Support Module (SM) and back. Two cost efficient tools were combined to perform a study of robot manipulator design parameters. These tools are graphical computer simulations and Taguchi Design of Experiment methods. Using a graphics platform, an off-the-shelf robot simulation software package, and an experiment designed with Taguchi's approach, the sensitivities of various manipulator kinematic design parameters to performance characteristics are determined with minimal cost.