Secondary Analysis of National Longitudinal Transition Study 2 Data
ERIC Educational Resources Information Center
Hicks, Tyler A.; Knollman, Greg A.
2015-01-01
This review examines published secondary analyses of National Longitudinal Transition Study 2 (NLTS2) data, with a primary focus upon statistical objectives, paradigms, inferences, and methods. Its primary purpose was to determine which statistical techniques have been common in secondary analyses of NLTS2 data. The review begins with an…
ERIC Educational Resources Information Center
Ellis, Barbara G.; Dick, Steven J.
1996-01-01
Employs the statistics-documentation portion of a word-processing program's grammar-check feature together with qualitative analyses to determine that Henry Watterson, long-time editor of the "Louisville Courier-Journal," was probably the South's famed Civil War correspondent "Shadow." (TB)
Human Deception Detection from Whole Body Motion Analysis
2015-12-01
9.3.2. Prediction Probability The output reports from SPSS detail the stepwise procedures for each series of analyses using Wald statistic values for... statistical significance in determining replication, but instead used a combination of significance and direction of means to determine partial or...and the independents need not be unbound. All data were analyzed utilizing the Statistical Package for Social Sciences ( SPSS , v.19.0, Chicago, IL
Thompson, Ronald E.; Hoffman, Scott A.
2006-01-01
A suite of 28 streamflow statistics, ranging from extreme low to high flows, was computed for 17 continuous-record streamflow-gaging stations and predicted for 20 partial-record stations in Monroe County and contiguous counties in north-eastern Pennsylvania. The predicted statistics for the partial-record stations were based on regression analyses relating inter-mittent flow measurements made at the partial-record stations indexed to concurrent daily mean flows at continuous-record stations during base-flow conditions. The same statistics also were predicted for 134 ungaged stream locations in Monroe County on the basis of regression analyses relating the statistics to GIS-determined basin characteristics for the continuous-record station drainage areas. The prediction methodology for developing the regression equations used to estimate statistics was developed for estimating low-flow frequencies. This study and a companion study found that the methodology also has application potential for predicting intermediate- and high-flow statistics. The statistics included mean monthly flows, mean annual flow, 7-day low flows for three recurrence intervals, nine flow durations, mean annual base flow, and annual mean base flows for two recurrence intervals. Low standard errors of prediction and high coefficients of determination (R2) indicated good results in using the regression equations to predict the statistics. Regression equations for the larger flow statistics tended to have lower standard errors of prediction and higher coefficients of determination (R2) than equations for the smaller flow statistics. The report discusses the methodologies used in determining the statistics and the limitations of the statistics and the equations used to predict the statistics. Caution is indicated in using the predicted statistics for small drainage area situations. Study results constitute input needed by water-resource managers in Monroe County for planning purposes and evaluation of water-resources availability.
Determinants of antiretroviral therapy coverage in Sub-Saharan Africa
Hoque, Mohammad Zahirul
2015-01-01
Among 35 million people living with the human immunodeficiency virus (HIV) in 2013, only 37% had access to antiretroviral therapy (ART). Despite global concerted efforts to provide the universal access to the ART treatment, the ART coverage varies among countries and regions. At present, there is a lack of systematic empirical analyses on factors that determine the ART coverage. Therefore, the current study aimed to identify the determinants of the ART coverage in 41 countries in Sub-Saharan Africa. It employed statistical analyses for this purpose. Four elements, namely, the HIV prevalence, the level of national income, the level of medical expenditure and the number of nurses, were hypothesised to determine the ART coverage. The findings revealed that among the four proposed determinants only the HIV prevalence had a statistically significant impact on the ART coverage. In other words, the HIV prevalence was the sole determinant of the ART coverage in Sub-Saharan Africa. PMID:26664812
Facilitating the Transition from Bright to Dim Environments
2016-03-04
For the parametric data, a multivariate ANOVA was used in determining the systematic presence of any statistically significant performance differences...performed. All significance levels were p < 0.05, and statistical analyses were performed with the Statistical Package for Social Sciences ( SPSS ...1950. Age changes in rate and level of visual dark adaptation. Journal of Applied Physiology, 2, 407–411. Field, A. 2009. Discovering statistics
Electric Field Magnitude and Radar Reflectivity as a Function of Distance from Cloud Edge
NASA Technical Reports Server (NTRS)
Ward, Jennifer G.; Merceret, Francis J.
2004-01-01
The results of analyses of data collected during a field investigation of thunderstorm anvil and debris clouds are reported. Statistics of the magnitude of the electric field are determined as a function of distance from cloud edge. Statistics of radar reflectivity near cloud edge are also determined. Both analyses use in-situ airborne field mill and cloud physics data coupled with ground-based radar measurements obtained in east-central Florida during the summer convective season. Electric fields outside of anvil and debris clouds averaged less than 3 kV/m. The average radar reflectivity at the cloud edge ranged between 0 and 5 dBZ.
Zhang, Harrison G; Ying, Gui-Shuang
2018-02-09
The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Field management of asphalt concrete mixes.
DOT National Transportation Integrated Search
1988-01-01
Marshall properties were determined on mixes from three different contractors each producing a 1/2-in and a 3/4-in top-sized aggregate mix. From these data, statistical analyses were made to determine differences among contractors and between mix typ...
Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D
2011-10-15
This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
Coordinate based random effect size meta-analysis of neuroimaging studies.
Tench, C R; Tanasescu, Radu; Constantinescu, C S; Auer, D P; Cottam, W J
2017-06-01
Low power in neuroimaging studies can make them difficult to interpret, and Coordinate based meta-analysis (CBMA) may go some way to mitigating this issue. CBMA has been used in many analyses to detect where published functional MRI or voxel-based morphometry studies testing similar hypotheses report significant summary results (coordinates) consistently. Only the reported coordinates and possibly t statistics are analysed, and statistical significance of clusters is determined by coordinate density. Here a method of performing coordinate based random effect size meta-analysis and meta-regression is introduced. The algorithm (ClusterZ) analyses both coordinates and reported t statistic or Z score, standardised by the number of subjects. Statistical significance is determined not by coordinate density, but by a random effects meta-analyses of reported effects performed cluster-wise using standard statistical methods and taking account of censoring inherent in the published summary results. Type 1 error control is achieved using the false cluster discovery rate (FCDR), which is based on the false discovery rate. This controls both the family wise error rate under the null hypothesis that coordinates are randomly drawn from a standard stereotaxic space, and the proportion of significant clusters that are expected under the null. Such control is necessary to avoid propagating and even amplifying the very issues motivating the meta-analysis in the first place. ClusterZ is demonstrated on both numerically simulated data and on real data from reports of grey matter loss in multiple sclerosis (MS) and syndromes suggestive of MS, and of painful stimulus in healthy controls. The software implementation is available to download and use freely. Copyright © 2017 Elsevier Inc. All rights reserved.
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Krystkowiak, Izabella; Manguy, Jean; Davey, Norman E
2018-06-05
There is a pressing need for in silico tools that can aid in the identification of the complete repertoire of protein binding (SLiMs, MoRFs, miniMotifs) and modification (moiety attachment/removal, isomerization, cleavage) motifs. We have created PSSMSearch, an interactive web-based tool for rapid statistical modeling, visualization, discovery and annotation of protein motif specificity determinants to discover novel motifs in a proteome-wide manner. PSSMSearch analyses proteomes for regions with significant similarity to a motif specificity determinant model built from a set of aligned motif-containing peptides. Multiple scoring methods are available to build a position-specific scoring matrix (PSSM) describing the motif specificity determinant model. This model can then be modified by a user to add prior knowledge of specificity determinants through an interactive PSSM heatmap. PSSMSearch includes a statistical framework to calculate the significance of specificity determinant model matches against a proteome of interest. PSSMSearch also includes the SLiMSearch framework's annotation, motif functional analysis and filtering tools to highlight relevant discriminatory information. Additional tools to annotate statistically significant shared keywords and GO terms, or experimental evidence of interaction with a motif-recognizing protein have been added. Finally, PSSM-based conservation metrics have been created for taxonomic range analyses. The PSSMSearch web server is available at http://slim.ucd.ie/pssmsearch/.
Wu, Robert; Glen, Peter; Ramsay, Tim; Martel, Guillaume
2014-06-28
Observational studies dominate the surgical literature. Statistical adjustment is an important strategy to account for confounders in observational studies. Research has shown that published articles are often poor in statistical quality, which may jeopardize their conclusions. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines have been published to help establish standards for statistical reporting.This study will seek to determine whether the quality of statistical adjustment and the reporting of these methods are adequate in surgical observational studies. We hypothesize that incomplete reporting will be found in all surgical observational studies, and that the quality and reporting of these methods will be of lower quality in surgical journals when compared with medical journals. Finally, this work will seek to identify predictors of high-quality reporting. This work will examine the top five general surgical and medical journals, based on a 5-year impact factor (2007-2012). All observational studies investigating an intervention related to an essential component area of general surgery (defined by the American Board of Surgery), with an exposure, outcome, and comparator, will be included in this systematic review. Essential elements related to statistical reporting and quality were extracted from the SAMPL guidelines and include domains such as intent of analysis, primary analysis, multiple comparisons, numbers and descriptive statistics, association and correlation analyses, linear regression, logistic regression, Cox proportional hazard analysis, analysis of variance, survival analysis, propensity analysis, and independent and correlated analyses. Each article will be scored as a proportion based on fulfilling criteria in relevant analyses used in the study. A logistic regression model will be built to identify variables associated with high-quality reporting. A comparison will be made between the scores of surgical observational studies published in medical versus surgical journals. Secondary outcomes will pertain to individual domains of analysis. Sensitivity analyses will be conducted. This study will explore the reporting and quality of statistical analyses in surgical observational studies published in the most referenced surgical and medical journals in 2013 and examine whether variables (including the type of journal) can predict high-quality reporting.
CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach
An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.
Catto, James W F; Abbod, Maysam F; Wild, Peter J; Linkens, Derek A; Pilarsky, Christian; Rehman, Ishtiaq; Rosario, Derek J; Denzinger, Stefan; Burger, Maximilian; Stoehr, Robert; Knuechel, Ruth; Hartmann, Arndt; Hamdy, Freddie C
2010-03-01
New methods for identifying bladder cancer (BCa) progression are required. Gene expression microarrays can reveal insights into disease biology and identify novel biomarkers. However, these experiments produce large datasets that are difficult to interpret. To develop a novel method of microarray analysis combining two forms of artificial intelligence (AI): neurofuzzy modelling (NFM) and artificial neural networks (ANN) and validate it in a BCa cohort. We used AI and statistical analyses to identify progression-related genes in a microarray dataset (n=66 tumours, n=2800 genes). The AI-selected genes were then investigated in a second cohort (n=262 tumours) using immunohistochemistry. We compared the accuracy of AI and statistical approaches to identify tumour progression. AI identified 11 progression-associated genes (odds ratio [OR]: 0.70; 95% confidence interval [CI], 0.56-0.87; p=0.0004), and these were more discriminate than genes chosen using statistical analyses (OR: 1.24; 95% CI, 0.96-1.60; p=0.09). The expression of six AI-selected genes (LIG3, FAS, KRT18, ICAM1, DSG2, and BRCA2) was determined using commercial antibodies and successfully identified tumour progression (concordance index: 0.66; log-rank test: p=0.01). AI-selected genes were more discriminate than pathologic criteria at determining progression (Cox multivariate analysis: p=0.01). Limitations include the use of statistical correlation to identify 200 genes for AI analysis and that we did not compare regression identified genes with immunohistochemistry. AI and statistical analyses use different techniques of inference to determine gene-phenotype associations and identify distinct prognostic gene signatures that are equally valid. We have identified a prognostic gene signature whose members reflect a variety of carcinogenic pathways that could identify progression in non-muscle-invasive BCa. 2009 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Patton, Charles J.; Kryskalla, Jennifer R.
2011-01-01
In addition to operational details and performance benchmarks for these new DA-AtNaR2 nitrate + nitrite assays, this report also provides results of interference studies for common inorganic and organic matrix constituents at 1, 10, and 100 times their median concentrations in surface-water and groundwater samples submitted annually to the NWQL for nitrate + nitrite analyses. Paired t-test and Wilcoxon signed-rank statistical analyses of results determined by CFA-CdR methods and DA-AtNaR2 methods indicate that nitrate concentration differences between population means or sign ranks were either statistically equivalent to zero at the 95 percent confidence level (p ≥ 0.05) or analytically equivalent to zero-that is, when p < 0.05, concentration differences between population means or medians were less than MDLs.
An evaluation of GTAW-P versus GTA welding of alloy 718
NASA Technical Reports Server (NTRS)
Gamwell, W. R.; Kurgan, C.; Malone, T. W.
1991-01-01
Mechanical properties were evaluated to determine statistically whether the pulsed current gas tungsten arc welding (GTAW-P) process produces welds in alloy 718 with room temperature structural performance equivalent to current Space Shuttle Main Engine (SSME) welds manufactured by the constant current GTAW-P process. Evaluations were conducted on two base metal lots, two filler metal lots, two heat input levels, and two welding processes. The material form was 0.125-inch (3.175-mm) alloy 718 sheet. Prior to welding, sheets were treated to either the ST or STA-1 condition. After welding, panels were left as welded or heat treated to the STA-1 condition, and weld beads were left intact or machined flush. Statistical analyses were performed on yield strength, ultimate tensile strength (UTS), and high cycle fatigue (HCF) properties for all the post welded material conditions. Analyses of variance were performed on the data to determine if there were any significant effects on UTS or HCF life due to variations in base metal, filler metal, heat input level, or welding process. Statistical analyses showed that the GTAW-P process does produce welds with room temperature structural performance equivalent to current SSME welds manufactured by the GTAW process, regardless of prior material condition or post welding condition.
Ethnic/racial misidentification in death: a problem which may distort suicide statistics.
Andres, V R
1977-01-01
Since the majority of suicide studies are ex post facto studies of demographic data collected by pathologists anc coroner's investigators, the role of the forensic scientist in determining the accuracy of statistical analyses of death is extremely important. This paper discusses how two salient features of a decedent, surname and residence location, can be misleading in determining the ethnic/racial classification of the decreased. Because many Southern California Indians have Spanish Surnames and most do not reside on an Indian reservation it is shown that the suicide statistics may represent an over-estimation of actual Mexican-American suicidal deaths while simultaneously representing an under-estimation of the suicides among American Indians of the region.
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Krawczyk, Christopher; Gradziel, Pat; Geraghty, Estella M.
2014-01-01
Objectives. We used a geographic information system and cluster analyses to determine locations in need of enhanced Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) Program services. Methods. We linked documented births in the 2010 California Birth Statistical Master File with the 2010 data from the WIC Integrated Statewide Information System. Analyses focused on the density of pregnant women who were eligible for but not receiving WIC services in California’s 7049 census tracts. We used incremental spatial autocorrelation and hot spot analyses to identify clusters of WIC-eligible nonparticipants. Results. We detected clusters of census tracts with higher-than-expected densities, compared with the state mean density of WIC-eligible nonparticipants, in 21 of 58 (36.2%) California counties (P < .05). In subsequent county-level analyses, we located neighborhood-level clusters of higher-than-expected densities of eligible nonparticipants in Sacramento, San Francisco, Fresno, and Los Angeles Counties (P < .05). Conclusions. Hot spot analyses provided a rigorous and objective approach to determine the locations of statistically significant clusters of WIC-eligible nonparticipants. Results helped inform WIC program and funding decisions, including the opening of new WIC centers, and offered a novel approach for targeting public health services. PMID:24354821
Quasi-Static Probabilistic Structural Analyses Process and Criteria
NASA Technical Reports Server (NTRS)
Goldberg, B.; Verderaime, V.
1999-01-01
Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.
NASA Technical Reports Server (NTRS)
Salstein, D. A.; Rosen, R. D.
1982-01-01
A study using the analyses produced from the assimilation cycle of parallel model runs that both include and withhold satellite data was undertaken. The analyzed state of the atmosphere is performed using data from a certain test period during the first Special Observing Period (SOP) of the Global Weather Experiment (FGGE).
The concept and use of elasticity in population viability models [Exercise 13
Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke
2003-01-01
As you have seen in exercise 12, plants, such as the western prairie fringed orchid, typically have distinct life stages and complex life cycles that require the matrix analyses associated with a stage-based population model. Some statistics that can be generated from such matrix analyses can be very informative in determining which variables in the model have the...
A Meta-Analysis: The Relationship between Father Involvement and Student Academic Achievement
ERIC Educational Resources Information Center
Jeynes, William H.
2015-01-01
A meta-analysis was undertaken, including 66 studies, to determine the relationship between father involvement and the educational outcomes of urban school children. Statistical analyses were done to determine the overall impact and specific components of father involvement. The possible differing effects of paternal involvement by race were also…
CADDIS Volume 4. Data Analysis: Basic Analyses
Use of statistical tests to determine if an observation is outside the normal range of expected values. Details of CART, regression analysis, use of quantile regression analysis, CART in causal analysis, simplifying or pruning resulting trees.
Low-flow statistics of selected streams in Chester County, Pennsylvania
Schreffler, Curtis L.
1998-01-01
Low-flow statistics for many streams in Chester County, Pa., were determined on the basis of data from 14 continuous-record streamflow stations in Chester County and data from 1 station in Maryland and 1 station in Delaware. The stations in Maryland and Delaware are on streams that drain large areas within Chester County. Streamflow data through the 1994 water year were used in the analyses. The low-flow statistics summarized are the 1Q10, 7Q10, 30Q10, and harmonic mean. Low-flow statistics were estimated at 34 partial-record stream sites throughout Chester County.
Multiple statistical tests: Lessons from a d20.
Madan, Christopher R
2016-01-01
Statistical analyses are often conducted with α= .05. When multiple statistical tests are conducted, this procedure needs to be adjusted to compensate for the otherwise inflated Type I error. In some instances in tabletop gaming, sometimes it is desired to roll a 20-sided die (or 'd20') twice and take the greater outcome. Here I draw from probability theory and the case of a d20, where the probability of obtaining any specific outcome is (1)/ 20, to determine the probability of obtaining a specific outcome (Type-I error) at least once across repeated, independent statistical tests.
Pejtersen, Jan Hyld; Burr, Hermann; Hannerz, Harald; Fishta, Alba; Hurwitz Eller, Nanna
2015-01-01
The present review deals with the relationship between occupational psychosocial factors and the incidence of ischemic heart disease (IHD) with special regard to the statistical power of the findings. This review with 4 inclusion criteria is an update of a 2009 review of which the first 3 criteria were included in the original review: (1) STUDY: a prospective or case-control study if exposure was not self-reported (prognostic studies excluded); (2) OUTCOME: definite IHD determined externally; (3) EXPOSURE: psychosocial factors at work (excluding shift work, trauma, violence or accidents, and social capital); and (4) Statistical power: acceptable to detect a 20% increased risk in IHD. Eleven new papers met the inclusion criteria 1-3; a total of 44 papers were evaluated regarding inclusion criteria 4. Of 169 statistical analyses, only 10 analyses in 2 papers had acceptable statistical power. The results of the 2 papers pointed in the same direction, namely that only the control dimension of job strain explained the excess risk for myocardial infarction for job strain. The large number of underpowered studies and the focus on psychosocial models, such as the job strain models, make it difficult to determine to what extent psychosocial factors at work are risk factors of IHD. There is a need for considering statistical power when planning studies.
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle
NASA Technical Reports Server (NTRS)
Foster, Winfred A., Jr.; Crowder, Winston; Steadman, Todd E.
2014-01-01
This paper presents the results of statistical analyses performed to predict the thrust imbalance between two solid rocket motor boosters to be used on the Space Launch System (SLS) vehicle. Two legacy internal ballistics codes developed for the Space Shuttle program were coupled with a Monte Carlo analysis code to determine a thrust imbalance envelope for the SLS vehicle based on the performance of 1000 motor pairs. Thirty three variables which could impact the performance of the motors during the ignition transient and thirty eight variables which could impact the performance of the motors during steady state operation of the motor were identified and treated as statistical variables for the analyses. The effects of motor to motor variation as well as variations between motors of a single pair were included in the analyses. The statistical variations of the variables were defined based on data provided by NASA's Marshall Space Flight Center for the upgraded five segment booster and from the Space Shuttle booster when appropriate. The results obtained for the statistical envelope are compared with the design specification thrust imbalance limits for the SLS launch vehicle.
Statistical analysis of field data for aircraft warranties
NASA Astrophysics Data System (ADS)
Lakey, Mary J.
Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.
Mediation analysis in nursing research: a methodological review.
Liu, Jianghong; Ulrich, Connie
2016-12-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.
ERIC Educational Resources Information Center
Jeynes, William H.
2007-01-01
A meta-analysis is undertaken, including 52 studies, to determine the influence of parental involvement on the educational outcomes of urban secondary school children. Statistical analyses are done to determine the overall impact of parental involvement as well as specific components of parental involvement. Four different measures of educational…
Mineral discrimination using a portable ratio-determining radiometer.
Whitney, G.; Abrams, M.J.; Goetz, A.F.H.
1983-01-01
A portable ratio-determining radiometer has been tested in the laboratory to evaluate the use of narrow band filters for separating geologically important minerals. The instrument has 10 bands in the visible and near-infrared portion of the spectrum (0.5-2.4mm), positioned to sample spectral regions having absorption bands characteristic of minerals in this wavelength region. Measurements and statistical analyses were performed on 66 samples, which were characterized by microscopic and X-ray diffraction analyses. Comparison with high-resolution laboratory spectral reflectance curves indicated that the radiometer's raw values faithfully reproduced the shapes of the spectra. -from Authors
A catalogue of /Fe/H/ determinations
NASA Astrophysics Data System (ADS)
Cayrel de Strobel, G.; Bentolila, C.; Hauck, B.; Curchod, A.
1980-09-01
A catalog of iron/hydrogen abundance ratios for 628 stars is compiled based on 1109 published values. The catalog consists of (1) a table of absolute iron abundance determinations in the solar photosphere as compiled by Blackwell (1974); (2) the iron/hydrogen abundances of 628 stars in the form of logarithmic differences between iron abundances in the given star and a standard star, obtained from analyses of high-dispersion spectra as well as useful stellar spectroscopic and photometric parameters; and (3) indications of the mean dispersion and wavelength interval used in the analyses. In addition, statistics on the distributions of the number of determinations per star and the apparent magnitudes and spectral types of the stars are presented.
Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi
2017-01-01
Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255
Mediation analysis in nursing research: a methodological review
Liu, Jianghong; Ulrich, Connie
2017-01-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Systems and methods for detection of blowout precursors in combustors
Lieuwen, Tim C.; Nair, Suraj
2006-08-15
The present invention comprises systems and methods for detecting flame blowout precursors in combustors. The blowout precursor detection system comprises a combustor, a pressure measuring device, and blowout precursor detection unit. A combustion controller may also be used to control combustor parameters. The methods of the present invention comprise receiving pressure data measured by an acoustic pressure measuring device, performing one or a combination of spectral analysis, statistical analysis, and wavelet analysis on received pressure data, and determining the existence of a blowout precursor based on such analyses. The spectral analysis, statistical analysis, and wavelet analysis further comprise their respective sub-methods to determine the existence of blowout precursors.
Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures
Çelebi, Mehmet
1998-01-01
Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.
Detection of semi-volatile organic compounds in permeable ...
Abstract The Edison Environmental Center (EEC) has a research and demonstration permeable parking lot comprised of three different permeable systems: permeable asphalt, porous concrete and interlocking concrete permeable pavers. Water quality and quantity analysis has been ongoing since January, 2010. This paper describes a subset of the water quality analysis, analysis of semivolatile organic compounds (SVOCs) to determine if hydrocarbons were in water infiltrated through the permeable surfaces. SVOCs were analyzed in samples collected from 11 dates over a 3 year period, from 2/8/2010 to 4/1/2013.Results are broadly divided into three categories: 42 chemicals were never detected; 12 chemicals (11 chemical test) were detected at a rate of less than 10% or less; and 22 chemicals were detected at a frequency of 10% or greater (ranging from 10% to 66.5% detections). Fundamental and exploratory statistical analyses were performed on these latter analyses results by grouping results by surface type. The statistical analyses were limited due to low frequency of detections and dilutions of samples which impacted detection limits. The infiltrate data through three permeable surfaces were analyzed as non-parametric data by the Kaplan-Meier estimation method for fundamental statistics; there were some statistically observable difference in concentration between pavement types when using Tarone-Ware Comparison Hypothesis Test. Additionally Spearman Rank order non-parame
Accounting for Multiple Births in Neonatal and Perinatal Trials: Systematic Review and Case Study
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-01-01
Objectives To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births. To explore the sensitivity of an actual trial to several analytic approaches to multiples. Methods A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The NO CLD trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using non-clustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. Results In the systematic review, most studies did not describe the randomization of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (p<0.01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. Conclusions The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. PMID:19969305
Accounting for multiple births in neonatal and perinatal trials: systematic review and case study.
Hibbs, Anna Maria; Black, Dennis; Palermo, Lisa; Cnaan, Avital; Luan, Xianqun; Truog, William E; Walsh, Michele C; Ballard, Roberta A
2010-02-01
To determine the prevalence in the neonatal literature of statistical approaches accounting for the unique clustering patterns of multiple births and to explore the sensitivity of an actual trial to several analytic approaches to multiples. A systematic review of recent perinatal trials assessed the prevalence of studies accounting for clustering of multiples. The Nitric Oxide to Prevent Chronic Lung Disease (NO CLD) trial served as a case study of the sensitivity of the outcome to several statistical strategies. We calculated odds ratios using nonclustered (logistic regression) and clustered (generalized estimating equations, multiple outputation) analyses. In the systematic review, most studies did not describe the random assignment of twins and did not account for clustering. Of those studies that did, exclusion of multiples and generalized estimating equations were the most common strategies. The NO CLD study included 84 infants with a sibling enrolled in the study. Multiples were more likely than singletons to be white and were born to older mothers (P < .01). Analyses that accounted for clustering were statistically significant; analyses assuming independence were not. The statistical approach to multiples can influence the odds ratio and width of confidence intervals, thereby affecting the interpretation of a study outcome. A minority of perinatal studies address this issue. Copyright 2010 Mosby, Inc. All rights reserved.
An operational definition of a statistically meaningful trend.
Bryhn, Andreas C; Dimberg, Peter H
2011-04-28
Linear trend analysis of time series is standard procedure in many scientific disciplines. If the number of data is large, a trend may be statistically significant even if data are scattered far from the trend line. This study introduces and tests a quality criterion for time trends referred to as statistical meaningfulness, which is a stricter quality criterion for trends than high statistical significance. The time series is divided into intervals and interval mean values are calculated. Thereafter, r(2) and p values are calculated from regressions concerning time and interval mean values. If r(2) ≥ 0.65 at p ≤ 0.05 in any of these regressions, then the trend is regarded as statistically meaningful. Out of ten investigated time series from different scientific disciplines, five displayed statistically meaningful trends. A Microsoft Excel application (add-in) was developed which can perform statistical meaningfulness tests and which may increase the operationality of the test. The presented method for distinguishing statistically meaningful trends should be reasonably uncomplicated for researchers with basic statistics skills and may thus be useful for determining which trends are worth analysing further, for instance with respect to causal factors. The method can also be used for determining which segments of a time trend may be particularly worthwhile to focus on.
Multispectral determination of soil moisture-2. [Guymon, Oklahoma and Dalhart, Texas
NASA Technical Reports Server (NTRS)
Estes, J. E.; Simonett, D. S. (Principal Investigator); Hajic, E. J.; Hilton, B. M.; Lees, R. D.
1982-01-01
Soil moisture data obtained using scatterometers, modular multispectral scanners and passive microwave radiometers were revised and grouped into four field cover types for statistical anaysis. Guymon data are grouped as alfalfa, bare, milo with rows perpendicular to the field view, and milo viewed parallel to the field of view. Dalhart data are grouped as bare combo, stubble, disked stubble, and corn field. Summary graphs combine selected analyses to compare the effects of field cover. The analysis for each of the cover types is presented in tables and graphs. Other tables show elementary statistics, correlation matrices, and single variable regressions. Selected eigenvectors and factor analyses are included and the highest correlating sensor typs for each location are summarized.
Wess, Bernard P.; Jacobson, Gary
1987-01-01
In the process of forming a new medical malpractice reinsurance company, the authors analyzed thousands of medical malpractice cases, settlements, and verdicts. The evidence of those analyses indicated that the medical malpractice crisis is (1)emerging nation- and world-wide, (2)exacerbated by but not primarily a result of “predatory” legal action, (3)statistically determined by a small percentage of physicians and procedures, (4)overburdened with data but poor on information, (5)subject to classic forms of quality control and automation. The management information system developed to address this problem features a tiered data base architecture to accommodate medical, administrative, procedural, statistical, and actuarial analyses necessary to predict claims from untoward events, not merely to report them.
Emotional Intelligence Profiles and Learning Strategies in Secondary School Students
ERIC Educational Resources Information Center
Inglés, Cándido J.; Martínez-Monteagudo, María C.; Pérez Fuentes, Maria C.; García-Fernández, José M.; Molero, María del Mar; Suriá-Martinez, Raquel; Gázquez, José J.
2017-01-01
The aim of this study was to analyse the relationship among emotional intelligence (EI) and learning strategies, identifying different emotional intelligence profiles and determining possible statistically significant differences in learning strategies through the identified profiles. Thousand and seventy-one Spaniards secondary school students…
NASA Technical Reports Server (NTRS)
Vangelder, B. H. W.
1978-01-01
Non-Bayesian statistics were used in simulation studies centered around laser range observations to LAGEOS. The capabilities of satellite laser ranging especially in connection with relative station positioning are evaluated. The satellite measurement system under investigation may fall short in precise determinations of the earth's orientation (precession and nutation) and earth's rotation as opposed to systems as very long baseline interferometry (VLBI) and lunar laser ranging (LLR). Relative station positioning, determination of (differential) polar motion, positioning of stations with respect to the earth's center of mass and determination of the earth's gravity field should be easily realized by satellite laser ranging (SLR). The last two features should be considered as best (or solely) determinable by SLR in contrast to VLBI and LLR.
Using Meta-analyses for Comparative Effectiveness Research
Ruppar, Todd M.; Phillips, Lorraine J.; Chase, Jo-Ana D.
2012-01-01
Comparative effectiveness research seeks to identify the most effective interventions for particular patient populations. Meta-analysis is an especially valuable form of comparative effectiveness research because it emphasizes the magnitude of intervention effects rather than relying on tests of statistical significance among primary studies. Overall effects can be calculated for diverse clinical and patient-centered variables to determine the outcome patterns. Moderator analyses compare intervention characteristics among primary studies by determining if effect sizes vary among studies with different intervention characteristics. Intervention effectiveness can be linked to patient characteristics to provide evidence for patient-centered care. Moderator analyses often answer questions never posed by primary studies because neither multiple intervention characteristics nor populations are compared in single primary studies. Thus meta-analyses provide unique contributions to knowledge. Although meta-analysis is a powerful comparative effectiveness strategy, methodological challenges and limitations in primary research must be acknowledged to interpret findings. PMID:22789450
Maximizing the significance in Higgs boson pair analyses [Mad-Maximized Higgs Pair Analyses
Kling, Felix; Plehn, Tilman; Schichtel, Peter
2017-02-22
Here, we study Higgs pair production with a subsequent decay to a pair of photons and a pair of bottoms at the LHC. We use the log-likelihood ratio to identify the kinematic regions which either allow us to separate the di-Higgs signal from backgrounds or to determine the Higgs self-coupling. We find that both regions are separate enough to ensure that details of the background modeling will not affect the determination of the self-coupling. Assuming dominant statistical uncertainties we determine the best precision with which the Higgs self-coupling can be probed in this channel. We finally comment on the samemore » questions at a future 100 TeV collider.« less
Maximizing the significance in Higgs boson pair analyses [Mad-Maximized Higgs Pair Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kling, Felix; Plehn, Tilman; Schichtel, Peter
Here, we study Higgs pair production with a subsequent decay to a pair of photons and a pair of bottoms at the LHC. We use the log-likelihood ratio to identify the kinematic regions which either allow us to separate the di-Higgs signal from backgrounds or to determine the Higgs self-coupling. We find that both regions are separate enough to ensure that details of the background modeling will not affect the determination of the self-coupling. Assuming dominant statistical uncertainties we determine the best precision with which the Higgs self-coupling can be probed in this channel. We finally comment on the samemore » questions at a future 100 TeV collider.« less
Determination of quality parameters from statistical analysis of routine TLD dosimetry data.
German, U; Weinstein, M; Pelled, O
2006-01-01
Following the as low as reasonably achievable (ALARA) practice, there is a need to measure very low doses, of the same order of magnitude as the natural background, and the limits of detection of the dosimetry systems. The different contributions of the background signals to the total zero dose reading of thermoluminescence dosemeter (TLD) cards were analysed by using the common basic definitions of statistical indicators: the critical level (L(C)), the detection limit (L(D)) and the determination limit (L(Q)). These key statistical parameters for the system operated at NRC-Negev were quantified, based on the history of readings of the calibration cards in use. The electronic noise seems to play a minor role, but the reading of the Teflon coating (without the presence of a TLD crystal) gave a significant contribution.
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
Medical statistics and hospital medicine: the case of the smallpox vaccination.
Rusnock, Andrea
2007-01-01
Between 1799 and 1806, trials of vaccination to determine its safety and efficacy were undertaken in hospitals in London, Paris, Vienna, and Boston. These trials were among the first instances of formal hospital evaluations of a medical procedure and signal a growing acceptance of a relatively new approach to medical practice. These early evaluations of smallpox vaccination also relied on descriptive and quantitative accounts, as well as probabilistic analyses, and thus occupy a significant, yet hitherto unexamined, place in the history of medical statistics.
Type I and type II residual stress in iron meteorites determined by neutron diffraction measurements
NASA Astrophysics Data System (ADS)
Caporali, Stefano; Pratesi, Giovanni; Kabra, Saurabh; Grazzi, Francesco
2018-04-01
In this work we present a preliminary investigation by means of neutron diffraction experiment to determine the residual stress state in three different iron meteorites (Chinga, Sikhote Alin and Nantan). Because of the very peculiar microstructural characteristic of this class of samples, all the systematic effects related to the measuring procedure - such as crystallite size and composition - were taken into account and a clear differentiation in the statistical distribution of residual stress in coarse and fine grained meteorites were highlighted. Moreover, the residual stress state was statistically analysed in three orthogonal directions finding evidence of the existence of both type I and type II residual stress components. Finally, the application of von Mises approach allowed to determine the distribution of type II stress.
Stopka, Thomas J; Goulart, Michael A; Meyers, David J; Hutcheson, Marga; Barton, Kerri; Onofrey, Shauna; Church, Daniel; Donahue, Ashley; Chui, Kenneth K H
2017-04-20
Hepatitis C virus (HCV) infections have increased during the past decade but little is known about geographic clustering patterns. We used a unique analytical approach, combining geographic information systems (GIS), spatial epidemiology, and statistical modeling to identify and characterize HCV hotspots, statistically significant clusters of census tracts with elevated HCV counts and rates. We compiled sociodemographic and HCV surveillance data (n = 99,780 cases) for Massachusetts census tracts (n = 1464) from 2002 to 2013. We used a five-step spatial epidemiological approach, calculating incremental spatial autocorrelations and Getis-Ord Gi* statistics to identify clusters. We conducted logistic regression analyses to determine factors associated with the HCV hotspots. We identified nine HCV clusters, with the largest in Boston, New Bedford/Fall River, Worcester, and Springfield (p < 0.05). In multivariable analyses, we found that HCV hotspots were independently and positively associated with the percent of the population that was Hispanic (adjusted odds ratio [AOR]: 1.07; 95% confidence interval [CI]: 1.04, 1.09) and the percent of households receiving food stamps (AOR: 1.83; 95% CI: 1.22, 2.74). HCV hotspots were independently and negatively associated with the percent of the population that were high school graduates or higher (AOR: 0.91; 95% CI: 0.89, 0.93) and the percent of the population in the "other" race/ethnicity category (AOR: 0.88; 95% CI: 0.85, 0.91). We identified locations where HCV clusters were a concern, and where enhanced HCV prevention, treatment, and care can help combat the HCV epidemic in Massachusetts. GIS, spatial epidemiological and statistical analyses provided a rigorous approach to identify hotspot clusters of disease, which can inform public health policy and intervention targeting. Further studies that incorporate spatiotemporal cluster analyses, Bayesian spatial and geostatistical models, spatially weighted regression analyses, and assessment of associations between HCV clustering and the built environment are needed to expand upon our combined spatial epidemiological and statistical methods.
Field and laboratory analyses of water from the Columbia aquifer in Eastern Maryland
Bachman, L.J.
1984-01-01
Field and laboratory analyses of pH, alkalinity, and specific conductance from water samples collected from the Columbia aquifer on the Delmarva Peninsula in eastern Maryland were compared to determine if laboratory analyses could be used for making regional water-quality interpretations. Kruskal-Wallis tests of field and laboratory data indicate that the difference between field and laboratory values is usually not enough to affect the outcome of the statistical tests. Thus, laboratory measurements of these constituents may be adequate for making certain regional water-quality interpretations, although they may result in errors if used for geochemical interpretations.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
NASA Astrophysics Data System (ADS)
Nyarko, B. J. B.; Bredwa-Mensah, Y.; Serfor-Armah, Y.; Dampare, S. B.; Akaho, E. H. K.; Osae, S.; Perbi, A.; Chatt, A.
2007-10-01
Concentrations of trace elements in ancient pottery excavated from Jenini in the Brong Ahafo region of Ghana were determined using instrumental neutron activation analysis (INAA) in conjunction with both conventional and Compton suppression counting. Jenini was a slave Camp of Samory Toure during the indigenous slavery and the Trans-Atlantic slave trade. Pottery fragments found during the excavation of the grave tombs of the slaves who died in the slave camps were analysed. In all, 26 trace elements were determined in 40 pottery fragments. These elemental concentrations were processed using multivariate statistical methods, cluster, factor and discriminant analyses in order to determine similarities and correlation between the various samples. The suitability of the two counting systems for determination of trace elements in pottery objects has been evaluated.
ERIC Educational Resources Information Center
Salamy, A.
1981-01-01
Determines the frequency distribution of Brainstem Auditory Evoked Potential variables (BAEP) for premature babies at different stages of development--normal newborns, infants, young children, and adults. The author concludes that the assumption of normality underlying most "standard" statistical analyses can be met for many BAEP…
Determining Sample Sizes for Precise Contrast Analysis with Heterogeneous Variances
ERIC Educational Resources Information Center
Jan, Show-Li; Shieh, Gwowen
2014-01-01
The analysis of variance (ANOVA) is one of the most frequently used statistical analyses in practical applications. Accordingly, the single and multiple comparison procedures are frequently applied to assess the differences among mean effects. However, the underlying assumption of homogeneous variances may not always be tenable. This study…
A Measurement of Alienation in College Student Marihuana Users and Non-Users.
ERIC Educational Resources Information Center
Harris, Eileen M.
A three part questionnaire was administered to 1380 Southern Illinois University students to: (1) elicit demographic data; (2) determine the extent of experience with marihuana; and (3) measure alienation utilizing Dean's scale. In addition, the Minnesota Multiphasic Personality Lie Inventory was given. Statistical analyses were performed to…
Content and Citation Analyses of "Public Relations Review."
ERIC Educational Resources Information Center
Morton, Linda P.; Lin, Li-Yun
1995-01-01
Analyzes 161 cited and 177 uncited articles published in "Public Relations Review" (1975-93) to determine if 3 independent variables--research methods, type of statistics, and topics--influenced whether or not articles were cited in other research articles. Finds significant differences between quantitative and qualitative research methods but not…
2013-09-30
developed for the Victorian Marine Habitat Mapping Program (Ierodiaconou et al. 2007). These analyses will enable statistical comparisons of prey...determine whether patterns of specialisation observed in the videos reflected long-term trophic niche. The distribution of prey types encountered
The influence of control group reproduction on the statistical ...
Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of breeding pairs of medaka. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) will have on the statistical power of the test. A software tool, the MEOGRT Reproduction Power Analysis Tool, was developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. The manuscript illustrates how the reproductive performance of the control medaka that are used in a MEOGRT influence statistical power, and therefore the successful implementation of the protocol. Example scenarios, based upon medaka reproduction data collected at MED, are discussed that bolster the recommendation that facilities planning to implement the MEOGRT should have a culture of medaka with hi
Statistical analysis of the determinations of the Sun's Galactocentric distance
NASA Astrophysics Data System (ADS)
Malkin, Zinovy
2013-02-01
Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.
Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias
2007-01-10
The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.
Immobilization/remobilization and the regulation of muscle mass
NASA Technical Reports Server (NTRS)
Almon, R. R.
1983-01-01
The relationship between animal body weight and the wet and dry weights of the soleus and EDL muscles was derived. Procedures were examined for tissue homogenization, fractionation, protein determination and DNA determination. A sequence of procedures and buffers were developed to carry out all analyses on one small muscle. This would yield a considerable increase in analytical strength associated with paired statistics. The proposed casting procedure which was to be used for immobilization was reexamined.
Statistical issues in quality control of proteomic analyses: good experimental design and planning.
Cairns, David A
2011-03-01
Quality control is becoming increasingly important in proteomic investigations as experiments become more multivariate and quantitative. Quality control applies to all stages of an investigation and statistics can play a key role. In this review, the role of statistical ideas in the design and planning of an investigation is described. This involves the design of unbiased experiments using key concepts from statistical experimental design, the understanding of the biological and analytical variation in a system using variance components analysis and the determination of a required sample size to perform a statistically powerful investigation. These concepts are described through simple examples and an example data set from a 2-D DIGE pilot experiment. Each of these concepts can prove useful in producing better and more reproducible data. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Statistics for NAEG: past efforts, new results, and future plans
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.
A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.
Hu, Ting; Pan, Qinxin; Andrew, Angeline S; Langer, Jillian M; Cole, Michael D; Tomlinson, Craig R; Karagas, Margaret R; Moore, Jason H
2014-04-11
Several different genetic and environmental factors have been identified as independent risk factors for bladder cancer in population-based studies. Recent studies have turned to understanding the role of gene-gene and gene-environment interactions in determining risk. We previously developed the bioinformatics framework of statistical epistasis networks (SEN) to characterize the global structure of interacting genetic factors associated with a particular disease or clinical outcome. By applying SEN to a population-based study of bladder cancer among Caucasians in New Hampshire, we were able to identify a set of connected genetic factors with strong and significant interaction effects on bladder cancer susceptibility. To support our statistical findings using networks, in the present study, we performed pathway enrichment analyses on the set of genes identified using SEN, and found that they are associated with the carcinogen benzo[a]pyrene, a component of tobacco smoke. We further carried out an mRNA expression microarray experiment to validate statistical genetic interactions, and to determine if the set of genes identified in the SEN were differentially expressed in a normal bladder cell line and a bladder cancer cell line in the presence or absence of benzo[a]pyrene. Significant nonrandom sets of genes from the SEN were found to be differentially expressed in response to benzo[a]pyrene in both the normal bladder cells and the bladder cancer cells. In addition, the patterns of gene expression were significantly different between these two cell types. The enrichment analyses and the gene expression microarray results support the idea that SEN analysis of bladder in population-based studies is able to identify biologically meaningful statistical patterns. These results bring us a step closer to a systems genetic approach to understanding cancer susceptibility that integrates population and laboratory-based studies.
ERIC Educational Resources Information Center
Fulmer, Gavin W.; Polikoff, Morgan S.
2014-01-01
An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…
Sex Differences in the Accuracy of Intonation Perception: A Study of Oral Interpretation Students.
ERIC Educational Resources Information Center
Jensen, Marvin D.; Carlin, Phyllis Scott
1981-01-01
To determine whether gender differences exist in paralinguistic perception, 31 male and 59 female college students listened to an audiotape on which emotions were conveyed through ambiguous verbal statements and then indicated their perceptions of the conveyed emotions. Statistical analyses of the student responses in relation to students'…
Investigation of Primary Mathematics Student Teachers' Concept Images: Cylinder and Cone
ERIC Educational Resources Information Center
Ertekin, Erhan; Yazici, Ersen; Delice, Ali
2014-01-01
The aim of the present study is to determine the influence of concept definitions of cylinder and cone on primary mathematics student teachers' construction of relevant concept images. The study had a relational survey design and the participants were 238 primary mathematics student teachers. Statistical analyses implied the following: mathematics…
Kaur, S; Nieuwenhuijsen, M J
2009-07-01
Short-term human exposure concentrations to PM2.5, ultrafine particle counts (particle range: 0.02-1 microm), and carbon monoxide (CO) were investigated at and around a street canyon intersection in Central London, UK. During a four week field campaign, groups of four volunteers collected samples at three timings (morning, lunch, and afternoon), along two different routes (a heavily trafficked route and a backstreet route) via five modes of transport (walking, cycling, bus, car, and taxi). This was followed by an investigation into the determinants of exposure using a regression technique which incorporated the site-specific traffic counts, meteorological variables (wind speed and temperature) and the mode of transport used. The analyses explained 9, 62, and 43% of the variability observed in the exposure concentrations to PM2.5, ultrafine particle counts, and CO in this study, respectively. The mode of transport was a statistically significant determinant of personal exposure to PM2.5, ultrafine particle counts, and CO, and for PM2.5 and ultrafine particle counts it was the most important determinant. Traffic count explained little of the variability in the PM2.5 concentrations, but it had a greater influence on ultrafine particle count and CO concentrations. The analyses showed that temperature had a statistically significant impact on ultrafine particle count and CO concentrations. Wind speed also had a statistically significant effect but smaller. The small proportion in variability explained in PM2.5 by the model compared to the largest proportion in ultrafine particle counts and CO may be due to the effect of long-range transboundary sources, whereas for ultrafine particle counts and CO, local traffic is the main source.
Correlation between Na/K ratio and electron densities in blood samples of breast cancer patients.
Topdağı, Ömer; Toker, Ozan; Bakırdere, Sezgin; Bursalıoğlu, Ertuğrul Osman; Öz, Ersoy; Eyecioğlu, Önder; Demir, Mustafa; İçelli, Orhan
2018-05-31
The main purpose of this study was to investigate the relationship between the electron densities and Na/K ratio which has important role in breast cancer disease. Determinations of sodium and potassium concentrations in blood samples performed with inductive coupled plasma-atomic emission spectrometry. Electron density values of blood samples were determined via ZXCOM. Statistical analyses were performed for electron densities and Na/K ratio including Kolmogorov-Smirnov normality tests, Spearman's rank correlation test and Mann-Whitney U test. It was found that the electron densities significantly differ between control and breast cancer groups. In addition, statistically significant positive correlation was found between the electron density and Na/K ratios in breast cancer group.
Bremberg, Sven G
2016-08-01
Studies of country-level determinants of health have produced conflicting results even when the analyses have been restricted to high-income counties. Yet, most of these studies have not taken historical, country-specific developments into account. Thus, it is appropriate to separate the influence of current exposures from historical aspects. Determinants of the infant mortality rate (IMR) were studied in 28 OECD countries over the period 1990-2012. Twelve determinants were selected. They refer to the level of general resources, resources that specifically address child health and characteristics that affect knowledge dissemination, including level of trust, and a health related behaviour: the rate of female smoking. Bivariate analyses with the IMR in year 2000 as outcome and the 12 determinants produced six statistically significant models. In multivariate analyses, the rate of decrease in the IMR was investigated as outcome and a history variable (IMR in 1990) was included in the models. The history variable alone explained 95% of the variation. None of the multivariate models, with the 12 determinants included, explained significantly more variation. Taking into account the historical development of the IMR will critically affect correlations between country-level determinants and the IMR. © The Author 2016. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casar, B; Carot, I Mendez; Peterlin, P
2016-06-15
Purpose: Aim of the multi-centre study was to analyse beam hardening effect of the Integral Quality Monitor (IQM) for high energy photon beams used in radiotherapy with linear accelerators. Generic values for attenuation coefficient k(IQM) of IQM system were additionally investigated. Methods: Beam hardening effect of the IQM system was studied for a set of standard nominal photon energies (6 MV–18 MV) and two flattening filter free (FFF) energies (6 MV FFF and 10 MV FFF). PDD curves were measured and analysed for various square radiation fields, with and without IQM in place. Differences between PDD curves were statistically analysedmore » through comparison of respective PDD-20,10 values. Attenuation coefficients k(IQM) were determined for the same range of photon energies. Results: Statistically significant differences in beam qualities for all evaluated high energy photon beams were found, comparing PDD-20,10 values derived from PDD curves with and without IQM in place. Significance of beam hardening effect was statistically proven with high confidence (p < 0,01) for all analysed photon beams except for 15 MV (p = 0,078), although relative differences in beam qualities were minimal, ranging from 0,1 % to 0,5 %. Attenuation of the IQM system showed negligible dependence on radiation field size. However, clinically important dependence of kIQM versus TPRs20,10 was found: 0,941 for 6 MV photon beams, to 0,959 for 18 MV photon beams, with highest uncertainty below 0,006. k(IQM) versus TPRs were tabulated and polynomial equation for the determination of k(IQM) is suggested for clinical use. Conclusion: There was no clinically relevant beam hardening, when IQM system was on linear accelerators. Consequently, no additional commissioning is needed for the IQM system regarding the determination of beam qualities. Generic values for k(IQM) are proposed and can be used as tray factors for complete range of examined photon beam energies.« less
Thomsen, Sarah; Ng, Nawi; Biao, Xu; Bondjers, Göran; Kusnanto, Hari; Liem, Nguyen Tanh; Mavalankar, Dileep; Målqvist, Mats; Diwan, Vinod
2013-03-13
The Millennium Development Goals (MDGs) are monitored using national-level statistics, which have shown substantial improvements in many countries. These statistics may be misleading, however, and may divert resources from disadvantaged populations within the same countries that are showing progress. The purpose of this article is to set out the relevance and design of the "Evidence for Policy and Implementation project (EPI-4)". EPI-4 aims to contribute to the reduction of inequities in the achievement of health-related MDGs in China, India, Indonesia and Vietnam through the promotion of research-informed policymaking. Using a framework provided by the Commission on the Social Determinants of Health (CSDH), we compare national-level MDG targets and results, as well as their social and structural determinants, in China, India, Indonesia and Vietnam. To understand country-level MDG achievements it is useful to analyze their social and structural determinants. This analysis is not sufficient, however, to understand within-country inequities. Specialized analyses are required for this purpose, as is discussion and debate of the results with policymakers, which is the aim of the EPI-4 project. Reducing health inequities requires sophisticated analyses to identify disadvantaged populations within and between countries, and to determine evidence-based solutions that will make a difference. The EPI-4 project hopes to contribute to this goal.
Quantifying Safety Margin Using the Risk-Informed Safety Margin Characterization (RISMC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Brunett, Acacia
2015-04-26
The Risk-Informed Safety Margin Characterization (RISMC), developed by Idaho National Laboratory as part of the Light-Water Reactor Sustainability Project, utilizes a probabilistic safety margin comparison between a load and capacity distribution, rather than a deterministic comparison between two values, as is usually done in best-estimate plus uncertainty analyses. The goal is to determine the failure probability, or in other words, the probability of the system load equaling or exceeding the system capacity. While this method has been used in pilot studies, there has been little work conducted investigating the statistical significance of the resulting failure probability. In particular, it ismore » difficult to determine how many simulations are necessary to properly characterize the failure probability. This work uses classical (frequentist) statistics and confidence intervals to examine the impact in statistical accuracy when the number of simulations is varied. Two methods are proposed to establish confidence intervals related to the failure probability established using a RISMC analysis. The confidence interval provides information about the statistical accuracy of the method utilized to explore the uncertainty space, and offers a quantitative method to gauge the increase in statistical accuracy due to performing additional simulations.« less
Pugh, Aaron L.
2014-01-01
Users of streamflow information often require streamflow statistics and basin characteristics at various locations along a stream. The USGS periodically calculates and publishes streamflow statistics and basin characteristics for streamflowgaging stations and partial-record stations, but these data commonly are scattered among many reports that may or may not be readily available to the public. The USGS also provides and periodically updates regional analyses of streamflow statistics that include regression equations and other prediction methods for estimating statistics for ungaged and unregulated streams across the State. Use of these regional predictions for a stream can be complex and often requires the user to determine a number of basin characteristics that may require interpretation. Basin characteristics may include drainage area, classifiers for physical properties, climatic characteristics, and other inputs. Obtaining these input values for gaged and ungaged locations traditionally has been time consuming, subjective, and can lead to inconsistent results.
Benetz, B A; Diaconu, E; Bowlin, S J; Oak, S S; Laing, R A; Lass, J H
1999-01-01
Compare corneal endothelial image analysis by Konan SP8000 and Bio-Optics Bambi image-analysis systems. Corneal endothelial images from 98 individuals (191 eyes), ranging in age from 4 to 87 years, with a normal slit-lamp examination and no history of ocular trauma, intraocular surgery, or intraocular inflammation were obtained by the Konan SP8000 noncontact specular microscope. One observer analyzed these images by using the Konan system and a second observer by using the Bio-Optics Bambi system. Three methods of analyses were used: a fixed-frame method to obtain cell density (for both Konan and Bio-Optics Bambi) and a "dot" (Konan) or "corners" (Bio-Optics Bambi) method to determine morphometric parameters. The cell density determined by the Konan fixed-frame method was significantly higher (157 cells/mm2) than the Bio-Optics Bambi fixed-frame method determination (p<0.0001). However, the difference in cell density, although still statistically significant, was smaller and reversed comparing the Konan fixed-frame method with both Konan dot and Bio-Optics Bambi comers method (-74 cells/mm2, p<0.0001; -55 cells/mm2, p<0.0001, respectively). Small but statistically significant morphometric analyses differences between Konan and Bio-Optics Bambi were seen: cell density, +19 cells/mm2 (p = 0.03); cell area, -3.0 microm2 (p = 0.008); and coefficient of variation, +1.0 (p = 0.003). There was no statistically significant difference between these two methods in the percentage of six-sided cells detected (p = 0.55). Cell densities measured by the Konan fixed-frame method were comparable with Konan and Bio-Optics Bambi's morphometric analysis, but not with the Bio-Optics Bambi fixed-frame method. The two morphometric analyses were comparable with minimal or no differences for the parameters that were studied. The Konan SP8000 endothelial image-analysis system may be useful for large-scale clinical trials determining cell loss; its noncontact system has many clinical benefits (including patient comfort, safety, ease of use, and short procedure time) and provides reliable cell-density calculations.
Hill, Shannon B; Faradzhev, Nadir S; Powell, Cedric J
2017-12-01
We discuss the problem of quantifying common sources of statistical uncertainties for analyses of trace levels of surface contamination using X-ray photoelectron spectroscopy. We examine the propagation of error for peak-area measurements using common forms of linear and polynomial background subtraction including the correlation of points used to determine both background and peak areas. This correlation has been neglected in previous analyses, but we show that it contributes significantly to the peak-area uncertainty near the detection limit. We introduce the concept of relative background subtraction variance (RBSV) which quantifies the uncertainty introduced by the method of background determination relative to the uncertainty of the background area itself. The uncertainties of the peak area and atomic concentration and of the detection limit are expressed using the RBSV, which separates the contributions from the acquisition parameters, the background-determination method, and the properties of the measured spectrum. These results are then combined to find acquisition strategies that minimize the total measurement time needed to achieve a desired detection limit or atomic-percentage uncertainty for a particular trace element. Minimization of data-acquisition time is important for samples that are sensitive to x-ray dose and also for laboratories that need to optimize throughput.
Radulović, Niko S; Genčić, Marija S; Stojanović, Nikola M; Randjelović, Pavle J; Stojanović-Radić, Zorica Z; Stojiljković, Nenad I
2017-07-01
Neurotoxic thujones (α- and β-diastereoisomers) are common constituents of plant essential oils. In this study, we employed a statistical approach to determine the contribution of thujones to the overall observed behaviour-modulating and toxic effects of essential oils (Salvia officinalis L., Artemisia absinthium L., Thuja occidentalis L. and Tanacetum vulgare L.) containing these monoterpene ketones. The data from three in vivo neuropharmacological tests on rats (open field, light-dark, and diazepam-induced sleep), and toxicity assays (brine shrimp, and antimicrobial activity against a panel of microorganisms), together with the data from detailed chemical analyses, were subjected to a multivariate statistical treatment to reveal the possible correlation(s) between the content of essential-oil constituents and the observed effects. The results strongly imply that the toxic and behaviour-modulating activity of the oils (hundreds of constituents) should not be associated exclusively with thujones. The statistical analyses pinpointed to a number of essential-oil constituents other than thujones that demonstrated a clear correlation with either the toxicity, antimicrobial effect or the activity on CNS. Thus, in addition to the thujone content, the amount and toxicity of other constituents should be taken into consideration when making risk assessment and determining the regulatory status of plants in food and medicines. Copyright © 2017 Elsevier Ltd. All rights reserved.
Improta, Roberto; Vitagliano, Luigi; Esposito, Luciana
2015-11-01
The elucidation of the mutual influence between peptide bond geometry and local conformation has important implications for protein structure refinement, validation, and prediction. To gain insights into the structural determinants and the energetic contributions associated with protein/peptide backbone plasticity, we here report an extensive analysis of the variability of the peptide bond angles by combining statistical analyses of protein structures and quantum mechanics calculations on small model peptide systems. Our analyses demonstrate that all the backbone bond angles strongly depend on the peptide conformation and unveil the existence of regular trends as function of ψ and/or φ. The excellent agreement of the quantum mechanics calculations with the statistical surveys of protein structures validates the computational scheme here employed and demonstrates that the valence geometry of protein/peptide backbone is primarily dictated by local interactions. Notably, for the first time we show that the position of the H(α) hydrogen atom, which is an important parameter in NMR structural studies, is also dependent on the local conformation. Most of the trends observed may be satisfactorily explained by invoking steric repulsive interactions; in some specific cases the valence bond variability is also influenced by hydrogen-bond like interactions. Moreover, we can provide a reliable estimate of the energies involved in the interplay between geometry and conformations. © 2015 Wiley Periodicals, Inc.
msap: a tool for the statistical analysis of methylation-sensitive amplified polymorphism data.
Pérez-Figueroa, A
2013-05-01
In this study msap, an R package which analyses methylation-sensitive amplified polymorphism (MSAP or MS-AFLP) data is presented. The program provides a deep analysis of epigenetic variation starting from a binary data matrix indicating the banding pattern between the isoesquizomeric endonucleases HpaII and MspI, with differential sensitivity to cytosine methylation. After comparing the restriction fragments, the program determines if each fragment is susceptible to methylation (representative of epigenetic variation) or if there is no evidence of methylation (representative of genetic variation). The package provides, in a user-friendly command line interface, a pipeline of different analyses of the variation (genetic and epigenetic) among user-defined groups of samples, as well as the classification of the methylation occurrences in those groups. Statistical testing provides support to the analyses. A comprehensive report of the analyses and several useful plots could help researchers to assess the epigenetic and genetic variation in their MSAP experiments. msap is downloadable from CRAN (http://cran.r-project.org/) and its own webpage (http://msap.r-forge.R-project.org/). The package is intended to be easy to use even for those people unfamiliar with the R command line environment. Advanced users may take advantage of the available source code to adapt msap to more complex analyses. © 2013 Blackwell Publishing Ltd.
First Monte Carlo analysis of fragmentation functions from single-inclusive e + e - annihilation
Sato, Nobuo; Ethier, J. J.; Melnitchouk, W.; ...
2016-12-02
Here, we perform the first iterative Monte Carlo (IMC) analysis of fragmentation functions constrained by all available data from single-inclusive $e^+ e^-$ annihilation into pions and kaons. The IMC method eliminates potential bias in traditional analyses based on single fits introduced by fixing parameters not well contrained by the data, and provides a statistically rigorous determination of uncertainties. Our analysis reveals specific features of fragmentation functions using the new IMC methodology and those obtained from previous analyses, especially for light quarks and for strange quark fragmentation to kaons.
Visualizing statistical significance of disease clusters using cartograms.
Kronenfeld, Barry J; Wong, David W S
2017-05-15
Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.
Bynum, T E; Koch, G G
1991-08-08
We sought to compare the efficacy of sucralfate to placebo for the prevention of duodenal ulcer recurrence and to determine that the efficacy of sucralfate was due to a true reduction in ulcer prevalence and not due to secondary effects such as analgesic activity or accelerated healing. This was a double-blind, randomized, placebo-controlled, parallel groups, multicenter clinical study with 254 patients. All patients had a past history of at least two duodenal ulcers with at least one ulcer diagnosed by endoscopic examination 3 months or less before the start of the study. Complete ulcer healing without erosions was required to enter the study. Sucralfate or placebo were dosed as a 1-g tablet twice a day for 4 months, or until ulcer recurrence. Endoscopic examinations once a month and when symptoms developed determined the presence or absence of duodenal ulcers. If a patient developed an ulcer between monthly scheduled visits, the patient was dosed with a 1-g sucralfate tablet twice a day until the next scheduled visit. Statistical analyses of the results determined the efficacy of sucralfate compared with placebo for preventing duodenal ulcer recurrence. Comparisons of therapeutic agents for preventing duodenal ulcers have usually been made by testing for statistical differences in the cumulative rates for all ulcers developed during a follow-up period, regardless of the time of detection. Statistical experts at the United States Food and Drug Administration (FDA) and on the FDA Advisory Panel expressed doubts about clinical study results based on this type of analysis. They suggested three possible mechanisms for reducing the number of observed ulcers: (a) analgesic effects, (b) accelerated healing, and (c) true ulcer prevention. Traditional ulcer analysis could miss recurring ulcers due to an analgesic effect or accelerated healing. Point-prevalence analysis could miss recurring ulcers due to accelerated healing between endoscopic examinations. Maximum ulcer analyses, a novel statistical method, eliminated analgesic effects by regularly scheduled endoscopies and accelerated healing of recurring ulcers by frequent endoscopies and an open-label phase. Maximum ulcer analysis reflects true ulcer recurrence and prevention. Sucralfate was significantly superior to placebo in reducing ulcer prevalence by all analyses. Significance (p less than 0.05) was found at months 3 and 4 for all analyses. All months were significant in the traditional analysis, months 2-4 in point-prevalence analysis, and months 3-4 in the maximal ulcer prevalence analysis. Sucralfate was shown to be effective for the prevention of duodenal ulcer recurrence by a true reduction in new ulcer development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, W.G.; Spaletto, M.I.; Lewis, K.
The method of plutonium (Pu) determination at the Brunswick Laboratory (NBL) consists of a combination of ion-exchange purification followed by controlled-potential coulometric analysis (IE/CPC). The present report's purpose is to quantify any detectable Pu loss occurring in the ion-exchange (IE) purification step which would cause a negative bias in the NBL method for Pu analysis. The magnitude of any such loss would be contained within the reproducibility (0.05%) of the IE/CPC method which utilizes a state-of-the-art autocoulometer developed at NBL. When the NBL IE/CPC method is used for Pu analysis, any loss in ion-exchange purification (<0.05%) is confounded with themore » repeatability of the ion-exchange and the precision of the CPC analysis technique (<0.05%). Consequently, to detect a bias in the IE/CPC method due to the IE alone using the IE/CPC method itself requires that many randomized analyses on a single material be performed over time and that statistical analysis of the data be performed. The initial approach described in this report to quantify any IE loss was an independent method, Isotope Dilution Mass Spectrometry; however, the number of analyses performed was insufficient to assign a statistically significant value to the IE loss (<0.02% of 10 mg samples of Pu). The second method used for quantifying any IE loss of Pu was multiple ion exchanges of the same Pu aliquant; the small number of analyses possible per individual IE together with the column-to-column variability over multiple ion exchanges prevented statistical detection of any loss of <0.05%. 12 refs.« less
Determinants of Demand for Private Supplementary Tutoring in China: Findings from a National Survey
ERIC Educational Resources Information Center
Liu, Junyan; Bray, Mark
2017-01-01
Private tutoring has expanded and intensified in China. However, no government statistical data or other empirical studies fully capture its extent and characteristics. This paper analyses private tutoring received by students in Grades 1-12 as indicated by a nationwide representative survey entitled China Family Panel Studies. The paper employs a…
A Content Analysis of Dissertations in the Field of Educational Technology: The Case of Turkey
ERIC Educational Resources Information Center
Durak, Gurhan; Cankaya, Serkan; Yunkul, Eyup; Misirli, Zeynel Abidin
2018-01-01
The present study aimed at conducting content analysis on dissertations carried out so far in the field of Educational Technology in Turkey. A total of 137 dissertations were examined to determine the key words, academic discipline, research areas, theoretical frameworks, research designs and models, statistical analyses, data collection tools,…
Proliferative changes in the bronchial epithelium of former smokers treated with retinoids.
Hittelman, Walter N; Liu, Diane D; Kurie, Jonathan M; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C; Walsh, Garrett; Roth, Jack A; Minna, John; Ro, Jae Y; Broxson, Anita; Hong, Waun Ki; Lee, J Jack
2007-11-07
Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and alpha-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and alpha-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67-positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per-biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index > or = 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and alpha-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = -0.72, 95% CI = -1.24 to -0.20; P = .007) compared with placebo, and after 13-cis-RA and alpha-tocopherol treatment (coefficient estimate = -0.66, 95% CI = -1.15 to -0.17; P = .008). In per-subject analyses, treatment with 13-cis-RA and alpha-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments.
Proliferative Changes in the Bronchial Epithelium of Former Smokers Treated With Retinoids
Hittelman, Walter N.; Liu, Diane D.; Kurie, Jonathan M.; Lotan, Reuben; Lee, Jin Soo; Khuri, Fadlo; Ibarguen, Heladio; Morice, Rodolfo C.; Walsh, Garrett; Roth, Jack A.; Minna, John; Ro, Jae Y.; Broxson, Anita; Hong, Waun Ki; Lee, J. Jack
2012-01-01
Background Retinoids have shown antiproliferative and chemopreventive activity. We analyzed data from a randomized, placebo-controlled chemoprevention trial to determine whether a 3-month treatment with either 9-cis-retinoic acid (RA) or 13-cis-RA and α-tocopherol reduced Ki-67, a proliferation biomarker, in the bronchial epithelium. Methods Former smokers (n = 225) were randomly assigned to receive 3 months of daily oral 9-cis-RA (100 mg), 13-cis-RA (1 mg/kg) and α-tocopherol (1200 IU), or placebo. Bronchoscopic biopsy specimens obtained before and after treatment were immunohistochemically assessed for changes in the Ki-67 proliferative index (i.e., percentage of cells with Ki-67–positive nuclear staining) in the basal and parabasal layers of the bronchial epithelium. Per-subject and per–biopsy site analyses were conducted. Multicovariable analyses, including a mixed-effects model and a generalized estimating equations model, were used to investigate the treatment effect (Ki-67 labeling index and percentage of bronchial epithelial biopsy sites with a Ki-67 index ≥ 5%) with adjustment for multiple covariates, such as smoking history and metaplasia. Coefficient estimates and 95% confidence intervals (CIs) were obtained from the models. All statistical tests were two-sided. Results In per-subject analyses, Ki-67 labeling in the basal layer was not changed by any treatment; the percentage of subjects with a high Ki-67 labeling in the parabasal layer dropped statistically significantly after treatment with 13-cis-RA and α-tocopherol treatment (P = .04) compared with placebo, but the drop was not statistically significant after 9-cis-RA treatment (P = .17). A similar effect was observed in the parabasal layer in a per-site analysis; the percentage of sites with high Ki-67 labeling dropped statistically significantly after 9-cis-RA treatment (coefficient estimate = −0.72, 95% CI = −1.24 to −0.20; P = .007) compared with placebo, and after 13-cis-RA and α-tocopherol treatment (coefficient estimate = −0.66, 95% CI = −1.15 to −0.17; P = .008). Conclusions In per-subject analyses, treatment with 13-cis-RA and α-tocopherol, compared with placebo, was statistically significantly associated with reduced bronchial epithelial cell proliferation; treatment with 9-cis-RA was not. In per-site analyses, statistically significant associations were obtained with both treatments. PMID:17971525
Radon-222 concentrations in ground water and soil gas on Indian reservations in Wisconsin
DeWild, John F.; Krohelski, James T.
1995-01-01
For sites with wells finished in the sand and gravel aquifer, the coefficient of determination (R2) of the regression of concentration of radon-222 in ground water as a function of well depth is 0.003 and the significance level is 0.32, which indicates that there is not a statistically significant relation between radon-222 concentrations in ground water and well depth. The coefficient of determination of the regression of radon-222 in ground water and soil gas is 0.19 and the root mean square error of the regression line is 271 picocuries per liter. Even though the significance level (0.036) indicates a statistical relation, the root mean square error of the regression is so large that the regression equation would not give reliable predictions. Because of an inadequate number of samples, similar statistical analyses could not be performed for sites with wells finished in the crystalline and sedimentary bedrock aquifers.
Mallette, Jennifer R; Casale, John F; Jordan, James; Morello, David R; Beyer, Paul M
2016-03-23
Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses ((2)H and (18)O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.
NASA Astrophysics Data System (ADS)
Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.
2016-03-01
Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions.
The sumLINK statistic for genetic linkage analysis in the presence of heterogeneity.
Christensen, G B; Knight, S; Camp, N J
2009-11-01
We present the "sumLINK" statistic--the sum of multipoint LOD scores for the subset of pedigrees with nominally significant linkage evidence at a given locus--as an alternative to common methods to identify susceptibility loci in the presence of heterogeneity. We also suggest the "sumLOD" statistic (the sum of positive multipoint LOD scores) as a companion to the sumLINK. sumLINK analysis identifies genetic regions of extreme consistency across pedigrees without regard to negative evidence from unlinked or uninformative pedigrees. Significance is determined by an innovative permutation procedure based on genome shuffling that randomizes linkage information across pedigrees. This procedure for generating the empirical null distribution may be useful for other linkage-based statistics as well. Using 500 genome-wide analyses of simulated null data, we show that the genome shuffling procedure results in the correct type 1 error rates for both the sumLINK and sumLOD. The power of the statistics was tested using 100 sets of simulated genome-wide data from the alternative hypothesis from GAW13. Finally, we illustrate the statistics in an analysis of 190 aggressive prostate cancer pedigrees from the International Consortium for Prostate Cancer Genetics, where we identified a new susceptibility locus. We propose that the sumLINK and sumLOD are ideal for collaborative projects and meta-analyses, as they do not require any sharing of identifiable data between contributing institutions. Further, loci identified with the sumLINK have good potential for gene localization via statistical recombinant mapping, as, by definition, several linked pedigrees contribute to each peak.
Beretzky, Zsuzsanna; Péntek, Márta
2017-12-01
Informal care plays an important role in ageing societies. To analyse informal care use and its determinants among patients with chronic diseases in Hungary. Patient level data from previous studies in 14 diagnoses were analysed including patients' EQ-5D-3L health status. Descriptive statistics were performed and a linear regression model was built to analyse determinants of informal care time. 2047 patients (female: 58%) with mean age of 58.9 (SD = 16.3) years and EQ-5D-3L index score of 0.64 (SD = 0.33) were involved. 27% received informal care, the average time of care was 7.54 (SD = 26.36) hours/week. Both the rate of informal care use and its time differed significantly between the diagnoses (p<0.05), the highest were in dementia, Parkinsons' disease and in chronic inflammatory immunological diseases. Significant determinants were age, EQ-5D-3L scores, gender and certain diagnosis dummys (R 2 = 0.111). Informal care use is significant in chronic debilitating conditions. Future studies are encouraged to reveal unmet needs, preferences and further explanatory factors. Orv Hetil. 2017; 158(52): 2068-2078.
Association between sleep difficulties as well as duration and hypertension: is BMI a mediator?
Carrillo-Larco, R M; Bernabe-Ortiz, A; Sacksteder, K A; Diez-Canseco, F; Cárdenas, M K; Gilman, R H; Miranda, J J
2017-01-01
Sleep difficulties and short sleep duration have been associated with hypertension. Though body mass index (BMI) may be a mediator variable, the mediation effect has not been defined. We aimed to assess the association between sleep duration and sleep difficulties with hypertension, to determine if BMI is a mediator variable, and to quantify the mediation effect. We conducted a mediation analysis and calculated prevalence ratios with 95% confidence intervals. The exposure variables were sleep duration and sleep difficulties, and the outcome was hypertension. Sleep difficulties were statistically significantly associated with a 43% higher prevalence of hypertension in multivariable analyses; results were not statistically significant for sleep duration. In these analyses, and in sex-specific subgroup analyses, we found no strong evidence that BMI mediated the association between sleep indices and risk of hypertension. Our findings suggest that BMI does not appear to mediate the association between sleep patterns and hypertension. These results highlight the need to further study the mechanisms underlying the relationship between sleep patterns and cardiovascular risk factors.
ERIC Educational Resources Information Center
Dastjerdi, Negin Barat
2016-01-01
The research aims at the evaluation of ICT use in teaching-learning process to the students of Isfahan elementary schools. The method of this research is descriptive-surveying. The statistical population of the study was all teachers of Isfahan elementary schools. The sample size was determined 350 persons that selected through cluster sampling…
Community Design Impacts on Health Habits in Low-income Southern Nevadans.
Coughenour, Courtney; Burns, Mackenzie S
2016-07-01
The purposes of this exploratory study were to: (1) characterize selected community design features; and (2) determine the relationship between select features and physical activity (PA) levels and nutrition habits for a small sample of low-income southern Nevadans. Secondary analysis was conducted on data from selected participants of the Nevada Healthy Homes Partnership program; self-report data on PA and diet habits were compared to national guidelines. Community design features were identified via GIS within a one-mile radius of participants' homes. Descriptive statistics characterized these features and chi-square analyses were conducted to determine the relationship between select features and habits. Data from 71 participants were analyzed; the majority failed to reach either PA or fruit and vegetable guidelines (81.7% and 93.0%, respectively). Many neighborhoods were absent of parks (71.8%), trailheads (36.6%), or pay-for-use PA facilities (47.9%). The mean number of grocery stores was 3.4 ± 2.3 per neighborhood. Chi-square analyses were not statistically significant. Findings were insufficient to make meaningful conclusions, but support the need for health promotion to meet guidelines. More research is needed to assess the impact of health-promoting community design and healthy behaviors, particularly in vulnerable populations.
"What If" Analyses: Ways to Interpret Statistical Significance Test Results Using EXCEL or "R"
ERIC Educational Resources Information Center
Ozturk, Elif
2012-01-01
The present paper aims to review two motivations to conduct "what if" analyses using Excel and "R" to understand the statistical significance tests through the sample size context. "What if" analyses can be used to teach students what statistical significance tests really do and in applied research either prospectively to estimate what sample size…
Thomsen, Sarah; Ng, Nawi; Biao, Xu; Bondjers, Göran; Kusnanto, Hari; Liem, Nguyen Tanh; Mavalankar, Dileep; Målqvist, Mats; Diwan, Vinod
2013-01-01
Background The Millennium Development Goals (MDGs) are monitored using national-level statistics, which have shown substantial improvements in many countries. These statistics may be misleading, however, and may divert resources from disadvantaged populations within the same countries that are showing progress. The purpose of this article is to set out the relevance and design of the “Evidence for Policy and Implementation project (EPI-4)”. EPI-4 aims to contribute to the reduction of inequities in the achievement of health-related MDGs in China, India, Indonesia and Vietnam through the promotion of research-informed policymaking. Methods Using a framework provided by the Commission on the Social Determinants of Health (CSDH), we compare national-level MDG targets and results, as well as their social and structural determinants, in China, India, Indonesia and Vietnam. Results To understand country-level MDG achievements it is useful to analyze their social and structural determinants. This analysis is not sufficient, however, to understand within-country inequities. Specialized analyses are required for this purpose, as is discussion and debate of the results with policymakers, which is the aim of the EPI-4 project. Conclusion Reducing health inequities requires sophisticated analyses to identify disadvantaged populations within and between countries, and to determine evidence-based solutions that will make a difference. The EPI-4 project hopes to contribute to this goal. PMID:23490302
NASA Astrophysics Data System (ADS)
Pieczara, Łukasz
2015-09-01
The paper presents the results of analysis of surface roughness parameters in the Krosno Sandstones of Mucharz, southern Poland. It was aimed at determining whether these parameters are influenced by structural features (mainly the laminar distribution of mineral components and directional distribution of non-isometric grains) and fracture processes. The tests applied in the analysis enabled us to determine and describe the primary statistical parameters used in the quantitative description of surface roughness, as well as specify the usefulness of contact profilometry as a method of visualizing spatial differentiation of fracture processes in rocks. These aims were achieved by selecting a model material (Krosno Sandstones from the Górka-Mucharz Quarry) and an appropriate research methodology. The schedule of laboratory analyses included: identification analyses connected with non-destructive ultrasonic tests, aimed at the preliminary determination of rock anisotropy, strength point load tests (cleaved surfaces were obtained due to destruction of rock samples), microscopic analysis (observation of thin sections in order to determine the mechanism of inducing fracture processes) and a test method of measuring surface roughness (two- and three-dimensional diagrams, topographic and contour maps, and statistical parameters of surface roughness). The highest values of roughness indicators were achieved for surfaces formed under the influence of intragranular fracture processes (cracks propagating directly through grains). This is related to the structural features of the Krosno Sandstones (distribution of lamination and bedding).
Pace, M.N.; Rosentreter, J.J.; Bartholomay, R.C.
2001-01-01
Idaho State University and the US Geological Survey, in cooperation with the US Department of Energy, conducted a study to determine and evaluate strontium distribution coefficients (Kds) of subsurface materials at the Idaho National Engineering and Environmental Laboratory (INEEL). The Kds were determined to aid in assessing the variability of strontium Kds and their effects on chemical transport of strontium-90 in the Snake River Plain aquifer system. Data from batch experiments done to determine strontium Kds of five sediment-infill samples and six standard reference material samples were analyzed by using multiple linear regression analysis and the stepwise variable-selection method in the statistical program, Statistical Product and Service Solutions, to derive an equation of variables that can be used to predict strontium Kds of sediment-infill samples. The sediment-infill samples were from basalt vesicles and fractures from a selected core at the INEEL; strontium Kds ranged from ???201 to 356 ml g-1. The standard material samples consisted of clay minerals and calcite. The statistical analyses of the batch-experiment results showed that the amount of strontium in the initial solution, the amount of manganese oxide in the sample material, and the amount of potassium in the initial solution are the most important variables in predicting strontium Kds of sediment-infill samples.
Perry, Charles A.; Wolock, David M.; Artman, Joshua C.
2004-01-01
Streamflow statistics of flow duration and peak-discharge frequency were estimated for 4,771 individual locations on streams listed on the 1999 Kansas Surface Water Register. These statistics included the flow-duration values of 90, 75, 50, 25, and 10 percent, as well as the mean flow value. Peak-discharge frequency values were estimated for the 2-, 5-, 10-, 25-, 50-, and 100-year floods. Least-squares multiple regression techniques were used, along with Tobit analyses, to develop equations for estimating flow-duration values of 90, 75, 50, 25, and 10 percent and the mean flow for uncontrolled flow stream locations. The contributing-drainage areas of 149 U.S. Geological Survey streamflow-gaging stations in Kansas and parts of surrounding States that had flow uncontrolled by Federal reservoirs and used in the regression analyses ranged from 2.06 to 12,004 square miles. Logarithmic transformations of climatic and basin data were performed to yield the best linear relation for developing equations to compute flow durations and mean flow. In the regression analyses, the significant climatic and basin characteristics, in order of importance, were contributing-drainage area, mean annual precipitation, mean basin permeability, and mean basin slope. The analyses yielded a model standard error of prediction range of 0.43 logarithmic units for the 90-percent duration analysis to 0.15 logarithmic units for the 10-percent duration analysis. The model standard error of prediction was 0.14 logarithmic units for the mean flow. Regression equations used to estimate peak-discharge frequency values were obtained from a previous report, and estimates for the 2-, 5-, 10-, 25-, 50-, and 100-year floods were determined for this report. The regression equations and an interpolation procedure were used to compute flow durations, mean flow, and estimates of peak-discharge frequency for locations along uncontrolled flow streams on the 1999 Kansas Surface Water Register. Flow durations, mean flow, and peak-discharge frequency values determined at available gaging stations were used to interpolate the regression-estimated flows for the stream locations where available. Streamflow statistics for locations that had uncontrolled flow were interpolated using data from gaging stations weighted according to the drainage area and the bias between the regression-estimated and gaged flow information. On controlled reaches of Kansas streams, the streamflow statistics were interpolated between gaging stations using only gaged data weighted by drainage area.
Ndwandwe, Duduzile; Uthman, Olalekan A; Adamu, Abdu A; Sambala, Evanson Z; Wiyeh, Alison B; Olukade, Tawa; Bishwajit, Ghose; Yaya, Sanni; Okwo-Bele, Jean-Marie; Wiysonge, Charles S
2018-04-24
Understanding the gaps in missed opportunities for vaccination (MOV) in sub-Saharan Africa would inform interventions for improving immunisation coverage to achieving universal childhood immunisation. We aimed to conduct a multicountry analyses to decompose the gap in MOV between poor and non-poor in SSA. We used cross-sectional data from 35 Demographic and Health Surveys in SSA conducted between 2007 and 2016. Descriptive statistics used to understand the gap in MOV between the urban poor and non-poor, and across the selected covariates. Out of the 35 countries included in this analysis, 19 countries showed pro-poor inequality, 5 showed pro-non-poor inequality and remaining 11 countries showed no statistically significant inequality. Among the countries with statistically significant pro-illiterate inequality, the risk difference ranged from 4.2% in DR Congo to 20.1% in Kenya. Important factors responsible for the inequality varied across countries. In Madagascar, the largest contributors to inequality in MOV were media access, number of under-five children, and maternal education. However, in Liberia media access narrowed inequality in MOV between poor and non-poor households. The findings indicate that in most SSA countries, children belonging to poor households are most likely to have MOV and that socio-economic inequality in is determined not only by health system functions, but also by factors beyond the scope of health authorities and care delivery system. The findings suggest the need for addressing social determinants of health.
Glaser, Robert; Kurimo, Robert; Shulman, Stanley
2007-08-01
A performance test of NIOSH Method 5524/ASTM Method D-7049-04 for analysis of metalworking fluids (MWF) was conducted. These methods involve determination of the total and extractable weights of MWF samples; extractions are performed using a ternary blend of toluene:dichloromethane:methanol and a binary blend of methanol:water. Six laboratories participated in this study. A preliminary analysis of 20 blank samples was made to familiarize the laboratories with the procedure(s) and to estimate the methods' limits of detection/quantitation (LODs/LOQs). Synthetically generated samples of a semisynthetic MWF aerosol were then collected on tared polytetrafluoroethylene (PTFE) filters and analyzed according to the methods by all participants. Sample masses deposited (approximately 400-500 micro g) corresponded to amounts expected in an 8-hr shift at the NIOSH recommended exposure levels (REL) of 0.4 mg/m(3) (thoracic) and 0.5 mg/m(3) (total particulate). The generator output was monitored with a calibrated laser particle counter. One laboratory significantly underreported the sampled masses relative to the other five labs. A follow-up study compared only gravimetric results of this laboratory with those of two other labs. In the preliminary analysis of blanks; the average LOQs were 0.094 mg for the total weight analysis and 0.136 mg for the extracted weight analyses. For the six-lab study, the average LOQs were 0.064 mg for the total weight analyses and 0.067 mg for the extracted weight analyses. Using ASTM conventions, h and k statistics were computed to determine the degree of consistency of each laboratory with the others. One laboratory experienced problems with precision but not bias. The precision estimates for the remaining five labs were not different statistically (alpha = 0.005) for either the total or extractable weights. For all six labs, the average fraction extracted was > or =0.94 (CV = 0.025). Pooled estimates of the total coefficients of variation of analysis were 0.13 for the total weight samples and 0.13 for the extracted weight samples. An overall method bias of -5% was determined by comparing the overall mean concentration reported by the participants to that determined by the particle counter. In the three-lab follow-up study, the nonconsistent lab reported results that were unbiased but statistically less precise than the others; the average LOQ was 0.133 mg for the total weight analyses. It is concluded that aerosolized MWF sampled at concentrations corresponding to either of the NIOSH RELs can generally be shipped unrefrigerated, stored refrigerated up to 7 days, and then analyzed quantitatively and precisely for MWF using the NIOSH/ASTM procedures.
Recent evaluations of crack-opening-area in circumferentially cracked pipes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, S.; Brust, F.; Ghadiali, N.
1997-04-01
Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less
Machisa, Mercilene; Wichmann, Janine; Nyasulu, Peter S
2013-09-01
This study is the second to investigate the association between the use of biomass fuels (BMF) for household cooking and anaemia and stunting in children. Such fuels include coal, charcoal, wood, dung and crop residues. Data from the 2006-2007 Swaziland Demographic and Health Survey (a cross-sectional study design) were analysed. Childhood stunting was ascertained through age and height, and anaemia through haemoglobin measurement. The association between BMF use and health outcomes was determined in multinomial logistic regression analyses. Various confounders were considered in the analyses. A total of 1150 children aged 6-36 months were included in the statistical analyses, of these 596 (51.8%) and 317 (27.6%) were anaemic and stunted, respectively. BMF use was not significantly associated with childhood anaemia in univariate analysis. Independent risk factors for childhood anaemia were child's age, history of childhood diarrhoea and mother's anaemia status. No statistically significant association was observed between BMF use and childhood stunting, after adjusting for child's gender, age, birth weight and preceding birth interval. This study identified the need to prioritize childhood anaemia and stunting as health outcomes and the introduction of public health interventions in Swaziland. Further research is needed globally on the potential effects of BMF use on childhood anaemia and stunting.
Cohen, Robert L; Murray, John; Jack, Susan; Arscott-Mills, Sharon; Verardi, Vincenzo
2017-01-01
Some health determinants require relatively stronger health system capacity and socioeconomic development than others to impact child mortality. Few quantitative analyses have analyzed how the impact of health determinants varies by mortality level. 149 low- and middle-income countries were stratified into high, moderate, low, and very low baseline levels of child mortality in 1990. Data for 52 health determinants were collected for these countries for 1980-2010. To quantify how changes in health determinants were associated with mortality decline, univariable and multivariable regression models were constructed. An advanced statistical technique that is new for child mortality analyses-MM-estimation with first differences and country clustering-controlled for outliers, fixed effects, and variation across decades. Some health determinants (immunizations, education) were consistently associated with child mortality reduction across all mortality levels. Others (staff availability, skilled birth attendance, fertility, water and sanitation) were associated with child mortality reduction mainly in low or very low mortality settings. The findings indicate that the impact of some health determinants on child mortality was only apparent with stronger health systems, public infrastructure and levels of socioeconomic development, whereas the impact of other determinants was apparent at all stages of development. Multisectoral progress was essential to mortality reduction at all baseline mortality levels. Policy-makers can use such analyses to direct investments in health and non-health sectors and to set five-year child mortality targets appropriate for their baseline mortality levels and local context.
Gupta, Ajay; Kesavabhotla, Kartik; Baradaran, Hediyeh; Kamel, Hooman; Pandya, Ankur; Giambrone, Ashley E.; Wright, Drew; Pain, Kevin J.; Mtui, Edward E.; Suri, Jasjit S.; Sanelli, Pina C.; Mushlin, Alvin I.
2014-01-01
Background and Purpose Ultrasonographic plaque echolucency has been studied as a stroke risk marker in carotid atherosclerotic disease. We performed a systematic review and meta-analysis to summarize the association between ultrasound determined carotid plaque echolucency and future ipsilateral stroke risk. Methods We searched the medical literature for studies evaluating the association between carotid plaque echolucency and future stroke in asymptomatic patients. We included prospective observational studies with stroke outcome ascertainment after baseline carotid plaque echolucency assessment. We performed a meta-analysis and assessed study heterogeneity and publication bias. We also performed subgroup analyses limited to patients with stenosis ≥50%, studies in which plaque echolucency was determined via subjective visual interpretation, studies with a relatively lower risk of bias, and studies published after the year 2000. Results We analyzed data from 7 studies on 7557 subjects with a mean follow up of 37.2 months. We found a significant positive relationship between predominantly echolucent (compared to predominantly echogenic) plaques and the risk of future ipsilateral stroke across all stenosis severities (0-99%) (relative risk [RR], 2.31, 95% CI, 1.58-3.39, P<.001) and in subjects with ≥50% stenosis (RR, 2.61 95% CI, 1.47-4.63, P=.001). A statistically significant increased RR for future stroke was preserved in all additional subgroup analyses. No statistically significant heterogeneity or publication bias was present in any of the meta-analyses. Conclusions The presence of ultrasound-determined carotid plaque echolucency provides predictive information in asymptomatic carotid artery stenosis beyond luminal stenosis. However, the magnitude of the increased risk is not sufficient on its own to identify patients likely to benefit from surgical revascularization. PMID:25406150
Mammographic Breast Density in a Cohort of Medically Underserved Women
2012-10-01
training, faculty from MMC and VUMC will conduct a case-control study of mammographic breast density to investigate its’ association with obesity and...hormones and growth factors, 4) to perform statistical analyses to determine the associations between obesity and insulin resistance and mammographic...on obesity and insulin resistance as they relate to mammographic breast density. We hypothesize that: 1) obesity and insulin resistance, defined
A Fatigue Management System for Sustained Military Operations
2008-03-31
Bradford, Brenda Jones, Margaret Campbell, Heather McCrory, Amy Campbell, Linda Mendez, Juan Cardenas, Beckie Moise, Samuel Cardenas, Fernando...always under the direct observation of research personnel or knowingly monitored from a central control station by closed circuit television, excluding...blocks 1, 5, 9, and 13). Statistical Analyses To determine the appropriate sample size for this study, a power analysis was based on the post
Statistical analysis of lightning electric field measured under Malaysian condition
NASA Astrophysics Data System (ADS)
Salimi, Behnam; Mehranzamir, Kamyar; Abdul-Malek, Zulkurnain
2014-02-01
Lightning is an electrical discharge during thunderstorms that can be either within clouds (Inter-Cloud), or between clouds and ground (Cloud-Ground). The Lightning characteristics and their statistical information are the foundation for the design of lightning protection system as well as for the calculation of lightning radiated fields. Nowadays, there are various techniques to detect lightning signals and to determine various parameters produced by a lightning flash. Each technique provides its own claimed performances. In this paper, the characteristics of captured broadband electric fields generated by cloud-to-ground lightning discharges in South of Malaysia are analyzed. A total of 130 cloud-to-ground lightning flashes from 3 separate thunderstorm events (each event lasts for about 4-5 hours) were examined. Statistical analyses of the following signal parameters were presented: preliminary breakdown pulse train time duration, time interval between preliminary breakdowns and return stroke, multiplicity of stroke, and percentages of single stroke only. The BIL model is also introduced to characterize the lightning signature patterns. Observations on the statistical analyses show that about 79% of lightning signals fit well with the BIL model. The maximum and minimum of preliminary breakdown time duration of the observed lightning signals are 84 ms and 560 us, respectively. The findings of the statistical results show that 7.6% of the flashes were single stroke flashes, and the maximum number of strokes recorded was 14 multiple strokes per flash. A preliminary breakdown signature in more than 95% of the flashes can be identified.
Baqué, Michèle; Amendt, Jens
2013-01-01
Developmental data of juvenile blow flies (Diptera: Calliphoridae) are typically used to calculate the age of immature stages found on or around a corpse and thus to estimate a minimum post-mortem interval (PMI(min)). However, many of those data sets don't take into account that immature blow flies grow in a non-linear fashion. Linear models do not supply a sufficient reliability on age estimates and may even lead to an erroneous determination of the PMI(min). According to the Daubert standard and the need for improvements in forensic science, new statistic tools like smoothing methods and mixed models allow the modelling of non-linear relationships and expand the field of statistical analyses. The present study introduces into the background and application of these statistical techniques by analysing a model which describes the development of the forensically important blow fly Calliphora vicina at different temperatures. The comparison of three statistical methods (linear regression, generalised additive modelling and generalised additive mixed modelling) clearly demonstrates that only the latter provided regression parameters that reflect the data adequately. We focus explicitly on both the exploration of the data--to assure their quality and to show the importance of checking it carefully prior to conducting the statistical tests--and the validation of the resulting models. Hence, we present a common method for evaluating and testing forensic entomological data sets by using for the first time generalised additive mixed models.
Lenehan, Claire E; Tobe, Shanan S; Smith, Renee J; Popelka-Filcoff, Rachel S
2017-01-01
Many archaeological science studies use the concept of "provenance", where the origins of cultural material can be determined through physical or chemical properties that relate back to the origins of the material. Recent studies using DNA profiling of bacteria have been used for the forensic determination of soils, towards determination of geographic origin. This manuscript presents a novel approach to the provenance of archaeological minerals and related materials through the use of 16S rRNA sequencing analysis of microbial DNA. Through the microbial DNA characterization from ochre and multivariate statistics, we have demonstrated the clear discrimination between four distinct Australian cultural ochre sites.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
Rainfall Results of the Florida Area Cumulus Experiment, 1970-76.
NASA Astrophysics Data System (ADS)
Woodley, William L.; Jordan, Jill; Barnston, Anthony; Simpson, Joanne; Biondini, Ron; Flueck, John
1982-02-01
The Florida Area Cumulus Experiment of 1970-76 (FACE-1) is a single-area, randomized, exploratory experiment to determine whether seeding cumuli for dynamic effects (dynamic seeding) can be used to augment convective rainfall over a substantial target area (1.3 × 104 km2) in south Florida. Rainfall is estimated using S-band radar observations after adjustment by raingages. The two primary response variables are rain volumes in the total target (TT) and in the floating target (FT), the most intensely treated portion of the target. The experimental unit is the day and the main observational period is the 6 h after initiation of treatment (silver iodide flares on seed days and either no flares or placebos on control days). Analyses without predictors suggest apparent increases in both the location (means and medians) and the dispersion (standard deviation and interquartile range) characteristics of rainfall due to seeding in the FT and TT variables with substantial statistical support for the FT results and lesser statistical support for the TT results. Analyses of covariance using meteorologically meaningful predictor variables suggest a somewhat larger effect of seeding with stronger statistical support. These results are interpreted in terms of the FACE conceptual model.
Statistical Data Analyses of Trace Chemical, Biochemical, and Physical Analytical Signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Udey, Ruth Norma
Analytical and bioanalytical chemistry measurement results are most meaningful when interpreted using rigorous statistical treatments of the data. The same data set may provide many dimensions of information depending on the questions asked through the applied statistical methods. Three principal projects illustrated the wealth of information gained through the application of statistical data analyses to diverse problems.
Compressor seal rub energetics study
NASA Technical Reports Server (NTRS)
Laverty, W. F.
1978-01-01
The rub mechanics of compressor abradable blade tip seals at simulated engine conditions were investigated. Twelve statistically planned, instrumented rub tests were conducted with titanium blades and Feltmetal fibermetal rubstrips. The tests were conducted with single stationary blades rubbing against seal material bonded to rotating test disks. The instantaneous rub torque, speed, incursion rate and blade temperatures were continuously measured and recorded. Basic rub parameters (incursion rate, rub depth, abradable density, blade thickness and rub velocity) were varied to determine the effects on rub energy and heat split between the blade, rubstrip surface and rub debris. The test data was reduced, energies were determined and statistical analyses were completed to determine the primary and interactive effects. Wear surface morphology, profile measurements and metallographic analysis were used to determine wear, glazing, melting and material transfer. The rub energies for these tests were most significantly affected by the incursion rate while rub velocity and blade thickness were of secondary importance. The ratios of blade wear to seal wear were representative of those experienced in engine operation of these seal system materials.
Hayhoe, Richard P G; Lentjes, Marleen A H; Luben, Robert N; Khaw, Kay-Tee; Welch, Ailsa A
2015-08-01
In our aging population, maintenance of bone health is critical to reduce the risk of osteoporosis and potentially debilitating consequences of fractures in older individuals. Among modifiable lifestyle and dietary factors, dietary magnesium and potassium intakes are postulated to influence bone quality and osteoporosis, principally via calcium-dependent alteration of bone structure and turnover. We investigated the influence of dietary magnesium and potassium intakes, as well as circulating magnesium, on bone density status and fracture risk in an adult population in the United Kingdom. A random subset of 4000 individuals from the European Prospective Investigation into Cancer and Nutrition-Norfolk cohort of 25,639 men and women with baseline data was used for bone density cross-sectional analyses and combined with fracture cases (n = 1502) for fracture case-cohort longitudinal analyses (mean follow-up 13.4 y). Relevant biological, lifestyle, and dietary covariates were used in multivariate regression analyses to determine associations between dietary magnesium and potassium intakes and calcaneal broadband ultrasound attenuation (BUA), as well as in Prentice-weighted Cox regression to determine associated risk of fracture. Separate analyses, excluding dietary covariates, investigated associations of BUA and fractures with serum magnesium concentration. Statistically significant positive trends in calcaneal BUA for women (n = 1360) but not men (n = 968) were apparent across increasing quintiles of magnesium plus potassium (Mg+K) z score intake (P = 0.03) or potassium intake alone (P = 0.04). Reduced hip fracture risk in both men (n = 1958) and women (n = 2755) was evident for individuals in specific Mg+K z score intake quintiles compared with the lowest. Statistically significant trends in fracture risk in men across serum magnesium concentration groups were apparent for spine fractures (P = 0.02) and total hip, spine, and wrist fractures (P = 0.02). None of these individual statistically significant associations remained after adjustment for multiple testing. These findings enhance the limited literature studying the association of magnesium and potassium with bone density and demonstrate that further investigation is warranted into the mechanisms involved and the potential protective role against osteoporosis. © 2015 American Society for Nutrition.
Triković-Janjić, Olivera; Apostolović, Mirjana; Janosević, Mirjana; Filipović, Gordana
2008-02-01
Anthropometric methods of measuring the whole body and body parts are the most commonly applied methods of analysing the growth and development of children. Anthropometric measures are interconnected, so that with growth and development the change of one of the parameters causes the change of the other. The aim of the paper was to analyse whether dental development follows the overall growth and development and what the ratio of this interdependence is. The research involved a sample of 134 participants, aged between 6 and 8 years. Dental age was determined as the average of the sum of existing permanent teeth from the participants aged 6, 7 and 8. With the aim of analysing physical growth and development, commonly accepted anthropometric indexes were applied: height, weight, circumference of the head, the chest cavity at its widest point, the upper arm, the abdomen, the thigh and thickness of the epidermis. The dimensions were measured according to the methodology of the International Biological Programme. The influence of the pertinent variables' related size on the analysed variable was deter mined by the statistical method of multivariable regression. The middle values of all the anthropometric parametres, except for the thickness of the epidermis, were slightly bigger with male participants, and the circumference of the chest cavity was statistically considerably bigger (p < 0.05). The results of anthropometric measurement showed in general a distinct homogeneity not only of the sample group but also within gender, in relation to all the dimensions, excyt for the thickness of the epidermis. The average of the dental age of the participants was 10.36, (10.42 and 10.31 for females and males respectively). Considerable correlation (R = 0.59) with high statistical significance (p < 0.001) was determined between dental age and the set of anthropometric parameters of general growth and development. There is a considerable positive correlation (R = 0.59) between dental age and anthropometric parameters of general growth and development, which confirms that dental development follows the overall growth and development of children, aged between 6 and 8 years.
Gerry, Christopher J
2012-07-01
Cross-national statistical analyses based on country-level panel data are increasingly popular in social epidemiology. To provide reliable results on the societal determinants of health, analysts must give very careful consideration to conceptual and methodological issues: aggregate (historical) data are typically compatible with multiple alternative stories of the data-generating process. Studies in this field which fail to relate their empirical approach to the true underlying data-generating process are likely to produce misleading results if, for example, they misspecify their models by failing to explore the statistical properties of the longitudinal aspect of their data or by ignoring endogeneity issues. We illustrate the importance of this extra need for care with reference to a recent debate on whether discussing the role of rapid mass privatisation can explain post-communist mortality fluctuations. We demonstrate that the finding that rapid mass privatisation was a "crucial determinant" of male mortality fluctuations in the post-communist world is rejected once better consideration is given to the way in which the data are generated. Copyright © 2012 Elsevier Ltd. All rights reserved.
McLawhorn, Alexander S; Steinhaus, Michael E; Southren, Daniel L; Lee, Yuo-Yu; Dodwell, Emily R; Figgie, Mark P
2017-01-01
The purpose of this study was to compare the health-related quality of life (HRQoL) of patients across World Health Organization (WHO) body mass index (BMI) classes before and after total hip arthroplasty (THA). Patients with end-stage hip osteoarthritis who received elective primary unilateral THA were identified through an institutional registry and categorized based on the World Health Organization BMI classification. Age, sex, laterality, year of surgery, and Charlson-Deyo comorbidity index were recorded. The primary outcome was the EQ-5D-3L index and visual analog scale (EQ-VAS) scores at 2 years postoperatively. Inferential statistics and regression analyses were performed to determine associations between BMI classes and HRQoL. EQ-5D-3L scores at baseline and at 2 years were statistically different across BMI classes, with higher EQ-VAS and index scores in patients with lower BMI. There was no difference observed for the 2-year change in EQ-VAS scores, but there was a statistically greater increase in index scores for more obese patients. In the regression analyses, there were statistically significant negative effect estimates for EQ-VAS and index scores associated with increasing BMI class. BMI class is independently associated with lower HRQoL scores 2 years after primary THA. While absolute scores in obese patients were lower than in nonobese patients, obese patients enjoyed more positive changes in EQ-5D index scores after THA. These results may provide the most detailed information on how BMI influences HRQoL before and after THA, and they are relevant to future economic decision analyses on the topic. Copyright © 2016 Elsevier Inc. All rights reserved.
Cohen, Alan A; Milot, Emmanuel; Li, Qing; Legault, Véronique; Fried, Linda P; Ferrucci, Luigi
2014-09-01
Measuring physiological dysregulation during aging could be a key tool both to understand underlying aging mechanisms and to predict clinical outcomes in patients. However, most existing indices are either circular or hard to interpret biologically. Recently, we showed that statistical distance of 14 common blood biomarkers (a measure of how strange an individual's biomarker profile is) was associated with age and mortality in the WHAS II data set, validating its use as a measure of physiological dysregulation. Here, we extend the analyses to other data sets (WHAS I and InCHIANTI) to assess the stability of the measure across populations. We found that the statistical criteria used to determine the original 14 biomarkers produced diverging results across populations; in other words, had we started with a different data set, we would have chosen a different set of markers. Nonetheless, the same 14 markers (or the subset of 12 available for InCHIANTI) produced highly similar predictions of age and mortality. We include analyses of all combinatorial subsets of the markers and show that results do not depend much on biomarker choice or data set, but that more markers produce a stronger signal. We conclude that statistical distance as a measure of physiological dysregulation is stable across populations in Europe and North America. Copyright © 2014 Elsevier Inc. All rights reserved.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
Early Warning Signs of Suicide in Service Members Who Engage in Unauthorized Acts of Violence
2016-06-01
observable to military law enforcement personnel. Statistical analyses tested for differences in warning signs between cases of suicide, violence, or...indicators, (2) Behavioral Change indicators, (3) Social indicators, and (4) Occupational indicators. Statistical analyses were conducted to test for...6 Coding _________________________________________________________________ 7 Statistical
A software platform for statistical evaluation of patient respiratory patterns in radiation therapy.
Dunn, Leon; Kenny, John
2017-10-01
The aim of this work was to design and evaluate a software tool for analysis of a patient's respiration, with the goal of optimizing the effectiveness of motion management techniques during radiotherapy imaging and treatment. A software tool which analyses patient respiratory data files (.vxp files) created by the Varian Real-Time Position Management System (RPM) was developed to analyse patient respiratory data. The software, called RespAnalysis, was created in MATLAB and provides four modules, one each for determining respiration characteristics, providing breathing coaching (biofeedback training), comparing pre and post-training characteristics and performing a fraction-by-fraction assessment. The modules analyse respiratory traces to determine signal characteristics and specifically use a Sample Entropy algorithm as the key means to quantify breathing irregularity. Simulated respiratory signals, as well as 91 patient RPM traces were analysed with RespAnalysis to test the viability of using the Sample Entropy for predicting breathing regularity. Retrospective assessment of patient data demonstrated that the Sample Entropy metric was a predictor of periodic irregularity in respiration data, however, it was found to be insensitive to amplitude variation. Additional waveform statistics assessing the distribution of signal amplitudes over time coupled with Sample Entropy method were found to be useful in assessing breathing regularity. The RespAnalysis software tool presented in this work uses the Sample Entropy method to analyse patient respiratory data recorded for motion management purposes in radiation therapy. This is applicable during treatment simulation and during subsequent treatment fractions, providing a way to quantify breathing irregularity, as well as assess the need for breathing coaching. It was demonstrated that the Sample Entropy metric was correlated to the irregularity of the patient's respiratory motion in terms of periodicity, whilst other metrics, such as percentage deviation of inhale/exhale peak positions provided insight into respiratory amplitude regularity. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Watson, Kara M.; McHugh, Amy R.
2014-01-01
Regional regression equations were developed for estimating monthly flow-duration and monthly low-flow frequency statistics for ungaged streams in Coastal Plain and non-coastal regions of New Jersey for baseline and current land- and water-use conditions. The equations were developed to estimate 87 different streamflow statistics, which include the monthly 99-, 90-, 85-, 75-, 50-, and 25-percentile flow-durations of the minimum 1-day daily flow; the August–September 99-, 90-, and 75-percentile minimum 1-day daily flow; and the monthly 7-day, 10-year (M7D10Y) low-flow frequency. These 87 streamflow statistics were computed for 41 continuous-record streamflow-gaging stations (streamgages) with 20 or more years of record and 167 low-flow partial-record stations in New Jersey with 10 or more streamflow measurements. The regression analyses used to develop equations to estimate selected streamflow statistics were performed by testing the relation between flow-duration statistics and low-flow frequency statistics for 32 basin characteristics (physical characteristics, land use, surficial geology, and climate) at the 41 streamgages and 167 low-flow partial-record stations. The regression analyses determined drainage area, soil permeability, average April precipitation, average June precipitation, and percent storage (water bodies and wetlands) were the significant explanatory variables for estimating the selected flow-duration and low-flow frequency statistics. Streamflow estimates were computed for two land- and water-use conditions in New Jersey—land- and water-use during the baseline period of record (defined as the years a streamgage had little to no change in development and water use) and current land- and water-use conditions (1989–2008)—for each selected station using data collected through water year 2008. The baseline period of record is representative of a period when the basin was unaffected by change in development. The current period is representative of the increased development of the last 20 years (1989–2008). The two different land- and water-use conditions were used as surrogates for development to determine whether there have been changes in low-flow statistics as a result of changes in development over time. The State was divided into two low-flow regression regions, the Coastal Plain and the non-coastal region, in order to improve the accuracy of the regression equations. The left-censored parametric survival regression method was used for the analyses to account for streamgages and partial-record stations that had zero flow values for some of the statistics. The average standard error of estimate for the 348 regression equations ranged from 16 to 340 percent. These regression equations and basin characteristics are presented in the U.S. Geological Survey (USGS) StreamStats Web-based geographic information system application. This tool allows users to click on an ungaged site on a stream in New Jersey and get the estimated flow-duration and low-flow frequency statistics. Additionally, the user can click on a streamgage or partial-record station and get the “at-site” streamflow statistics. The low-flow characteristics of a stream ultimately affect the use of the stream by humans. Specific information on the low-flow characteristics of streams is essential to water managers who deal with problems related to municipal and industrial water supply, fish and wildlife conservation, and dilution of wastewater.
[Statistical analysis using freely-available "EZR (Easy R)" software].
Kanda, Yoshinobu
2015-10-01
Clinicians must often perform statistical analyses for purposes such evaluating preexisting evidence and designing or executing clinical studies. R is a free software environment for statistical computing. R supports many statistical analysis functions, but does not incorporate a statistical graphical user interface (GUI). The R commander provides an easy-to-use basic-statistics GUI for R. However, the statistical function of the R commander is limited, especially in the field of biostatistics. Therefore, the author added several important statistical functions to the R commander and named it "EZR (Easy R)", which is now being distributed on the following website: http://www.jichi.ac.jp/saitama-sct/. EZR allows the application of statistical functions that are frequently used in clinical studies, such as survival analyses, including competing risk analyses and the use of time-dependent covariates and so on, by point-and-click access. In addition, by saving the script automatically created by EZR, users can learn R script writing, maintain the traceability of the analysis, and assure that the statistical process is overseen by a supervisor.
Rivkin, Michael J; Davis, Peter E; Lemaster, Jennifer L; Cabral, Howard J; Warfield, Simon K; Mulkern, Robert V; Robson, Caroline D; Rose-Jacobs, Ruth; Frank, Deborah A
2008-04-01
The objective of this study was to use volumetric MRI to study brain volumes in 10- to 14-year-old children with and without intrauterine exposure to cocaine, alcohol, cigarettes, or marijuana. Volumetric MRI was performed on 35 children (mean age: 12.3 years; 14 with intrauterine exposure to cocaine, 21 with no intrauterine exposure to cocaine) to determine the effect of prenatal drug exposure on volumes of cortical gray matter; white matter; subcortical gray matter; cerebrospinal fluid; and total parenchymal volume. Head circumference was also obtained. Analyses of each individual substance were adjusted for demographic characteristics and the remaining 3 prenatal substance exposures. Regression analyses adjusted for demographic characteristics showed that children with intrauterine exposure to cocaine had lower mean cortical gray matter and total parenchymal volumes and smaller mean head circumference than comparison children. After adjustment for other prenatal exposures, these volumes remained smaller but lost statistical significance. Similar analyses conducted for prenatal ethanol exposure adjusted for demographics showed significant reduction in mean cortical gray matter; total parenchymal volumes; and head circumference, which remained smaller but lost statistical significance after adjustment for the remaining 3 exposures. Notably, prenatal cigarette exposure was associated with significant reductions in cortical gray matter and total parenchymal volumes and head circumference after adjustment for demographics that retained marginal significance after adjustment for the other 3 exposures. Finally, as the number of exposures to prenatal substances grew, cortical gray matter and total parenchymal volumes and head circumference declined significantly with smallest measures found among children exposed to all 4. CONCLUSIONS; These data suggest that intrauterine exposures to cocaine, alcohol, and cigarettes are individually related to reduced head circumference; cortical gray matter; and total parenchymal volumes as measured by MRI at school age. Adjustment for other substance exposures precludes determination of statistically significant individual substance effect on brain volume in this small sample; however, these substances may act cumulatively during gestation to exert lasting effects on brain size and volume.
Analysis of the sleep quality of elderly people using biomedical signals.
Moreno-Alsasua, L; Garcia-Zapirain, B; Mendez-Zorrilla, A
2015-01-01
This paper presents a technical solution that analyses sleep signals captured by biomedical sensors to find possible disorders during rest. Specifically, the method evaluates electrooculogram (EOG) signals, skin conductance (GSR), air flow (AS), and body temperature. Next, a quantitative sleep quality analysis determines significant changes in the biological signals, and any similarities between them in a given time period. Filtering techniques such as the Fourier transform method and IIR filters process the signal and identify significant variations. Once these changes have been identified, all significant data is compared and a quantitative and statistical analysis is carried out to determine the level of a person's rest. To evaluate the correlation and significant differences, a statistical analysis has been calculated showing correlation between EOG and AS signals (p=0,005), EOG, and GSR signals (p=0,037) and, finally, the EOG and Body temperature (p=0,04). Doctors could use this information to monitor changes within a patient.
Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M
2008-03-01
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.
Improving DHH students' grammar through an individualized software program.
Cannon, Joanna E; Easterbrooks, Susan R; Gagné, Phill; Beal-Alvarez, Jennifer
2011-01-01
The purpose of this study was to determine if the frequent use of a targeted, computer software grammar instruction program, used as an individualized classroom activity, would influence the comprehension of morphosyntax structures (determiners, tense, and complementizers) in deaf/hard-of-hearing (DHH) participants who use American Sign Language (ASL). Twenty-six students from an urban day school for the deaf participated in this study. Two hierarchical linear modeling growth curve analyses showed that the influence of LanguageLinks: Syntax Assessment and Intervention (LL) resulted in statistically significant gains in participants' comprehension of morphosyntax structures. Two dependent t tests revealed statistically significant results between the pre- and postintervention assessments on the Diagnostic Evaluation of Language Variation-Norm Referenced. The daily use of LL increased the morphosyntax comprehension of the participants in this study and may be a promising practice for DHH students who use ASL.
Mallette, Jennifer R.; Casale, John F.; Jordan, James; Morello, David R.; Beyer, Paul M.
2016-01-01
Previously, geo-sourcing to five major coca growing regions within South America was accomplished. However, the expansion of coca cultivation throughout South America made sub-regional origin determinations increasingly difficult. The former methodology was recently enhanced with additional stable isotope analyses (2H and 18O) to fully characterize cocaine due to the varying environmental conditions in which the coca was grown. An improved data analysis method was implemented with the combination of machine learning and multivariate statistical analysis methods to provide further partitioning between growing regions. Here, we show how the combination of trace cocaine alkaloids, stable isotopes, and multivariate statistical analyses can be used to classify illicit cocaine as originating from one of 19 growing regions within South America. The data obtained through this approach can be used to describe current coca cultivation and production trends, highlight trafficking routes, as well as identify new coca growing regions. PMID:27006288
McConkey, Roy; Bunting, Brendan; Keogh, Fiona; Garcia Iriarte, Edurne
2017-01-01
A natural experiment contrasted the social relationships of people with intellectual disabilities ( n = 110) before and after they moved from congregated settings to either personalized accommodation or group homes. Contrasts could also be drawn with individuals who had enduring mental health problems ( n = 46) and who experienced similar moves. Face-to-face interviews were conducted in each person's residence on two occasions approximately 24 months apart. Multivariate statistical analyses were used to determine significant effects. Greater proportions of people living in personalized settings scored higher on the five chosen indicators of social relationships than did persons living in grouped accommodation. However, multivariate statistical analyses identified that only one in five persons increased their social relationships as a result of changes in their accommodation, particularly persons with an intellectual disability and high support needs. These findings reinforce the extent of social isolation experienced by people with disabilities and mental health problems that changes in their accommodation only partially counter.
Parasites as valuable stock markers for fisheries in Australasia, East Asia and the Pacific Islands.
Lester, R J G; Moore, B R
2015-01-01
Over 30 studies in Australasia, East Asia and the Pacific Islands region have collected and analysed parasite data to determine the ranges of individual fish, many leading to conclusions about stock delineation. Parasites used as biological tags have included both those known to have long residence times in the fish and those thought to be relatively transient. In many cases the parasitological conclusions have been supported by other methods especially analysis of the chemical constituents of otoliths, and to a lesser extent, genetic data. In analysing parasite data, authors have applied multiple different statistical methodologies, including summary statistics, and univariate and multivariate approaches. Recently, a growing number of researchers have found non-parametric methods, such as analysis of similarities and cluster analysis, to be valuable. Future studies into the residence times, life cycles and geographical distributions of parasites together with more robust analytical methods will yield much important information to clarify stock structures in the area.
Angstman, Nicholas B; Frank, Hans-Georg; Schmitz, Christoph
2016-01-01
As a widely used and studied model organism, Caenorhabditis elegans worms offer the ability to investigate implications of behavioral change. Although, investigation of C. elegans behavioral traits has been shown, analysis is often narrowed down to measurements based off a single point, and thus cannot pick up on subtle behavioral and morphological changes. In the present study videos were captured of four different C. elegans strains grown in liquid cultures and transferred to NGM-agar plates with an E. coli lawn or with no lawn. Using an advanced software, WormLab, the full skeleton and outline of worms were tracked to determine whether the presence of food affects behavioral traits. In all seven investigated parameters, statistically significant differences were found in worm behavior between those moving on NGM-agar plates with an E. coli lawn and NGM-agar plates with no lawn. Furthermore, multiple test groups showed differences in interaction between variables as the parameters that significantly correlated statistically with speed of locomotion varied. In the present study, we demonstrate the validity of a model to analyze C. elegans behavior beyond simple speed of locomotion. The need to account for a nested design while performing statistical analyses in similar studies is also demonstrated. With extended analyses, C. elegans behavioral change can be investigated with greater sensitivity, which could have wide utility in fields such as, but not limited to, toxicology, drug discovery, and RNAi screening.
NASA Astrophysics Data System (ADS)
El Sharawy, Mohamed S.; Gaafar, Gamal R.
2016-12-01
Both reservoir engineers and petrophysicists have been concerned about dividing a reservoir into zones for engineering and petrophysics purposes. Through decades, several techniques and approaches were introduced. Out of them, statistical reservoir zonation, stratigraphic modified Lorenz (SML) plot and the principal component and clustering analyses techniques were chosen to apply on the Nubian sandstone reservoir of Palaeozoic - Lower Cretaceous age, Gulf of Suez, Egypt, by using five adjacent wells. The studied reservoir consists mainly of sandstone with some intercalation of shale layers with varying thickness from one well to another. The permeability ranged from less than 1 md to more than 1000 md. The statistical reservoir zonation technique, depending on core permeability, indicated that the cored interval of the studied reservoir can be divided into two zones. Using reservoir properties such as porosity, bulk density, acoustic impedance and interval transit time indicated also two zones with an obvious variation in separation depth and zones continuity. The stratigraphic modified Lorenz (SML) plot indicated the presence of more than 9 flow units in the cored interval as well as a high degree of microscopic heterogeneity. On the other hand, principal component and cluster analyses, depending on well logging data (gamma ray, sonic, density and neutron), indicated that the whole reservoir can be divided at least into four electrofacies having a noticeable variation in reservoir quality, as correlated with the measured permeability. Furthermore, continuity or discontinuity of the reservoir zones can be determined using this analysis.
Webster, R J; Williams, A; Marchetti, F; Yauk, C L
2018-07-01
Mutations in germ cells pose potential genetic risks to offspring. However, de novo mutations are rare events that are spread across the genome and are difficult to detect. Thus, studies in this area have generally been under-powered, and no human germ cell mutagen has been identified. Whole Genome Sequencing (WGS) of human pedigrees has been proposed as an approach to overcome these technical and statistical challenges. WGS enables analysis of a much wider breadth of the genome than traditional approaches. Here, we performed power analyses to determine the feasibility of using WGS in human families to identify germ cell mutagens. Different statistical models were compared in the power analyses (ANOVA and multiple regression for one-child families, and mixed effect model sampling between two to four siblings per family). Assumptions were made based on parameters from the existing literature, such as the mutation-by-paternal age effect. We explored two scenarios: a constant effect due to an exposure that occurred in the past, and an accumulating effect where the exposure is continuing. Our analysis revealed the importance of modeling inter-family variability of the mutation-by-paternal age effect. Statistical power was improved by models accounting for the family-to-family variability. Our power analyses suggest that sufficient statistical power can be attained with 4-28 four-sibling families per treatment group, when the increase in mutations ranges from 40 to 10% respectively. Modeling family variability using mixed effect models provided a reduction in sample size compared to a multiple regression approach. Much larger sample sizes were required to detect an interaction effect between environmental exposures and paternal age. These findings inform study design and statistical modeling approaches to improve power and reduce sequencing costs for future studies in this area. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
1993-12-01
graduate education required for Ocean Facilities Program (OFP) officers in the Civil Engineer Corps (CEC) of the United States Navy. For the purpose...determined by distributing questionnaires to all officers in the OFP. Statistical analyses of numerical data and judgmental3 analysis of professional...45 B. Ocean Facility Program Officer Graduate Education Questionnaire ....... 47 C. Summary of Questionnaire Responses
ERIC Educational Resources Information Center
Whittington, David H.
2012-01-01
This study included a literature review of juried research studies of student achievement factors that affect African American achievements tracked in the No Child Left Behind Legislative Act. Statistical correlation analyses were performed to determine if the absence or presence of one or two-parents in the household affected student achievement…
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1983-01-01
The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.
Finke, Erinn H; Hickerson, Benjamin; McLaughlin, Eileen
2015-04-01
The purpose of this study was to determine parental attitudes regarding engagement with video games by their children with autism spectrum disorder (ASD) and whether attitudes vary based on ASD symptom severity. Online survey methodology was used to gather information from parents of children with ASD between the ages of 8 and 12 years. The finalized data set included 152 cases. Descriptive statistics and frequency analyses were used to examine participant demographics and video game play. Descriptive and inferential statistics were used to evaluate questions on the theory of planned behavior. Regression analyses determined the predictive ability of the theory of planned behavior constructs, and t tests provided additional descriptive information about between-group differences. Children with ASD play video games. There are no significant differences in the time, intensity, or types of games played based on severity of ASD symptoms (mild vs. moderate). Parents of children with ASD had positive attitudes about video game play. Parents of children with ASD appear to support video game play. On average, parents indicated video game play was positive for their children with ASD, particularly if they believed the games were having a positive impact on their child's development.
Hydrothermal contamination of public supply wells in Napa and Sonoma Valleys, California
Forrest, Matthew J.; Kulongoski, Justin T.; Edwards, Matthew S.; Farrar, Christopher D.; Belitz, Kenneth; Norris, Richard D.
2013-01-01
Groundwater chemistry and isotope data from 44 public supply wells in the Napa and Sonoma Valleys, California were determined to investigate mixing of relatively shallow groundwater with deeper hydrothermal fluids. Multivariate analyses including Cluster Analyses, Multidimensional Scaling (MDS), Principal Components Analyses (PCA), Analysis of Similarities (ANOSIM), and Similarity Percentage Analyses (SIMPER) were used to elucidate constituent distribution patterns, determine which constituents are significantly associated with these hydrothermal systems, and investigate hydrothermal contamination of local groundwater used for drinking water. Multivariate statistical analyses were essential to this study because traditional methods, such as mixing tests involving single species (e.g. Cl or SiO2) were incapable of quantifying component proportions due to mixing of multiple water types. Based on these analyses, water samples collected from the wells were broadly classified as fresh groundwater, saline waters, hydrothermal fluids, or mixed hydrothermal fluids/meteoric water wells. The Multivariate Mixing and Mass-balance (M3) model was applied in order to determine the proportion of hydrothermal fluids, saline water, and fresh groundwater in each sample. Major ions, isotopes, and physical parameters of the waters were used to characterize the hydrothermal fluids as Na–Cl type, with significant enrichment in the trace elements As, B, F and Li. Five of the wells from this study were classified as hydrothermal, 28 as fresh groundwater, two as saline water, and nine as mixed hydrothermal fluids/meteoric water wells. The M3 mixing-model results indicated that the nine mixed wells contained between 14% and 30% hydrothermal fluids. Further, the chemical analyses show that several of these mixed-water wells have concentrations of As, F and B that exceed drinking-water standards or notification levels due to contamination by hydrothermal fluids.
Hrdlickova Kuckova, Stepanka; Rambouskova, Gabriela; Hynek, Radovan; Cejnar, Pavel; Oltrogge, Doris; Fuchs, Robert
2015-11-01
Matrix-assisted laser desorption/ionisation-time of flight (MALDI-TOF) mass spectrometry is commonly used for the identification of proteinaceous binders and their mixtures in artworks. The determination of protein binders is based on a comparison between the m/z values of tryptic peptides in the unknown sample and a reference one (egg, casein, animal glues etc.), but this method has greater potential to study changes due to ageing and the influence of organic/inorganic components on protein identification. However, it is necessary to then carry out statistical evaluation on the obtained data. Before now, it has been complicated to routinely convert the mass spectrometric data into a statistical programme, to extract and match the appropriate peaks. Only several 'homemade' computer programmes without user-friendly interfaces are available for these purposes. In this paper, we would like to present our completely new, publically available, non-commercial software, ms-alone and multiMS-toolbox, for principal component analyses of MALDI-TOF MS data for R software, and their application to the study of the influence of heterogeneous matrices (organic lakes) for protein identification. Using this new software, we determined the main factors that influence the protein analyses of artificially aged model mixtures of organic lakes and fish glue, prepared according to historical recipes that were used for book illumination, using MALDI-TOF peptide mass mapping. Copyright © 2015 John Wiley & Sons, Ltd.
Using R-Project for Free Statistical Analysis in Extension Research
ERIC Educational Resources Information Center
Mangiafico, Salvatore S.
2013-01-01
One option for Extension professionals wishing to use free statistical software is to use online calculators, which are useful for common, simple analyses. A second option is to use a free computing environment capable of performing statistical analyses, like R-project. R-project is free, cross-platform, powerful, and respected, but may be…
Matsuoka, Masanari; Sugita, Masatake; Kikuchi, Takeshi
2014-09-18
Proteins that share a high sequence homology while exhibiting drastically different 3D structures are investigated in this study. Recently, artificial proteins related to the sequences of the GA and IgG binding GB domains of human serum albumin have been designed. These artificial proteins, referred to as GA and GB, share 98% amino acid sequence identity but exhibit different 3D structures, namely, a 3α bundle versus a 4β + α structure. Discriminating between their 3D structures based on their amino acid sequences is a very difficult problem. In the present work, in addition to using bioinformatics techniques, an analysis based on inter-residue average distance statistics is used to address this problem. It was hard to distinguish which structure a given sequence would take only with the results of ordinary analyses like BLAST and conservation analyses. However, in addition to these analyses, with the analysis based on the inter-residue average distance statistics and our sequence tendency analysis, we could infer which part would play an important role in its structural formation. The results suggest possible determinants of the different 3D structures for sequences with high sequence identity. The possibility of discriminating between the 3D structures based on the given sequences is also discussed.
Plant selection for ethnobotanical uses on the Amalfi Coast (Southern Italy).
Savo, V; Joy, R; Caneva, G; McClatchey, W C
2015-07-15
Many ethnobotanical studies have investigated selection criteria for medicinal and non-medicinal plants. In this paper we test several statistical methods using different ethnobotanical datasets in order to 1) define to which extent the nature of the datasets can affect the interpretation of results; 2) determine if the selection for different plant uses is based on phylogeny, or other selection criteria. We considered three different ethnobotanical datasets: two datasets of medicinal plants and a dataset of non-medicinal plants (handicraft production, domestic and agro-pastoral practices) and two floras of the Amalfi Coast. We performed residual analysis from linear regression, the binomial test and the Bayesian approach for calculating under-used and over-used plant families within ethnobotanical datasets. Percentages of agreement were calculated to compare the results of the analyses. We also analyzed the relationship between plant selection and phylogeny, chorology, life form and habitat using the chi-square test. Pearson's residuals for each of the significant chi-square analyses were examined for investigating alternative hypotheses of plant selection criteria. The three statistical analysis methods differed within the same dataset, and between different datasets and floras, but with some similarities. In the two medicinal datasets, only Lamiaceae was identified in both floras as an over-used family by all three statistical methods. All statistical methods in one flora agreed that Malvaceae was over-used and Poaceae under-used, but this was not found to be consistent with results of the second flora in which one statistical result was non-significant. All other families had some discrepancy in significance across methods, or floras. Significant over- or under-use was observed in only a minority of cases. The chi-square analyses were significant for phylogeny, life form and habitat. Pearson's residuals indicated a non-random selection of woody species for non-medicinal uses and an under-use of plants of temperate forests for medicinal uses. Our study showed that selection criteria for plant uses (including medicinal) are not always based on phylogeny. The comparison of different statistical methods (regression, binomial and Bayesian) under different conditions led to the conclusion that the most conservative results are obtained using regression analysis.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
Brushing force of manual and sonic toothbrushes affects dental hard tissue abrasion.
Wiegand, Annette; Burkhard, John Patrik Matthias; Eggmann, Florin; Attin, Thomas
2013-04-01
This study aimed to determine the brushing forces applied during in vivo toothbrushing with manual and sonic toothbrushes and to analyse the effect of these brushing forces on abrasion of sound and eroded enamel and dentin in vitro. Brushing forces of a manual and two sonic toothbrushes (low and high frequency mode) were measured in 27 adults before and after instruction of the respective brushing technique and statistically analysed by repeated measures analysis of variance (ANOVA). In the in vitro experiment, sound and eroded enamel and dentin specimens (each subgroup n = 12) were brushed in an automatic brushing machine with the respective brushing forces using a fluoridated toothpaste slurry. Abrasion was determined by profilometry and statistically analysed by one-way ANOVA. Average brushing force of the manual toothbrush (1.6 ± 0.3 N) was significantly higher than for the sonic toothbrushes (0.9 ± 0.2 N), which were not significantly different from each other. Brushing force prior and after instruction of the brushing technique was not significantly different. The manual toothbrush caused highest abrasion of sound and eroded dentin, but lowest on sound enamel. No significant differences were detected on eroded enamel. Brushing forces of manual and sonic toothbrushes are different and affect their abrasive capacity. Patients with severe tooth wear and exposed and/or eroded dentin surfaces should use sonic toothbrushes to reduce abrasion, while patients without tooth wear or with erosive lesions confining only to enamel do not benefit from sonic toothbrushes with regard to abrasion.
The Problem of Auto-Correlation in Parasitology
Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick
2012-01-01
Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865
Agriculture, population growth, and statistical analysis of the radiocarbon record.
Zahid, H Jabran; Robinson, Erick; Kelly, Robert L
2016-01-26
The human population has grown significantly since the onset of the Holocene about 12,000 y ago. Despite decades of research, the factors determining prehistoric population growth remain uncertain. Here, we examine measurements of the rate of growth of the prehistoric human population based on statistical analysis of the radiocarbon record. We find that, during most of the Holocene, human populations worldwide grew at a long-term annual rate of 0.04%. Statistical analysis of the radiocarbon record shows that transitioning farming societies experienced the same rate of growth as contemporaneous foraging societies. The same rate of growth measured for populations dwelling in a range of environments and practicing a variety of subsistence strategies suggests that the global climate and/or endogenous biological factors, not adaptability to local environment or subsistence practices, regulated the long-term growth of the human population during most of the Holocene. Our results demonstrate that statistical analyses of large ensembles of radiocarbon dates are robust and valuable for quantitatively investigating the demography of prehistoric human populations worldwide.
Fernández-San-Martín, Maria Isabel; Martín-López, Luis Miguel; Masa-Font, Roser; Olona-Tabueña, Noemí; Roman, Yuani; Martin-Royo, Jaume; Oller-Canet, Silvia; González-Tejón, Susana; San-Emeterio, Luisa; Barroso-Garcia, Albert; Viñas-Cabrera, Lidia; Flores-Mateo, Gemma
2014-01-01
Patients with severe mental illness have higher prevalences of cardiovascular risk factors (CRF). The objective is to determine whether interventions to modify lifestyles in these patients reduce anthropometric and analytical parameters related to CRF in comparison to routine clinical practice. Systematic review of controlled clinical trials with lifestyle intervention in Medline, Cochrane Library, Embase, PsycINFO and CINALH. Change in body mass index, waist circumference, cholesterol, triglycerides and blood sugar. Meta-analyses were performed using random effects models to estimate the weighted mean difference. Heterogeneity was determined using i(2) statistical and subgroups analyses. 26 studies were selected. Lifestyle interventions decrease anthropometric and analytical parameters at 3 months follow up. At 6 and 12 months, the differences between the intervention and control groups were maintained, although with less precision. More studies with larger samples and long-term follow-up are needed.
Popov, Stanko Ilić; Stafilov, Trajče; Šajn, Robert; Tănăselia, Claudiu; Bačeva, Katerina
2014-01-01
A systematic study was carried out to investigate the distribution of fifty-six elements in the water samples from river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 27 sampling sites. Analyses were performed by mass spectrometry with inductively coupled plasma (ICP-MS) and atomic emission spectrometry with inductively coupled plasma (ICP-AES). Cluster and R mode factor analysis (FA) was used to identify and characterise element associations and four associations of elements were determined by the method of multivariate statistics. Three factors represent the associations of elements that occur in the river water naturally while Factor 3 represents an anthropogenic association of the elements (Cd, Ga, In, Pb, Re, Tl, Cu, and Zn) introduced in the river waters from the waste waters from the mining and metallurgical activities in the country. PMID:24587756
Popov, Stanko Ilić; Stafilov, Trajče; Sajn, Robert; Tănăselia, Claudiu; Bačeva, Katerina
2014-01-01
A systematic study was carried out to investigate the distribution of fifty-six elements in the water samples from river Vardar (Republic of Macedonia and Greece) and its major tributaries. The samples were collected from 27 sampling sites. Analyses were performed by mass spectrometry with inductively coupled plasma (ICP-MS) and atomic emission spectrometry with inductively coupled plasma (ICP-AES). Cluster and R mode factor analysis (FA) was used to identify and characterise element associations and four associations of elements were determined by the method of multivariate statistics. Three factors represent the associations of elements that occur in the river water naturally while Factor 3 represents an anthropogenic association of the elements (Cd, Ga, In, Pb, Re, Tl, Cu, and Zn) introduced in the river waters from the waste waters from the mining and metallurgical activities in the country.
Immunosequencing now readily generates 103105 sequences per sample ; however, statistical analysis of these repertoires is challenging because of the high genetic...diversity of BCRs and the elaborate clonal relationships among them. To date, most immunosequencing analyses have focused on reporting qualitative ...repertoire differences, (2) identifying how two repertoires differ, and (3) determining appropriate confidence intervals for assessing the size of the differences and their potential biological relevance.
Multifactorial modelling of high-temperature treatment of timber in the saturated water steam medium
NASA Astrophysics Data System (ADS)
Prosvirnikov, D. B.; Safin, R. G.; Ziatdinova, D. F.; Timerbaev, N. F.; Lashkov, V. A.
2016-04-01
The paper analyses experimental data obtained in studies of high-temperature treatment of softwood and hardwood in an environment of saturated water steam. Data were processed in the Curve Expert software for the purpose of statistical modelling of processes and phenomena occurring during this process. The multifactorial modelling resulted in the empirical dependences, allowing determining the main parameters of this type of hydrothermal treatment with high accuracy.
Methods for estimating low-flow statistics for Massachusetts streams
Ries, Kernell G.; Friesz, Paul J.
2000-01-01
Methods and computer software are described in this report for determining flow duration, low-flow frequency statistics, and August median flows. These low-flow statistics can be estimated for unregulated streams in Massachusetts using different methods depending on whether the location of interest is at a streamgaging station, a low-flow partial-record station, or an ungaged site where no data are available. Low-flow statistics for streamgaging stations can be estimated using standard U.S. Geological Survey methods described in the report. The MOVE.1 mathematical method and a graphical correlation method can be used to estimate low-flow statistics for low-flow partial-record stations. The MOVE.1 method is recommended when the relation between measured flows at a partial-record station and daily mean flows at a nearby, hydrologically similar streamgaging station is linear, and the graphical method is recommended when the relation is curved. Equations are presented for computing the variance and equivalent years of record for estimates of low-flow statistics for low-flow partial-record stations when either a single or multiple index stations are used to determine the estimates. The drainage-area ratio method or regression equations can be used to estimate low-flow statistics for ungaged sites where no data are available. The drainage-area ratio method is generally as accurate as or more accurate than regression estimates when the drainage-area ratio for an ungaged site is between 0.3 and 1.5 times the drainage area of the index data-collection site. Regression equations were developed to estimate the natural, long-term 99-, 98-, 95-, 90-, 85-, 80-, 75-, 70-, 60-, and 50-percent duration flows; the 7-day, 2-year and the 7-day, 10-year low flows; and the August median flow for ungaged sites in Massachusetts. Streamflow statistics and basin characteristics for 87 to 133 streamgaging stations and low-flow partial-record stations were used to develop the equations. The streamgaging stations had from 2 to 81 years of record, with a mean record length of 37 years. The low-flow partial-record stations had from 8 to 36 streamflow measurements, with a median of 14 measurements. All basin characteristics were determined from digital map data. The basin characteristics that were statistically significant in most of the final regression equations were drainage area, the area of stratified-drift deposits per unit of stream length plus 0.1, mean basin slope, and an indicator variable that was 0 in the eastern region and 1 in the western region of Massachusetts. The equations were developed by use of weighted-least-squares regression analyses, with weights assigned proportional to the years of record and inversely proportional to the variances of the streamflow statistics for the stations. Standard errors of prediction ranged from 70.7 to 17.5 percent for the equations to predict the 7-day, 10-year low flow and 50-percent duration flow, respectively. The equations are not applicable for use in the Southeast Coastal region of the State, or where basin characteristics for the selected ungaged site are outside the ranges of those for the stations used in the regression analyses. A World Wide Web application was developed that provides streamflow statistics for data collection stations from a data base and for ungaged sites by measuring the necessary basin characteristics for the site and solving the regression equations. Output provided by the Web application for ungaged sites includes a map of the drainage-basin boundary determined for the site, the measured basin characteristics, the estimated streamflow statistics, and 90-percent prediction intervals for the estimates. An equation is provided for combining regression and correlation estimates to obtain improved estimates of the streamflow statistics for low-flow partial-record stations. An equation is also provided for combining regression and drainage-area ratio estimates to obtain improved e
Lee, Yangchool; Jeoung, Bogja
2016-12-01
The purpose of this study was to determine the relationship between the motor skills and the behavior problems of students with intellectual disabilities. The study participants were 117 students with intellectual disabilities who were between 7 and 25 years old (male, n=79; female, n=38) and attending special education schools in South Korea. Motor skill abilities were assessed by using the second version of the Bruininks-Oseretsky test of motor proficiency, which includes subtests in fine motor control, manual coordination, body coordination, strength, and agility. Data were analyzed with SPSS IBM 21 by using correlation and regression analyses, and the significance level was set at P <0.05. The results showed that fine motor precision and integration had a statistically significant influence on aggressive behavior. Manual dexterity showed a statistically significant influence on somatic complaint and anxiety/depression, and bilateral coordination had a statistically significant influence on social problems, attention problem, and aggressive behavior. Our results showed that balance had a statistically significant influence on social problems and aggressive behavior, and speed and agility had a statistically significant influence on social problems and aggressive behavior. Upper limb coordination and strength had a statistically significant influence on social problems.
Cundell, A M; Bean, R; Massimore, L; Maier, C
1998-01-01
To determine the relationship between the sampling time of the environmental monitoring, i.e., viable counts, in aseptic filling areas and the microbial count and frequency of alerts for air, surface and personnel microbial monitoring, statistical analyses were conducted on 1) the frequency of alerts versus the time of day for routine environmental sampling conducted in calendar year 1994, and 2) environmental monitoring data collected at 30-minute intervals during routine aseptic filling operations over two separate days in four different clean rooms with multiple shifts and equipment set-ups at a parenteral manufacturing facility. Statistical analyses showed, except for one floor location that had significantly higher number of counts but no alert or action level samplings in the first two hours of operation, there was no relationship between the number of counts and the time of sampling. Further studies over a 30-day period at the floor location showed no relationship between time of sampling and microbial counts. The conclusion reached in the study was that there is no worst case time for environmental monitoring at that facility and that sampling any time during the aseptic filling operation will give a satisfactory measure of the microbial cleanliness in the clean room during the set-up and aseptic filling operation.
Bayesian approach for counting experiment statistics applied to a neutrino point source analysis
NASA Astrophysics Data System (ADS)
Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.
2013-12-01
In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.
1992-10-01
N=8) and Results of 44 Statistical Analyses for Impact Test Performed on Forefoot of Unworn Footwear A-2. Summary Statistics (N=8) and Results of...on Forefoot of Worn Footwear Vlll Tables (continued) Table Page B-2. Summary Statistics (N=4) and Results of 76 Statistical Analyses for Impact...used tests to assess heel and forefoot shock absorption, upper and sole durability, and flexibility (Cavanagh, 1978). Later, the number of tests was
Statistical issues in the design, conduct and analysis of two large safety studies.
Gaffney, Michael
2016-10-01
The emergence, post approval, of serious medical events, which may be associated with the use of a particular drug or class of drugs, is an important public health and regulatory issue. The best method to address this issue is through a large, rigorously designed safety study. Therefore, it is important to elucidate the statistical issues involved in these large safety studies. Two such studies are PRECISION and EAGLES. PRECISION is the primary focus of this article. PRECISION is a non-inferiority design with a clinically relevant non-inferiority margin. Statistical issues in the design, conduct and analysis of PRECISION are discussed. Quantitative and clinical aspects of the selection of the composite primary endpoint, the determination and role of the non-inferiority margin in a large safety study and the intent-to-treat and modified intent-to-treat analyses in a non-inferiority safety study are shown. Protocol changes that were necessary during the conduct of PRECISION are discussed from a statistical perspective. Issues regarding the complex analysis and interpretation of the results of PRECISION are outlined. EAGLES is presented as a large, rigorously designed safety study when a non-inferiority margin was not able to be determined by a strong clinical/scientific method. In general, when a non-inferiority margin is not able to be determined, the width of the 95% confidence interval is a way to size the study and to assess the cost-benefit of relative trial size. A non-inferiority margin, when able to be determined by a strong scientific method, should be included in a large safety study. Although these studies could not be called "pragmatic," they are examples of best real-world designs to address safety and regulatory concerns. © The Author(s) 2016.
2011-01-01
Background Clinical researchers have often preferred to use a fixed effects model for the primary interpretation of a meta-analysis. Heterogeneity is usually assessed via the well known Q and I2 statistics, along with the random effects estimate they imply. In recent years, alternative methods for quantifying heterogeneity have been proposed, that are based on a 'generalised' Q statistic. Methods We review 18 IPD meta-analyses of RCTs into treatments for cancer, in order to quantify the amount of heterogeneity present and also to discuss practical methods for explaining heterogeneity. Results Differing results were obtained when the standard Q and I2 statistics were used to test for the presence of heterogeneity. The two meta-analyses with the largest amount of heterogeneity were investigated further, and on inspection the straightforward application of a random effects model was not deemed appropriate. Compared to the standard Q statistic, the generalised Q statistic provided a more accurate platform for estimating the amount of heterogeneity in the 18 meta-analyses. Conclusions Explaining heterogeneity via the pre-specification of trial subgroups, graphical diagnostic tools and sensitivity analyses produced a more desirable outcome than an automatic application of the random effects model. Generalised Q statistic methods for quantifying and adjusting for heterogeneity should be incorporated as standard into statistical software. Software is provided to help achieve this aim. PMID:21473747
Gaskin, Cadeyrn J; Happell, Brenda
2014-05-01
To (a) assess the statistical power of nursing research to detect small, medium, and large effect sizes; (b) estimate the experiment-wise Type I error rate in these studies; and (c) assess the extent to which (i) a priori power analyses, (ii) effect sizes (and interpretations thereof), and (iii) confidence intervals were reported. Statistical review. Papers published in the 2011 volumes of the 10 highest ranked nursing journals, based on their 5-year impact factors. Papers were assessed for statistical power, control of experiment-wise Type I error, reporting of a priori power analyses, reporting and interpretation of effect sizes, and reporting of confidence intervals. The analyses were based on 333 papers, from which 10,337 inferential statistics were identified. The median power to detect small, medium, and large effect sizes was .40 (interquartile range [IQR]=.24-.71), .98 (IQR=.85-1.00), and 1.00 (IQR=1.00-1.00), respectively. The median experiment-wise Type I error rate was .54 (IQR=.26-.80). A priori power analyses were reported in 28% of papers. Effect sizes were routinely reported for Spearman's rank correlations (100% of papers in which this test was used), Poisson regressions (100%), odds ratios (100%), Kendall's tau correlations (100%), Pearson's correlations (99%), logistic regressions (98%), structural equation modelling/confirmatory factor analyses/path analyses (97%), and linear regressions (83%), but were reported less often for two-proportion z tests (50%), analyses of variance/analyses of covariance/multivariate analyses of variance (18%), t tests (8%), Wilcoxon's tests (8%), Chi-squared tests (8%), and Fisher's exact tests (7%), and not reported for sign tests, Friedman's tests, McNemar's tests, multi-level models, and Kruskal-Wallis tests. Effect sizes were infrequently interpreted. Confidence intervals were reported in 28% of papers. The use, reporting, and interpretation of inferential statistics in nursing research need substantial improvement. Most importantly, researchers should abandon the misleading practice of interpreting the results from inferential tests based solely on whether they are statistically significant (or not) and, instead, focus on reporting and interpreting effect sizes, confidence intervals, and significance levels. Nursing researchers also need to conduct and report a priori power analyses, and to address the issue of Type I experiment-wise error inflation in their studies. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.
2003-01-01
This report describes the results from Phase II of the AMU's Short-Range Statistical Forecasting task for peak winds at the Shuttle Landing Facility (SLF). The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The 45th Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A seven year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. A PC-based Graphical User Interface (GUI) tool was created to display the data quickly.
NASA Astrophysics Data System (ADS)
Glavanović, Siniša; Glavanović, Marija; Tomišić, Vladislav
2016-03-01
The UV spectrophotometric methods for simultaneous quantitative determination of paracetamol and tramadol in paracetamol-tramadol tablets were developed. The spectrophotometric data obtained were processed by means of partial least squares (PLS) and genetic algorithm coupled with PLS (GA-PLS) methods in order to determine the content of active substances in the tablets. The results gained by chemometric processing of the spectroscopic data were statistically compared with those obtained by means of validated ultra-high performance liquid chromatographic (UHPLC) method. The accuracy and precision of data obtained by the developed chemometric models were verified by analysing the synthetic mixture of drugs, and by calculating recovery as well as relative standard error (RSE). A statistically good agreement was found between the amounts of paracetamol determined using PLS and GA-PLS algorithms, and that obtained by UHPLC analysis, whereas for tramadol GA-PLS results were proven to be more reliable compared to those of PLS. The simplest and the most accurate and precise models were constructed by using the PLS method for paracetamol (mean recovery 99.5%, RSE 0.89%) and the GA-PLS method for tramadol (mean recovery 99.4%, RSE 1.69%).
Incidence of phlebitis and post-infusion phlebitis in hospitalised adults.
Urbanetto, Janete de Souza; Muniz, Franciele de Oliveira Minuto; Silva, Renata Martins da; Freitas, Ana Paula Christo de; Oliveira, Ana Paula Ribeiro de; Santos, Jessica de Cassia Ramos Dos
2017-06-29
to determine the incidence of phlebitis during and after the use of peripheral intravenous catheter (PIC), and analyse the association of this complication with risk factors. cohort study with 165 adult patients admitted to a university hospital in Porto Alegre, totalling 447 accesses, from December 2014 to February 2015. Data were collected on a daily basis and analysed by means of descriptive and analytical statistics. The incidence of phlebitis during PIC was 7.15% and the incidence of post-infusion phlebitis was 22.9%. Phlebitis during catheter use was associated with the use of Amoxicillin + Clavulanic Acid. The grade of post-infusion phlebitis was associated with age and use of Amoxicillin + Clavulanic Acid, Tramadol Hydrochloride, and Amphotericin. The incidence of post-infusion phlebitis proved to be an important indicator to analyse the quality of the healthcare setting.
Summary of hydrologic conditions in Kansas, water year 2014
Robison, Andrew L.
2015-01-01
The U.S. Geological Survey Kansas Water Science Center, in cooperation with Federal, State, and local agencies, maintains a long-term network of hydrologic monitoring gages in the State of Kansas. These include 206 real-time streamgages, 12 real-time reservoir-level monitoring stations, and 32 groundwater monitoring wells. These data and associated analyses, accumulated over time, provide a unique overview of hydrologic conditions and help improve our understanding of Kansas’s water resources. Yearly hydrologic conditions are determined by comparing statistical analyses of current and historical water year data for the period of record. These data are used in protecting life and property, and managing water resources for agricultural, industrial, public supply, ecological, and recreational purposes.
Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan
2006-07-15
ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.
Redmond, Tony; O'Leary, Neil; Hutchison, Donna M; Nicolela, Marcelo T; Artes, Paul H; Chauhan, Balwantray C
2013-12-01
A new analysis method called permutation of pointwise linear regression measures the significance of deterioration over time at each visual field location, combines the significance values into an overall statistic, and then determines the likelihood of change in the visual field. Because the outcome is a single P value, individualized to that specific visual field and independent of the scale of the original measurement, the method is well suited for comparing techniques with different stimuli and scales. To test the hypothesis that frequency-doubling matrix perimetry (FDT2) is more sensitive than standard automated perimetry (SAP) in identifying visual field progression in glaucoma. Patients with open-angle glaucoma and healthy controls were examined by FDT2 and SAP, both with the 24-2 test pattern, on the same day at 6-month intervals in a longitudinal prospective study conducted in a hospital-based setting. Only participants with at least 5 examinations were included. Data were analyzed with permutation of pointwise linear regression. Permutation of pointwise linear regression is individualized to each participant, in contrast to current analyses in which the statistical significance is inferred from population-based approaches. Analyses were performed with both total deviation and pattern deviation. Sixty-four patients and 36 controls were included in the study. The median age, SAP mean deviation, and follow-up period were 65 years, -2.6 dB, and 5.4 years, respectively, in patients and 62 years, +0.4 dB, and 5.2 years, respectively, in controls. Using total deviation analyses, statistically significant deterioration was identified in 17% of patients with FDT2, in 34% of patients with SAP, and in 14% of patients with both techniques; in controls these percentages were 8% with FDT2, 31% with SAP, and 8% with both. Using pattern deviation analyses, statistically significant deterioration was identified in 16% of patients with FDT2, in 17% of patients with SAP, and in 3% of patients with both techniques; in controls these values were 3% with FDT2 and none with SAP. No evidence was found that FDT2 is more sensitive than SAP in identifying visual field deterioration. In about one-third of healthy controls, age-related deterioration with SAP reached statistical significance.
Bradley, Michael T; Brand, Andrew
2016-10-01
Accurate measurement and a cutoff probability with inferential statistics are not wholly compatible. Fisher understood this when he developed the F test to deal with measurement variability and to make judgments on manipulations that may be worth further study. Neyman and Pearson focused on modeled distributions whose parameters were highly determined and concluded that inferential judgments following an F test could be made with accuracy because the distribution parameters were determined. Neyman and Pearson's approach in the application of statistical analyses using alpha and beta error rates has played a dominant role guiding inferential judgments, appropriately in highly determined situations and inappropriately in scientific exploration. Fisher tried to explain the different situations, but, in part due to some obscure wording, generated a long standing dispute that currently has left the importance of Fisher's p < .05 criteria not fully understood and a general endorsement of the Neyman and Pearson error rate approach. Problems were compounded with power calculations based on effect sizes following significant results entering into exploratory science. To understand in a practical sense when each approach should be used, a dimension reflecting varying levels of certainty or knowledge of population distributions is presented. The dimension provides a taxonomy of statistical situations and appropriate approaches by delineating four zones that represent how well the underlying population of interest is defined ranging from exploratory situations to highly determined populations. © The Author(s) 2016.
Chasing the peak: optimal statistics for weak shear analyses
NASA Astrophysics Data System (ADS)
Smit, Merijn; Kuijken, Konrad
2018-01-01
Context. Weak gravitational lensing analyses are fundamentally limited by the intrinsic distribution of galaxy shapes. It is well known that this distribution of galaxy ellipticity is non-Gaussian, and the traditional estimation methods, explicitly or implicitly assuming Gaussianity, are not necessarily optimal. Aims: We aim to explore alternative statistics for samples of ellipticity measurements. An optimal estimator needs to be asymptotically unbiased, efficient, and robust in retaining these properties for various possible sample distributions. We take the non-linear mapping of gravitational shear and the effect of noise into account. We then discuss how the distribution of individual galaxy shapes in the observed field of view can be modeled by fitting Fourier modes to the shear pattern directly. This allows scientific analyses using statistical information of the whole field of view, instead of locally sparse and poorly constrained estimates. Methods: We simulated samples of galaxy ellipticities, using both theoretical distributions and data for ellipticities and noise. We determined the possible bias Δe, the efficiency η and the robustness of the least absolute deviations, the biweight, and the convex hull peeling (CHP) estimators, compared to the canonical weighted mean. Using these statistics for regression, we have shown the applicability of direct Fourier mode fitting. Results: We find an improved performance of all estimators, when iteratively reducing the residuals after de-shearing the ellipticity samples by the estimated shear, which removes the asymmetry in the ellipticity distributions. We show that these estimators are then unbiased in the absence of noise, and decrease noise bias by more than 30%. Our results show that the CHP estimator distribution is skewed, but still centered around the underlying shear, and its bias least affected by noise. We find the least absolute deviations estimator to be the most efficient estimator in almost all cases, except in the Gaussian case, where it's still competitive (0.83 < η < 5.1) and therefore robust. These results hold when fitting Fourier modes, where amplitudes of variation in ellipticity are determined to the order of 10-3. Conclusions: The peak of the ellipticity distribution is a direct tracer of the underlying shear and unaffected by noise, and we have shown that estimators that are sensitive to a central cusp perform more efficiently, potentially reducing uncertainties by more 0% and significantly decreasing noise bias. These results become increasingly important, as survey sizes increase and systematic issues in shape measurements decrease.
Coronado-Montoya, Stephanie; Levis, Alexander W; Kwakkenbos, Linda; Steele, Russell J; Turner, Erick H; Thombs, Brett D
2016-01-01
A large proportion of mindfulness-based therapy trials report statistically significant results, even in the context of very low statistical power. The objective of the present study was to characterize the reporting of "positive" results in randomized controlled trials of mindfulness-based therapy. We also assessed mindfulness-based therapy trial registrations for indications of possible reporting bias and reviewed recent systematic reviews and meta-analyses to determine whether reporting biases were identified. CINAHL, Cochrane CENTRAL, EMBASE, ISI, MEDLINE, PsycInfo, and SCOPUS databases were searched for randomized controlled trials of mindfulness-based therapy. The number of positive trials was described and compared to the number that might be expected if mindfulness-based therapy were similarly effective compared to individual therapy for depression. Trial registries were searched for mindfulness-based therapy registrations. CINAHL, Cochrane CENTRAL, EMBASE, ISI, MEDLINE, PsycInfo, and SCOPUS were also searched for mindfulness-based therapy systematic reviews and meta-analyses. 108 (87%) of 124 published trials reported ≥1 positive outcome in the abstract, and 109 (88%) concluded that mindfulness-based therapy was effective, 1.6 times greater than the expected number of positive trials based on effect size d = 0.55 (expected number positive trials = 65.7). Of 21 trial registrations, 13 (62%) remained unpublished 30 months post-trial completion. No trial registrations adequately specified a single primary outcome measure with time of assessment. None of 36 systematic reviews and meta-analyses concluded that effect estimates were overestimated due to reporting biases. The proportion of mindfulness-based therapy trials with statistically significant results may overstate what would occur in practice.
Barry, Samantha J; Pham, Tran N; Borman, Phil J; Edwards, Andrew J; Watson, Simon A
2012-01-27
The DMAIC (Define, Measure, Analyse, Improve and Control) framework and associated statistical tools have been applied to both identify and reduce variability observed in a quantitative (19)F solid-state NMR (SSNMR) analytical method. The method had been developed to quantify levels of an additional polymorph (Form 3) in batches of an active pharmaceutical ingredient (API), where Form 1 is the predominant polymorph. In order to validate analyses of the polymorphic form, a single batch of API was used as a standard each time the method was used. The level of Form 3 in this standard was observed to gradually increase over time, the effect not being immediately apparent due to method variability. In order to determine the cause of this unexpected increase and to reduce method variability, a risk-based statistical investigation was performed to identify potential factors which could be responsible for these effects. Factors identified by the risk assessment were investigated using a series of designed experiments to gain a greater understanding of the method. The increase of the level of Form 3 in the standard was primarily found to correlate with the number of repeat analyses, an effect not previously reported in SSNMR literature. Differences in data processing (phasing and linewidth) were found to be responsible for the variability in the method. After implementing corrective actions the variability was reduced such that the level of Form 3 was within an acceptable range of ±1% ww(-1) in fresh samples of API. Copyright © 2011. Published by Elsevier B.V.
2015-01-01
Changes in glycosylation have been shown to have a profound correlation with development/malignancy in many cancer types. Currently, two major enrichment techniques have been widely applied in glycoproteomics, namely, lectin affinity chromatography (LAC)-based and hydrazide chemistry (HC)-based enrichments. Here we report the LC–MS/MS quantitative analyses of human blood serum glycoproteins and glycopeptides associated with esophageal diseases by LAC- and HC-based enrichment. The separate and complementary qualitative and quantitative data analyses of protein glycosylation were performed using both enrichment techniques. Chemometric and statistical evaluations, PCA plots, or ANOVA test, respectively, were employed to determine and confirm candidate cancer-associated glycoprotein/glycopeptide biomarkers. Out of 139, 59 common glycoproteins (42% overlap) were observed in both enrichment techniques. This overlap is very similar to previously published studies. The quantitation and evaluation of significantly changed glycoproteins/glycopeptides are complementary between LAC and HC enrichments. LC–ESI–MS/MS analyses indicated that 7 glycoproteins enriched by LAC and 11 glycoproteins enriched by HC showed significantly different abundances between disease-free and disease cohorts. Multiple reaction monitoring quantitation resulted in 13 glycopeptides by LAC enrichment and 10 glycosylation sites by HC enrichment to be statistically different among disease cohorts. PMID:25134008
Khan, Asaduzzaman; Chien, Chi-Wen; Bagraith, Karl S
2015-04-01
To investigate whether using a parametric statistic in comparing groups leads to different conclusions when using summative scores from rating scales compared with using their corresponding Rasch-based measures. A Monte Carlo simulation study was designed to examine between-group differences in the change scores derived from summative scores from rating scales, and those derived from their corresponding Rasch-based measures, using 1-way analysis of variance. The degree of inconsistency between the 2 scoring approaches (i.e. summative and Rasch-based) was examined, using varying sample sizes, scale difficulties and person ability conditions. This simulation study revealed scaling artefacts that could arise from using summative scores rather than Rasch-based measures for determining the changes between groups. The group differences in the change scores were statistically significant for summative scores under all test conditions and sample size scenarios. However, none of the group differences in the change scores were significant when using the corresponding Rasch-based measures. This study raises questions about the validity of the inference on group differences of summative score changes in parametric analyses. Moreover, it provides a rationale for the use of Rasch-based measures, which can allow valid parametric analyses of rating scale data.
Hussain, Bilal; Sultana, Tayyaba; Sultana, Salma; Al-Ghanim, Khalid Abdullah; Masoud, Muhammad Shahreef; Mahboob, Shahid
2018-04-01
Cirrhinus mrigala, Labeo rohita, and Catla catla are economically important fish for human consumption in Pakistan, but industrial and sewage pollution has drastically reduced their population in the River Chenab. Statistics are an important tool to analyze and interpret comet assay results. The specific aims of the study were to determine the DNA damage in Cirrhinus mrigala, Labeo rohita, and Catla catla due to chemical pollution and to assess the validity of statistical analyses to determine the viability of the comet assay for a possible use with these freshwater fish species as a good indicator of pollution load and habitat degradation. Comet assay results indicated a significant (P < 0.05) degree of DNA fragmentation in Cirrhinus mrigala followed by Labeo rohita and Catla catla in respect to comet head diameter, comet tail length, and % DNA damage. Regression analysis and correlation matrices conducted among the parameters of the comet assay affirmed the precision and the legitimacy of the results. The present study, therefore, strongly recommends that genotoxicological studies conduct appropriate analysis of the various components of comet assays to offer better interpretation of the assay data.
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Across the Great Divide: The Effects of Technology in Secondary Biology Classrooms
NASA Astrophysics Data System (ADS)
Worley, Johnny Howard, II
This study investigates the relationship between technology use and student achievement in public high school across North Carolina. The purpose of this study was to determine whether a digital divide (differences in technology utilization based on student demographics of race/ethnicity, gender, socioeconomic status, and municipality) exists among schools and whether those differences relate to student achievement in high school biology classrooms. The study uses North Carolina end-of-course (EOC) data for biology to analyze student demographic data and assessment results from the 2010-2011 school year from the North Carolina Department of Public Instruction. The data analyses use descriptive and factorial univariate statistics to determine the existence of digital divides and their effects on biology achievement. Analysis of these data described patterns of technology use to determine whether potential variances resulted in a digital divide. Specific technology uses were identified in the data and then their impact on biology achievement scores within various demographic groups was examined. Research findings revealed statistically significant variations of use within different population groups. Despite being statistically significant, the relevance of the association in the variations was minimal at best -- based on the effect scale established by Cohen (1988). Additional factorial univariate analyses were employed to determine potential relationships between technology use and student achievement. The data revealed that technology use did not influence the variation of student achievement scale scores as much as race/ethnicity and socioeconomic status. White students outperformed Hispanic students by an average of three scale score points and Black students by an average of six scale score points. Technology use alone averaged less than a one point difference in mean scale scores, and only when interacting with race, gender, and/or SES did the mean difference increase. However, this increase within the context of the biology scale score range was negligible. This study contributes to the existing body of research on the effects of technology use on student achievement and its influence within various student demographic groups and municipalities. The study also provides additional research information for effective technology utilization, implementation, and instruction in educational environments.
40 CFR 91.512 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis for... will be made available to the public during Agency business hours. ...
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-05-25
High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative.
ERIC Educational Resources Information Center
Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.
2010-01-01
This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carver, A; Rowbottom, C
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less
Henríquez-Hernández, Luis Alberto; Valenciano, Almudena; Foro-Arnalot, Palmira; Álvarez-Cubero, María Jesús; Cozar, José Manuel; Suárez-Novo, José Francisco; Castells-Esteve, Manel; Fernández-Gonzalo, Pablo; De-Paula-Carranza, Belén; Ferrer, Montse; Guedea, Ferrán; Sancho-Pardo, Gemma; Craven-Bartle, Jordi; Ortiz-Gordillo, María José; Cabrera-Roldán, Patricia; Rodríguez-Melcón, Juan Ignacio; Herrera-Ramos, Estefanía; Rodríguez-Gallego, Carlos; Lara, Pedro C
2015-07-01
Prostate cancer (PCa) is an androgen-dependent disease. Nonetheless, the role of single nucleotide polymorphisms (SNPs) in genes encoding androgen metabolism remains an unexplored area. To investigate the role of germline variations in cytochrome P450 17A1 (CYP17A1) and steroid-5α-reductase, α-polypeptides 1 and 2 (SRD5A1 and SRD5A2) genes in PCa. In total, 494 consecutive Spanish patients diagnosed with nonmetastatic localized PCa were included in this multicenter study and were genotyped for 32 SNPs in SRD5A1, SRD5A2, and CYP17A1 genes using a Biotrove OpenArray NT Cycler. Clinical data were available. Genotypic and allelic frequencies, as well as haplotype analyses, were determined using the web-based environment SNPator. All additional statistical analyses comparing clinical data and SNPs were performed using PASW Statistics 15. The call rate obtained (determined as the percentage of successful determinations) was 97.3% of detection. A total of 2 SNPs in SRD5A1-rs3822430 and rs1691053-were associated with prostate-specific antigen level at diagnosis. Moreover, G carriers for both SNPs were at higher risk of presenting initial prostate-specific antigen levels>20ng/ml (Exp(B) = 2.812, 95% CI: 1.397-5.657, P = 0.004) than those who are AA-AA carriers. Haplotype analyses showed that patients with PCa nonhomozygous for the haplotype GCTTGTAGTA were at an elevated risk of presenting bigger clinical tumor size (Exp(B) = 3.823, 95% CI: 1.280-11.416, P = 0.016), and higher Gleason score (Exp(B) = 2.808, 95% CI: 1.134-6.953, P = 0.026). SNPs in SRD5A1 seem to affect the clinical characteristics of Spanish patients with PCa. Copyright © 2015 Elsevier Inc. All rights reserved.
Flynn, Kevin; Swintek, Joe; Johnson, Rodney
2017-02-01
Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of medaka breeding pairs. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) would have on the statistical power of the test. The MEOGRT Reproduction Power Analysis Tool (MRPAT) is a software tool developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. Example scenarios are detailed that highlight the importance of the reproductive parameters on statistical power. When control fecundity is increased from 21 to 38 eggs per pair per day and the variance decreased from 49 to 20, the gain in power is equivalent to increasing replication by 2.5 times. On the other hand, if 10% of the breeding pairs, including controls, do not spawn, the power to detect a 40% decrease in fecundity drops to 0.54 from nearly 0.98 when all pairs have some level of egg production. Perhaps most importantly, MRPAT was used to inform the decision making process that lead to the final recommendation of the MEOGRT to have 24 control breeding pairs and 12 breeding pairs in each exposure group. Published by Elsevier Inc.
Algorithm for Identifying Erroneous Rain-Gauge Readings
NASA Technical Reports Server (NTRS)
Rickman, Doug
2005-01-01
An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.
NASA Astrophysics Data System (ADS)
Syahrom, N. S.; Nasrudin, N. S.; Mohamad Yasin, N.; Azlan, N.; Manap, N.
2017-08-01
It has been reported that students are already accumulating substantial debt from higher education fees and their spending habit are at a critical stage. These situations cause the youngsters facing bankruptcy if they cannot improve their saving habit. Many researches have acknowledged that the determinants of saving habit include financial management, parental socialization, peer influence, and self-control. However, there remains a need for investigating the relationship between these determinants in order to avoid bankruptcy among youngsters. The objectives of this study are to investigate the relationship between saving habit determinants and to generate a statistical model based on the determinants of saving habit. Data collection method used is questionnaire and its scope is undergraduate students of UiTM Negeri Sembilan, Kampus Seremban. Samples are chosen by using stratified probability sampling technique and cross-sectional method is used as the research design. The results are then analysed by using descriptive analysis, reliability test, Pearson Correlation, and Multiple Linear Regression analysis. It shows that saving habit and self-control has no relationship. It means that students with less self-control tend to save less. A statistical model has been generated that incorporated this relationship. This study is significant to help the students save more by having high self-control.
Citation of previous meta-analyses on the same topic: a clue to perpetuation of incorrect methods?
Li, Tianjing; Dickersin, Kay
2013-06-01
Systematic reviews and meta-analyses serve as a basis for decision-making and clinical practice guidelines and should be carried out using appropriate methodology to avoid incorrect inferences. We describe the characteristics, statistical methods used for meta-analyses, and citation patterns of all 21 glaucoma systematic reviews we identified pertaining to the effectiveness of prostaglandin analog eye drops in treating primary open-angle glaucoma, published between December 2000 and February 2012. We abstracted data, assessed whether appropriate statistical methods were applied in meta-analyses, and examined citation patterns of included reviews. We identified two forms of problematic statistical analyses in 9 of the 21 systematic reviews examined. Except in 1 case, none of the 9 reviews that used incorrect statistical methods cited a previously published review that used appropriate methods. Reviews that used incorrect methods were cited 2.6 times more often than reviews that used appropriate statistical methods. We speculate that by emulating the statistical methodology of previous systematic reviews, systematic review authors may have perpetuated incorrect approaches to meta-analysis. The use of incorrect statistical methods, perhaps through emulating methods described in previous research, calls conclusions of systematic reviews into question and may lead to inappropriate patient care. We urge systematic review authors and journal editors to seek the advice of experienced statisticians before undertaking or accepting for publication a systematic review and meta-analysis. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
40 CFR 90.712 - Request for public hearing.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sampling plans and statistical analyses have been properly applied (specifically, whether sampling procedures and statistical analyses specified in this subpart were followed and whether there exists a basis... Clerk and will be made available to the public during Agency business hours. ...
Prevalence, Trend and Determining Factors of Gestational Diabetes in Germany.
Huy, C; Loerbroks, A; Hornemann, A; Röhrig, S; Schneider, S
2012-04-01
Purpose: The true prevalence of gestational diabetes in Germany is unknown. Thus, the study's purposes were to estimate the prevalence of gestational diabetes as well as to describe the temporal prevalence trend and to identify determinants. Material and Methods: We calculated prevalence estimates based on two datasets: the register-based German perinatal statistic (n = 650 232) and the maternal self-reports from the German children and youth health survey (KiGGS; n = 15 429). Differences between prevalence estimates were analysed using χ 2 and trend tests, and determinants were identified using logistic regression. Results: According to the perinatal statistic, gestational diabetes was present in 3.7 % of pregnant women in Germany in 2010. The prevalence across the years 2001 to 2006 was estimated at 1.9 % which differed significantly from the prevalence estimate derived from the KiGGS dataset for the same period of time (5.3 %; 95 % confidence interval: 4.6-6.1 %). Both datasets show an increasing trend of gestational diabetes (p < 0.001). The risk for gestational diabetes was mainly associated with age, BMI and social class of pregnant women as well as with multiple pregnancies. Conclusion: The lack of significant screening studies among representative samples hampers a sound estimation of the true prevalence of gestational diabetes in Germany. The increasing trend in gestational diabetes might continue due to the projected increase of important risk factors (e.g., maternal age, obesity). Our analyses support the current consensus recommendations regarding standardised gestational diabetes screening.
Statistical Analyses of Femur Parameters for Designing Anatomical Plates.
Wang, Lin; He, Kunjin; Chen, Zhengming
2016-01-01
Femur parameters are key prerequisites for scientifically designing anatomical plates. Meanwhile, individual differences in femurs present a challenge to design well-fitting anatomical plates. Therefore, to design anatomical plates more scientifically, analyses of femur parameters with statistical methods were performed in this study. The specific steps were as follows. First, taking eight anatomical femur parameters as variables, 100 femur samples were classified into three classes with factor analysis and Q-type cluster analysis. Second, based on the mean parameter values of the three classes of femurs, three sizes of average anatomical plates corresponding to the three classes of femurs were designed. Finally, based on Bayes discriminant analysis, a new femur could be assigned to the proper class. Thereafter, the average anatomical plate suitable for that new femur was selected from the three available sizes of plates. Experimental results showed that the classification of femurs was quite reasonable based on the anatomical aspects of the femurs. For instance, three sizes of condylar buttress plates were designed. Meanwhile, 20 new femurs are judged to which classes the femurs belong. Thereafter, suitable condylar buttress plates were determined and selected.
Practice-based evidence study design for comparative effectiveness research.
Horn, Susan D; Gassaway, Julie
2007-10-01
To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.
Sieve analysis in HIV-1 vaccine efficacy trials
Edlefsen, Paul T.; Gilbert, Peter B.; Rolland, Morgane
2013-01-01
Purpose of review The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. Recent findings The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 and RV144, led to numerous studies in the last five years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Summary Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons while correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection. PMID:23719202
Sieve analysis in HIV-1 vaccine efficacy trials.
Edlefsen, Paul T; Gilbert, Peter B; Rolland, Morgane
2013-09-01
The genetic characterization of HIV-1 breakthrough infections in vaccine and placebo recipients offers new ways to assess vaccine efficacy trials. Statistical and sequence analysis methods provide opportunities to mine the mechanisms behind the effect of an HIV vaccine. The release of results from two HIV-1 vaccine efficacy trials, Step/HVTN-502 (HIV Vaccine Trials Network-502) and RV144, led to numerous studies in the last 5 years, including efforts to sequence HIV-1 breakthrough infections and compare viral characteristics between the vaccine and placebo groups. Novel genetic and statistical analysis methods uncovered features that distinguished founder viruses isolated from vaccinees from those isolated from placebo recipients, and identified HIV-1 genetic targets of vaccine-induced immune responses. Studies of HIV-1 breakthrough infections in vaccine efficacy trials can provide an independent confirmation to correlates of risk studies, as they take advantage of vaccine/placebo comparisons, whereas correlates of risk analyses are limited to vaccine recipients. Through the identification of viral determinants impacted by vaccine-mediated host immune responses, sieve analyses can shed light on potential mechanisms of vaccine protection.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Karuppiah, R.; Faldi, A.; Laurenzi, I.; Usadi, A.; Venkatesh, A.
2014-12-01
An increasing number of studies are focused on assessing the environmental footprint of different products and processes, especially using life cycle assessment (LCA). This work shows how combining statistical methods and Geographic Information Systems (GIS) with environmental analyses can help improve the quality of results and their interpretation. Most environmental assessments in literature yield single numbers that characterize the environmental impact of a process/product - typically global or country averages, often unchanging in time. In this work, we show how statistical analysis and GIS can help address these limitations. For example, we demonstrate a method to separately quantify uncertainty and variability in the result of LCA models using a power generation case study. This is important for rigorous comparisons between the impacts of different processes. Another challenge is lack of data that can affect the rigor of LCAs. We have developed an approach to estimate environmental impacts of incompletely characterized processes using predictive statistical models. This method is applied to estimate unreported coal power plant emissions in several world regions. There is also a general lack of spatio-temporal characterization of the results in environmental analyses. For instance, studies that focus on water usage do not put in context where and when water is withdrawn. Through the use of hydrological modeling combined with GIS, we quantify water stress on a regional and seasonal basis to understand water supply and demand risks for multiple users. Another example where it is important to consider regional dependency of impacts is when characterizing how agricultural land occupation affects biodiversity in a region. We developed a data-driven methodology used in conjuction with GIS to determine if there is a statistically significant difference between the impacts of growing different crops on different species in various biomes of the world.
Information processing requirements for on-board monitoring of automatic landing
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Karmarkar, J. S.
1977-01-01
A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.
Period variations of Algol-type eclipsing binaries AD And, TWCas and IV Cas
NASA Astrophysics Data System (ADS)
Parimucha, Štefan; Gajdoš, Pavol; Kudak, Viktor; Fedurco, Miroslav; Vaňko, Martin
2018-04-01
We present new analyses of variations in O – C diagrams of three Algol-type eclipsing binary stars: AD And, TW Cas and IV Cas. We have used all published minima times (including visual and photographic) as well as newly determined ones from our and SuperWasp observations. We determined orbital parameters of 3rd bodies in the systems with statistically significant errors, using our code based on genetic algorithms and Markov chain Monte Carlo simulations. We confirmed the multiple nature of AD And and the triple-star model of TW Cas, and we proposed a quadruple-star model of IV Cas.
Statistical analysis of radiation dose derived from ingestion of foods
NASA Astrophysics Data System (ADS)
Dougherty, Ward L.
2001-09-01
This analysis undertook the task of designing and implementing a methodology to determine an individual's probabilistic radiation dose from ingestion of foods utilizing Crystal Ball. A dietary intake model was determined by comparing previous existing models. Two principal radionuclides were considered-Lead210 (Pb-210) and Radium 226 (Ra-226). Samples from three different local grocery stores-Publix, Winn Dixie, and Albertsons-were counted on a gamma spectroscopy system with a GeLi detector. The same food samples were considered as those in the original FIPR database. A statistical analysis, utilizing the Crystal Ball program, was performed on the data to assess the most accurate distribution to use for these data. This allowed a determination of a radiation dose to an individual based on the above-information collected. Based on the analyses performed, radiation dose for grocery store samples was lower for Radium-226 than FIPR debris analyses, 2.7 vs. 5.91 mrem/yr. Lead-210 had a higher dose in the grocery store sample than the FIPR debris analyses, 21.4 vs. 518 mrem/yr. The output radiation dose was higher for all evaluations when an accurate estimation of distributions for each value was considered. Radium-226 radiation dose for FIPR and grocery rose to 9.56 and 4.38 mrem/yr. Radiation dose from ingestion of Pb-210 rose to 34.7 and 854 mrem/yr for FIPR and grocery data, respectively. Lead-210 was higher than initial doses for many reasons: Different peak examined, lower edge of detection limit, and minimum detectable concentration was considered. FIPR did not utilize grocery samples as a control because they calculated radiation dose that appeared unreasonably high. Consideration of distributions with the initial values allowed reevaluation of radiation does and showed a significant difference to original deterministic values. This work shows the value and importance of considering distributions to ensure that a person's radiation dose is accurately calculated. Probabilistic dose methodology was proved to be a more accurate and realistic method of radiation dose determination. This type of methodology provides a visual presentation of dose distribution that can be a vital aid in risk methodology.
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Alshipli, Marwan; Kabir, Norlaili A.
2017-05-01
Computed tomography (CT) employs X-ray radiation to create cross-sectional images. Dual-energy CT acquisition includes the images acquired from an alternating voltage of X-ray tube: a low- and a high-peak kilovoltage. The main objective of this study is to determine the best slice thickness that reduces image noise with adequate diagnostic information using dual energy CT head protocol. The study used the ImageJ software and statistical analyses to aid the medical image analysis of dual-energy CT. In this study, ImageJ software and F-test were utilised as the combination methods to analyse DICOM CT images. They were used to investigate the effect of slice thickness on noise and visibility in dual-energy CT head protocol images. Catphan-600 phantom was scanned at different slice thickness values;.6, 1, 2, 3, 4, 5 and 6 mm, then quantitative analyses were carried out. The DECT operated in helical mode with another fixed scan parameter values. Based on F-test statistical analyses, image noise at 0.6, 1, and 2 mm were significantly different compared to the other images acquired at slice thickness of 3, 4, 5, and 6 mm. However, no significant differences of image noise were observed at 3, 4, 5, and 6 mm. As a result, better diagnostic image value, image visibility, and lower image noise in dual-energy CT head protocol was observed at a slice thickness of 3 mm.
Supply Chain Collaboration: Information Sharing in a Tactical Operating Environment
2013-06-01
architecture, there are four tiers: Client (Web Application Clients ), Presentation (Web-Server), Processing (Application-Server), Data (Database...organization in each period. This data will be collected to analyze. i) Analyses and Validation: We will do a statistics test in this data, Pareto ...notes, outstanding deliveries, and inventory. i) Analyses and Validation: We will do a statistics test in this data, Pareto analyses and confirmation
Analysis of the Einstein sample of early-type galaxies
NASA Technical Reports Server (NTRS)
Eskridge, Paul B.; Fabbiano, Giuseppina
1993-01-01
The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.
Muir, Bob; Quick, Suzanne; Slater, Ben J; Cooper, David B; Moran, Mary C; Timperley, Christopher M; Carrick, Wendy A; Burnell, Christopher K
2005-03-18
Thermal desorption with gas chromatography-mass spectrometry (TD-GC-MS) remains the technique of choice for analysis of trace concentrations of analytes in air samples. This paper describes the development and application of a method for analysing the vesicant compounds sulfur mustard and Lewisites I-III. 3,4-Dimercaptotoluene and butanethiol were used to spike sorbent tubes and vesicant vapours sampled; Lewisite I and II reacted with the thiols while sulfur mustard and Lewisite III did not. Statistical experimental design was used to optimise thermal desorption parameters and the optimum method used to determine vesicant compounds in headspace samples taken from a decontamination trial. 3,4-Dimercaptotoluene reacted with Lewisites I and II to give a common derivative with a limit of detection (LOD) of 260 microg m(-3), while the butanethiol gave distinct derivatives with limits of detection around 30 microg m(-3).
Neighborhood archetypes for population health research: is there no place like home?
Weden, Margaret M; Bird, Chloe E; Escarce, José J; Lurie, Nicole
2011-01-01
This study presents a new, latent archetype approach for studying place in population health. Latent class analysis is used to show how the number, defining attributes, and change/stability of neighborhood archetypes can be characterized and tested for statistical significance. The approach is demonstrated using data on contextual determinants of health for US neighborhoods defined by census tracts in 1990 and 2000. Six archetypes (prevalence 13-20%) characterize the statistically significant combinations of contextual determinants of health from the social environment, built environment, commuting and migration patterns, and demographics and household composition of US neighborhoods. Longitudinal analyses based on the findings demonstrate notable stability (76.4% of neighborhoods categorized as the same archetype ten years later), with exceptions reflecting trends in (ex)urbanization, gentrification/downgrading, and racial/ethnic reconfiguration. The findings and approach is applicable to both research and practice (e.g. surveillance) and can be scaled up or down to study health and place in other geographical contexts or historical periods. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Leake, M. A.
1982-01-01
The intercrater plains of Mercury and the Moon are defined, in part, by their high densities of small craters. The crater size frequency statistics presented in this chapter may help constrain the relative ages and origins of these surfaces. To this end, the effects of common geologic processes on crater frequency statistics are compared with the diameter frequency distributions of the intercrater regions of the Moon and Mercury. Such analyses may determine whether secondary craters dominate the distribution at small diameters, and whether volcanic plains or ballistic deposits form the intercrater surface. Determining the mass frequency distribution and flux of the impacting population is a more difficult problem. The necessary information such as scaling relationships between projectile energy and crater diameter, the relative fluxes of solar system objects, and the absolute ages of surface units is model dependent and poorly constrained, especially for Mercury.
On vital aid: the why, what and how of validation
Kleywegt, Gerard J.
2009-01-01
Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such techniques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968
Millennium development goals and oral health in cities in Southern Brazil.
Bueno, Roberto Eduardo; Moysés, Samuel Jorge; Moysés, Simone Tetu
2010-06-01
To investigate social determinants of oral health, analysing the occurrence of associations between millennium development goals (MDG) indicators and oral health (OH) indicators. An ecological study was performed in two distinct phases. In Phase 1, MDG indicators and related covariates were obtained from the demographic census of the Brazilian Institute of Geography and Statistics, the Ministry of Health database and the 2000 Human Development Atlas, making up the whole set of independent variables. Principal component analysis was carried out for the independent variables showing the correlations among the variables comprising the main components, and generating a synthetic index allowing the performance of the cities to be known with regard to the MDG (MDG index). In Phase 2, the DMFT index (mean number of decay, missing or filled permanent teeth) and the CF index (prevalence of caries-free individuals), in 12 years old were obtained from the epidemiological survey undertaken in 2002-2003, in 49 cities in southern Brazil, and were analysed in relation to the MDG index using Spearman's correlation. A statistically significant correlation was found for the DMFT and CF indices, respectively, with: the MDG index (R(2)=0.49 and 0.48; P = 0.00); the socioeconomic status of the population (R(2)= 0.12 and 0.12; P = 0.02); the socioenvironmental characteristics (R(2)=0.41 and 0.46; P= 0.00). The MDG synthetic index of the cities analysed and the respective components relating to their socioeconomic and socioenvironmental status demonstrated a positive correlation with OH indicators. As such, intersectoral public policies based on population strategies that act on social determinants of general and oral health need to be integrated so as to impact on the MDG and OH outcomes. © 2010 John Wiley & Sons A/S.
Nour-Eldein, Hebatallah
2016-01-01
With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles.
Nour-Eldein, Hebatallah
2016-01-01
Background: With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. Objectives: To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. Methods: This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Results: Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Conclusion: Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles. PMID:27453839
Research of Extension of the Life Cycle of Helicopter Rotor Blade in Hungary
2003-02-01
Radiography (DXR), and (iii) Vibration Diagnostics (VD) with Statistical Energy Analysis (SEA) were semi- simultaneously applied [1]. The used three...2.2. Vibration Diagnostics (VD)) Parallel to the NDT measurements the Statistical Energy Analysis (SEA) as a vibration diagnostical tool were...noises were analysed with a dual-channel real time frequency analyser (BK2035). In addition to the Statistical Energy Analysis measurement a small
Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew
2017-09-01
Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.
Sunspot activity and influenza pandemics: a statistical assessment of the purported association.
Towers, S
2017-10-01
Since 1978, a series of papers in the literature have claimed to find a significant association between sunspot activity and the timing of influenza pandemics. This paper examines these analyses, and attempts to recreate the three most recent statistical analyses by Ertel (1994), Tapping et al. (2001), and Yeung (2006), which all have purported to find a significant relationship between sunspot numbers and pandemic influenza. As will be discussed, each analysis had errors in the data. In addition, in each analysis arbitrary selections or assumptions were also made, and the authors did not assess the robustness of their analyses to changes in those arbitrary assumptions. Varying the arbitrary assumptions to other, equally valid, assumptions negates the claims of significance. Indeed, an arbitrary selection made in one of the analyses appears to have resulted in almost maximal apparent significance; changing it only slightly yields a null result. This analysis applies statistically rigorous methodology to examine the purported sunspot/pandemic link, using more statistically powerful un-binned analysis methods, rather than relying on arbitrarily binned data. The analyses are repeated using both the Wolf and Group sunspot numbers. In all cases, no statistically significant evidence of any association was found. However, while the focus in this particular analysis was on the purported relationship of influenza pandemics to sunspot activity, the faults found in the past analyses are common pitfalls; inattention to analysis reproducibility and robustness assessment are common problems in the sciences, that are unfortunately not noted often enough in review.
Gingerich, Stephen B.
2005-01-01
Flow-duration statistics under natural (undiverted) and diverted flow conditions were estimated for gaged and ungaged sites on 21 streams in northeast Maui, Hawaii. The estimates were made using the optimal combination of continuous-record gaging-station data, low-flow measurements, and values determined from regression equations developed as part of this study. Estimated 50- and 95-percent flow duration statistics for streams are presented and the analyses done to develop and evaluate the methods used in estimating the statistics are described. Estimated streamflow statistics are presented for sites where various amounts of streamflow data are available as well as for locations where no data are available. Daily mean flows were used to determine flow-duration statistics for continuous-record stream-gaging stations in the study area following U.S. Geological Survey established standard methods. Duration discharges of 50- and 95-percent were determined from total flow and base flow for each continuous-record station. The index-station method was used to adjust all of the streamflow records to a common, long-term period. The gaging station on West Wailuaiki Stream (16518000) was chosen as the index station because of its record length (1914-2003) and favorable geographic location. Adjustments based on the index-station method resulted in decreases to the 50-percent duration total flow, 50-percent duration base flow, 95-percent duration total flow, and 95-percent duration base flow computed on the basis of short-term records that averaged 7, 3, 4, and 1 percent, respectively. For the drainage basin of each continuous-record gaged site and selected ungaged sites, morphometric, geologic, soil, and rainfall characteristics were quantified using Geographic Information System techniques. Regression equations relating the non-diverted streamflow statistics to basin characteristics of the gaged basins were developed using ordinary-least-squares regression analyses. Rainfall rate, maximum basin elevation, and the elongation ratio of the basin were the basin characteristics used in the final regression equations for 50-percent duration total flow and base flow. Rainfall rate and maximum basin elevation were used in the final regression equations for the 95-percent duration total flow and base flow. The relative errors between observed and estimated flows ranged from 10 to 20 percent for the 50-percent duration total flow and base flow, and from 29 to 56 percent for the 95-percent duration total flow and base flow. The regression equations developed for this study were used to determine the 50-percent duration total flow, 50-percent duration base flow, 95-percent duration total flow, and 95-percent duration base flow at selected ungaged diverted and undiverted sites. Estimated streamflow, prediction intervals, and standard errors were determined for 48 ungaged sites in the study area and for three gaged sites west of the study area. Relative errors were determined for sites where measured values of 95-percent duration discharge of total flow were available. East of Keanae Valley, the 95-percent duration discharge equation generally underestimated flow, and within and west of Keanae Valley, the equation generally overestimated flow. Reduction in 50- and 95-percent flow-duration values in stream reaches affected by diversions throughout the study area average 58 to 60 percent.
Makarewicz, Roman; Kopczyńska, Ewa; Marszałek, Andrzej; Goralewska, Alina; Kardymowicz, Hanna
2012-01-01
Aim of the study This retrospective study attempts to evaluate the influence of serum vascular endothelial growth factor C (VEGF-C), microvessel density (MVD) and lymphatic vessel density (LMVD) on the result of tumour treatment in women with cervical cancer. Material and methods The research was carried out in a group of 58 patients scheduled for brachytherapy for cervical cancer. All women were patients of the Department and University Hospital of Oncology and Brachytherapy, Collegium Medicum in Bydgoszcz of Nicolaus Copernicus University in Toruń. VEGF-C was determined by means of a quantitative sandwich enzyme immunoassay using a human antibody VEGF-C ELISA produced by Bender MedSystem, enzyme-linked immunosorbent detecting the activity of human VEGF-C in body fluids. The measure for the intensity of angiogenesis and lymphangiogenesis in immunohistochemical reactions is the number of blood vessels within the tumour. Statistical analysis was done using Statistica 6.0 software (StatSoft, Inc. 2001). The Cox proportional hazards model was used for univariate and multivariate analyses. Univariate analysis of overall survival was performed as outlined by Kaplan and Meier. In all statistical analyses p < 0.05 (marked red) was taken as significant. Results In 51 patients who showed up for follow-up examination, the influence of the factors of angiogenesis, lymphangiogenesis, patients’ age and the level of haemoglobin at the end of treatment were assessed. Selected variables, such as patients’ age, lymph vessel density (LMVD), microvessel density (MVD) and the level of haemoglobin (Hb) before treatment were analysed by means of Cox logical regression as potential prognostic factors for lymph node invasion. The observed differences were statistically significant for haemoglobin level before treatment and the platelet number after treatment. The study revealed the following prognostic factors: lymph node status, FIGO stage, and kind of treatment. No statistically significant influence of angiogenic and lymphangiogenic factors on the prognosis was found. Conclusion Angiogenic and lymphangiogenic factors have no value in predicting response to radiotherapy in cervical cancer patients. PMID:23788848
Lightfoot, Emma; O’Connell, Tamsin C.
2016-01-01
Oxygen isotope analysis of archaeological skeletal remains is an increasingly popular tool to study past human migrations. It is based on the assumption that human body chemistry preserves the δ18O of precipitation in such a way as to be a useful technique for identifying migrants and, potentially, their homelands. In this study, the first such global survey, we draw on published human tooth enamel and bone bioapatite data to explore the validity of using oxygen isotope analyses to identify migrants in the archaeological record. We use human δ18O results to show that there are large variations in human oxygen isotope values within a population sample. This may relate to physiological factors influencing the preservation of the primary isotope signal, or due to human activities (such as brewing, boiling, stewing, differential access to water sources and so on) causing variation in ingested water and food isotope values. We compare the number of outliers identified using various statistical methods. We determine that the most appropriate method for identifying migrants is dependent on the data but is likely to be the IQR or median absolute deviation from the median under most archaeological circumstances. Finally, through a spatial assessment of the dataset, we show that the degree of overlap in human isotope values from different locations across Europe is such that identifying individuals’ homelands on the basis of oxygen isotope analysis alone is not possible for the regions analysed to date. Oxygen isotope analysis is a valid method for identifying first-generation migrants from an archaeological site when used appropriately, however it is difficult to identify migrants using statistical methods for a sample size of less than c. 25 individuals. In the absence of local previous analyses, each sample should be treated as an individual dataset and statistical techniques can be used to identify migrants, but in most cases pinpointing a specific homeland should not be attempted. PMID:27124001
Surface pressure maps from scatterometer data
NASA Technical Reports Server (NTRS)
Brown, R. A.; Levy, Gad
1991-01-01
The ability to determine surface pressure fields from satellite scatterometer data was shown by Brown and Levy (1986). The surface winds are used to calculate the gradient winds above the planetary boundary layer, and these are directly related to the pressure gradients. There are corrections for variable stratification, variable surface roughness, horizontal inhomogeneity, humidity and baroclinity. The Seasat-A Satellite Scatterometer (SASS) data have been used in a systematic study of 50 synoptic weather events (regions of approximately 1000 X 1000 km). The preliminary statistics of agreement with national weather service surface pressure maps are calculated. The resulting surface pressure maps can be used together with SASS winds and Scanning Multichannel Microwave Radiometer (SMMR) water vapor and liquid water analyses to provide good front and storm system analyses.
Walker, Brian K
2012-01-01
Marine organism diversity typically attenuates latitudinally from tropical to colder climate regimes. Since the distribution of many marine species relates to certain habitats and depth regimes, mapping data provide valuable information in the absence of detailed ecological data that can be used to identify and spatially quantify smaller scale (10 s km) coral reef ecosystem regions and potential physical biogeographic barriers. This study focused on the southeast Florida coast due to a recognized, but understudied, tropical to subtropical biogeographic gradient. GIS spatial analyses were conducted on recent, accurate, shallow-water (0-30 m) benthic habitat maps to identify and quantify specific regions along the coast that were statistically distinct in the number and amount of major benthic habitat types. Habitat type and width were measured for 209 evenly-spaced cross-shelf transects. Evaluation of groupings from a cluster analysis at 75% similarity yielded five distinct regions. The number of benthic habitats and their area, width, distance from shore, distance from each other, and LIDAR depths were calculated in GIS and examined to determine regional statistical differences. The number of benthic habitats decreased with increasing latitude from 9 in the south to 4 in the north and many of the habitat metrics statistically differed between regions. Three potential biogeographic barriers were found at the Boca, Hillsboro, and Biscayne boundaries, where specific shallow-water habitats were absent further north; Middle Reef, Inner Reef, and oceanic seagrass beds respectively. The Bahamas Fault Zone boundary was also noted where changes in coastal morphologies occurred that could relate to subtle ecological changes. The analyses defined regions on a smaller scale more appropriate to regional management decisions, hence strengthening marine conservation planning with an objective, scientific foundation for decision making. They provide a framework for similar regional analyses elsewhere.
Walker, Brian K.
2012-01-01
Marine organism diversity typically attenuates latitudinally from tropical to colder climate regimes. Since the distribution of many marine species relates to certain habitats and depth regimes, mapping data provide valuable information in the absence of detailed ecological data that can be used to identify and spatially quantify smaller scale (10 s km) coral reef ecosystem regions and potential physical biogeographic barriers. This study focused on the southeast Florida coast due to a recognized, but understudied, tropical to subtropical biogeographic gradient. GIS spatial analyses were conducted on recent, accurate, shallow-water (0–30 m) benthic habitat maps to identify and quantify specific regions along the coast that were statistically distinct in the number and amount of major benthic habitat types. Habitat type and width were measured for 209 evenly-spaced cross-shelf transects. Evaluation of groupings from a cluster analysis at 75% similarity yielded five distinct regions. The number of benthic habitats and their area, width, distance from shore, distance from each other, and LIDAR depths were calculated in GIS and examined to determine regional statistical differences. The number of benthic habitats decreased with increasing latitude from 9 in the south to 4 in the north and many of the habitat metrics statistically differed between regions. Three potential biogeographic barriers were found at the Boca, Hillsboro, and Biscayne boundaries, where specific shallow-water habitats were absent further north; Middle Reef, Inner Reef, and oceanic seagrass beds respectively. The Bahamas Fault Zone boundary was also noted where changes in coastal morphologies occurred that could relate to subtle ecological changes. The analyses defined regions on a smaller scale more appropriate to regional management decisions, hence strengthening marine conservation planning with an objective, scientific foundation for decision making. They provide a framework for similar regional analyses elsewhere. PMID:22276204
Karacan, C Özgen; Olea, Ricardo A
2018-03-01
Chemical properties of coal largely determine coal handling, processing, beneficiation methods, and design of coal-fired power plants. Furthermore, these properties impact coal strength, coal blending during mining, as well as coal's gas content, which is important for mining safety. In order for these processes and quantitative predictions to be successful, safer, and economically feasible, it is important to determine and map chemical properties of coals accurately in order to infer these properties prior to mining. Ultimate analysis quantifies principal chemical elements in coal. These elements are C, H, N, S, O, and, depending on the basis, ash, and/or moisture. The basis for the data is determined by the condition of the sample at the time of analysis, with an "as-received" basis being the closest to sampling conditions and thus to the in-situ conditions of the coal. The parts determined or calculated as the result of ultimate analyses are compositions, reported in weight percent, and pose the challenges of statistical analyses of compositional data. The treatment of parts using proper compositional methods may be even more important in mapping them, as most mapping methods carry uncertainty due to partial sampling as well. In this work, we map the ultimate analyses parts of the Springfield coal from an Indiana section of the Illinois basin, USA, using sequential Gaussian simulation of isometric log-ratio transformed compositions. We compare the results with those of direct simulations of compositional parts. We also compare the implications of these approaches in calculating other properties using correlations to identify the differences and consequences. Although the study here is for coal, the methods described in the paper are applicable to any situation involving compositional data and its mapping.
Space shuttle solid rocket booster recovery system definition, volume 1
NASA Technical Reports Server (NTRS)
1973-01-01
The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.
2013-07-01
as a statistical graphic, and Pearson product moment correlation coefficients as measures of the strength of linear association; 4) performing SNP ...determine if there are differences in single nucleotide polymorphisms ( SNPs ) in selected candidate genes implicated in metabolic syndrome, obesity, chronic...samples for the serum and SNP analyses. We have reached a target of 500 patients at the end of year 2; however, some of the patients turned out to be
Evaluation of the ecological relevance of mysid toxicity tests using population modeling techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn-Hines, A.; Munns, W.R. Jr.; Lussier, S.
1995-12-31
A number of acute and chronic bioassay statistics are used to evaluate the toxicity and risks of chemical stressors to the mysid shrimp, Mysidopsis bahia. These include LC{sub 50}S from acute tests, NOECs from 7-day and life-cycle tests, and the US EPA Water Quality Criteria Criterion Continuous Concentrations (CCC). Because these statistics are generated from endpoints which focus upon the responses of individual organisms, their relationships to significant effects at higher levels of ecological organization are unknown. This study was conducted to evaluate the quantitative relationships between toxicity test statistics and a concentration-based statistic derived from exposure-response models describing populationmore » growth rate ({lambda}) to stressor concentration. This statistic, C{sup {sm_bullet}} (concentration where {lambda} = I, zero population growth) describes the concentration above which mysid populations are projected to decline in abundance as determined using population modeling techniques. An analysis of M. bahia responses to 9 metals and 9 organic contaminants indicated the NOEC from life-cycle tests to be the best predictor of C{sup {sm_bullet}}, although the acute LC{sub 50} predicted population-level response surprisingly well. These analyses provide useful information regarding uncertainties of extrapolation among test statistics in assessments of ecological risk.« less
NASA Astrophysics Data System (ADS)
Ridgley, James Alexander, Jr.
This dissertation is an exploratory quantitative analysis of various independent variables to determine their effect on the professional longevity (years of service) of high school science teachers in the state of Florida for the academic years 2011-2012 to 2013-2014. Data are collected from the Florida Department of Education, National Center for Education Statistics, and the National Assessment of Educational Progress databases. The following research hypotheses are examined: H1 - There are statistically significant differences in Level 1 (teacher variables) that influence the professional longevity of a high school science teacher in Florida. H2 - There are statistically significant differences in Level 2 (school variables) that influence the professional longevity of a high school science teacher in Florida. H3 - There are statistically significant differences in Level 3 (district variables) that influence the professional longevity of a high school science teacher in Florida. H4 - When tested in a hierarchical multiple regression, there are statistically significant differences in Level 1, Level 2, or Level 3 that influence the professional longevity of a high school science teacher in Florida. The professional longevity of a Floridian high school science teacher is the dependent variable. The independent variables are: (Level 1) a teacher's sex, age, ethnicity, earned degree, salary, number of schools taught in, migration count, and various years of service in different areas of education; (Level 2) a school's geographic location, residential population density, average class size, charter status, and SES; and (Level 3) a school district's average SES and average spending per pupil. Statistical analyses of exploratory MLRs and a HMR are used to support the research hypotheses. The final results of the HMR analysis show a teacher's age, salary, earned degree (unknown, associate, and doctorate), and ethnicity (Hispanic and Native Hawaiian/Pacific Islander); a school's charter status; and a school district's average SES are all significant predictors of a Florida high school science teacher's professional longevity. Although statistically significant in the initial exploratory MLR analyses, a teacher's ethnicity (Asian and Black), a school's geographic location (city and rural), and a school's SES are not statistically significant in the final HMR model.
Jin, Zhichao; Yu, Danghui; Zhang, Luoman; Meng, Hong; Lu, Jian; Gao, Qingbin; Cao, Yang; Ma, Xiuqiang; Wu, Cheng; He, Qian; Wang, Rui; He, Jia
2010-01-01
Background High quality clinical research not only requires advanced professional knowledge, but also needs sound study design and correct statistical analyses. The number of clinical research articles published in Chinese medical journals has increased immensely in the past decade, but study design quality and statistical analyses have remained suboptimal. The aim of this investigation was to gather evidence on the quality of study design and statistical analyses in clinical researches conducted in China for the first decade of the new millennium. Methodology/Principal Findings Ten (10) leading Chinese medical journals were selected and all original articles published in 1998 (N = 1,335) and 2008 (N = 1,578) were thoroughly categorized and reviewed. A well-defined and validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation. Main outcomes were the frequencies of different types of study design, error/defect proportion in design and statistical analyses, and implementation of CONSORT in randomized clinical trials. From 1998 to 2008: The error/defect proportion in statistical analyses decreased significantly ( = 12.03, p<0.001), 59.8% (545/1,335) in 1998 compared to 52.2% (664/1,578) in 2008. The overall error/defect proportion of study design also decreased ( = 21.22, p<0.001), 50.9% (680/1,335) compared to 42.40% (669/1,578). In 2008, design with randomized clinical trials remained low in single digit (3.8%, 60/1,578) with two-third showed poor results reporting (defects in 44 papers, 73.3%). Nearly half of the published studies were retrospective in nature, 49.3% (658/1,335) in 1998 compared to 48.2% (761/1,578) in 2008. Decreases in defect proportions were observed in both results presentation ( = 93.26, p<0.001), 92.7% (945/1,019) compared to 78.2% (1023/1,309) and interpretation ( = 27.26, p<0.001), 9.7% (99/1,019) compared to 4.3% (56/1,309), some serious ones persisted. Conclusions/Significance Chinese medical research seems to have made significant progress regarding statistical analyses, but there remains ample room for improvement regarding study designs. Retrospective clinical studies are the most often used design, whereas randomized clinical trials are rare and often show methodological weaknesses. Urgent implementation of the CONSORT statement is imperative. PMID:20520824
Arceo-Carranza, D; Vega-Cendejas, M E; Hernández de Santillana, M
2013-01-01
The aim of this study was to determine the trophic structure and nycthemeral variations in the diet of dominant fish species (Ariopsis felis, Bairdiella chrysoura, Micropogonias undulatus, Eucinostomus gula, Eucinostomus argenteus, Lagodon rhomboides and Sphoeroides testudineus) in Celestun Lagoon, a biosphere reserve located in the southern Gulf of Mexico, and influenced by freshwater seeps. A total of 1473 stomachs were analysed and nine trophic groups were recorded. Bray-Curtis analyses with analyses of similarity (ANOSIM) statistical tests were used to determine two groups of feeding guilds: zoobenthivores and omnivores, with significant differences between time and habitat. The relationships between fish feeding habits, size class and environmental variables were investigated using canonical correspondence analysis (CCA). Most of the species showed a low niche breadth with high specialization towards amphipod consumption, with the exception of L. rhomboides (0·60), which indicated generalist feeding. This study in a protected area is an important source of information for drawing up conservation policies in relation to the management of aquatic resources, and will aid in the establishment of priority areas for conservation. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Validation of endogenous internal real-time PCR controls in renal tissues.
Cui, Xiangqin; Zhou, Juling; Qiu, Jing; Johnson, Martin R; Mrug, Michal
2009-01-01
Endogenous internal controls ('reference' or 'housekeeping' genes) are widely used in real-time PCR (RT-PCR) analyses. Their use relies on the premise of consistently stable expression across studied experimental conditions. Unfortunately, none of these controls fulfills this premise across a wide range of experimental conditions; consequently, none of them can be recommended for universal use. To determine which endogenous RT-PCR controls are suitable for analyses of renal tissues altered by kidney disease, we studied the expression of 16 commonly used 'reference genes' in 7 mildly and 7 severely affected whole kidney tissues from a well-characterized cystic kidney disease model. Expression levels of these 16 genes, determined by TaqMan RT-PCR analyses and Affymetrix GeneChip arrays, were normalized and tested for overall variance and equivalence of the means. Both statistical approaches and both TaqMan- and GeneChip-based methods converged on 3 out of the 4 top-ranked genes (Ppia, Gapdh and Pgk1) that had the most constant expression levels across the studied phenotypes. A combination of the top-ranked genes will provide a suitable endogenous internal control for similar studies of kidney tissues across a wide range of disease severity. Copyright 2009 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Estrada, Francisco; Perron, Pierre; Martínez-López, Benjamín
2013-12-01
The warming of the climate system is unequivocal as evidenced by an increase in global temperatures by 0.8°C over the past century. However, the attribution of the observed warming to human activities remains less clear, particularly because of the apparent slow-down in warming since the late 1990s. Here we analyse radiative forcing and temperature time series with state-of-the-art statistical methods to address this question without climate model simulations. We show that long-term trends in total radiative forcing and temperatures have largely been determined by atmospheric greenhouse gas concentrations, and modulated by other radiative factors. We identify a pronounced increase in the growth rates of both temperatures and radiative forcing around 1960, which marks the onset of sustained global warming. Our analyses also reveal a contribution of human interventions to two periods when global warming slowed down. Our statistical analysis suggests that the reduction in the emissions of ozone-depleting substances under the Montreal Protocol, as well as a reduction in methane emissions, contributed to the lower rate of warming since the 1990s. Furthermore, we identify a contribution from the two world wars and the Great Depression to the documented cooling in the mid-twentieth century, through lower carbon dioxide emissions. We conclude that reductions in greenhouse gas emissions are effective in slowing the rate of warming in the short term.
Preventing homicide: an evaluation of the efficacy of a Detroit gun ordinance.
O'Carroll, P W; Loftin, C; Waller, J B; McDowall, D; Bukoff, A; Scott, R O; Mercy, J A; Wiersema, B
1991-01-01
BACKGROUND: In November 1986, a Detroit, Michigan city ordinance requiring mandatory jail sentences for illegally carrying a firearm in public was passed to preserve "the public peace, health, safety, and welfare of the people." METHODS: We conducted a set of interrupted time-series analyses to evaluate the impact of the law on the incidence of homicides, hypothesizing that the ordinance, by its nature, would affect only firearm homicides and homicides committed outside (e.g., on the street). RESULTS: The incidence of homicide in general increased after the law was passed, but the increases in non-firearm homicides and homicides committed inside (e.g., in a home) were either statistically significant or approached statistical significance (p = .006 and p = .070, respectively), whereas changes in the incidence of firearm homicides and homicides committed outside were not statistically significant (p = .238 and p = .418, respectively). We also determined that the ordinance was essentially unenforced, apparently because of a critical shortage of jail space. CONCLUSIONS: Our findings are consistent with a model in which the ordinance had a dampening effect on firearm homicides occurring in public in Detroit. The apparent preventive effect evident in the time series analyses may have been due to publicity about the ordinance, whereas the small nature of the effect may have been due to the lack of enforcement. PMID:2014857
Overweight, but not obesity, paradox on mortality following coronary artery bypass grafting.
Takagi, Hisato; Umemoto, Takuya
2016-09-01
To determine whether an "obesity paradox" on post-coronary artery bypass grafting (CABG) mortality exists, we abstracted exclusively adjusted odds ratios (ORs) and/or hazard ratios (HRs) for mortality from each study, and then combined them in a meta-analysis. MEDLINE and EMBASE were searched through April 2015 using PubMed and OVID, to identify comparative studies, of overweight or obese versus normal weight patients undergoing CABG, reporting adjusted relative risk estimates for short-term (30-day or in-hospital) and/or mid-to-long-term all-cause mortality. Our search identified 14 eligible studies. In total our meta-analysis included data on 79,140 patients undergoing CABG. Pooled analyses in short-term mortality demonstrated that overweight was associated with a statistically significant 15% reduction relative to normal weight (OR, 0.85; 95% confidence interval [CI], 0.74-0.98; p=0.03) and no statistically significant differences between mild obesity, moderate/severe obesity, or overall obesity and normal weight. Pooled analyses in mid-to-long-term mortality demonstrated that overweight was associated with a statistically significant 10% reduction relative to normal weight (HR, 0.90; 95% CI, 0.84 to 0.96; p=0.001); and no statistically significant differences between mild obesity, moderate/severe obesity, or overall obesity and normal weight. Overweight, but not obesity, may be associated with better short-term and mid-to-long-term post-CABG survival relative to normal weight. An overweight, but not obesity, paradox on post-CABG mortality appears to exist. Copyright © 2015 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Kelechi, Teresa J; Mueller, Martina; Zapka, Jane G; King, Dana E
2011-11-01
The aim of this randomized clinical trial was to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Chronic venous disorders are under-recognized vascular health problems that result in severe skin damage and ulcerations of the lower legs. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Data were collected between 2008 2009 and analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (-6.2[-11.8; -0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. © 2011 Blackwell Publishing Ltd.
Hardell, Elin; Carlberg, Michael; Nordström, Marie; van Bavel, Bert
2010-09-15
Persistent organic pollutants (POPs) are lipophilic chemicals that bioaccumulate. Most of them were resticted or banned in the 1970s and 1980s to protect human health and the environment. The main source for humans is dietary intake of dairy products, meat and fish. Little data exist on changes of the concentration of POPs in the Swedish population over time. To study if the concentrations of polychlorinated biphenyls (PCBs), DDE, hexachlorobenzene (HCB) and chlordanes have changed in the Swedish population during 1993-2007, and certain factors that may influence the concentrations. During 1993-2007 samples from 537 controls in different human cancer studies were collected and analysed. Background information such as body mass index, breast-feeding and parity was assessed by questionaires. Wilcoxon rank-sum test was used to analyse the explanatory factors specimen (blood or adipose tissue), gender, BMI, total breast-feeding and parity in relation to POPs. Time trends for POPs were analysed using linear regression analysis, adjusted for specimen, gender, BMI and age. The concentration decreased for all POPs during 1993-2007. The annual change was statistically significant for the sum of PCBs -7.2%, HCB -8.8%, DDE -13.5% and the sum of chlordanes -10.3%. BMI and age were determinants of the concentrations. Cumulative breast-feeding >8 months gave statistically significantly lower concentrations for the sum of PCBs, DDE and the sum of chlordanes. Parity with >2 children yielded statistically significantly lower sum of PCBs. All the studied POPs decreased during the time period, probably due to restrictions of their use. Copyright 2010 Elsevier B.V. All rights reserved.
Mukaka, Mavuto; White, Sarah A; Terlouw, Dianne J; Mwapasa, Victor; Kalilani-Phiri, Linda; Faragher, E Brian
2016-07-22
Missing outcomes can seriously impair the ability to make correct inferences from randomized controlled trials (RCTs). Complete case (CC) analysis is commonly used, but it reduces sample size and is perceived to lead to reduced statistical efficiency of estimates while increasing the potential for bias. As multiple imputation (MI) methods preserve sample size, they are generally viewed as the preferred analytical approach. We examined this assumption, comparing the performance of CC and MI methods to determine risk difference (RD) estimates in the presence of missing binary outcomes. We conducted simulation studies of 5000 simulated data sets with 50 imputations of RCTs with one primary follow-up endpoint at different underlying levels of RD (3-25 %) and missing outcomes (5-30 %). For missing at random (MAR) or missing completely at random (MCAR) outcomes, CC method estimates generally remained unbiased and achieved precision similar to or better than MI methods, and high statistical coverage. Missing not at random (MNAR) scenarios yielded invalid inferences with both methods. Effect size estimate bias was reduced in MI methods by always including group membership even if this was unrelated to missingness. Surprisingly, under MAR and MCAR conditions in the assessed scenarios, MI offered no statistical advantage over CC methods. While MI must inherently accompany CC methods for intention-to-treat analyses, these findings endorse CC methods for per protocol risk difference analyses in these conditions. These findings provide an argument for the use of the CC approach to always complement MI analyses, with the usual caveat that the validity of the mechanism for missingness be thoroughly discussed. More importantly, researchers should strive to collect as much data as possible.
Mueller, Martina; Zapka, Jane G.; King, Dana E.
2011-01-01
Aim This randomized clinical trial was conducted 2008 – 2009 to investigate a cryotherapy (cooling) gel wrap applied to lower leg skin affected by chronic venous disorders to determine whether therapeutic cooling improves skin microcirculation. Impaired skin microcirculation contributes to venous leg ulcer development, thus new prevention therapies should address the microcirculation to prevent venous leg ulcers. Data Sources Sixty participants (n = 30 per group) were randomized to receive one of two daily 30-minute interventions for four weeks. The treatment group applied the cryotherapy gel wrap around the affected lower leg skin, or compression and elevated the legs on a special pillow each evening at bedtime. The standard care group wore compression and elevated the legs only. Laboratory pre- and post-measures included microcirculation measures of skin temperature with a thermistor, blood flow with a laser Doppler flowmeter, and venous refill time with a photoplethysmograph. Review methods Data were analysed using descriptive statistics, paired t-tests or Wilcoxon signed ranks tests, logistic regression analyses, and mixed model analyses. Results Fifty-seven participants (treatment = 28; standard care = 29) completed the study. The mean age was 62 years, 70% female, 50% African American. In the final adjusted model, there was a statistically significant decrease in blood flow between the two groups (−6.2[−11.8; −0.6], P = 0.03). No statistically significant differences were noted in temperature or venous refill time. Conclusion Study findings suggest that cryotherapy improves blood flow by slowing movement within the microcirculation and thus might potentially provide a therapeutic benefit to prevent leg ulcers. PMID:21592186
Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.
Schmitt, M; Grub, J; Heib, F
2015-06-01
Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Gaskin, Cadeyrn J; Happell, Brenda
2013-02-01
Having sufficient power to detect effect sizes of an expected magnitude is a core consideration when designing studies in which inferential statistics will be used. The main aim of this study was to investigate the statistical power in studies published in the International Journal of Mental Health Nursing. From volumes 19 (2010) and 20 (2011) of the journal, studies were analysed for their power to detect small, medium, and large effect sizes, according to Cohen's guidelines. The power of the 23 studies included in this review to detect small, medium, and large effects was 0.34, 0.79, and 0.94, respectively. In 90% of papers, no adjustments for experiment-wise error were reported. With a median of nine inferential tests per paper, the mean experiment-wise error rate was 0.51. A priori power analyses were only reported in 17% of studies. Although effect sizes for correlations and regressions were routinely reported, effect sizes for other tests (χ(2)-tests, t-tests, ANOVA/MANOVA) were largely absent from the papers. All types of effect sizes were infrequently interpreted. Researchers are strongly encouraged to conduct power analyses when designing studies, and to avoid scattergun approaches to data analysis (i.e. undertaking large numbers of tests in the hope of finding 'significant' results). Because reviewing effect sizes is essential for determining the clinical significance of study findings, researchers would better serve the field of mental health nursing if they reported and interpreted effect sizes. © 2012 The Authors. International Journal of Mental Health Nursing © 2012 Australian College of Mental Health Nurses Inc.
Microplate-based filter paper assay to measure total cellulase activity.
Xiao, Zhizhuang; Storms, Reginald; Tsang, Adrian
2004-12-30
The standard filter paper assay (FPA) published by the International Union of Pure and Applied Chemistry (IUPAC) is widely used to determine total cellulase activity. However, the IUPAC method is not suitable for the parallel analyses of large sample numbers. We describe here a microplate-based method for assaying large sample numbers. To achieve this, we reduced the enzymatic reaction volume to 60 microl from the 1.5 ml used in the IUPAC method. The modified 60-microl format FPA can be carried out in 96-well assay plates. Statistical analyses showed that the cellulase activities of commercial cellulases from Trichoderma reesei and Aspergillus species determined with our 60-microl format FPA were not significantly different from the activities measured with the standard FPA. Our results also indicate that the 60-microl format FPA is quantitative and highly reproducible. Moreover, the addition of excess beta-glucosidase increased the sensitivity of the assay by up to 60%. 2004 Wiley Periodicals, Inc.
Rowell, Candace; Kuiper, Nora; Preud'Homme, Hugues
2016-07-01
The knowledge-base of bottled water leachate is highly contradictory due to varying methodologies and limited multi-elemental and/or molecular analyses; understanding the range of contaminants and their pathways is required. This study determined the leaching potential and leaching kinetics of trace elements, using consistent comprehensive quantitative and semi-quantitative (79 elements total) analyses, and BPA, using isotopic dilution and MEPS pre-concentration with UHPLC-ESI-QTOF. Statistical methods were used to determine confounders and predictors of leaching and human health risk throughout 12days of UV exposure and after exposure to elevated temperature. Various types of water were used to assess the impact of water quality. Results suggest Sb leaching is primarily dependent upon water quality, not container type. Bottle type is a predictor of elemental leaching for Pb, Ba, Cr, Cu, Mn and Sr; BPA was detected in samples from polycarbonate containers. Health risks from the consumption of bottled water increase after UV exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI)
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non–expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI’s robustness and sensitivity in capturing useful data relating to the students’ conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. PMID:26903497
Racial disparities in diabetes mortality in the 50 most populous US cities.
Rosenstock, Summer; Whitman, Steve; West, Joseph F; Balkin, Michael
2014-10-01
While studies have consistently shown that in the USA, non-Hispanic Blacks (Blacks) have higher diabetes prevalence, complication and death rates than non-Hispanic Whites (Whites), there are no studies that compare disparities in diabetes mortality across the largest US cities. This study presents and compares Black/White age-adjusted diabetes mortality rate ratios (RRs), calculated using national death files and census data, for the 50 most populous US cities. Relationships between city-level diabetes mortality RRs and 12 ecological variables were explored using bivariate correlation analyses. Multivariate analyses were conducted using negative binomial regression to examine how much of the disparity could be explained by these variables. Blacks had statistically significantly higher mortality rates compared to Whites in 39 of the 41 cities included in analyses, with statistically significant rate ratios ranging from 1.57 (95 % CI: 1.33-1.86) in Baltimore to 3.78 (95 % CI: 2.84-5.02) in Washington, DC. Analyses showed that economic inequality was strongly correlated with the diabetes mortality disparity, driven by differences in White poverty levels. This was followed by segregation. Multivariate analyses showed that adjusting for Black/White poverty alone explained 58.5 % of the disparity. Adjusting for Black/White poverty and segregation explained 72.6 % of the disparity. This study emphasizes the role that inequalities in social and economic determinants, rather than for example poverty on its own, play in Black/White diabetes mortality disparities. It also highlights how the magnitude of the disparity and the factors that influence it can vary greatly across cities, underscoring the importance of using local data to identify context specific barriers and develop effective interventions to eliminate health disparities.
Predictors of persistent pain after total knee arthroplasty: a systematic review and meta-analysis.
Lewis, G N; Rice, D A; McNair, P J; Kluger, M
2015-04-01
Several studies have identified clinical, psychosocial, patient characteristic, and perioperative variables that are associated with persistent postsurgical pain; however, the relative effect of these variables has yet to be quantified. The aim of the study was to provide a systematic review and meta-analysis of predictor variables associated with persistent pain after total knee arthroplasty (TKA). Included studies were required to measure predictor variables prior to or at the time of surgery, include a pain outcome measure at least 3 months post-TKA, and include a statistical analysis of the effect of the predictor variable(s) on the outcome measure. Counts were undertaken of the number of times each predictor was analysed and the number of times it was found to have a significant relationship with persistent pain. Separate meta-analyses were performed to determine the effect size of each predictor on persistent pain. Outcomes from studies implementing uni- and multivariable statistical models were analysed separately. Thirty-two studies involving almost 30 000 patients were included in the review. Preoperative pain was the predictor that most commonly demonstrated a significant relationship with persistent pain across uni- and multivariable analyses. In the meta-analyses of data from univariate models, the largest effect sizes were found for: other pain sites, catastrophizing, and depression. For data from multivariate models, significant effects were evident for: catastrophizing, preoperative pain, mental health, and comorbidities. Catastrophizing, mental health, preoperative knee pain, and pain at other sites are the strongest independent predictors of persistent pain after TKA. © The Author 2014. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Predicting clinical trial results based on announcements of interim analyses
2014-01-01
Background Announcements of interim analyses of a clinical trial convey information about the results beyond the trial’s Data Safety Monitoring Board (DSMB). The amount of information conveyed may be minimal, but the fact that none of the trial’s stopping boundaries has been crossed implies that the experimental therapy is neither extremely effective nor hopeless. Predicting success of the ongoing trial is of interest to the trial’s sponsor, the medical community, pharmaceutical companies, and investors. We determine the probability of trial success by quantifying only the publicly available information from interim analyses of an ongoing trial. We illustrate our method in the context of the National Surgical Adjuvant Breast and Bowel (NSABP) trial, C-08. Methods We simulated trials based on the specifics of the NSABP C-08 protocol that were publicly available. We quantified the uncertainty around the treatment effect using prior weights for the various possibilities in light of other colon cancer studies and other studies of the investigational agent, bevacizumab. We considered alternative prior distributions. Results Subsequent to the trial’s third interim analysis, our predictive probabilities were: that the trial would eventually be successful, 48.0%; would stop for futility, 7.4%; and would continue to completion without statistical significance, 44.5%. The actual trial continued to completion without statistical significance. Conclusions Announcements of interim analyses provide information outside the DSMB’s sphere of confidentiality. This information is potentially helpful to clinical trial prognosticators. ‘Information leakage’ from standard interim analyses such as in NSABP C-08 is conventionally viewed as acceptable even though it may be quite revealing. Whether leakage from more aggressive types of adaptations is acceptable should be assessed at the design stage. PMID:24607270
A Nonparametric Geostatistical Method For Estimating Species Importance
Andrew J. Lister; Rachel Riemann; Michael Hoppus
2001-01-01
Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...
Hanssen, Denise J C; Naarding, Paul; Collard, Rose M; Comijs, Hannie C; Oude Voshaar, Richard C
2014-10-01
Late-life depression and pain more often co-occur than can be explained by chance. Determinants of pain in late-life depression are unknown, even though knowledge on possible determinants of pain in depression is important for clinical practice. Therefore, the objectives of the present study were 1) to describe pain characteristics of depressed older adults and a nondepressed comparison group, and 2) to explore physical, lifestyle, psychological, and social determinants of acute and chronic pain intensity, disability, and multisite pain in depressed older adults. Data from the Netherlands Study of Depression in Older Persons cohort, consisting of 378 depressed persons, diagnosed according to Diagnostic and Statistical Manual of Mental Disorders, 4th edition criteria, and 132 nondepressed persons aged 60 years and older, were used in a cross-sectional design. Pain characteristics were measured by the Chronic Graded Pain Scale. Multiple linear regression analyses were performed to explore the contribution of physical, lifestyle, psychological, and social determinants to outcomes pain intensity, disability, and the number of pain locations. Depressed older adults more often reported chronic pain and experienced their pain as more intense and disabling compared to nondepressed older adults. Adjusted for demographic, physical, and lifestyle characteristics, multinomial logistic regression analyses showed increased odds ratios (OR) for depression in acute pain (OR 3.010; P=0.005) and chronic pain (OR 4.544, P<0.001). In addition, linear regression analyses showed that acute and chronic pain intensity, disability, and multisite pain were associated with several biopsychosocial determinants, of which anxiety was most pronounced. Further research could focus on the temporal relationship between anxiety, late-life depression, and pain. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.
Determination Of The Activity Space By The Stereometric Method
NASA Astrophysics Data System (ADS)
Deloison, Y.; Crete, N.; Mollard, R.
1980-07-01
To determine the activity space of a sitting subject, it is necessary to go beyond the mere statistical description of morphology and the knowledge of the displacement volume. An anlysis of the positions or variations of the positions of the diverse segmental elements (arms, hands, lower limbs, etc...) in the course of a given activity is required. Of the various methods used to locate quickly and accurately the spatial positions of anatomical points, stereometry makes it possible to plot the three-dimensional coordinates of any point in space in relation to a fixed trirectangle frame of reference determined by the stereome-tric measuring device. Thus, regardless of the orientation and posture of the subject, his segmental elements can be easily pin-pointed, throughout the experiment, within the space they occupy. Using this method, it is possible for a sample of operators seated at an operation station and applying either manual controls or pedals and belonging to a population statistically defined from the data collected and the analyses produced by the anthropometric study to determine a contour line of reach capability marking out the usable working space and to know, within this working space, a contour line of preferential activity that is limited, in space, by the whole range of optimal reach capability of all the subjects.
Effect of the menstrual cycle on voice quality.
Silverman, E M; Zimmer, C H
1978-01-01
The question addressed was whether most young women with no vocal training exhibit premenstrual hoarseness. Spectral (acoustical) analyses of the sustained productions of three vowels produced by 20 undergraduates at and at premenstruation were rated for degree of hoarseness. Statistical analysis of the data indicated that the typical subject was no more hoarse of premenstruation than at ovulation. To determine whether this finding represented a genuine characteristic of women's voices or a type II statistical error, a systematic replication was undertaken with another sample of 27 undergraduates. The finding replicated that of the original investigation, suggesting that premenstrual hoarseness is a rarely occurring condition among young women with no vocal training. The apparent differential effect of the menstrual cycle on trained as opposed to untrained voices deserves systematic investigation.
Differentiation of chocolates according to the cocoa's geographical origin using chemometrics.
Cambrai, Amandine; Marcic, Christophe; Morville, Stéphane; Sae Houer, Pierre; Bindler, Françoise; Marchioni, Eric
2010-02-10
The determination of the geographical origin of cocoa used to produce chocolate has been assessed through the analysis of the volatile compounds of chocolate samples. The analysis of the volatile content and their statistical processing by multivariate analyses tended to form independent groups for both Africa and Madagascar, even if some of the chocolate samples analyzed appeared in a mixed zone together with those from America. This analysis also allowed a clear separation between Caribbean chocolates and those from other origins. Height compounds (such as linalool or (E,E)-2,4-decadienal) characteristic of chocolate's different geographical origins were also identified. The method described in this work (hydrodistillation, GC analysis, and statistic treatment) may improve the control of the geographical origin of chocolate during its long production process.
Use of model calibration to achieve high accuracy in analysis of computer networks
Frogner, Bjorn; Guarro, Sergio; Scharf, Guy
2004-05-11
A system and method are provided for creating a network performance prediction model, and calibrating the prediction model, through application of network load statistical analyses. The method includes characterizing the measured load on the network, which may include background load data obtained over time, and may further include directed load data representative of a transaction-level event. Probabilistic representations of load data are derived to characterize the statistical persistence of the network performance variability and to determine delays throughout the network. The probabilistic representations are applied to the network performance prediction model to adapt the model for accurate prediction of network performance. Certain embodiments of the method and system may be used for analysis of the performance of a distributed application characterized as data packet streams.
1993-08-01
subtitled "Simulation Data," consists of detailed infonrnation on the design parmneter variations tested, subsequent statistical analyses conducted...used with confidence during the design process. The data quality can be examined in various forms such as statistical analyses of measure of merit data...merit, such as time to capture or nmaximurn pitch rate, can be calculated from the simulation time history data. Statistical techniques are then used
Statistical Modelling of the Soil Dielectric Constant
NASA Astrophysics Data System (ADS)
Usowicz, Boguslaw; Marczewski, Wojciech; Bogdan Usowicz, Jerzy; Lipiec, Jerzy
2010-05-01
The dielectric constant of soil is the physical property being very sensitive on water content. It funds several electrical measurement techniques for determining the water content by means of direct (TDR, FDR, and others related to effects of electrical conductance and/or capacitance) and indirect RS (Remote Sensing) methods. The work is devoted to a particular statistical manner of modelling the dielectric constant as the property accounting a wide range of specific soil composition, porosity, and mass density, within the unsaturated water content. Usually, similar models are determined for few particular soil types, and changing the soil type one needs switching the model on another type or to adjust it by parametrization of soil compounds. Therefore, it is difficult comparing and referring results between models. The presented model was developed for a generic representation of soil being a hypothetical mixture of spheres, each representing a soil fraction, in its proper phase state. The model generates a serial-parallel mesh of conductive and capacitive paths, which is analysed for a total conductive or capacitive property. The model was firstly developed to determine the thermal conductivity property, and now it is extended on the dielectric constant by analysing the capacitive mesh. The analysis is provided by statistical means obeying physical laws related to the serial-parallel branching of the representative electrical mesh. Physical relevance of the analysis is established electrically, but the definition of the electrical mesh is controlled statistically by parametrization of compound fractions, by determining the number of representative spheres per unitary volume per fraction, and by determining the number of fractions. That way the model is capable covering properties of nearly all possible soil types, all phase states within recognition of the Lorenz and Knudsen conditions. In effect the model allows on generating a hypothetical representative of the soil type, and that way it enables clear comparing to results from other soil type dependent models. The paper is focused on proper representing possible range of porosity in commonly existing soils. This work is done with aim of implementing the statistical-physical model of the dielectric constant to a use in the model CMEM (Community Microwave Emission Model), applicable to SMOS (Soil Moisture and Ocean Salinity ESA Mission) data. The input data to the model clearly accepts definition of soil fractions in common physical measures, and in opposition to other empirical models, does not need calibrating. It is not dependent on recognition of the soil by type, but instead it offers the control of accuracy by proper determination of the soil compound fractions. SMOS employs CMEM being funded only by the sand-clay-silt composition. Common use of the soil data, is split on tens or even hundreds soil types depending on the region. We hope that only by determining three element compounds of sand-clay-silt, in few fractions may help resolving the question of relevance of soil data to the input of CMEM, for SMOS. Now, traditionally employed soil types are converted on sand-clay-silt compounds, but hardly cover effects of other specific properties like the porosity. It should bring advantageous effects in validating SMOS observation data, and is taken for the aim in the Cal/Val project 3275, in the campaigns for SVRT (SMOS Validation and Retrieval Team). Acknowledgements. This work was funded in part by the PECS - Programme for European Cooperating States, No. 98084 "SWEX/R - Soil Water and Energy Exchange/Research".
Chung, Sang M; Lee, David J; Hand, Austin; Young, Philip; Vaidyanathan, Jayabharathi; Sahajwalla, Chandrahas
2015-12-01
The study evaluated whether the renal function decline rate per year with age in adults varies based on two primary statistical analyses: cross-section (CS), using one observation per subject, and longitudinal (LT), using multiple observations per subject over time. A total of 16628 records (3946 subjects; age range 30-92 years) of creatinine clearance and relevant demographic data were used. On average, four samples per subject were collected for up to 2364 days (mean: 793 days). A simple linear regression and random coefficient models were selected for CS and LT analyses, respectively. The renal function decline rates per year were 1.33 and 0.95 ml/min/year for CS and LT analyses, respectively, and were slower when the repeated individual measurements were considered. The study confirms that rates are different based on statistical analyses, and that a statistically robust longitudinal model with a proper sampling design provides reliable individual as well as population estimates of the renal function decline rates per year with age in adults. In conclusion, our findings indicated that one should be cautious in interpreting the renal function decline rate with aging information because its estimation was highly dependent on the statistical analyses. From our analyses, a population longitudinal analysis (e.g. random coefficient model) is recommended if individualization is critical, such as a dose adjustment based on renal function during a chronic therapy. Copyright © 2015 John Wiley & Sons, Ltd.
Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul
2017-08-01
The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version's validity. Composite reliability was tested to determine the scale's reliability. All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6) , for the Malaysian population.
Risk factors for persistent gestational trophoblastic neoplasia.
Kuyumcuoglu, Umur; Guzel, Ali Irfan; Erdemoglu, Mahmut; Celik, Yusuf
2011-01-01
This retrospective study evaluated the risk factors for persistent gestational trophoblastic disease (GTN) and determined their odds ratios. This study included 100 cases with GTN admitted to our clinic. Possible risk factors recorded were age, gravidity, parity, size of the neoplasia, and beta-human chorionic gonadotropin levels (beta-hCG) before and after the procedure. Statistical analyses consisted of the independent sample t-test and logistic regression using the statistical package SPSS ver. 15.0 for Windows (SPSS, Chicago, IL, USA). Twenty of the cases had persistent GTN, and the differences between these and the others cases were evaluated. The size of the neoplasia and histopathological type of GTN had no statistical relationship with persistence, whereas age, gravidity, and beta-hCG levels were significant risk factors for persistent GTN (p < 0.05). The odds ratios (95% confidence interval (CI)) for age, gravidity, and pre- and post-evacuation beta-hCG levels determined using logistic regression were 4.678 (0.97-22.44), 7.315 (1.16-46.16), 2.637 (1.41-4.94), and 2.339 (1.52-3.60), respectively. Patient age, gravidity, and beta-hCG levels were risk factors for persistent GTN, whereas the size of the neoplasia and histopathological type of GTN were not significant risk factors.
Baek, Hye Jin; Lee, Jeong Hyun; Lim, Hyun Kyung; Lee, Ha Young; Baek, Jung Hwan
2014-11-01
To determine the optimal clinical and CT findings for differentiating Kikuchi's disease (KD) and tuberculous lymphadenitis (TB) in patients presenting with cervical lymphadenopathy. From 2006 to 2010, 87 consecutive patients who were finally diagnosed with KD or TB were enrolled. Two radiologists performed independent analysis of contrast-enhanced neck CT images with regard to the involvement pattern, nodal or perinodal changes, and evidence of the previous infection. Significant clinical and CT findings of KD were determined by statistical analyses. Of the 87 patients, 27 (31%) were classified as having KD and 60 (69%) as having TB. Statistically significant findings of KD patients were younger age, presence of fever, involvement of ≥5 nodal levels or the bilateral neck, no or minimal nodal necrosis, marked perinodal infiltration, and no evidence of upper lung lesion or mediastinal lymphadenopathy. The presence of four or more statistically significant clinical and CT findings of KD had the largest area under the receiver-operating characteristic curve (A z = 0.861; 95% confidence intervals 0.801, 0.909), with a sensitivity of 89% and specificity of 83%. CT can be a helpful tool for differentiating KD from TB, especially when it is combined with the clinical findings.
Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul
2017-01-01
Background The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. Methods The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version’s validity. Composite reliability was tested to determine the scale’s reliability. Results All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. Conclusion The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6), for the Malaysian population. PMID:28951688
Myers, E A; Rodríguez-Robles, J A; Denardo, D F; Staub, R E; Stropoli, A; Ruane, S; Burbrink, F T
2013-11-01
Phylogeographic inference can determine the timing of population divergence, historical demographic processes, patterns of migration, and when extended to multiple species, the history of communities. Single-locus analyses can mislead interpretations of the evolutionary history of taxa and comparative analyses. It is therefore important to revisit previous single-locus phylogeographic studies, particularly those that have been used to propose general patterns for regional biotas and the processes responsible for generating inferred patterns. Here, we employ a multilocus statistical approach to re-examine the phylogeography of Lampropeltis zonata. Using nonparametic and Bayesian species delimitation, we determined that there are two well-supported species within L. zonata. Ecological niche modelling supports the delimitation of these taxa, suggesting that the two species inhabit distinct climatic environments. Gene flow between the two taxa is low and appears to occur unidirectionally. Further, our data suggest that gene flow was mediated by females, a rare pattern in snakes. In contrast to previous analyses, we determined that the divergence between the two lineages occurred in the late Pliocene (c. 2.07 Ma). Spatially and temporally, the divergence of these lineages is associated with the inundation of central California by the Monterey Bay. The effective population sizes of the two species appear to have been unaffected by Pleistocene glaciation. Our increased sampling of loci for L. zonata, combined with previously published multilocus analyses of other sympatric species, suggests that previous conclusions reached by comparative phylogeographic studies conducted within the California Floristic Province should be reassessed. © 2013 John Wiley & Sons Ltd.
Estimation versus falsification approaches in sport and exercise science.
Wilkinson, Michael; Winter, Edward M
2018-05-22
There has been a recent resurgence in debate about methods for statistical inference in science. The debate addresses statistical concepts and their impact on the value and meaning of analyses' outcomes. In contrast, philosophical underpinnings of approaches and the extent to which analytical tools match philosophical goals of the scientific method have received less attention. This short piece considers application of the scientific method to "what-is-the-influence-of x-on-y" type questions characteristic of sport and exercise science. We consider applications and interpretations of estimation versus falsification based statistical approaches and their value in addressing how much x influences y, and in measurement error and method agreement settings. We compare estimation using magnitude based inference (MBI) with falsification using null hypothesis significance testing (NHST), and highlight the limited value both of falsification and NHST to address problems in sport and exercise science. We recommend adopting an estimation approach, expressing the uncertainty of effects of x on y, and their practical/clinical value against pre-determined effect magnitudes using MBI.
Comparison of Data Quality of NOAA's ISIS and SURFRAD Networks to NREL's SRRL-BMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderberg, M.; Sengupta, M.
2014-11-01
This report provides analyses of broadband solar radiometric data quality for the National Oceanic and Atmospheric Administration's Integrated Surface Irradiance Study and Surface Radiation Budget Network (SURFRAD) solar measurement networks. The data quality of these networks is compared to that of the National Renewable Energy Laboratory's Solar Radiation Research Laboratory Baseline Measurement System (SRRL-BMS) native data resolutions and hourly averages of the data from the years 2002 through 2013. This report describes the solar radiometric data quality testing and flagging procedures and the method used to determine and tabulate data quality statistics. Monthly data quality statistics for each network weremore » plotted by year against the statistics for the SRRL-BMS. Some of the plots are presented in the body of the report, but most are in the appendix. These plots indicate that the overall solar radiometric data quality of the SURFRAD network is superior to that of the Integrated Surface Irradiance Study network and can be comparable to SRRL-BMS.« less
Using conventional F-statistics to study unconventional sex-chromosome differentiation.
Rodrigues, Nicolas; Dufresnes, Christophe
2017-01-01
Species with undifferentiated sex chromosomes emerge as key organisms to understand the astonishing diversity of sex-determination systems. Whereas new genomic methods are widening opportunities to study these systems, the difficulty to separately characterize their X and Y homologous chromosomes poses limitations. Here we demonstrate that two simple F -statistics calculated from sex-linked genotypes, namely the genetic distance ( F st ) between sexes and the inbreeding coefficient ( F is ) in the heterogametic sex, can be used as reliable proxies to compare sex-chromosome differentiation between populations. We correlated these metrics using published microsatellite data from two frog species ( Hyla arborea and Rana temporaria ), and show that they intimately relate to the overall amount of X-Y differentiation in populations. However, the fits for individual loci appear highly variable, suggesting that a dense genetic coverage will be needed for inferring fine-scale patterns of differentiation along sex-chromosomes. The applications of these F -statistics, which implies little sampling requirement, significantly facilitate population analyses of sex-chromosomes.
Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok
2017-09-01
This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.
Cascioli, Vincenzo; Liu, Zhuofu; Heusch, Andrew; McCarthy, Peter W
2016-05-01
This study presents a method for objectively measuring in-chair movement (ICM) that shows correlation with subjective ratings of comfort and discomfort. Employing a cross-over controlled, single blind design, healthy young subjects (n = 21) sat for 18 min on each of the following surfaces: contoured foam, straight foam and wood. Force sensitive resistors attached to the sitting interface measured the relative movements of the subjects during sitting. The purpose of this study was to determine whether ICM could statistically distinguish between each seat material, including two with subtle design differences. In addition, this study investigated methodological considerations, in particular appropriate threshold selection and sitting duration, when analysing objective movement data. ICM appears to be able to statistically distinguish between similar foam surfaces, as long as appropriate ICM thresholds and sufficient sitting durations are present. A relationship between greater ICM and increased discomfort, and lesser ICM and increased comfort was also found. Copyright © 2016. Published by Elsevier Ltd.
Lin, Yuning; Chen, Ziqian; Yang, Xizhang; Zhong, Qun; Zhang, Hongwen; Yang, Li; Xu, Shangwen; Li, Hui
2013-12-01
The aim of this study is to evaluate the diagnostic performance of multidetector CT angiography (CTA) in depicting bronchial and non-bronchial systemic arteries in patients with haemoptysis and to assess whether this modality helps determine the feasibility of angiographic embolisation. Fifty-two patients with haemoptysis between January 2010 and July 2011 underwent both preoperative multidetector CTA and digital subtraction angiography (DSA) imaging. Diagnostic performance of CTA in depicting arteries causing haemoptysis was assessed on a per-patient and a per-artery basis. The feasibility of the endovascular treatment evaluated by CTA was analysed. Sensitivity, specificity, and positive and negative predictive values for those analyses were determined. Fifty patients were included in the artery-presence-number analysis. In the per-patient analysis, neither CTA (P = 0.25) nor DSA (P = 1.00) showed statistical difference in the detection of arteries causing haemoptysis. The sensitivity, specificity, and positive and negative predictive values were 94%, 100%, 100%, and 40%, respectively, for the presence of pathologic arteries evaluated by CTA, and 98%, 100%, 100%, and 67%, respectively, for DSA. On the per-artery basis, CTA correctly identified 97% (107/110). Fifty-two patients were included in the feasibility analysis. The performance of CTA in predicting the feasibility of angiographic embolisation was not statistically different from the treatment performed (P = 1.00). The sensitivity, specificity, and positive and negative predictive values were 96%, 80%, 98% and 67%, respectively, for CTA. Multidetector CTA is an accurate imaging method in depicting the presence and number of arteries causing haemoptysis. This modality is also useful for determining the feasibility of angiographic embolisation for haemoptysis. © 2013 The Authors. Journal of Medical Imaging and Radiation Oncology © 2013 The Royal Australian and New Zealand College of Radiologists.
Energy requirements in preschool-age children with cerebral palsy.
Walker, Jacqueline L; Bell, Kristie L; Boyd, Roslyn N; Davies, Peter S W
2012-12-01
There is a paucity of data concerning the energy requirements (ERs) of preschool-age children with cerebral palsy (CP), the knowledge of which is essential for early nutritional management. We aimed to determine the ERs for preschool-age children with CP in relation to functional ability, motor type, and distribution and compared with typically developing children (TDC) and published estimation equations. Thirty-two children with CP (63% male) of all functional abilities, motor types, and distributions and 16 TDC (63% male) aged 2.9-4.4 y participated in this study. The doubly labeled water method was used to determine ERs. Statistical analyses were conducted by 1-factor ANOVA and post hoc Tukey honestly significant difference tests, independent and paired t tests, Bland and Altman analyses, correlations, and multivariable regressions. As a population, children with CP had significantly lower ERs than did TDC (P < 0.05). No significant difference in ERs was found between ambulant children and TDC. Marginally ambulant and nonambulant children had ERs that were ∼18% lower than those of ambulant children and 31% lower than those of TDC. A trend toward lower ERs with greater numbers of limbs involved was observed. The influence of motor type could not be determined statistically. Published equations substantially underestimated ERs in the nonambulant children by ∼22%. In preschool-age children with CP, ERs decreased as ambulatory status declined and more limbs were involved. The greatest predictor of ERs was fat-free mass, then ambulatory status. Future research should build on the information presented to expand the knowledge base regarding ERs in children with CP. This trial was registered with the Australian New Zealand Clinical Trials Registry as ACTRN 12612000686808.
Geographic variation of stable isotopes in African elephant ivory
NASA Astrophysics Data System (ADS)
Ziegler, S.; Merker, S.; Jacob, D.
2012-04-01
In 1989, the international community listed the African elephant in Appendix I of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) thus prohibiting commercial ivory trade. Recent surveillance data show that the illegal trade in ivory has been growing worldwide. Long-term preservation of many of the African elephant populations can be supported with a control mechanism that helps with the implementation of remedial conservation action. Therefore, setting up a reference database that predicts the origin of ivory specimens can assist in determining smuggling routes and the provenance of illegal ivory. Our research builds on earlier work to seek an appropriate method for determining the area of origin for individual tusks. Several researchers have shown that the provenance of elephant ivory can be traced by its isotopic composition, but this is the first attempt to produce an integrated isotopic reference database of elephant ivory provenance. We applied a combination of various routine geochemical analyses to measure the stable isotope ratios of hydrogen, carbon, nitrogen, oxygen, and sulphur. Up to now, we analysed 606 ivory samples of known geographical origin from African range states, museums and private collections, comprising 22 African elephant range states. The isotopic measurements were superimposed with data layers from vegetation, geology and climate. A regression function for the isotope composition of the water isotopes in precipitation and collagen in ivory was developed to overcome the problem of imprecise origin of some of the sampled material. Multivariate statistics, such as nearest neighborhood and discriminate analysis were applied to eventually allow a statistical determination of the provenance for ivory of unknown origin. Our results suggest that the combination of isotopic parameters have the potential to provide predictable and complementary markers for estimating the origin of seized elephant ivory.
Isotopic analysis of uranium in natural waters by alpha spectrometry
Edwards, K.W.
1968-01-01
A method is described for the determination of U234/U238 activity ratios for uranium present in natural waters. The uranium is coprecipitated from solution with aluminum phosphate, extracted into ethyl acetate, further purified by ion exchange, and finally electroplated on a titanium disc for counting. The individual isotopes are determined by measurement of the alpha-particle energy spectrum using a high resolution low-background alpha spectrometer. Overall chemical recovery of about 90 percent and a counting efficiency of 25 percent allow analyses of water samples containing as little as 0.10 ?g/l of uranium. The accuracy of the method is limited, on most samples, primarily by counting statistics.
Parton distributions in the LHC era
NASA Astrophysics Data System (ADS)
Del Debbio, Luigi
2018-03-01
Analyses of LHC (and other!) experiments require robust and statistically accurate determinations of the structure of the proton, encoded in the parton distribution functions (PDFs). The standard description of hadronic processes relies on factorization theorems, which allow a separation of process-dependent short-distance physics from the universal long-distance structure of the proton. Traditionally the PDFs are obtained from fits to experimental data. However, understanding the long-distance properties of hadrons is a nonperturbative problem, and lattice QCD can play a role in providing useful results from first principles. In this talk we compare the different approaches used to determine PDFs, and try to assess the impact of existing, and future, lattice calculations.
Behr, Guilherme A; Patel, Jay P; Coote, Marg; Moreira, Jose C F; Gelain, Daniel P; Steiner, Meir; Frey, Benicio N
2017-05-01
Previous studies have reported that salivary concentrations of certain hormones correlate with their respective serum levels. However, most of these studies did not control for potential blood contamination in saliva. In the present study we developed a statistical method to test the amount of blood contamination that needs to be avoided in saliva samples for the following hormones: cortisol, estradiol, progesterone, testosterone and oxytocin. Saliva and serum samples were collected from 38 healthy, medication-free women (mean age=33.8±7.3yr.; range=19-45). Serum and salivary hormonal levels and the amount of transferrin in saliva samples were determined using enzyme immunoassays. Salivary transferrin levels did not correlate with salivary cortisol or estradiol (up to 3mg/dl), but they were positively correlated with salivary testosterone, progesterone and oxytocin (p<0.05). After controlling for blood contamination, only cortisol (r=0.65, P<0.001) and progesterone levels (r=0.57, P=0.002) displayed a positive correlation between saliva and serum. Our analyses suggest that transferrin levels higher than 0.80, 0.92 and 0.64mg/dl should be avoided for testosterone, progesterone and oxytocin salivary analyses, respectively. We recommend that salivary transferrin is measured in research involving salivary hormones in order to determine the level of blood contamination that might affect specific hormonal salivary concentrations. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick
2006-07-01
Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less
Inferential Statistics in "Language Teaching Research": A Review and Ways Forward
ERIC Educational Resources Information Center
Lindstromberg, Seth
2016-01-01
This article reviews all (quasi)experimental studies appearing in the first 19 volumes (1997-2015) of "Language Teaching Research" (LTR). Specifically, it provides an overview of how statistical analyses were conducted in these studies and of how the analyses were reported. The overall conclusion is that there has been a tight adherence…
Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments
NASA Technical Reports Server (NTRS)
Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew
2011-01-01
The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response's frequency over the test duration. This characterization process assists in evaluating the discreteness of a signal as well as the stability of the chamber response. Broadband stability was assessed using a running root-mean-square evaluation. These techniques were also employed, in a comparative analysis, on available Fastrac data, and these results are presented here.
SOCR Analyses - an Instructional Java Web-based Statistical Analysis Toolkit.
Chu, Annie; Cui, Jenny; Dinov, Ivo D
2009-03-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test.The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website.In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models.
Genetic polymorphisms of pharmacogenomic VIP variants in the Yi population from China.
Yan, Mengdan; Li, Dianzhen; Zhao, Guige; Li, Jing; Niu, Fanglin; Li, Bin; Chen, Peng; Jin, Tianbo
2018-03-30
Drug response and target therapeutic dosage are different among individuals. The variability is largely genetically determined. With the development of pharmacogenetics and pharmacogenomics, widespread research have provided us a wealth of information on drug-related genetic polymorphisms, and the very important pharmacogenetic (VIP) variants have been identified for the major populations around the world whereas less is known regarding minorities in China, including the Yi ethnic group. Our research aims to screen the potential genetic variants in Yi population on pharmacogenomics and provide a theoretical basis for future medication guidance. In the present study, 80 VIP variants (selected from the PharmGKB database) were genotyped in 100 unrelated and healthy Yi adults recruited for our research. Through statistical analysis, we made a comparison between the Yi and other 11 populations listed in the HapMap database for significant SNPs detection. Two specific SNPs were subsequently enrolled in an observation on global allele distribution with the frequencies downloaded from ALlele FREquency Database. Moreover, F-statistics (Fst), genetic structure and phylogenetic tree analyses were conducted for determination of genetic similarity between the 12 ethnic groups. Using the χ2 tests, rs1128503 (ABCB1), rs7294 (VKORC1), rs9934438 (VKORC1), rs1540339 (VDR) and rs689466 (PTGS2) were identified as the significantly different loci for further analysis. The global allele distribution revealed that the allele "A" of rs1540339 and rs9934438 were more frequent in Yi people, which was consistent with the most populations in East Asia. F-statistics (Fst), genetic structure and phylogenetic tree analyses demonstrated that the Yi and CHD shared a closest relationship on their genetic backgrounds. Additionally, Yi was considered similar to the Han people from Shaanxi province among the domestic ethnic populations in China. Our results demonstrated significant differences on several polymorphic SNPs and supplement the pharmacogenomic information for the Yi population, which could provide new strategies for optimizing clinical medication in accordance with the genetic determinants of drug toxicity and efficacy. Copyright © 2018 Elsevier B.V. All rights reserved.
Advanced Structural Analyses by Third Generation Synchrotron Radiation Powder Diffraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakata, M.; Aoyagi, S.; Ogura, T.
2007-01-19
Since the advent of the 3rd generation Synchrotron Radiation (SR) sources, such as SPring-8, the capabilities of SR powder diffraction increased greatly not only in an accurate structure refinement but also ab initio structure determination. In this study, advanced structural analyses by 3rd generation SR powder diffraction based on the Large Debye-Scherrer camera installed at BL02B2, SPring-8 is described. Because of high angular resolution and high counting statistics powder data collected at BL02B2, SPring-8, ab initio structure determination can cope with a molecular crystals with 65 atoms including H atoms. For the structure refinements, it is found that a kindmore » of Maximum Entropy Method in which several atoms are omitted in phase calculation become very important to refine structural details of fairy large molecule in a crystal. It should be emphasized that until the unknown structure is refined very precisely, the obtained structure by Genetic Algorithm (GA) or some other ab initio structure determination method using real space structural knowledge, it is not possible to tell whether the structure obtained by the method is correct or not. In order to determine and/or refine crystal structure of rather complicated molecules, we cannot overemphasize the importance of the 3rd generation SR sources.« less
Size and shape measurement in contemporary cephalometrics.
McIntyre, Grant T; Mossey, Peter A
2003-06-01
The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.
Bos, Judith T; Donders, Nathalie C G M; Bouwman-Brouwer, Karin M; Van der Gulden, Joost W J
2009-11-01
To investigate (a) differences in work characteristics and (b) determinants of job satisfaction among employees in different age groups. A cross-sectional questionnaire was filled in by 1,112 university employees, classified into four age groups. (a) Work characteristics were analysed with ANOVA while adjusting for sex and job classification. (b) Job satisfaction was regressed against job demands and job resources adapted from the Job Demands-Resources model. Statistically significant differences concerning work characteristics between age groups are present, but rather small. Regression analyses revealed that negative association of the job demands workload and conflicts at work with job satisfaction faded by adding job resources. Job resources were most correlated with more job satisfaction, especially more skill discretion and more relations with colleagues. Skill discretion and relations with colleagues are major determinants of job satisfaction. However, attention should also be given to conflicts at work, support from supervisor and opportunities for further education, because the mean scores of these work characteristics were disappointing in almost all age groups. The latter two characteristics were found to be associated significantly to job satisfaction in older workers.
[Projection of prisoner numbers].
Metz, Rainer; Sohn, Werner
2015-01-01
The past and future development of occupancy rates in prisons is of crucial importance for the judicial administration of every country. Basic factors for planning the required penal facilities are seasonal fluctuations, minimum, maximum and average occupancy as well as the present situation and potential development of certain imprisonment categories. As the prisoner number of a country is determined by a complex set of interdependent conditions, it has turned out to be difficult to provide any theoretical explanations. The idea accepted in criminology for a long time that prisoner numbers are interdependent with criminal policy must be regarded as having failed. Statistical and time series analyses may help, however, to identify the factors having influenced the development of prisoner numbers in the past. The analyses presented here, first describe such influencing factors from a criminological perspective and then deal with their statistical identification and modelling. Using the development of prisoner numbers in Hesse as an example, it has been found that modelling methods in which the independent variables predict the dependent variable with a time lag are particularly helpful. A potential complication is, however, that for predicting the number of prisoners the different dynamics in German and foreign prisoners require the development of further models.
Spatial and Alignment Analyses for a field of Small Volcanic Vents South of Pavonis Mons Mars
NASA Technical Reports Server (NTRS)
Bleacher, J. E.; Glaze, L. S.; Greeley, R.; Hauber, E.; Baloga, S. M.; Sakimoto, S. E. H.; Williams, D. A.; Glotch, T. D.
2008-01-01
The Tharsis province of Mars displays a variety of small volcanic vent (10s krn in diameter) morphologies. These features were identified in Mariner and Viking images [1-4], and Mars Orbiter Laser Altimeter (MOLA) data show them to be more abundant than originally observed [5,6]. Recent studies are classifying their diverse morphologies [7-9]. Building on this work, we are mapping the location of small volcanic vents (small-vents) in the Tharsis province using MOLA, Thermal Emission Imaging System, and High Resolution Stereo Camera data [10]. Here we report on a preliminary study of the spatial and alignment relationships between small-vents south of Pavonis Mons, as determined by nearest neighbor and two-point azimuth statistical analyses. Terrestrial monogenetic volcanic fields display four fundamental characteristics: 1) recurrence rates of eruptions,2 ) vent abundance, 3) vent distribution, and 4) tectonic relationships [11]. While understanding recurrence rates typically requires field measurements, insight into vent abundance, distribution, and tectonic relationships can be established by mapping of remotely sensed data, and subsequent application of spatial statistical studies [11,12], the goal of which is to link the distribution of vents to causal processes.
Prognostic value of inflammation-based scores in patients with osteosarcoma
Liu, Bangjian; Huang, Yujing; Sun, Yuanjue; Zhang, Jianjun; Yao, Yang; Shen, Zan; Xiang, Dongxi; He, Aina
2016-01-01
Systemic inflammation responses have been associated with cancer development and progression. C-reactive protein (CRP), Glasgow prognostic score (GPS), neutrophil-lymphocyte ratio (NLR), platelet-lymphocyte ratio (PLR), lymphocyte-monocyte ratio (LMR), and neutrophil-platelet score (NPS) have been shown to be independent risk factors in various types of malignant tumors. This retrospective analysis of 162 osteosarcoma cases was performed to estimate their predictive value of survival in osteosarcoma. All statistical analyses were performed by SPSS statistical software. Receiver operating characteristic (ROC) analysis was generated to set optimal thresholds; area under the curve (AUC) was used to show the discriminatory abilities of inflammation-based scores; Kaplan-Meier analysis was performed to plot the survival curve; cox regression models were employed to determine the independent prognostic factors. The optimal cut-off points of NLR, PLR, and LMR were 2.57, 123.5 and 4.73, respectively. GPS and NLR had a markedly larger AUC than CRP, PLR and LMR. High levels of CRP, GPS, NLR, PLR, and low level of LMR were significantly associated with adverse prognosis (P < 0.05). Multivariate Cox regression analyses revealed that GPS, NLR, and occurrence of metastasis were top risk factors associated with death of osteosarcoma patients. PMID:28008988
A d-statistic for single-case designs that is equivalent to the usual between-groups d-statistic.
Shadish, William R; Hedges, Larry V; Pustejovsky, James E; Boyajian, Jonathan G; Sullivan, Kristynn J; Andrade, Alma; Barrientos, Jeannette L
2014-01-01
We describe a standardised mean difference statistic (d) for single-case designs that is equivalent to the usual d in between-groups experiments. We show how it can be used to summarise treatment effects over cases within a study, to do power analyses in planning new studies and grant proposals, and to meta-analyse effects across studies of the same question. We discuss limitations of this d-statistic, and possible remedies to them. Even so, this d-statistic is better founded statistically than other effect size measures for single-case design, and unlike many general linear model approaches such as multilevel modelling or generalised additive models, it produces a standardised effect size that can be integrated over studies with different outcome measures. SPSS macros for both effect size computation and power analysis are available.
NASA Astrophysics Data System (ADS)
Gbedo, Yémalin Gabin; Mangin-Brinet, Mariane
2017-07-01
We present a new procedure to determine parton distribution functions (PDFs), based on Markov chain Monte Carlo (MCMC) methods. The aim of this paper is to show that we can replace the standard χ2 minimization by procedures grounded on statistical methods, and on Bayesian inference in particular, thus offering additional insight into the rich field of PDFs determination. After a basic introduction to these techniques, we introduce the algorithm we have chosen to implement—namely Hybrid (or Hamiltonian) Monte Carlo. This algorithm, initially developed for Lattice QCD, turns out to be very interesting when applied to PDFs determination by global analyses; we show that it allows us to circumvent the difficulties due to the high dimensionality of the problem, in particular concerning the acceptance. A first feasibility study is performed and presented, which indicates that Markov chain Monte Carlo can successfully be applied to the extraction of PDFs and of their uncertainties.
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C; Downing, James R; Lamba, Jatinder
2009-08-15
In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org.
Dietary determinants for Hb-acrylamide and Hb-glycidamide adducts in Danish non-smoking women.
Outzen, Malene; Egeberg, Rikke; Dragsted, Lars; Christensen, Jane; Olesen, Pelle T; Frandsen, Henrik; Overvad, Kim; Tjønneland, Anne; Olsen, Anja
2011-05-01
Acrylamide (AA) is a probable human carcinogen that is formed in heat-treated carbohydrate-rich foods. The validity of FFQ to assess AA exposure has been questioned. The aim of the present cross-sectional study was to investigate dietary determinants of Hb-AA and Hb-glycidamide (GA) adducts. The study included 537 non-smoking women aged 50-65 years who participated in the Diet, Cancer and Health cohort (1993-97). At study baseline, blood samples and information on dietary and lifestyle variables obtained from self-administered questionnaires were collected. From blood samples, Hb-AA and Hb-GA in erythrocytes were analysed by liquid chromatography/MS/MS. Dietary determinants were evaluated by multiple linear regression analyses adjusted for age and smoking behaviour among ex-smokers. The median for Hb-AA was 35 pmol/g globin (5th percentile 17, 95th percentile 89) and for Hb-GA 21 pmol/g globin (5th percentile 8, 95th percentile 49). Of the dietary factors studied, intakes of coffee and chips were statistically significantly associated with a 4 % per 200 g/d (95 % CI 2, 7; P < 0·0001) and an 18 % per 5 g/d (95 % CI 6, 31; P = 0·002) higher Hb-AA, respectively. This model explained 17 % of the variation in Hb-AA. Intakes of coffee and biscuits/crackers were statistically significantly associated with a 3 % per 200 g/d (95 % CI 1, 6; P = 0·005) and 12 % per 10 g/d (95 % CI 3, 23; P = 0·01) higher Hb-GA, respectively. This model explained 12 % of the variation in Hb-GA. In conclusion, only a few dietary determinants of Hb-AA and Hb-GA were identified. Thus, the present study implies that dietary intake measured by an FFQ explains only to a limited extent the variation in Hb-AA and Hb-GA concentrations.
ERIC Educational Resources Information Center
Thompson, Bruce; Melancon, Janet G.
Effect sizes have been increasingly emphasized in research as more researchers have recognized that: (1) all parametric analyses (t-tests, analyses of variance, etc.) are correlational; (2) effect sizes have played an important role in meta-analytic work; and (3) statistical significance testing is limited in its capacity to inform scientific…
Comments on `A Cautionary Note on the Interpretation of EOFs'.
NASA Astrophysics Data System (ADS)
Behera, Swadhin K.; Rao, Suryachandra A.; Saji, Hameed N.; Yamagata, Toshio
2003-04-01
The misleading aspect of the statistical analyses used in Dommenget and Latif, which raises concerns on some of the reported climate modes, is demonstrated. Adopting simple statistical techniques, the physical existence of the Indian Ocean dipole mode is shown and then the limitations of varimax and regression analyses in capturing the climate mode are discussed.
Barbie, Dana L.; Wehmeyer, Loren L.
2012-01-01
Trends in selected streamflow statistics during 1922-2009 were evaluated at 19 long-term streamflow-gaging stations considered indicative of outflows from Texas to Arkansas, Louisiana, Galveston Bay, and the Gulf of Mexico. The U.S. Geological Survey, in cooperation with the Texas Water Development Board, evaluated streamflow data from streamflow-gaging stations with more than 50 years of record that were active as of 2009. The outflows into Arkansas and Louisiana were represented by 3 streamflow-gaging stations, and outflows into the Gulf of Mexico, including Galveston Bay, were represented by 16 streamflow-gaging stations. Monotonic trend analyses were done using the following three streamflow statistics generated from daily mean values of streamflow: (1) annual mean daily discharge, (2) annual maximum daily discharge, and (3) annual minimum daily discharge. The trend analyses were based on the nonparametric Kendall's Tau test, which is useful for the detection of monotonic upward or downward trends with time. A total of 69 trend analyses by Kendall's Tau were computed - 19 periods of streamflow multiplied by the 3 streamflow statistics plus 12 additional trend analyses because the periods of record for 2 streamflow-gaging stations were divided into periods representing pre- and post-reservoir impoundment. Unless otherwise described, each trend analysis used the entire period of record for each streamflow-gaging station. The monotonic trend analysis detected 11 statistically significant downward trends, 37 instances of no trend, and 21 statistically significant upward trends. One general region studied, which seemingly has relatively more upward trends for many of the streamflow statistics analyzed, includes the rivers and associated creeks and bayous to Galveston Bay in the Houston metropolitan area. Lastly, the most western river basins considered (the Nueces and Rio Grande) had statistically significant downward trends for many of the streamflow statistics analyzed.
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
NASA Astrophysics Data System (ADS)
Hendricks, Lorin; Spencer Guthrie, W.; Mazzeo, Brian
2018-04-01
An automated acoustic impact-echo testing device with seven channels has been developed for faster surveying of bridge decks. Due to potential variations in bridge deck overlay thickness, varying conditions between testing passes, and occasional imprecise equipment calibrations, a method that can account for variations in deck properties and testing conditions was necessary to correctly interpret the acoustic data. A new methodology involving statistical analyses was therefore developed. After acoustic impact-echo data are collected and analyzed, the results are normalized by the median for each channel, a Gaussian distribution is fit to the histogram of the data, and the Kullback-Leibler divergence test or Otsu's method is then used to determine the optimum threshold for differentiating between intact and delaminated concrete. The new methodology was successfully applied to individual channels of previously unusable acoustic impact-echo data obtained from a three-lane interstate bridge deck surfaced with a polymer overlay, and the resulting delamination map compared very favorably with the results of a manual deck sounding survey.
Image Processing Diagnostics: Emphysema
NASA Astrophysics Data System (ADS)
McKenzie, Alex
2009-10-01
Currently the computerized tomography (CT) scan can detect emphysema sooner than traditional x-rays, but other tests are required to measure more accurately the amount of affected lung. CT scan images show clearly if a patient has emphysema, but is unable by visual scan alone, to quantify the degree of the disease, as it appears merely as subtle, barely distinct, dark spots on the lung. Our goal is to create a software plug-in to interface with existing open source medical imaging software, to automate the process of accurately diagnosing and determining emphysema severity levels in patients. This will be accomplished by performing a number of statistical calculations using data taken from CT scan images of several patients representing a wide range of severity of the disease. These analyses include an examination of the deviation from a normal distribution curve to determine skewness, a commonly used statistical parameter. Our preliminary results show that this method of assessment appears to be more accurate and robust than currently utilized methods which involve looking at percentages of radiodensities in air passages of the lung.
Accuracy of Physical Self-Description Among Chronic Exercisers and Non-Exercisers.
Berning, Joseph M; DeBeliso, Mark; Sevene, Trish G; Adams, Kent J; Salmon, Paul; Stamford, Bryant A
2014-11-06
This study addressed the role of chronic exercise to enhance physical self-description as measured by self-estimated percent body fat. Accuracy of physical self-description was determined in normal-weight, regularly exercising and non-exercising males with similar body mass index (BMI)'s and females with similar BMI's (n=42 males and 45 females of which 23 males and 23 females met criteria to be considered chronic exercisers). Statistical analyses were conducted to determine the degree of agreement between self-estimated percent body fat and actual laboratory measurements (hydrostatic weighing). Three statistical techniques were employed: Pearson correlation coefficients, Bland and Altman plots, and regression analysis. Agreement between measured and self-estimated percent body fat was superior for males and females who exercised chronically, compared to non-exercisers. The clinical implications are as follows. Satisfaction with one's body can be influenced by several factors, including self-perceived body composition. Dissatisfaction can contribute to maladaptive and destructive weight management behaviors. The present study suggests that regular exercise provides a basis for more positive weight management behaviors by enhancing the accuracy of self-assessed body composition.
Badran, M; Morsy, R; Soliman, H; Elnimr, T
2016-01-01
The trace elements metabolism has been reported to possess specific roles in the pathogenesis and progress of diabetes mellitus. Due to the continuous increase in the population of patients with Type 2 diabetes (T2D), this study aims to assess the levels and inter-relationships of fast blood glucose (FBG) and serum trace elements in Type 2 diabetic patients. This study was conducted on 40 Egyptian Type 2 diabetic patients and 36 healthy volunteers (Hospital of Tanta University, Tanta, Egypt). The blood serum was digested and then used to determine the levels of 24 trace elements using an inductive coupled plasma mass spectroscopy (ICP-MS). Multivariate statistical analysis depended on correlation coefficient, cluster analysis (CA) and principal component analysis (PCA), were used to analysis the data. The results exhibited significant changes in FBG and eight of trace elements, Zn, Cu, Se, Fe, Mn, Cr, Mg, and As, levels in the blood serum of Type 2 diabetic patients relative to those of healthy controls. The statistical analyses using multivariate statistical techniques were obvious in the reduction of the experimental variables, and grouping the trace elements in patients into three clusters. The application of PCA revealed a distinct difference in associations of trace elements and their clustering patterns in control and patients group in particular for Mg, Fe, Cu, and Zn that appeared to be the most crucial factors which related with Type 2 diabetes. Therefore, on the basis of this study, the contributors of trace elements content in Type 2 diabetic patients can be determine and specify with correlation relationship and multivariate statistical analysis, which confirm that the alteration of some essential trace metals may play a role in the development of diabetes mellitus. Copyright © 2015 Elsevier GmbH. All rights reserved.
Marciano, Marina Angélica; Estrela, Carlos; Mondelli, Rafael Francisco Lia; Ordinola-Zapata, Ronald; Duarte, Marco Antonio Hungaro
2013-01-01
The aim of the study was to determine if the increase in radiopacity provided by bismuth oxide is related to the color alteration of calcium silicate-based cement. Calcium silicate cement (CSC) was mixed with 0%, 15%, 20%, 30% and 50% of bismuth oxide (BO), determined by weight. Mineral trioxide aggregate (MTA) was the control group. The radiopacity test was performed according to ISO 6876/2001. The color was evaluated using the CIE system. The assessments were performed after 24 hours, 7 and 30 days of setting time, using a spectrophotometer to obtain the ΔE, Δa, Δb and ΔL values. The statistical analyses were performed using the Kruskal-Wallis/Dunn and ANOVA/Tukey tests (p<0.05). The cements in which bismuth oxide was added showed radiopacity corresponding to the ISO recommendations (>3 mm equivalent of Al). The MTA group was statistically similar to the CSC/30% BO group (p>0.05). In regard to color, the increase of bismuth oxide resulted in a decrease in the ΔE value of the calcium silicate cement. The CSC group presented statistically higher ΔE values than the CSC/50% BO group (p<0.05). The comparison between 24 hours and 7 days showed higher ΔE for the MTA group, with statistical differences for the CSC/15% BO and CSC/50% BO groups (p<0.05). After 30 days, CSC showed statistically higher ΔE values than CSC/30% BO and CSC/50% BO (p<0.05). In conclusion, the increase in radiopacity provided by bismuth oxide has no relation to the color alteration of calcium silicate-based cements.
Statistics provide guidance for indigenous organic carbon detection on Mars missions.
Sephton, Mark A; Carter, Jonathan N
2014-08-01
Data from the Viking and Mars Science Laboratory missions indicate the presence of organic compounds that are not definitively martian in origin. Both contamination and confounding mineralogies have been suggested as alternatives to indigenous organic carbon. Intuitive thought suggests that we are repeatedly obtaining data that confirms the same level of uncertainty. Bayesian statistics may suggest otherwise. If an organic detection method has a true positive to false positive ratio greater than one, then repeated organic matter detection progressively increases the probability of indigeneity. Bayesian statistics also reveal that methods with higher ratios of true positives to false positives give higher overall probabilities and that detection of organic matter in a sample with a higher prior probability of indigenous organic carbon produces greater confidence. Bayesian statistics, therefore, provide guidance for the planning and operation of organic carbon detection activities on Mars. Suggestions for future organic carbon detection missions and instruments are as follows: (i) On Earth, instruments should be tested with analog samples of known organic content to determine their true positive to false positive ratios. (ii) On the mission, for an instrument with a true positive to false positive ratio above one, it should be recognized that each positive detection of organic carbon will result in a progressive increase in the probability of indigenous organic carbon being present; repeated measurements, therefore, can overcome some of the deficiencies of a less-than-definitive test. (iii) For a fixed number of analyses, the highest true positive to false positive ratio method or instrument will provide the greatest probability that indigenous organic carbon is present. (iv) On Mars, analyses should concentrate on samples with highest prior probability of indigenous organic carbon; intuitive desires to contrast samples of high prior probability and low prior probability of indigenous organic carbon should be resisted.
Guidelines for the design and statistical analysis of experiments in papers submitted to ATLA.
Festing, M F
2001-01-01
In vitro experiments need to be well designed and correctly analysed if they are to achieve their full potential to replace the use of animals in research. An "experiment" is a procedure for collecting scientific data in order to answer a hypothesis, or to provide material for generating new hypotheses, and differs from a survey because the scientist has control over the treatments that can be applied. Most experiments can be classified into one of a few formal designs, the most common being completely randomised, and randomised block designs. These are quite common with in vitro experiments, which are often replicated in time. Some experiments involve a single independent (treatment) variable, while other "factorial" designs simultaneously vary two or more independent variables, such as drug treatment and cell line. Factorial designs often provide additional information at little extra cost. Experiments need to be carefully planned to avoid bias, be powerful yet simple, provide for a valid statistical analysis and, in some cases, have a wide range of applicability. Virtually all experiments need some sort of statistical analysis in order to take account of biological variation among the experimental subjects. Parametric methods using the t test or analysis of variance are usually more powerful than non-parametric methods, provided the underlying assumptions of normality of the residuals and equal variances are approximately valid. The statistical analyses of data from a completely randomised design, and from a randomised-block design are demonstrated in Appendices 1 and 2, and methods of determining sample size are discussed in Appendix 3. Appendix 4 gives a checklist for authors submitting papers to ATLA.
Satellite disintegration dynamics
NASA Technical Reports Server (NTRS)
Dasenbrock, R. R.; Kaufman, B.; Heard, W. B.
1975-01-01
The subject of satellite disintegration is examined in detail. Elements of the orbits of individual fragments, determined by DOD space surveillance systems, are used to accurately predict the time and place of fragmentation. Dual time independent and time dependent analyses are performed for simulated and real breakups. Methods of statistical mechanics are used to study the evolution of the fragment clouds. The fragments are treated as an ensemble of non-interacting particles. A solution of Liouville's equation is obtained which enables the spatial density to be calculated as a function of position, time and initial velocity distribution.
A Novel Statistical Analysis and Interpretation of Flow Cytometry Data
2013-07-05
ing parameters of the mathematical model are determined by using least squares to fit the data in Figures 1 and 2 (see Section 4). The second method is...variance (for all k and j). Then the AIC, which is the expected value of the relative Kullback – Leibler distance for a given model [16], is Kn = n log ( J...emphasized that the fit of the model is quite good for both donors and cell types. As such, we proceed to analyse the dynamic responsiveness of the
Have the temperature time series a structural change after 1998?
NASA Astrophysics Data System (ADS)
Werner, Rolf; Valev, Dimitare; Danov, Dimitar
2012-07-01
The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.
Preliminary analyses of SIB-B radar data for recent Hawaii lava flows
NASA Technical Reports Server (NTRS)
Kaupp, V. H.; Derryberry, B. A.; Macdonald, H. C.; Gaddis, L. R.; Mouginis-Mark, P. J.
1986-01-01
The Shuttle Imaging Radar (SIR-B) experiment acquired two L-band (23 cm wavelength) radar images (at about 28 and 48 deg incidence angles) over the Kilauea Volcano area of southeastern Hawaii. Geologic analysis of these data indicates that, although aa lava flows and pyroclastic deposits can be discriminated, pahoehoe lava flows are not readily distinguished from surrounding low return materials. Preliminary analysis of data extracted from isolated flows indicates that flow type (i.e., aa or pahoehoe) and relative age can be determined from their basic statistics and illumination angle.
Comparison of methods for estimating flood magnitudes on small streams in Georgia
Hess, Glen W.; Price, McGlone
1989-01-01
The U.S. Geological Survey has collected flood data for small, natural streams at many sites throughout Georgia during the past 20 years. Flood-frequency relations were developed for these data using four methods: (1) observed (log-Pearson Type III analysis) data, (2) rainfall-runoff model, (3) regional regression equations, and (4) map-model combination. The results of the latter three methods were compared to the analyses of the observed data in order to quantify the differences in the methods and determine if the differences are statistically significant.
A likelihood ratio test for evolutionary rate shifts and functional divergence among proteins
Knudsen, Bjarne; Miyamoto, Michael M.
2001-01-01
Changes in protein function can lead to changes in the selection acting on specific residues. This can often be detected as evolutionary rate changes at the sites in question. A maximum-likelihood method for detecting evolutionary rate shifts at specific protein positions is presented. The method determines significance values of the rate differences to give a sound statistical foundation for the conclusions drawn from the analyses. A statistical test for detecting slowly evolving sites is also described. The methods are applied to a set of Myc proteins for the identification of both conserved sites and those with changing evolutionary rates. Those positions with conserved and changing rates are related to the structures and functions of their proteins. The results are compared with an earlier Bayesian method, thereby highlighting the advantages of the new likelihood ratio tests. PMID:11734650
[Organizational climate and burnout syndrome].
Lubrańska, Anna
2011-01-01
The paper addresses the issue of organizational climate and burnout syndrome. It has been assumed that burnout syndrome is dependent on work climate (organizational climate), therefore, two concepts were analyzed: by D. Kolb (organizational climate) and by Ch. Maslach (burnout syndrome). The research involved 239 persons (122 woman, 117 men), aged 21-66. In the study Maslach Burnout Inventory (MBI) and Inventory of Organizational Climate were used. The results of statistical methods (correlation analysis, one-variable analysis of variance and regression analysis) evidenced a strong relationship between organizational climate and burnout dimension. As depicted by the results, there are important differences in the level of burnout between the study participants who work in different types of organizational climate. The results of the statistical analyses indicate that the organizational climate determines burnout syndrome. Therefore, creating supportive conditions at the workplace might reduce the risk of burnout.
Marzulli, F; Maguire, H C
1982-02-01
Several guinea-pig predictive test methods were evaluated by comparison of results with those obtained with human predictive tests, using ten compounds that have been used in cosmetics. The method involves the statistical analysis of the frequency with which guinea-pig tests agree with the findings of tests in humans. In addition, the frequencies of false positive and false negative predictive findings are considered and statistically analysed. The results clearly demonstrate the superiority of adjuvant tests (complete Freund's adjuvant) in determining skin sensitizers and the overall superiority of the guinea-pig maximization test in providing results similar to those obtained by human testing. A procedure is suggested for utilizing adjuvant and non-adjuvant test methods for characterizing compounds as of weak, moderate or strong sensitizing potential.
An evaluation of the Goddard Space Flight Center Library
NASA Technical Reports Server (NTRS)
Herner, S.; Lancaster, F. W.; Wright, N.; Ockerman, L.; Shearer, B.; Greenspan, S.; Mccartney, J.; Vellucci, M.
1979-01-01
The character and degree of coincidence between the current and future missions, programs, and projects of the Goddard Space Flight Center and the current and future collection, services, and facilities of its library were determined from structured interviews and discussions with various classes of facility personnel. In addition to the tabulation and interpretation of the data from the structured interview survey, five types of statistical analyses were performed to corroborate (or contradict) the survey results and to produce useful information not readily attainable through survey material. Conclusions reached regarding compatability between needs and holdings, services and buildings, library hours of operation, methods of early detection and anticipation of changing holdings requirements, and the impact of near future programs are presented along with a list of statistics needing collection, organization, and interpretation on a continuing or longitudinal basis.
A toolbox for determining subdiffusive mechanisms
NASA Astrophysics Data System (ADS)
Meroz, Yasmine; Sokolov, Igor M.
2015-04-01
Subdiffusive processes have become a field of great interest in the last decades, due to amounting experimental evidence of subdiffusive behavior in complex systems, and especially in biological systems. Different physical scenarios leading to subdiffusion differ in the details of the dynamics. These differences are what allow to theoretically reconstruct the underlying physics from the results of observations, and will be the topic of this review. We review the main statistical analyses available today to distinguish between these scenarios, categorizing them according to the relevant characteristics. We collect the available tools and statistical tests, presenting them within a broader perspective. We also consider possible complications such as the subordination of subdiffusive mechanisms. Due to the advances in single particle tracking experiments in recent years, we focus on the relevant case of where the available experimental data is scant, at the level of single trajectories.
The GMO Sumrule and the πNN Coupling Constant
NASA Astrophysics Data System (ADS)
Ericson, T. E. O.; Loiseau, B.; Thomas, A. W.
The isovector GMO sumrule for forward πN scattering is critically evaluated using the precise π-p and π-d scattering lengths obtained recently from pionic atom measurements. The charged πNN coupling constant is then deduced with careful analysis of systematic and statistical sources of uncertainties. This determination gives directly from data gc2(GMO)/4π = 14.17±0.09 (statistic) ±0.17 (systematic) or fc2/ 4π=0.078(11). This value is half-way between that of indirect methods (phase-shift analyses) and the direct evaluation from from backward np differential scattering cross sections (extrapolation to pion pole). From the π-p and π-d scattering lengths our analysis leads also to accurate values for (1/2)(aπ-p+aπ-n) and (1/2) (aπ-p-aπ-n).
Application of one-way ANOVA in completely randomized experiments
NASA Astrophysics Data System (ADS)
Wahid, Zaharah; Izwan Latiff, Ahmad; Ahmad, Kartini
2017-12-01
This paper describes an application of a statistical technique one-way ANOVA in completely randomized experiments with three replicates. This technique was employed to a single factor with four levels and multiple observations at each level. The aim of this study is to investigate the relationship between chemical oxygen demand index and location on-sites. Two different approaches are employed for the analyses; critical value and p-value. It also presents key assumptions of the technique to be satisfied by the data in order to obtain valid results. Pairwise comparisons by Turkey method are also considered and discussed to determine where the significant differences among the means is after the ANOVA has been performed. The results revealed that there are statistically significant relationship exist between the chemical oxygen demand index and the location on-sites.
Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu
2018-05-01
Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality.
Papageorgiou, Spyridon N; Kloukos, Dimitrios; Petridis, Haralampos; Pandis, Nikolaos
2015-10-01
To assess the hypothesis that there is excessive reporting of statistically significant studies published in prosthodontic and implantology journals, which could indicate selective publication. The last 30 issues of 9 journals in prosthodontics and implant dentistry were hand-searched for articles with statistical analyses. The percentages of significant and non-significant results were tabulated by parameter of interest. Univariable/multivariable logistic regression analyses were applied to identify possible predictors of reporting statistically significance findings. The results of this study were compared with similar studies in dentistry with random-effects meta-analyses. From the 2323 included studies 71% of them reported statistically significant results, with the significant results ranging from 47% to 86%. Multivariable modeling identified that geographical area and involvement of statistician were predictors of statistically significant results. Compared to interventional studies, the odds that in vitro and observational studies would report statistically significant results was increased by 1.20 times (OR: 2.20, 95% CI: 1.66-2.92) and 0.35 times (OR: 1.35, 95% CI: 1.05-1.73), respectively. The probability of statistically significant results from randomized controlled trials was significantly lower compared to various study designs (difference: 30%, 95% CI: 11-49%). Likewise the probability of statistically significant results in prosthodontics and implant dentistry was lower compared to other dental specialties, but this result did not reach statistical significant (P>0.05). The majority of studies identified in the fields of prosthodontics and implant dentistry presented statistically significant results. The same trend existed in publications of other specialties in dentistry. Copyright © 2015 Elsevier Ltd. All rights reserved.
Saliu, Abdulsalam; Adebayo, Onajole; Kofoworola, Odeyemi; Babatunde, Ogunowo; Ismail, Abdussalam
2015-01-01
Occupational exposure to lead is common among automobile technicians and constitutes 0.9% of total global health burden with a majority of cases in developing countries. The aim of this study was to determine and compare the blood lead levels of automobile technicians in roadside and organised garages in Lagos State, Nigeria. This was a comparative cross-sectional study. Data were collected using interviewer-administered questionnaires. Physical examinations were conducted and blood was analysed for lead using atomic spectrophotometery. Statistical analyses were performed to compare the median blood lead levels of each group using the independent sample (Mann-Whitney U) test. Seventy-three (40.3%) of the organised compared to 59 (34.3%) of the roadside groups had high blood lead levels. The organised group had statistically significant higher median blood lead levels of, 66.0 µg/dL than the roadside 43.5 µg/dL (P < 0.05). There was also statistically significant association between high blood lead levels and abnormal discolouration of the mucosa of the mouth in the organised group. Automobile technicians in organised garages in Lagos have higher prevalence of elevated blood lead levels and higher median levels than the roadside group. Preventive strategies against lead exposures should be instituted by the employers and further actions should be taken to minimize exposures, improve work practices, implement engineering controls (e.g., proper ventilation), and ensure the use of personal protective equipment.
Saliu, Abdulsalam; Adebayo, Onajole; Kofoworola, Odeyemi; Babatunde, Ogunowo; Ismail, Abdussalam
2015-01-01
Occupational exposure to lead is common among automobile technicians and constitutes 0.9% of total global health burden with a majority of cases in developing countries. The aim of this study was to determine and compare the blood lead levels of automobile technicians in roadside and organised garages in Lagos State, Nigeria. This was a comparative cross-sectional study. Data were collected using interviewer-administered questionnaires. Physical examinations were conducted and blood was analysed for lead using atomic spectrophotometery. Statistical analyses were performed to compare the median blood lead levels of each group using the independent sample (Mann-Whitney U) test. Seventy-three (40.3%) of the organised compared to 59 (34.3%) of the roadside groups had high blood lead levels. The organised group had statistically significant higher median blood lead levels of, 66.0 µg/dL than the roadside 43.5 µg/dL (P < 0.05). There was also statistically significant association between high blood lead levels and abnormal discolouration of the mucosa of the mouth in the organised group. Automobile technicians in organised garages in Lagos have higher prevalence of elevated blood lead levels and higher median levels than the roadside group. Preventive strategies against lead exposures should be instituted by the employers and further actions should be taken to minimize exposures, improve work practices, implement engineering controls (e.g., proper ventilation), and ensure the use of personal protective equipment. PMID:25759723
Jezberová, Jitka; Jezbera, Jan; Brandt, Ulrike; Lindström, Eva S.; Langenheder, Silke; Hahn, Martin W.
2010-01-01
Summary We present a survey on the distribution and habitat range of P. necessarius subspecies asymbioticus (PnecC), an important taxon in the water column of freshwater systems. We systematically sampled stagnant freshwater habitats in a heterogeneous 2000 km2 area, together with ecologically different habitats outside this area. In total, 137 lakes, ponds and puddles were investigated, which represent an enormous diversity of habitats differing, i.e., in depth (<10 cm – 171 m) and pH (3.9 – 8.5). PnecC was detected by cultivation-independent methods in all investigated habitats, and its presence was confirmed by cultivation of strains from selected habitats including the most extreme ones. The determined relative abundance of the subspecies ranged from slightly above 0% to 67% (average 14.5% ± 14.3%), and the highest observed absolute abundance was 5.3×106 cells mL−1. Statistical analyses revealed that the abundance of PnecC is partially controlled by factors linked to concentrations of humic substances, which might support the hypothesis that these bacteria utilize photodegradation products of humic substances. . Statistical analyses revealed that the abundance of PnecC is partially controlled by low conductivity and pH and factors linked to concentrations of humic substances. Based on the revealed statistical relationships, an average relative abundance of this subspecies of 20% in global freshwater habitats was extrapolated. Our study provides important implications for the current debate on ubiquity and biogeography in microorganisms. PMID:20041938
Andrus, J Malia; Porter, Matthew D; Rodríguez, Luis F; Kuehlhorn, Timothy; Cooke, Richard A C; Zhang, Yuanhui; Kent, Angela D; Zilles, Julie L
2014-02-01
Denitrifying biofilters can remove agricultural nitrates from subsurface drainage, reducing nitrate pollution that contributes to coastal hypoxic zones. The performance and reliability of natural and engineered systems dependent upon microbially mediated processes, such as the denitrifying biofilters, can be affected by the spatial structure of their microbial communities. Furthermore, our understanding of the relationship between microbial community composition and function is influenced by the spatial distribution of samples.In this study we characterized the spatial structure of bacterial communities in a denitrifying biofilter in central Illinois. Bacterial communities were assessed using automated ribosomal intergenic spacer analysis for bacteria and terminal restriction fragment length polymorphism of nosZ for denitrifying bacteria.Non-metric multidimensional scaling and analysis of similarity (ANOSIM) analyses indicated that bacteria showed statistically significant spatial structure by depth and transect,while denitrifying bacteria did not exhibit significant spatial structure. For determination of spatial patterns, we developed a package of automated functions for the R statistical environment that allows directional analysis of microbial community composition data using either ANOSIM or Mantel statistics.Applying this package to the biofilter data, the flow path correlation range for the bacterial community was 6.4 m at the shallower, periodically in undated depth and 10.7 m at the deeper, continually submerged depth. These spatial structures suggest a strong influence of hydrology on the microbial community composition in these denitrifying biofilters. Understanding such spatial structure can also guide optimal sample collection strategies for microbial community analyses.
Statistical Characterization of School Bus Drive Cycles Collected via Onboard Logging Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duran, A.; Walkowicz, K.
In an effort to characterize the dynamics typical of school bus operation, National Renewable Energy Laboratory (NREL) researchers set out to gather in-use duty cycle data from school bus fleets operating across the country. Employing a combination of Isaac Instruments GPS/CAN data loggers in conjunction with existing onboard telemetric systems resulted in the capture of operating information for more than 200 individual vehicles in three geographically unique domestic locations. In total, over 1,500 individual operational route shifts from Washington, New York, and Colorado were collected. Upon completing the collection of in-use field data using either NREL-installed data acquisition devices ormore » existing onboard telemetry systems, large-scale duty-cycle statistical analyses were performed to examine underlying vehicle dynamics trends within the data and to explore vehicle operation variations between fleet locations. Based on the results of these analyses, high, low, and average vehicle dynamics requirements were determined, resulting in the selection of representative standard chassis dynamometer test cycles for each condition. In this paper, the methodology and accompanying results of the large-scale duty-cycle statistical analysis are presented, including graphical and tabular representations of a number of relationships between key duty-cycle metrics observed within the larger data set. In addition to presenting the results of this analysis, conclusions are drawn and presented regarding potential applications of advanced vehicle technology as it relates specifically to school buses.« less
Gai, Liping; Liu, Hui; Cui, Jing-Hui; Yu, Weijian; Ding, Xiao-Dong
2017-03-20
The purpose of this study was to examine the specific allele combinations of three loci connected with the liver cancers, stomach cancers, hematencephalon and patients with chronic obstructive pulmonary disease (COPD) and to explore the feasibility of the research methods. We explored different mathematical methods for statistical analyses to assess the association between the genotype and phenotype. At the same time we still analyses the statistical results of allele combinations of three loci by difference value method and ratio method. All the DNA blood samples were collected from patients with 50 liver cancers, 75 stomach cancers, 50 hematencephalon, 72 COPD and 200 normal populations. All the samples were from Chinese. Alleles from short tandem repeat (STR) loci were determined using the STR Profiler plus PCR amplification kit (15 STR loci). Previous research was based on combinations of single-locus alleles, and combinations of cross-loci (two loci) alleles. Allele combinations of three loci were obtained by computer counting and stronger genetic signal was obtained. The methods of allele combinations of three loci can help to identify the statistically significant differences of allele combinations between liver cancers, stomach cancers, patients with hematencephalon, COPD and the normal population. The probability of illness followed different rules and had apparent specificity. This method can be extended to other diseases and provide reference for early clinical diagnosis. Copyright © 2016. Published by Elsevier B.V.
Wright, David K.; MacEachern, Scott; Lee, Jaeyong
2014-01-01
The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883
Cognitive predictors of balance in Parkinson's disease.
Fernandes, Ângela; Mendes, Andreia; Rocha, Nuno; Tavares, João Manuel R S
2016-06-01
Postural instability is one of the most incapacitating symptoms of Parkinson's disease (PD) and appears to be related to cognitive deficits. This study aims to determine the cognitive factors that can predict deficits in static and dynamic balance in individuals with PD. A sociodemographic questionnaire characterized 52 individuals with PD for this work. The Trail Making Test, Rule Shift Cards Test, and Digit Span Test assessed the executive functions. The static balance was assessed using a plantar pressure platform, and dynamic balance was based on the Timed Up and Go Test. The results were statistically analysed using SPSS Statistics software through linear regression analysis. The results show that a statistically significant model based on cognitive outcomes was able to explain the variance of motor variables. Also, the explanatory value of the model tended to increase with the addition of individual and clinical variables, although the resulting model was not statistically significant The model explained 25-29% of the variability of the Timed Up and Go Test, while for the anteroposterior displacement it was 23-34%, and for the mediolateral displacement it was 24-39%. From the findings, we conclude that the cognitive performance, especially the executive functions, is a predictor of balance deficit in individuals with PD.
Tips and Tricks for Successful Application of Statistical Methods to Biological Data.
Schlenker, Evelyn
2016-01-01
This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.
Jones, Jaime R; Neff, Linda J; Ely, Elizabeth K; Parker, Andrew M
2012-12-01
The Cities Readiness Initiative is a federally funded program designed to assist 72 metropolitan statistical areas (MSAs) in preparing to dispense life-saving medical countermeasures within 48 hours of a public health emergency. Beginning in 2008, the 72 MSAs were required to conduct 3 drills related to the distribution and dispensing of emergency medical countermeasures. The report describes the results of the first year of pilot data for medical countermeasure drills conducted by the MSAs. The MSAs were provided templates with key metrics for 5 functional elements critical for a successful dispensing campaign: personnel call down, site activation, facility setup, pick-list generation, and dispensing throughput. Drill submissions were compiled into single data sets for each of the 5 drills. Analyses were conducted to determine whether the measures were comparable across business and non-business hours. Descriptive statistics were computed for each of the key metrics identified in the 5 drills. Most drills were conducted on Mondays and Wednesdays during business hours (8:00 am-5:00 pm). The median completion time for the personnel call-down drill was 1 hour during business hours (n = 287) and 55 minutes during non-business hours (n = 136). Site-activation drills were completed in a median of 30 minutes during business hours and 5 minutes during non-business hours. Facility setup drills were completed more rapidly during business hours (75 minutes) compared with non-business hours (96 minutes). During business hours, pick lists were generated in a median of 3 minutes compared with 5 minutes during non-business hours. Aggregate results from the dispensing throughput drills demonstrated that the median observed throughput during business hours (60 people/h) was higher than that during non-business hours (43 people/h). The results of the analyses from this pilot sample of drill submissions provide a baseline for the determination of a national standard in operational capabilities for local jurisdictions to achieve in their planning efforts for a mass dispensing campaign during an emergency.
Burkhart, Timothy A; Asa, Benjamin; Payne, Michael W C; Johnson, Marjorie; Dunning, Cynthia E; Wilson, Timothy D
2015-02-01
A result of below-knee amputations (BKAs) is abnormal motion that occurs about the proximal tibiofibular joint (PTFJ). While it is known that joint morphology may play a role in joint kinematics, this is not well understood with respect to the PTFJ. Therefore, the purposes of this study were: (i) to characterize the anatomy of the PTFJ and statistically analyze the relationships within the joint; and (ii) to determine the relationships between the PTFJ characteristics and the degree of movement of the fibula in BKAs. The PTFJ was characterized in 40 embalmed specimens disarticulated at the knee, and amputated through the mid-tibia and fibula. Four metrics were measured: inclination angle (angle at which the fibula articulates with the tibia); tibial and fibular articular surface areas; articular surface concavity and shape. The specimens were mechanically tested by applying a load through the biceps femoris tendon, and the degree of motion about the tibiofibular joint was measured. Regression analyses were performed to determine the relationships between the different PTFJ characteristics and the magnitude of fibular abduction. Finally, Pearson correlation analyses were performed on inclination angle and surface area vs. fibular kinematics. The inclination angle measured on the fibula was significantly greater than that measured on the tibia. This difference may be attributed to differences in concavity of the tibial and fibular surfaces. Surface area measured on the tibia and fibula was not statistically different. The inclination angle was not statistically correlated to surface area. However, when correlating fibular kinematics in BKAs, inclination angle was positively correlated to the degree of fibular abduction, whereas surface area was negatively correlated. The characteristics of the PTFJ dictate the amount of fibular movement, specifically, fibular abduction in BKAs. Predicting BKA complications based on PTFJ characteristics can lead to recommendations in treatment. © 2014 Anatomical Society.
Bagshaw, Clarence; Isdell, Allen E; Thiruvaiyaru, Dharma S; Brisbin, I Lehr; Sanchez, Susan
2014-06-01
More than thirty years have passed since canine parvovirus (CPV) emerged as a significant pathogen and it continues to pose a severe threat to world canine populations. Published information suggests that flies (Diptera) may play a role in spreading this virus; however, they have not been studied extensively and the degree of their involvement is not known. This investigation was directed toward evaluating the vector capacity of such flies and determining their potential role in the transmission and ecology of CPV. Molecular diagnostic methods were used in this cross-sectional study to detect the presence of CPV in flies trapped at thirty-eight canine facilities. The flies involved were identified as belonging to the house fly (Mucidae), flesh fly (Sarcophagidae) and blow/bottle fly (Calliphoridae) families. A primary surveillance location (PSL) was established at a canine facility in south-central South Carolina, USA, to identify fly-virus interaction within the canine facility environment. Flies trapped at this location were pooled monthly and assayed for CPV using polymerase chain reaction (PCR) methods. These insects were found to be positive for CPV every month from February through the end of November 2011. Fly vector behavior and seasonality were documented and potential environmental risk factors were evaluated. Statistical analyses were conducted to compare the mean numbers of each of the three fly families captured, and after determining fly CPV status (positive or negative), it was determined whether there were significant relationships between numbers of flies captured, seasonal numbers of CPV cases, temperature and rainfall. Flies were also sampled at thirty-seven additional canine facility surveillance locations (ASL) and at four non-canine animal industry locations serving as negative field controls. Canine facility risk factors were identified and evaluated. Statistical analyses were conducted on the number of CPV cases reported within the past year to determine the correlation of fly CPV status (positive or negative) for each facility, facility design (open or closed), mean number of dogs present monthly and number of flies captured. Significant differences occurred between fly CPV positive vs. negative sites with regard to their CPV case numbers, fly numbers captured, and number of dogs present. At the ASL, a statistically significant relationship was found between PCR-determined fly CPV status (positive or negative) and facility design (open vs. closed). Open-facility designs were likely to have more CPV outbreaks and more likely to have flies testing positive for CPV DNA. Copyright © 2014 Elsevier B.V. All rights reserved.
River water quality and pollution sources in the Pearl River Delta, China.
Ouyang, Tingping; Zhu, Zhaoyu; Kuang, Yaoqiu
2005-07-01
Some physicochemical parameters were determined for thirty field water samples collected from different water channels in the Pearl River Delta Economic Zone river system. The analytical results were compared with the environmental quality standards for surface water. Using the SPSS software, statistical analyses were performed to determine the main pollutants of the river water. The main purpose of the present research is to investigate the river water quality and to determine the main pollutants and pollution sources. Furthermore, the research provides some approaches for protecting and improving river water quality. The results indicate that the predominant pollutants are ammonium, phosphorus, and organic compounds. The wastewater discharged from households in urban and rural areas, industrial facilities, and non-point sources from agricultural areas are the main sources of pollution in river water in the Pearl River Delta Economic Zone.
Validation of Endogenous Internal Real-Time PCR Controls in Renal Tissues
Cui, Xiangqin; Zhou, Juling; Qiu, Jing; Johnson, Martin R.; Mrug, Michal
2009-01-01
Background Endogenous internal controls (‘reference’ or ‘housekeeping’ genes) are widely used in real-time PCR (RT-PCR) analyses. Their use relies on the premise of consistently stable expression across studied experimental conditions. Unfortunately, none of these controls fulfills this premise across a wide range of experimental conditions; consequently, none of them can be recommended for universal use. Methods To determine which endogenous RT-PCR controls are suitable for analyses of renal tissues altered by kidney disease, we studied the expression of 16 commonly used ‘reference genes’ in 7 mildly and 7 severely affected whole kidney tissues from a well-characterized cystic kidney disease model. Expression levels of these 16 genes, determined by TaqMan® RT-PCR analyses and Affymetrix GeneChip® arrays, were normalized and tested for overall variance and equivalence of the means. Results Both statistical approaches and both TaqMan- and GeneChip-based methods converged on 3 out of the 4 top-ranked genes (Ppia, Gapdh and Pgk1) that had the most constant expression levels across the studied phenotypes. Conclusion A combination of the top-ranked genes will provide a suitable endogenous internal control for similar studies of kidney tissues across a wide range of disease severity. PMID:19729889
Neal, B; MacMahon, S
1999-01-01
Overviews (meta-analyses) of the major ongoing randomized trials of blood pressure lowering drugs will be conducted to determine the effects of: first, newer versus older classes of blood pressure lowering drugs in patients with hypertension; and second, blood pressure lowering treatments versus untreated or less treated control conditions in patient groups at high risk of cardiovascular events. The principal study outcomes are stroke, coronary heart disease, total cardiovascular events and total cardiovascular deaths. The overviews have been prospectively designed and will be conducted on individual patient data. The analyses will be conducted as a collaboration between the principal investigators of participating trials involving about 270,000 patients. Full data should be available in 2003, with the first round of analyses performed in 1999-2000. The combination of trial results should provide good statistical power to detect even modest differences between the effects on the main study outcomes.
A preliminary comparison between TOVS and GOME level 2 ozone data
NASA Astrophysics Data System (ADS)
Rathman, William; Monks, Paul S.; Llewellyn-Jones, David; Burrows, John P.
1997-09-01
A preliminary comparison between total column ozone concentration values derived from TIROS Operational Vertical Sounder (TOVS) and Global Ozone Monitoring Experiment (GOME) has been carried out. Two comparisons of ozone datasets have been made: a) TOVS ozone analysis maps vs. GOME level 2 data; b) TOVS data located at Northern Hemisphere Ground Ozone Stations (NHGOS) vs. GOME data. Both analyses consistently showed an offset in the value of the total column ozone between the datasets [for analyses a) 35 Dobson Units (DU); and for analyses b) 10 DU], despite a good correlation between the spatial and temporal features of the datasets. A noticeably poor correlation in the latitudinal bands 10°/20° North and 10°/20° South was observed—the reasons for which are discussed. The smallest region which was statistically representative of the ozone value correlation dataset of TOVS data at NHGOS and GOME level-2 data was determined to be a region that was enclosed by effective radius of 0.75 arc-degrees (83.5km).
Disentangling diatom species complexes: does morphometry suffice?
Borrego-Ramos, María; Olenici, Adriana
2017-01-01
Accurate taxonomic resolution in light microscopy analyses of microalgae is essential to achieve high quality, comparable results in both floristic analyses and biomonitoring studies. A number of closely related diatom taxa have been detected to date co-occurring within benthic diatom assemblages, sharing many morphological, morphometrical and ecological characteristics. In this contribution, we analysed the hypothesis that, where a large sample size (number of individuals) is available, common morphometrical parameters (valve length, width and stria density) are sufficient to achieve a correct identification to the species level. We focused on some common diatom taxa belonging to the genus Gomphonema. More than 400 valves and frustules were photographed in valve view and measured using Fiji software. Several statistical tools (mixture and discriminant analysis, k-means clustering, classification trees, etc.) were explored to test whether mere morphometry, independently of other valve features, leads to correct identifications, when compared to identifications made by experts. In view of the results obtained, morphometry-based determination in diatom taxonomy is discouraged. PMID:29250472
NASA Astrophysics Data System (ADS)
Azila Che Musa, Nor; Mahmud, Zamalia; Baharun, Norhayati
2017-09-01
One of the important skills that is required from any student who are learning statistics is knowing how to solve statistical problems correctly using appropriate statistical methods. This will enable them to arrive at a conclusion and make a significant contribution and decision for the society. In this study, a group of 22 students majoring in statistics at UiTM Shah Alam were given problems relating to topics on testing of hypothesis which require them to solve the problems using confidence interval, traditional and p-value approach. Hypothesis testing is one of the techniques used in solving real problems and it is listed as one of the difficult concepts for students to grasp. The objectives of this study is to explore students’ perceived and actual ability in solving statistical problems and to determine which item in statistical problem solving that students find difficult to grasp. Students’ perceived and actual ability were measured based on the instruments developed from the respective topics. Rasch measurement tools such as Wright map and item measures for fit statistics were used to accomplish the objectives. Data were collected and analysed using Winsteps 3.90 software which is developed based on the Rasch measurement model. The results showed that students’ perceived themselves as moderately competent in solving the statistical problems using confidence interval and p-value approach even though their actual performance showed otherwise. Item measures for fit statistics also showed that the maximum estimated measures were found on two problems. These measures indicate that none of the students have attempted these problems correctly due to reasons which include their lack of understanding in confidence interval and probability values.
DESIGNING ENVIRONMENTAL MONITORING DATABASES FOR STATISTIC ASSESSMENT
Databases designed for statistical analyses have characteristics that distinguish them from databases intended for general use. EMAP uses a probabilistic sampling design to collect data to produce statistical assessments of environmental conditions. In addition to supporting the ...
Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.
Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V
2018-04-01
A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.
USDA-ARS?s Scientific Manuscript database
Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
ERIC Educational Resources Information Center
Steyvers, Mark; Tenenbaum, Joshua B.
2005-01-01
We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…
Determining Spatio-Temporal Cadastral Data Requirement for Infrastructure of Ladm for Turkey
NASA Astrophysics Data System (ADS)
Alkan, M.; Polat, Z. A.
2016-06-01
Nowadays, the nature of land title and cadastral (LTC) data in the Turkey is dynamic from a temporal perspective which depends on the LTC operations. Functional requirements with respect to the characteristics are investigated based upon interviews of professionals in public and private sectors. These are; Legal authorities, Land Registry and Cadastre offices, Highway departments, Foundations, Ministries of Budget, Transportation, Justice, Public Works and Settlement, Environment and Forestry, Agriculture and Rural Affairs, Culture and Internal Affairs, State Institute of Statistics (SIS), execution offices, tax offices, real estate offices, private sector, local governments and banks. On the other hand, spatio-temporal LTC data very important component for creating infrastructure of Land Administration Model (LADM). For this reason, spatio-temporal LTC data needs for LADM not only updated but also temporal. The investigations ended up with determine temporal analyses of LTC data, traditional LTC system and tracing temporal analyses in traditional LTC system. In the traditional system, the temporal analyses needed by all these users could not be performed in a rapid and reliable way. The reason for this is that the traditional LTC system is a manual archiving system. The aims and general contents of this paper: (1) define traditional LTC system of Turkey; (2) determining the need for spatio-temporal LTC data and analyses for core domain model for LADM. As a results of temporal and spatio-temporal analysis LTC data needs, new system design is important for the Turkish LADM model. Designing and realizing an efficient and functional Temporal Geographic Information Systems (TGIS) is inevitable for the Turkish LADM core infrastructure. Finally this paper outcome is creating infrastructure for design and develop LADM for Turkey.
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
Differences in Performance Among Test Statistics for Assessing Phylogenomic Model Adequacy.
Duchêne, David A; Duchêne, Sebastian; Ho, Simon Y W
2018-05-18
Statistical phylogenetic analyses of genomic data depend on models of nucleotide or amino acid substitution. The adequacy of these substitution models can be assessed using a number of test statistics, allowing the model to be rejected when it is found to provide a poor description of the evolutionary process. A potentially valuable use of model-adequacy test statistics is to identify when data sets are likely to produce unreliable phylogenetic estimates, but their differences in performance are rarely explored. We performed a comprehensive simulation study to identify test statistics that are sensitive to some of the most commonly cited sources of phylogenetic estimation error. Our results show that, for many test statistics, traditional thresholds for assessing model adequacy can fail to reject the model when the phylogenetic inferences are inaccurate and imprecise. This is particularly problematic when analysing loci that have few variable informative sites. We propose new thresholds for assessing substitution model adequacy and demonstrate their effectiveness in analyses of three phylogenomic data sets. These thresholds lead to frequent rejection of the model for loci that yield topological inferences that are imprecise and are likely to be inaccurate. We also propose the use of a summary statistic that provides a practical assessment of overall model adequacy. Our approach offers a promising means of enhancing model choice in genome-scale data sets, potentially leading to improvements in the reliability of phylogenomic inference.
Systematic review of wireless phone use and brain cancer and other head tumors.
Repacholi, Michael H; Lerchl, Alexander; Röösli, Martin; Sienkiewicz, Zenon; Auvinen, Anssi; Breckenkamp, Jürgen; d'Inzeo, Guglielmo; Elliott, Paul; Frei, Patrizia; Heinrich, Sabine; Lagroye, Isabelle; Lahkola, Anna; McCormick, David L; Thomas, Silke; Vecchia, Paolo
2012-04-01
We conducted a systematic review of scientific studies to evaluate whether the use of wireless phones is linked to an increased incidence of the brain cancer glioma or other tumors of the head (meningioma, acoustic neuroma, and parotid gland), originating in the areas of the head that most absorb radiofrequency (RF) energy from wireless phones. Epidemiology and in vivo studies were evaluated according to an agreed protocol; quality criteria were used to evaluate the studies for narrative synthesis but not for meta-analyses or pooling of results. The epidemiology study results were heterogeneous, with sparse data on long-term use (≥ 10 years). Meta-analyses of the epidemiology studies showed no statistically significant increase in risk (defined as P < 0.05) for adult brain cancer or other head tumors from wireless phone use. Analyses of the in vivo oncogenicity, tumor promotion, and genotoxicity studies also showed no statistically significant relationship between exposure to RF fields and genotoxic damage to brain cells, or the incidence of brain cancers or other tumors of the head. Assessment of the review results using the Hill criteria did not support a causal relationship between wireless phone use and the incidence of adult cancers in the areas of the head that most absorb RF energy from the use of wireless phones. There are insufficient data to make any determinations about longer-term use (≥ 10 years). © 2011 Wiley Periodicals, Inc.
Full in-vitro analyses of new-generation bulk fill dental composites cured by halogen light.
Tekin, Tuçe Hazal; Kantürk Figen, Aysel; Yılmaz Atalı, Pınar; Coşkuner Filiz, Bilge; Pişkin, Mehmet Burçin
2017-08-01
The objective of this study was to investigate the full in-vitro analyses of new-generation bulk-fill dental composites cured by halogen light (HLG). Two types' four composites were studied: Surefill SDR (SDR) and Xtra Base (XB) as bulk-fill flowable materials; QuixFill (QF) and XtraFill (XF) as packable bulk-fill materials. Samples were prepared for each analysis and test by applying the same procedure, but with different diameters and thicknesses appropriate to the analysis and test requirements. Thermal properties were determined by thermogravimetric analysis (TG/DTG) and differential scanning calorimetry (DSC) analysis; the Vickers microhardness (VHN) was measured after 1, 7, 15 and 30days of storage in water. The degree of conversion values for the materials (DC, %) were immediately measured using near-infrared spectroscopy (FT-IR). The surface morphology of the composites was investigated by scanning electron microscopes (SEM) and atomic-force microscopy (AFM) analyses. The sorption and solubility measurements were also performed after 1, 7, 15 and 30days of storage in water. In addition to his, the data were statistically analyzed using one-way analysis of variance, and both the Newman Keuls and Tukey multiple comparison tests. The statistical significance level was established at p<0.05. According to the ISO 4049 standards, all the tested materials showed acceptable water sorption and solubility, and a halogen light source was an option to polymerize bulk-fill, resin-based dental composites. Copyright © 2017 Elsevier B.V. All rights reserved.
Mulroy, Sara J; Winstein, Carolee J; Kulig, Kornelia; Beneck, George J; Fowler, Eileen G; DeMuth, Sharon K; Sullivan, Katherine J; Brown, David A; Lane, Christianne J
2011-12-01
Each of the 4 randomized clinical trials (RCTs) hosted by the Physical Therapy Clinical Research Network (PTClinResNet) targeted a different disability group (low back disorder in the Muscle-Specific Strength Training Effectiveness After Lumbar Microdiskectomy [MUSSEL] trial, chronic spinal cord injury in the Strengthening and Optimal Movements for Painful Shoulders in Chronic Spinal Cord Injury [STOMPS] trial, adult stroke in the Strength Training Effectiveness Post-Stroke [STEPS] trial, and pediatric cerebral palsy in the Pediatric Endurance and Limb Strengthening [PEDALS] trial for children with spastic diplegic cerebral palsy) and tested the effectiveness of a muscle-specific or functional activity-based intervention on primary outcomes that captured pain (STOMPS, MUSSEL) or locomotor function (STEPS, PEDALS). The focus of these secondary analyses was to determine causal relationships among outcomes across levels of the International Classification of Functioning, Disability and Health (ICF) framework for the 4 RCTs. With the database from PTClinResNet, we used 2 separate secondary statistical approaches-mediation analysis for the MUSSEL and STOMPS trials and regression analysis for the STEPS and PEDALS trials-to test relationships among muscle performance, primary outcomes (pain related and locomotor related), activity and participation measures, and overall quality of life. Predictive models were stronger for the 2 studies with pain-related primary outcomes. Change in muscle performance mediated or predicted reductions in pain for the MUSSEL and STOMPS trials and, to some extent, walking speed for the STEPS trial. Changes in primary outcome variables were significantly related to changes in activity and participation variables for all 4 trials. Improvement in activity and participation outcomes mediated or predicted increases in overall quality of life for the 3 trials with adult populations. Variables included in the statistical models were limited to those measured in the 4 RCTs. It is possible that other variables also mediated or predicted the changes in outcomes. The relatively small sample size in the PEDALS trial limited statistical power for those analyses. Evaluating the mediators or predictors of change between each ICF level and for 2 fundamentally different outcome variables (pain versus walking) provided insights into the complexities inherent across 4 prevalent disability groups.
Hamed, Moath; Schraml, Frank; Wilson, Jeffrey; Galvin, James; Sabbagh, Marwan N
2018-01-01
To determine whether occipital and cingulate hypometabolism is being under-reported or missed on 18-fluorodeoxyglucose positron emission tomography (FDG-PET) CT scans in patients with Dementia with Lewy Bodies (DLB). Recent studies have reported higher sensitivity and specificity for occipital and cingulate hypometabolism on FDG-PET of DLB patients. This retrospective chart review looked at regions of interest (ROI's) in FDG-PET CT scan reports in 35 consecutive patients with a clinical diagnosis of probable, possible, or definite DLB as defined by the latest DLB Consortium Report. ROI's consisting of glucose hypometabolism in frontal, parietal, temporal, occipital, and cingulate areas were tabulated and charted separately by the authors from the reports. A blinded Nuclear medicine physician read the images independently and marked ROI's separately. A Cohen's Kappa coefficient statistic was calculated to determine agreement between the reports and the blinded reads. On the radiology reports, 25.71% and 17.14% of patients reported occipital and cingulate hypometabolism respectively. Independent reads demonstrated significant disagreement with the proportion of occipital and cingulate hypometabolism being reported on initial reads: 91.43% and 85.71% respectively. Cohen's Kappa statistic determinations demonstrated significant agreement only with parietal hypometabolism (p<0.05). Occipital and cingulate hypometabolism is under-reported and missed frequently on clinical interpretations of FDG-PET scans of patients with DLB, but the frequency of hypometabolism is even higher than previously reported. Further studies with more statistical power and receiver operating characteristic analyses are needed to delineate the sensitivity and specificity of these in vivo biomarkers.
Slonim, Anthony D; Marcin, James P; Turenne, Wendy; Hall, Matt; Joseph, Jill G
2007-12-01
To determine the rates, patient, and institutional characteristics associated with the occurrence of patient safety indicators (PSIs) in hospitalized children and the degree of statistical difference derived from using three approaches of controlling for institution level effects. Pediatric Health Information System Dataset consisting of all pediatric discharges (<21 years of age) from 34 academic, freestanding children's hospitals for calendar year 2003. The rates of PSIs were computed for all discharges. The patient and institutional characteristics associated with these PSIs were calculated. The analyses sequentially applied three increasingly conservative methods to control for the institution-level effects robust standard error estimation, a fixed effects model, and a random effects model. The degree of difference from a "base state," which excluded institution-level variables, and between the models was calculated. The effects of these analyses on the interpretation of the PSIs are presented. PSIs are relatively infrequent events in hospitalized children ranging from 0 per 10,000 (postoperative hip fracture) to 87 per 10,000 (postoperative respiratory failure). Significant variables associated PSIs included age (neonates), race (Caucasians), payor status (public insurance), severity of illness (extreme), and hospital size (>300 beds), which all had higher rates of PSIs than their reference groups in the bivariable logistic regression results. The three different approaches of adjusting for institution-level effects demonstrated that there were similarities in both the clinical and statistical significance across each of the models. Institution-level effects can be appropriately controlled for by using a variety of methods in the analyses of administrative data. Whenever possible, resource-conservative methods should be used in the analyses especially if clinical implications are minimal.
Lin, Shuai-Chun; Pasquale, Louis R; Singh, Kuldev; Lin, Shan C
2018-03-01
The purpose of this article is to investigate the association between body mass index (BMI) and open-angle glaucoma (OAG) in a sample of the South Korean population. The sample consisted of a cross-sectional, population-based sample of 10,978 participants, 40 years of age and older, enrolled in the 2008 to 2011 Korean National Health and Nutrition Examination Survey. All participants had measured intraocular pressure <22 mm Hg and open anterior chamber angles. OAG was defined using disc and visual field criteria established by the International Society for Geographical and Epidemiological Ophthalmology. Multivariable analyses were performed to determine the association between BMI and OAG. These analyses were also performed in a sex-stratified and age-stratified manner. After adjusting for potential confounding variables, lower BMI (<19 kg/m) was associated with greater risk of OAG compared with normal BMI (19 to 24.9 kg/m) [odds ratio (OR), 2.28; 95% confidence interval (CI), 1.22-4.26]. In sex-stratified analyses, low BMI remained adversely related to glaucoma in women (OR, 3.45; 95% CI, 1.42-8.38) but not in men (OR, 1.72; 95% CI, 0.71-4.20). In age-stratified analyses, lower BMI was adversely related to glaucoma among subjects 40- to 49-year old (OR, 5.16; 95% CI, 1.86-14.36) but differences in glaucoma prevalence were not statistically significant between those with low versus normal BMI in other age strata. Lower BMI was associated with increased odds of OAG in a sample of the South Korean population. Multivariate analysis revealed the association to be statistically significant in women and those in the youngest age stratum.
Measurement issues in research on social support and health.
Dean, K; Holst, E; Kreiner, S; Schoenborn, C; Wilson, R
1994-01-01
STUDY OBJECTIVE--The aims were: (1) to identify methodological problems that may explain the inconsistencies and contradictions in the research evidence on social support and health, and (2) to validate a frequently used measure of social support in order to determine whether or not it could be used in multivariate analyses of population data in research on social support and health. DESIGN AND METHODS--Secondary analysis of data collected in a cross sectional survey of a multistage cluster sample of the population of the United States, designed to study relationships in behavioural, social support and health variables. Statistical models based on item response theory and graph theory were used to validate the measure of social support to be used in subsequent analyses. PARTICIPANTS--Data on 1755 men and women aged 20 to 64 years were available for the scale validation. RESULTS--Massive evidence of item bias was found for all items of a group membership subscale. The most serious problems were found in relationship to an item measuring membership in work related groups. Using that item in the social network scale in multivariate analyses would distort findings on the statistical effects of education, employment status, and household income. Evidence of item bias was also found for a sociability subscale. When marital status was included to create what is called an intimate contacts subscale, the confounding grew worse. CONCLUSIONS--The composite measure of social network is not valid and would seriously distort the findings of analyses attempting to study relationships between the index and other variables. The findings show that valid measurement is a methodological issue that must be addressed in scientific research on population health. PMID:8189179
Post Hoc Analyses of ApoE Genotype-Defined Subgroups in Clinical Trials.
Kennedy, Richard E; Cutter, Gary R; Wang, Guoqiao; Schneider, Lon S
2016-01-01
Many post hoc analyses of clinical trials in Alzheimer's disease (AD) and mild cognitive impairment (MCI) are in small Phase 2 trials. Subject heterogeneity may lead to statistically significant post hoc results that cannot be replicated in larger follow-up studies. We investigated the extent of this problem using simulation studies mimicking current trial methods with post hoc analyses based on ApoE4 carrier status. We used a meta-database of 24 studies, including 3,574 subjects with mild AD and 1,171 subjects with MCI/prodromal AD, to simulate clinical trial scenarios. Post hoc analyses examined if rates of progression on the Alzheimer's Disease Assessment Scale-cognitive (ADAS-cog) differed between ApoE4 carriers and non-carriers. Across studies, ApoE4 carriers were younger and had lower baseline scores, greater rates of progression, and greater variability on the ADAS-cog. Up to 18% of post hoc analyses for 18-month trials in AD showed greater rates of progression for ApoE4 non-carriers that were statistically significant but unlikely to be confirmed in follow-up studies. The frequency of erroneous conclusions dropped below 3% with trials of 100 subjects per arm. In MCI, rates of statistically significant differences with greater progression in ApoE4 non-carriers remained below 3% unless sample sizes were below 25 subjects per arm. Statistically significant differences for ApoE4 in post hoc analyses often reflect heterogeneity among small samples rather than true differential effect among ApoE4 subtypes. Such analyses must be viewed cautiously. ApoE genotype should be incorporated into the design stage to minimize erroneous conclusions.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
NASA Astrophysics Data System (ADS)
Tomkowska, Anna; Chmielewski, Krzysztof; Skrzyczanowski, Wojciech; Mularczyk-Oliwa, Monika; Ostrowski, Roman; Strzelec, Marek
2017-07-01
The aim of research was determination of composition and nature of superficial deposits, cumulated at the selected mosaic's tesserae from Lebanon. Selected were three series of objects from different locations, namely from the seaside and mountain archaeological sites as well as from the mosaics exposed in the city center. Analyzed were stone and ceramic tesserae. The selection of objects was dictated by wide diversification of factors influencing the state of preservation and composition of deposits in given location. Investigations were performed including LIBS, FT-IR, Raman spectroscopy and optical 3D microscopy. The experimental results included composition and kind of deposit at the tesserae surfaces, and composition of tesserae itself. Compounds in the superficial deposits were identified. Confirmed was occurrence of different encrustations in dependence on geographic localization of a given sample. The interpretation of results was supported by multivariate statistical techniques, especially by the factor analysis. Performed analyses constitute the pioneer realization in terms of determination of deposits composition at the surface of mosaics from the Lebanon territory.
Fujioka, Rumi; Mochizuki, Nobuo; Ikeda, Masafumi; Sato, Akihiro; Nomura, Shogo; Owada, Satoshi; Yomoda, Satoshi; Tsuchihara, Katsuya; Kishino, Satoshi
2018-01-01
Arctigenin is evaluated for antitumor efficacy in patients with pancreatic cancer. It has an inhibitory activity on mitochondrial complex I.Therefore, plasma lactate level of patients after arctigenin administration was evaluated for biomarker of clinical response and/or adverse effect. Plasma lactate level in 15 patients enrolled in a Phase I clinical trial of GBS-01 rich in arctigenin was analyzed by colorimetric assay. Statistical analyses for association of plasma lactate and clinical responses, pharmacokinetics of arctigenin, and background factors of each patient by multivariate and univariate analyses.In about half of the patients, transient increase of lactate was observed. Correlation between plasma lactate level and pharmacokinetic parameters of arctigenin and its glucuronide conjugate, and clinical outcome was not detected. Regarding to the determinant of lactate level, only slight association with liver function test was detected. Plasma lactate level is primary determined by reutilization rather than production for antitumor effect and dose not serve as a biomarker. Arctigenin, inhibition of mitochondrial complex I, plasma lactate concentration, phase I clinical trial of GBS-01, Cori cycle. PMID:29856804
Fujioka, Rumi; Mochizuki, Nobuo; Ikeda, Masafumi; Sato, Akihiro; Nomura, Shogo; Owada, Satoshi; Yomoda, Satoshi; Tsuchihara, Katsuya; Kishino, Satoshi; Esumi, Hiroyasu
2018-01-01
Arctigenin is evaluated for antitumor efficacy in patients with pancreatic cancer. It has an inhibitory activity on mitochondrial complex I.Therefore, plasma lactate level of patients after arctigenin administration was evaluated for biomarker of clinical response and/or adverse effect. Plasma lactate level in 15 patients enrolled in a Phase I clinical trial of GBS-01 rich in arctigenin was analyzed by colorimetric assay. Statistical analyses for association of plasma lactate and clinical responses, pharmacokinetics of arctigenin, and background factors of each patient by multivariate and univariate analyses.In about half of the patients, transient increase of lactate was observed. Correlation between plasma lactate level and pharmacokinetic parameters of arctigenin and its glucuronide conjugate, and clinical outcome was not detected. Regarding to the determinant of lactate level, only slight association with liver function test was detected. Plasma lactate level is primary determined by reutilization rather than production for antitumor effect and dose not serve as a biomarker. Arctigenin, inhibition of mitochondrial complex I, plasma lactate concentration, phase I clinical trial of GBS-01, Cori cycle.
Predictors of mammography screening among ethnically diverse low-income women.
Cronan, Terry A; Villalta, Ian; Gottfried, Emily; Vaden, Yavette; Ribas, Mabel; Conway, Terry L
2008-05-01
Breast cancer is the second leading cause of cancer deaths among women in the United States. Minority women are less likely to be screened and more likely to die from breast cancer than are Caucasian women. Although some studies have examined ethnic disparities in mammography screening, no study has examined whether there are ethnic disparities among low-income, ethnically diverse women. The present study was designed to determine whether there are ethnic disparities in mammography screening and predictors of screening among low-income African American, Mexican American, and Caucasian women, and to determine whether the disparities and predictors vary across ethnic groups. The participants were 146 low-income women who were Mexican American (32%), African American (31%), or Caucasian (37%). Statistical analyses were performed to assess the relationships between mammography screening during the past 2 years and potential predictors of screening, both within ethnic groups and for the combined sample. The results varied depending on whether analyses combined ethnic groups or were performed within each of the three ethnic groups. It is, therefore, important to examine within-group differences when examining ethnic disparities in predictors of mammography.
Automated and assisted RNA resonance assignment using NMR chemical shift statistics
Aeschbacher, Thomas; Schmidt, Elena; Blatter, Markus; Maris, Christophe; Duss, Olivier; Allain, Frédéric H.-T.; Güntert, Peter; Schubert, Mario
2013-01-01
The three-dimensional structure determination of RNAs by NMR spectroscopy relies on chemical shift assignment, which still constitutes a bottleneck. In order to develop more efficient assignment strategies, we analysed relationships between sequence and 1H and 13C chemical shifts. Statistics of resonances from regularly Watson–Crick base-paired RNA revealed highly characteristic chemical shift clusters. We developed two approaches using these statistics for chemical shift assignment of double-stranded RNA (dsRNA): a manual approach that yields starting points for resonance assignment and simplifies decision trees and an automated approach based on the recently introduced automated resonance assignment algorithm FLYA. Both strategies require only unlabeled RNAs and three 2D spectra for assigning the H2/C2, H5/C5, H6/C6, H8/C8 and H1′/C1′ chemical shifts. The manual approach proved to be efficient and robust when applied to the experimental data of RNAs with a size between 20 nt and 42 nt. The more advanced automated assignment approach was successfully applied to four stem-loop RNAs and a 42 nt siRNA, assigning 92–100% of the resonances from dsRNA regions correctly. This is the first automated approach for chemical shift assignment of non-exchangeable protons of RNA and their corresponding 13C resonances, which provides an important step toward automated structure determination of RNAs. PMID:23921634
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.
Effect Size Analyses of Souvenaid in Patients with Alzheimer's Disease.
Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E; Bertolucci, Paulo H F; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A; Shah, Raj C
2017-01-01
Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Data from all three reported randomized controlled trials of Souvenaid in Alzheimer's disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen's d statistic (or Cramér's V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: -0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD.
NASA Astrophysics Data System (ADS)
Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Góźdź, S.; Majewska, U.; Pajek, M.
2007-07-01
The total reflection X-ray fluorescence method was applied to study the trace element concentrations in human breast malignant and breast benign neoplasm tissues taken from the women who were patients of Holycross Cancer Centre in Kielce (Poland). These investigations were mainly focused on the development of new possibilities of cancer diagnosis and therapy monitoring. This systematic comparative study was based on relatively large (˜ 100) population studied, namely 26 samples of breast malignant and 68 samples of breast benign neoplasm tissues. The concentrations, being in the range from a few ppb to 0.1%, were determined for thirteen elements (from P to Pb). The results were carefully analysed to investigate the concentration distribution of trace elements in the studied samples. The measurements of concentration of trace elements by total reflection X-ray fluorescence were limited, however, by the detection limit of the method. It was observed that for more than 50% of elements determined, the concentrations were not measured in all samples. These incomplete measurements were treated within the statistical concept called left-random censoring and for the estimation of the mean value and median of censored concentration distributions, the Kaplan-Meier estimator was used. For comparison of concentrations in two populations, the log-rank test was applied, which allows to compare the censored total reflection X-ray fluorescence data. Found statistically significant differences are discussed in more details. It is noted that described data analysis procedures should be the standard tool to analyze the censored concentrations of trace elements analysed by X-ray fluorescence methods.
gHRV: Heart rate variability analysis made easy.
Rodríguez-Liñares, L; Lado, M J; Vila, X A; Méndez, A J; Cuesta, P
2014-08-01
In this paper, the gHRV software tool is presented. It is a simple, free and portable tool developed in python for analysing heart rate variability. It includes a graphical user interface and it can import files in multiple formats, analyse time intervals in the signal, test statistical significance and export the results. This paper also contains, as an example of use, a clinical analysis performed with the gHRV tool, namely to determine whether the heart rate variability indexes change across different stages of sleep. Results from tests completed by researchers who have tried gHRV are also explained: in general the application was positively valued and results reflect a high level of satisfaction. gHRV is in continuous development and new versions will include suggestions made by testers. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
A construct-driven investigation of gender differences in a leadership-role assessment center.
Anderson, Neil; Lievens, Filip; van Dam, Karen; Born, Marise
2006-05-01
This study examined gender differences in a large-scale assessment center for officer entry in the British Army. Subgroup differences were investigated for a sample of 1,857 candidates: 1,594 men and 263 women. A construct-driven approach was chosen (a) by examining gender differences at the construct level, (b) by formulating a priori hypotheses about which constructs would be susceptible to gender effects, and (c) by using both effect size statistics and latent mean analyses to investigate gender differences in assessment center ratings. Results showed that female candidates were rated notably higher on constructs reflecting an interpersonally oriented leadership style (i.e., oral communication and interaction) and on drive and determination. These results are discussed in light of role congruity theory and of the advantages of using latent mean analyses.
NASA Astrophysics Data System (ADS)
Arifin, Mukh; Ni'matullah Al-Baarri, Ahmad; Etza Setiani, Bhakti; Fazriyati Siregar, Risa
2018-02-01
This study was done for analysing the texture profile and colour performance in local and imported meat in Semarang, Indonesia. Two types of available meat were compared in the hardness, cohesiveness, springiness, adhesiveness and the colour L*a*b* performance. Five fresh beef cut of round meats from local and imported meat were used in this experiments. Data were analysed statistically using T-test. The results showed that local beef exhibit higher in the springiness than imported beef resulting in the remarkable differences. The colour analysis showed that imported beef provided remarkable higher in L* value than local beef. Resulting significant differences among two types of beef. As conclusion, these value might provide the notable of differences among local and imported meat and may give preferences status to the user for further application in meat processing.
Small, J R
1993-01-01
This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434
2014-01-01
Objective To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Method Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Results Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. Conclusions This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses. PMID:23965298
Youngstrom, Eric A
2014-03-01
To offer a practical demonstration of receiver operating characteristic (ROC) analyses, diagnostic efficiency statistics, and their application to clinical decision making using a popular parent checklist to assess for potential mood disorder. Secondary analyses of data from 589 families seeking outpatient mental health services, completing the Child Behavior Checklist and semi-structured diagnostic interviews. Internalizing Problems raw scores discriminated mood disorders significantly better than did age- and gender-normed T scores, or an Affective Problems score. Internalizing scores <8 had a diagnostic likelihood ratio <0.3, and scores >30 had a diagnostic likelihood ratio of 7.4. This study illustrates a series of steps in defining a clinical problem, operationalizing it, selecting a valid study design, and using ROC analyses to generate statistics that support clinical decisions. The ROC framework offers important advantages for clinical interpretation. Appendices include sample scripts using SPSS and R to check assumptions and conduct ROC analyses.
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
This tool allows users to animate cancer trends over time by cancer site and cause of death, race, and sex. Provides access to incidence, mortality, and survival. Select the type of statistic, variables, format, and then extract the statistics in a delimited format for further analyses.
Pounds, Stan; Cheng, Cheng; Cao, Xueyuan; Crews, Kristine R.; Plunkett, William; Gandhi, Varsha; Rubnitz, Jeffrey; Ribeiro, Raul C.; Downing, James R.; Lamba, Jatinder
2009-01-01
Motivation: In some applications, prior biological knowledge can be used to define a specific pattern of association of multiple endpoint variables with a genomic variable that is biologically most interesting. However, to our knowledge, there is no statistical procedure designed to detect specific patterns of association with multiple endpoint variables. Results: Projection onto the most interesting statistical evidence (PROMISE) is proposed as a general procedure to identify genomic variables that exhibit a specific biologically interesting pattern of association with multiple endpoint variables. Biological knowledge of the endpoint variables is used to define a vector that represents the biologically most interesting values for statistics that characterize the associations of the endpoint variables with a genomic variable. A test statistic is defined as the dot-product of the vector of the observed association statistics and the vector of the most interesting values of the association statistics. By definition, this test statistic is proportional to the length of the projection of the observed vector of correlations onto the vector of most interesting associations. Statistical significance is determined via permutation. In simulation studies and an example application, PROMISE shows greater statistical power to identify genes with the interesting pattern of associations than classical multivariate procedures, individual endpoint analyses or listing genes that have the pattern of interest and are significant in more than one individual endpoint analysis. Availability: Documented R routines are freely available from www.stjuderesearch.org/depts/biostats and will soon be available as a Bioconductor package from www.bioconductor.org. Contact: stanley.pounds@stjude.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19528086
SOCR Analyses – an Instructional Java Web-based Statistical Analysis Toolkit
Chu, Annie; Cui, Jenny; Dinov, Ivo D.
2011-01-01
The Statistical Online Computational Resource (SOCR) designs web-based tools for educational use in a variety of undergraduate courses (Dinov 2006). Several studies have demonstrated that these resources significantly improve students' motivation and learning experiences (Dinov et al. 2008). SOCR Analyses is a new component that concentrates on data modeling and analysis using parametric and non-parametric techniques supported with graphical model diagnostics. Currently implemented analyses include commonly used models in undergraduate statistics courses like linear models (Simple Linear Regression, Multiple Linear Regression, One-Way and Two-Way ANOVA). In addition, we implemented tests for sample comparisons, such as t-test in the parametric category; and Wilcoxon rank sum test, Kruskal-Wallis test, Friedman's test, in the non-parametric category. SOCR Analyses also include several hypothesis test models, such as Contingency tables, Friedman's test and Fisher's exact test. The code itself is open source (http://socr.googlecode.com/), hoping to contribute to the efforts of the statistical computing community. The code includes functionality for each specific analysis model and it has general utilities that can be applied in various statistical computing tasks. For example, concrete methods with API (Application Programming Interface) have been implemented in statistical summary, least square solutions of general linear models, rank calculations, etc. HTML interfaces, tutorials, source code, activities, and data are freely available via the web (www.SOCR.ucla.edu). Code examples for developers and demos for educators are provided on the SOCR Wiki website. In this article, the pedagogical utilization of the SOCR Analyses is discussed, as well as the underlying design framework. As the SOCR project is on-going and more functions and tools are being added to it, these resources are constantly improved. The reader is strongly encouraged to check the SOCR site for most updated information and newly added models. PMID:21546994
Browne, Richard W; Whitcomb, Brian W
2010-07-01
Problems in the analysis of laboratory data commonly arise in epidemiologic studies in which biomarkers subject to lower detection thresholds are used. Various thresholds exist including limit of detection (LOD), limit of quantification (LOQ), and limit of blank (LOB). Choosing appropriate strategies for dealing with data affected by such limits relies on proper understanding of the nature of the detection limit and its determination. In this paper, we demonstrate experimental and statistical procedures generally used for estimating different detection limits according to standard procedures in the context of analysis of fat-soluble vitamins and micronutrients in human serum. Fat-soluble vitamins and micronutrients were analyzed by high-performance liquid chromatography with diode array detection. A simulated serum matrix blank was repeatedly analyzed for determination of LOB parametrically by using the observed blank distribution as well as nonparametrically by using ranks. The LOD was determined by combining information regarding the LOB with data from repeated analysis of standard reference materials (SRMs), diluted to low levels; from LOB to 2-3 times LOB. The LOQ was determined experimentally by plotting the observed relative standard deviation (RSD) of SRM replicates compared with the concentration, where the LOQ is the concentration at an RSD of 20%. Experimental approaches and example statistical procedures are given for determination of LOB, LOD, and LOQ. These quantities are reported for each measured analyte. For many analyses, there is considerable information available below the LOQ. Epidemiologic studies must understand the nature of these detection limits and how they have been estimated for appropriate treatment of affected data.
Dissecting the genetics of complex traits using summary association statistics.
Pasaniuc, Bogdan; Price, Alkes L
2017-02-01
During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.
Statistical innovations in diagnostic device evaluation.
Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q
2016-01-01
Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.
Using basic statistics on the individual patient's own numeric data.
Hart, John
2012-12-01
This theoretical report gives an example for how coefficient of variation (CV) and quartile analysis (QA) to assess outliers might be able to be used to analyze numeric data in practice for an individual patient. A patient was examined for 8 visits using infrared instrumentation for measurement of mastoid fossa temperature differential (MFTD) readings. The CV and QA were applied to the readings. The participant also completed the Short Form-12 health perception survey on each visit, and these findings were correlated with CV to determine if CV had outcomes support (clinical significance). An outlier MFTD reading was observed on the eighth visit according to QA that coincided with the largest CV value for the MFTDs. Correlations between the Short Form-12 and CV were low to negligible, positive, and statistically nonsignificant. This case provides an example of how basic statistical analyses could possibly be applied to numerical data in chiropractic practice for an individual patient. This might add objectivity to analyzing an individual patient's data in practice, particularly if clinical significance of a clinical numerical finding is unknown.
CAVALCANTI, Andrea Nóbrega; MARCHI, Giselle Maria; AMBROSANO, Gláucia Maria Bovi
2010-01-01
Statistical analysis interpretation is a critical field in scientific research. When there is more than one main variable being studied in a research, the effect of the interaction between those variables is fundamental on experiments discussion. However, some doubts can occur when the p-value of the interaction is greater than the significance level. Objective To determine the most adequate interpretation for factorial experiments with p-values of the interaction nearly higher than the significance level. Materials and methods The p-values of the interactions found in two restorative dentistry experiments (0.053 and 0.068) were interpreted in two distinct ways: considering the interaction as not significant and as significant. Results Different findings were observed between the two analyses, and studies results became more coherent when the significant interaction was used. Conclusion The p-value of the interaction between main variables must be analyzed with caution because it can change the outcomes of research studies. Researchers are strongly advised to interpret carefully the results of their statistical analysis in order to discuss the findings of their experiments properly. PMID:20857003
2017-01-01
Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019
Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.
1980-01-01
Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.
Defining window-boundaries for genomic analyses using smoothing spline techniques
Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...
2015-04-17
High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less
The problem of pseudoreplication in neuroscientific studies: is it affecting your analysis?
2010-01-01
Background Pseudoreplication occurs when observations are not statistically independent, but treated as if they are. This can occur when there are multiple observations on the same subjects, when samples are nested or hierarchically organised, or when measurements are correlated in time or space. Analysis of such data without taking these dependencies into account can lead to meaningless results, and examples can easily be found in the neuroscience literature. Results A single issue of Nature Neuroscience provided a number of examples and is used as a case study to highlight how pseudoreplication arises in neuroscientific studies, why the analyses in these papers are incorrect, and appropriate analytical methods are provided. 12% of papers had pseudoreplication and a further 36% were suspected of having pseudoreplication, but it was not possible to determine for certain because insufficient information was provided. Conclusions Pseudoreplication can undermine the conclusions of a statistical analysis, and it would be easier to detect if the sample size, degrees of freedom, the test statistic, and precise p-values are reported. This information should be a requirement for all publications. PMID:20074371
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Huvane, Jacqueline; Komarow, Lauren; Hill, Carol; Tran, Thuy Tien T.; Pereira, Carol; Rosenkranz, Susan L.; Finnemeyer, Matt; Earley, Michelle; Jiang, Hongyu (Jeanne); Wang, Rui; Lok, Judith
2017-01-01
Abstract The Statistical and Data Management Center (SDMC) provides the Antibacterial Resistance Leadership Group (ARLG) with statistical and data management expertise to advance the ARLG research agenda. The SDMC is active at all stages of a study, including design; data collection and monitoring; data analyses and archival; and publication of study results. The SDMC enhances the scientific integrity of ARLG studies through the development and implementation of innovative and practical statistical methodologies and by educating research colleagues regarding the application of clinical trial fundamentals. This article summarizes the challenges and roles, as well as the innovative contributions in the design, monitoring, and analyses of clinical trials and diagnostic studies, of the ARLG SDMC. PMID:28350899
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.
2017-01-01
Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968
Qu, Shu-Gen; Gao, Jin; Tang, Bo; Yu, Bo; Shen, Yue-Ping; Tu, Yu
2018-01-01
Low-dose ionizing radiation (LDIR) may increase the mortality of solid cancers in nuclear industry workers, but only few individual cohort studies exist, and the available reports have low statistical power. The aim of the present study was to focus on solid cancer mortality risk from LDIR in the nuclear industry using standard mortality ratios (SMRs) and 95% confidence intervals. A systematic literature search through the PubMed and Embase databases identified 27 studies relevant to this meta-analysis. There was statistical significance for total, solid and lung cancers, with meta-SMR values of 0.88, 0.80, and 0.89, respectively. There was evidence of stochastic effects by IR, but more definitive conclusions require additional analyses using standardized protocols to determine whether LDIR increases the risk of solid cancer-related mortality. PMID:29725540
The Influence of Cognitive Reserve on Recovery from Traumatic Brain Injury.
Donders, Jacobus; Stout, Jacob
2018-04-12
we sought to determine the degree to which cognitive reserve, as assessed by the Test of Premorbid Functioning in combination with demographic variables, could act as a buffer against the effect of traumatic brain injury (TBI) on cognitive test performance. retrospective analysis of a cohort of 121 persons with TBI who completed the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV) within 1-12 months after injury. regression analyses indicated that cognitive reserve was a statistically significant predictor of all postinjury WAIS-IV factor index scores, after controlling for various premorbid and comorbid confounding variables. Only for Processing Speed did injury severity make an additional statistically significant contribution to the prediction model. cognitive reserve has a protective effect with regard to the impact of TBI on cognitive test performance but this effect is imperfect and does not completely negate the effect of injury severity.
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
Dark Energy Survey Year 1 Results: Multi-Probe Methodology and Simulated Likelihood Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krause, E.; et al.
We present the methodology for and detail the implementation of the Dark Energy Survey (DES) 3x2pt DES Year 1 (Y1) analysis, which combines configuration-space two-point statistics from three different cosmological probes: cosmic shear, galaxy-galaxy lensing, and galaxy clustering, using data from the first year of DES observations. We have developed two independent modeling pipelines and describe the code validation process. We derive expressions for analytical real-space multi-probe covariances, and describe their validation with numerical simulations. We stress-test the inference pipelines in simulated likelihood analyses that vary 6-7 cosmology parameters plus 20 nuisance parameters and precisely resemble the analysis to be presented in the DES 3x2pt analysis paper, using a variety of simulated input data vectors with varying assumptions. We find that any disagreement between pipelines leads to changes in assigned likelihoodmore » $$\\Delta \\chi^2 \\le 0.045$$ with respect to the statistical error of the DES Y1 data vector. We also find that angular binning and survey mask do not impact our analytic covariance at a significant level. We determine lower bounds on scales used for analysis of galaxy clustering (8 Mpc$$~h^{-1}$$) and galaxy-galaxy lensing (12 Mpc$$~h^{-1}$$) such that the impact of modeling uncertainties in the non-linear regime is well below statistical errors, and show that our analysis choices are robust against a variety of systematics. These tests demonstrate that we have a robust analysis pipeline that yields unbiased cosmological parameter inferences for the flagship 3x2pt DES Y1 analysis. We emphasize that the level of independent code development and subsequent code comparison as demonstrated in this paper is necessary to produce credible constraints from increasingly complex multi-probe analyses of current data.« less
ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.
Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z
2016-04-01
Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.
2008-10-30
The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less
A systematic optimization for graphene-based supercapacitors
NASA Astrophysics Data System (ADS)
Deuk Lee, Sung; Lee, Han Sung; Kim, Jin Young; Jeong, Jaesik; Kahng, Yung Ho
2017-08-01
Increasing the energy-storage density for supercapacitors is critical for their applications. Many researchers have attempted to identify optimal candidate component materials to achieve this goal, but investigations into systematically optimizing their mixing rate for maximizing the performance of each candidate material have been insufficient, which hinders the progress in their technology. In this study, we employ a statistically systematic method to determine the optimum mixing ratio of three components that constitute graphene-based supercapacitor electrodes: reduced graphene oxide (rGO), acetylene black (AB), and polyvinylidene fluoride (PVDF). By using the extreme-vertices design, the optimized proportion is determined to be (rGO: AB: PVDF = 0.95: 0.00: 0.05). The corresponding energy-storage density increases by a factor of 2 compared with that of non-optimized electrodes. Electrochemical and microscopic analyses are performed to determine the reason for the performance improvements.
Application of a faith-based integration tool to assess mental and physical health interventions.
Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A
2017-01-01
To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.
NASA Astrophysics Data System (ADS)
Staroń, Waldemar; Herbowski, Leszek; Gurgul, Henryk
2007-04-01
The goal of the work was to determine the values of cumulative parameters of the cerebrospinal fluid. Values of the parameters characterise statistical cerebrospinal fluid obtained by puncture from the patients diagnosed due to suspicion of normotensive hydrocephalus. The cerebrospinal fluid taken by puncture for the routine examinations carried out at the patients suspected of normotensive hydrocephalus was analysed. In the paper there are presented results of examinations of several dozens of puncture samples of the cerebrospinal fluid coming from various patients. Each sample was examined under the microscope and photographed in 20 randomly chosen places. On the basis of analysis of the pictures showing the area of 100 x 100μm, the selected cumulative parameters such as count, numerical density, field area and field perimeter were determined for each sample. Then the average value of the parameters was determined as well.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Using a Five-Step Procedure for Inferential Statistical Analyses
ERIC Educational Resources Information Center
Kamin, Lawrence F.
2010-01-01
Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…
NASA Technical Reports Server (NTRS)
Stefanski, Philip L.
2015-01-01
Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.
Harris, Michael; Radtke, Arthur S.
1976-01-01
Linear regression and discriminant analyses techniques were applied to gold, mercury, arsenic, antimony, barium, copper, molybdenum, lead, zinc, boron, tellurium, selenium, and tungsten analyses from drill holes into unoxidized gold ore at the Carlin gold mine near Carlin, Nev. The statistical treatments employed were used to judge proposed hypotheses on the origin and geochemical paragenesis of this disseminated gold deposit.
ERIC Educational Resources Information Center
Neumann, David L.; Hood, Michelle
2009-01-01
A wiki was used as part of a blended learning approach to promote collaborative learning among students in a first year university statistics class. One group of students analysed a data set and communicated the results by jointly writing a practice report using a wiki. A second group analysed the same data but communicated the results in a…
Extreme between-study homogeneity in meta-analyses could offer useful insights.
Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias
2006-10-01
Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.
Cavalcante, Y L; Hauser-Davis, R A; Saraiva, A C F; Brandão, I L S; Oliveira, T F; Silveira, A M
2013-01-01
This paper compared and evaluated seasonal variations in physico-chemical parameters and metals at a hydroelectric power station reservoir by applying Multivariate Analyses and Artificial Neural Networks (ANN) statistical techniques. A Factor Analysis was used to reduce the number of variables: the first factor was composed of elements Ca, K, Mg and Na, and the second by Chemical Oxygen Demand. The ANN showed 100% correct classifications in training and validation samples. Physico-chemical analyses showed that water pH values were not statistically different between the dry and rainy seasons, while temperature, conductivity, alkalinity, ammonia and DO were higher in the dry period. TSS, hardness and COD, on the other hand, were higher during the rainy season. The statistical analyses showed that Ca, K, Mg and Na are directly connected to the Chemical Oxygen Demand, which indicates a possibility of their input into the reservoir system by domestic sewage and agricultural run-offs. These statistical applications, thus, are also relevant in cases of environmental management and policy decision-making processes, to identify which factors should be further studied and/or modified to recover degraded or contaminated water bodies. Copyright © 2012 Elsevier B.V. All rights reserved.
Cicatelli, Angela; Fortunati, Tancredi; De Feis, Italia; Castiglione, Stefano
2013-09-01
The present study is focused on determining the olive oil fatty acid composition of ancient and recent varieties of the Campania region (Italy), but also on molecularly characterizing the most common cultivated varieties in the same region, together with olive trees of the garden of the University Campus of Salerno and of three olive groves of south Italy. Fatty acid methyl esters in the extra virgin oil derived olive fruits were determined, during three consecutive harvests, by gas chromatography. The statistical analysis on fatty acid composition was performed with the ffmanova package. The genetic biodiversity of the olive collection was estimated by using eight highly polymorphic microsatellite loci and calculating the most commonly used indexes. "Dice index" was employed to estimate the similarity level of the analysed olive samples, while the Structure software to infer their genetic structure. The fatty acid content of extra virgin olive oils, produced from the two olive groves in Campania, suggests that the composition is mainly determined by genotype and not by cultural practices or climatic conditions. Furthermore, the analysis conducted on the molecular data revealed the presence of 100 distinct genotypes and seven homonymies out of the 136 analysed trees. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
From sexless to sexy: Why it is time for human genetics to consider and report analyses of sex.
Powers, Matthew S; Smith, Phillip H; McKee, Sherry A; Ehringer, Marissa A
2017-01-01
Science has come a long way with regard to the consideration of sex differences in clinical and preclinical research, but one field remains behind the curve: human statistical genetics. The goal of this commentary is to raise awareness and discussion about how to best consider and evaluate possible sex effects in the context of large-scale human genetic studies. Over the course of this commentary, we reinforce the importance of interpreting genetic results in the context of biological sex, establish evidence that sex differences are not being considered in human statistical genetics, and discuss how best to conduct and report such analyses. Our recommendation is to run stratified analyses by sex no matter the sample size or the result and report the findings. Summary statistics from stratified analyses are helpful for meta-analyses, and patterns of sex-dependent associations may be hidden in a combined dataset. In the age of declining sequencing costs, large consortia efforts, and a number of useful control samples, it is now time for the field of human genetics to appropriately include sex in the design, analysis, and reporting of results.
Pike, Katie; Nash, Rachel L; Murphy, Gavin J; Reeves, Barnaby C; Rogers, Chris A
2015-02-22
The Transfusion Indication Threshold Reduction (TITRe2) trial is the largest randomized controlled trial to date to compare red blood cell transfusion strategies following cardiac surgery. This update presents the statistical analysis plan, detailing how the study will be analyzed and presented. The statistical analysis plan has been written following recommendations from the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use, prior to database lock and the final analysis of trial data. Outlined analyses are in line with the Consolidated Standards of Reporting Trials (CONSORT). The study aims to randomize 2000 patients from 17 UK centres. Patients are randomized to either a restrictive (transfuse if haemoglobin concentration <7.5 g/dl) or liberal (transfuse if haemoglobin concentration <9 g/dl) transfusion strategy. The primary outcome is a binary composite outcome of any serious infectious or ischaemic event in the first 3 months following randomization. The statistical analysis plan details how non-adherence with the intervention, withdrawals from the study, and the study population will be derived and dealt with in the analysis. The planned analyses of the trial primary and secondary outcome measures are described in detail, including approaches taken to deal with multiple testing, model assumptions not being met and missing data. Details of planned subgroup and sensitivity analyses and pre-specified ancillary analyses are given, along with potential issues that have been identified with such analyses and possible approaches to overcome such issues. ISRCTN70923932 .
Application of Statistically Derived CPAS Parachute Parameters
NASA Technical Reports Server (NTRS)
Romero, Leah M.; Ray, Eric S.
2013-01-01
The Capsule Parachute Assembly System (CPAS) Analysis Team is responsible for determining parachute inflation parameters and dispersions that are ultimately used in verifying system requirements. A model memo is internally released semi-annually documenting parachute inflation and other key parameters reconstructed from flight test data. Dispersion probability distributions published in previous versions of the model memo were uniform because insufficient data were available for determination of statistical based distributions. Uniform distributions do not accurately represent the expected distributions since extreme parameter values are just as likely to occur as the nominal value. CPAS has taken incremental steps to move away from uniform distributions. Model Memo version 9 (MMv9) made the first use of non-uniform dispersions, but only for the reefing cutter timing, for which a large number of sample was available. In order to maximize the utility of the available flight test data, clusters of parachutes were reconstructed individually starting with Model Memo version 10. This allowed for statistical assessment for steady-state drag area (CDS) and parachute inflation parameters such as the canopy fill distance (n), profile shape exponent (expopen), over-inflation factor (C(sub k)), and ramp-down time (t(sub k)) distributions. Built-in MATLAB distributions were applied to the histograms, and parameters such as scale (sigma) and location (mu) were output. Engineering judgment was used to determine the "best fit" distribution based on the test data. Results include normal, log normal, and uniform (where available data remains insufficient) fits of nominal and failure (loss of parachute and skipped stage) cases for all CPAS parachutes. This paper discusses the uniform methodology that was previously used, the process and result of the statistical assessment, how the dispersions were incorporated into Monte Carlo analyses, and the application of the distributions in trajectory benchmark testing assessments with parachute inflation parameters, drag area, and reefing cutter timing used by CPAS.
Dória, Maria Luisa; McKenzie, James S.; Mroz, Anna; Phelps, David L.; Speller, Abigail; Rosini, Francesca; Strittmatter, Nicole; Golf, Ottmar; Veselkov, Kirill; Brown, Robert; Ghaem-Maghami, Sadaf; Takats, Zoltan
2016-01-01
Ovarian cancer is highly prevalent among European women, and is the leading cause of gynaecological cancer death. Current histopathological diagnoses of tumour severity are based on interpretation of, for example, immunohistochemical staining. Desorption electrospray mass spectrometry imaging (DESI-MSI) generates spatially resolved metabolic profiles of tissues and supports an objective investigation of tumour biology. In this study, various ovarian tissue types were analysed by DESI-MSI and co-registered with their corresponding haematoxylin and eosin (H&E) stained images. The mass spectral data reveal tissue type-dependent lipid profiles which are consistent across the n = 110 samples (n = 107 patients) used in this study. Multivariate statistical methods were used to classify samples and identify molecular features discriminating between tissue types. Three main groups of samples (epithelial ovarian carcinoma, borderline ovarian tumours, normal ovarian stroma) were compared as were the carcinoma histotypes (serous, endometrioid, clear cell). Classification rates >84% were achieved for all analyses, and variables differing statistically between groups were determined and putatively identified. The changes noted in various lipid types help to provide a context in terms of tumour biochemistry. The classification of unseen samples demonstrates the capability of DESI-MSI to characterise ovarian samples and to overcome existing limitations in classical histopathology. PMID:27976698
Korus, Anna; Lisiewska, Zofia; Kmiecik, Waldemar
2002-08-01
Seeds of the grass pea (Lathyrus sativus L.) cultivars Derek and Krab, with a dry matter content of about 33%, were used for freezing and for canning. The content of vitamins C, B1, and B2 and of carotenoids, beta-carotene, and chlorophylls was determined in raw and blanched material, in frozen products after 6-month storage before and after cooking to consumption consistency, and in canned products after 6-month storage. In comparison with the cultivar Krab, raw seeds of Derek contained 45% more vitamin C, 14% more total chlorophylls, 13% less thiamine (vitamin B1), and 7% less riboflavin (vitamin B2). The level of carotenoids was similar. Blanching of seeds led to a statistically significant decrease only in the content of vitamin C. Freezing and frozen storage significantly lowered the level of vitamin C and chlorophylls. The cooking of frozen seeds and the production of canned products and their storage resulted in a statistically verified reduction in the content of components analysed in all the samples. Greater losses were found in products prepared from seeds of the cv. Krab. After cooking, frozen seeds contained more of all the analysed components than the canned products.
Ekins, Sean; Olechno, Joe; Williams, Antony J.
2013-01-01
Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research. PMID:23658723
Bernard, Nicola K; Kashy, Deborah A; Levendosky, Alytia A; Bogat, G Anne; Lonstein, Joseph S
2017-03-01
Attunement between mothers and infants in their hypothalamic-pituitary-adrenal (HPA) axis responsiveness to acute stressors is thought to benefit the child's emerging physiological and behavioral self-regulation, as well as their socioemotional development. However, there is no universally accepted definition of attunement in the literature, which appears to have resulted in inconsistent statistical analyses for determining its presence or absence, and contributed to discrepant results. We used a series of data analytic approaches, some previously used in the attunement literature and others not, to evaluate the attunement between 182 women and their 1-year-old infants in their HPA axis responsivity to acute stress. Cortisol was measured in saliva samples taken from mothers and infants before and twice after a naturalistic laboratory stressor (infant arm restraint). The results of the data analytic approaches were mixed, with some analyses suggesting attunement while others did not. The strengths and weaknesses of each statistical approach are discussed, and an analysis using a cross-lagged model that considered both time and interactions between mother and infant appeared the most appropriate. Greater consensus in the field about the conceptualization and analysis of physiological attunement would be valuable in order to advance our understanding of this phenomenon. © 2016 Wiley Periodicals, Inc.
Characterising the disintegration properties of tablets in opaque media using texture analysis.
Scheuerle, Rebekah L; Gerrard, Stephen E; Kendall, Richard A; Tuleu, Catherine; Slater, Nigel K H; Mahbubani, Krishnaa T
2015-01-01
Tablet disintegration characterisation is used in pharmaceutical research, development, and quality control. Standard methods used to characterise tablet disintegration are often dependent on visual observation in measurement of disintegration times. This presents a challenge for disintegration studies of tablets in opaque, physiologically relevant media that could be useful for tablet formulation optimisation. This study has explored an application of texture analysis disintegration testing, a non-visual, quantitative means of determining tablet disintegration end point, by analysing the disintegration behaviour of two tablet formulations in opaque media. In this study, the disintegration behaviour of one tablet formulation manufactured in-house, and Sybedia Flashtab placebo tablets in water, bovine, and human milk were characterised. A novel method is presented to characterise the disintegration process and to quantify the disintegration end points of the tablets in various media using load data generated by a texture analyser probe. The disintegration times in the different media were found to be statistically different (P<0.0001) from one another for both tablet formulations using one-way ANOVA. Using the Tukey post-hoc test, the Sybedia Flashtab placebo tablets were found not to have statistically significant disintegration times from each other in human versus bovine milk (adjusted P value 0.1685). Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Bhattacharyya, S.
1982-01-01
Four wrought alloys (A-286, IN 800H, N-155, and 19-9DL) and two cast alloys (CRM-6D and XF-818) were tested to determine their creep-rupture behavior. The wrought alloys were used in the form of sheets of 0.89 mm (0.035 in.) average thickness. The cast alloy specimens were investment cast and machined to 6.35 mm (0.250 in.) gage diameter. All specimens were tested to rupture in air at different times up to 3000 h over the temperature range of 650 C to 925 C (1200 F to 1700 F). Rupture life, minimum creep rate, and time to 1% creep strain were statistically analyzed as a function of stress at different temperatures. Temperature-compensated analysis was also performed to obtain the activation energies for rupture life, time to 1% creep strain, and the minimum creep rate. Microstructural and fracture analyses were also performed. Based on statistical analyses, estimates were made for stress levels at different temperatures to obtain 3500 h rupture life and time to 1% creep strain. Test results are to be compared with similar data being obtained for these alloys under 15 MPa (2175 psi) hydrogen.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fairey, R.; Roberts, C.; Jacobi, M.
1998-08-01
Sediment quality within San Diego Bay, Mission Bay, and the Tijuana River Estuary of California was investigated as part of an ongoing statewide monitoring effort (Bay Protection and Toxic Cleanup Program). Study objectives were to determine the incidence, spatial patterns, and spatial extent of toxicity in sediments and porewater; the concentration and distribution of potentially toxic anthropogenic chemicals; and the relationships between toxicity and chemical concentrations. Rhepoxynius abronius survival bioassays, grain size, and total organic carbon analyses were performed on 350 sediment samples. Strongylocentrotus purpuratus development bioassays were performed on 164 pore-water samples. Toxicity was demonstrated throughout the San Diegomore » Bay region, with increased incidence and concordance occurring in areas of industrial and shipping activity. Trace metal and trace synthetic organic analyses were performed on 229 samples. Copper, zinc, mercury, polycyclic aromatic hydrocarbons, polychlorinated biphenyls, and chlordane were found to exceed ERM (effects range median) or PEL (probable effects level) sediment quality guidelines and were considered the six major chemicals or chemical groups of concern. Statistical analysis of the relationships between amphipod toxicity, bulk phase sediment chemistry, and physical parameters demonstrated few significant linear relationships. Significant differences in chemical levels were found between toxic and nontoxic responses using multivariate and univariate statistics. Potential sources of anthropogenic chemicals were discussed.« less
Immature granulocyte detection by the SE-9000 haematology analyser during pregnancy.
Fernández-Suárez, A; Pascual, V T; Gimenez, M T F; Hernández, J F S
2003-12-01
The objective of this study was to determine the nature of the alarm for immature granulocytes appearing in haemograms from pregnant women, as detected by the immature cell information channel (IMI) of the SE-9000 automated haematology analyser. Of all tests run on pregnant women in a 4-month period (n = 698), the first 100 haemograms with immature granulocyte alarms (14.33%) were collected. Each of these samples was then stained with Wright-Giemsa stain. The following variables were also analysed: age of the mother, trimester and days of gestation, type of delivery, weight and sex of the baby, and Apgar score. Most pregnant women were in the third trimester of gestation (82%) when an alarm was noted on the IMI channel. Of the patients, 62% had normal deliveries. The most frequent complication was obstructed delivery (23%). Mean percentages by microscopic counts of band cells, metamyelocytes, and myelocytes were 2.99, 0.45, and 0.19%, respectively. There was a statistically significant correlation for all cell types between the SE-9000 and the manual count method. No association was observed between the presence of immature granulocytes and the clinical variables analysed. The SE-9000 analyser shows high sensitivity in the IMI channel for detection of immature forms.
Ahearn, Elizabeth A.
2010-01-01
Multiple linear regression equations for determining flow-duration statistics were developed to estimate select flow exceedances ranging from 25- to 99-percent for six 'bioperiods'-Salmonid Spawning (November), Overwinter (December-February), Habitat Forming (March-April), Clupeid Spawning (May), Resident Spawning (June), and Rearing and Growth (July-October)-in Connecticut. Regression equations also were developed to estimate the 25- and 99-percent flow exceedances without reference to a bioperiod. In total, 32 equations were developed. The predictive equations were based on regression analyses relating flow statistics from streamgages to GIS-determined basin and climatic characteristics for the drainage areas of those streamgages. Thirty-nine streamgages (and an additional 6 short-term streamgages and 28 partial-record sites for the non-bioperiod 99-percent exceedance) in Connecticut and adjacent areas of neighboring States were used in the regression analysis. Weighted least squares regression analysis was used to determine the predictive equations; weights were assigned based on record length. The basin characteristics-drainage area, percentage of area with coarse-grained stratified deposits, percentage of area with wetlands, mean monthly precipitation (November), mean seasonal precipitation (December, January, and February), and mean basin elevation-are used as explanatory variables in the equations. Standard errors of estimate of the 32 equations ranged from 10.7 to 156 percent with medians of 19.2 and 55.4 percent to predict the 25- and 99-percent exceedances, respectively. Regression equations to estimate high and median flows (25- to 75-percent exceedances) are better predictors (smaller variability of the residual values around the regression line) than the equations to estimate low flows (less than 75-percent exceedance). The Habitat Forming (March-April) bioperiod had the smallest standard errors of estimate, ranging from 10.7 to 20.9 percent. In contrast, the Rearing and Growth (July-October) bioperiod had the largest standard errors, ranging from 30.9 to 156 percent. The adjusted coefficient of determination of the equations ranged from 77.5 to 99.4 percent with medians of 98.5 and 90.6 percent to predict the 25- and 99-percent exceedances, respectively. Descriptive information on the streamgages used in the regression, measured basin and climatic characteristics, and estimated flow-duration statistics are provided in this report. Flow-duration statistics and the 32 regression equations for estimating flow-duration statistics in Connecticut are stored on the U.S. Geological Survey World Wide Web application ?StreamStats? (http://water.usgs.gov/osw/streamstats/index.html). The regression equations developed in this report can be used to produce unbiased estimates of select flow exceedances statewide.
NASA Technical Reports Server (NTRS)
Botkin, Daniel B.
1987-01-01
The analysis of ground-truth data from the boreal forest plots in the Superior National Forest, Minnesota, was completed. Development of statistical methods was completed for dimension analysis (equations to estimate the biomass of trees from measurements of diameter and height). The dimension-analysis equations were applied to the data obtained from ground-truth plots, to estimate the biomass. Classification and analyses of remote sensing images of the Superior National Forest were done as a test of the technique to determine forest biomass and ecological state by remote sensing. Data was archived on diskette and tape and transferred to UCSB to be used in subsequent research.
NASA Technical Reports Server (NTRS)
1976-01-01
A representative set of payloads for both science and applications disciplines were selected that would ensure a realistic and statistically significant estimate of equipment utilization. The selected payloads were analyzed to determine the applicability of Nuclear Instrumentation Modular (NIM)/Computer Automated Measurement Control (CAMAC) equipment in satisfying their data acquisition and control requirements. The analyses results were combined with the comparable results from related studies to arrive at an overall assessment of the applicability and commonality of NIM/CAMAC equipment usage across the spectrum of payloads.
Novel image encryption algorithm based on multiple-parameter discrete fractional random transform
NASA Astrophysics Data System (ADS)
Zhou, Nanrun; Dong, Taiji; Wu, Jianhua
2010-08-01
A new method of digital image encryption is presented by utilizing a new multiple-parameter discrete fractional random transform. Image encryption and decryption are performed based on the index additivity and multiple parameters of the multiple-parameter fractional random transform. The plaintext and ciphertext are respectively in the spatial domain and in the fractional domain determined by the encryption keys. The proposed algorithm can resist statistic analyses effectively. The computer simulation results show that the proposed encryption algorithm is sensitive to the multiple keys, and that it has considerable robustness, noise immunity and security.
NASA Astrophysics Data System (ADS)
Stück, H. L.; Siegesmund, S.
2012-04-01
Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity evolution during diagenesis is a very important control on the petrophysical properties of a building stone. The relationship between intergranular volume, cementation and grain contact, can also provide valuable information to predict the strength properties. Since the samples investigated mainly originate from the Triassic German epicontinental basin, arkoses and feldspar-arenites are underrepresented. In general, the sandstones can be grouped as follows: i) quartzites, highly mature with a primary porosity of about 40%, ii) quartzites, highly mature, showing a primary porosity of 40% but with early clay infiltration, iii) sublitharenites-lithic arenites exhibiting a lower primary porosity, higher cementation with quartz and Fe-oxides ferritic and iv) sublitharenites-lithic arenites with a higher content of pseudomatrix. However, in the last two groups the feldspar and lithoclasts can also show considerable alteration. All sandstone groups differ with respect to the pore space and strength data, as well as water uptake properties, which were obtained by linear regression analysis. Similar petrophysical properties are discernible for each type when using principle component analysis. Furthermore, strength as well as the porosity of sandstones shows distinct differences considering their stratigraphic ages and the compositions. The relationship between porosity, strength as well as salt resistance could also be verified. Hygric swelling shows an interrelation to pore size type, porosity and strength but also to the degree of alteration (e.g. lithoclasts, pseudomatrix). To summarize, the different regression analyses and the calculated confidence regions provide a significant tool to classify the petrographical and petrophysical parameters of sandstones. Based on this, the durability and the weathering behavior of the sandstone groups can be constrained. Keywords: sandstones, petrographical & petrophysical properties, predictive approach, statistical investigation
The future of the New Zealand plastic surgery workforce.
Adams, Brandon M; Klaassen, Michael F; Tan, Swee T
2013-04-05
The New Zealand (NZ) plastic and reconstructive surgery (PRS) workforce provides reconstructive plastic surgery (RPS) public services from six centres. There has been little analysis on whether the workforce is adequate to meet the needs of the NZ population currently or in the future. This study analysed the current workforce, its distribution and future requirements. PRS manpower data, workforce activities, population statistics, and population modelling were analysed to determine current needs and predict future needs for the PRS workforce. The NZ PRS workforce is compared with international benchmarks. Regional variation of the workforce was analysed with respect to the population's access to PRS services. Future supply of specialist plastic surgeons is analysed. NZ has a lower number of plastic surgeons per capita than comparable countries. The current NZ PRS workforce is mal-distributed. Areas of current and emerging future need are identified. The current workforce mal-distribution will worsen with future population growth and distribution. Up to 60% of the NZ population will be at risk of inadequate access to PRS services by 2027. Development of PRS services must be coordinated to ensure that equitable and sustainable services are available throughout NZ. Strategies for ensuring satisfactory future workforce are discussed.
The epidemiology of suicide in Jamaica 2002-2010: rates and patterns.
Abell, W D; James, K; Bridgelal-Nagassar, R; Holder-Nevins, D; Eldemire, H; Thompson, E; Sewell, C
2012-08-01
Suicide is increasingly recognized as a worldwide problem. There is a paucity of quality data pertaining to suicide in developing countries. Epidemiological analysis of suicide data elucidates prevailing patterns that facilitate risk factor identification and the development of germane programmatic responses. This paper analyses temporal variations in suicide rates for the years 2002-2010 in Jamaica and describes the sociodemographic profile of cases and method of suicide for the latter four years. Data pertaining to suicides were extracted from the police (The Jamaica Constabulary Force) records. These were summarized and analysed with respect to person, place and time. Population statistics for the computation of rates were obtained from publications of the Statistical Institute of Jamaica. Age-standardized rates were generated for comparison of trends over time. Poisson and binomial probabilities were used to determine statistically significant differences in rates. Suicide rates in Jamaica have remained relatively stable for the period reviewed with mean overall annual incidence of 2.1 per 100 000 population. Rates for males were significantly higher than those for females. The majority (90.4%) of suicide cases were males. A trend for higher rates of suicide was generally noted in the 25-34-year and the 75-year and over age groups. Hanging was the main method used to commit suicide (77.5%). Age-adjusted rates of suicide indicate no significant changes in Jamaica over the period 2002 to 2010. Continued surveillance of suicide as well as improved recording of the circumstances surrounding suicides are recommended to promote greater understanding of suicides and this will ultimately inform intervention strategies.
Analysis of Exhaled Breath Volatile Organic Compounds in Inflammatory Bowel Disease: A Pilot Study.
Hicks, Lucy C; Huang, Juzheng; Kumar, Sacheen; Powles, Sam T; Orchard, Timothy R; Hanna, George B; Williams, Horace R T
2015-09-01
Distinguishing between the inflammatory bowel diseases [IBD], Crohn's disease [CD] and ulcerative colitis [UC], is important for determining management and prognosis. Selected ion flow tube mass spectrometry [SIFT-MS] may be used to analyse volatile organic compounds [VOCs] in exhaled breath: these may be altered in disease states, and distinguishing breath VOC profiles can be identified. The aim of this pilot study was to identify, quantify, and analyse VOCs present in the breath of IBD patients and controls, potentially providing insights into disease pathogenesis and complementing current diagnostic algorithms. SIFT-MS breath profiling of 56 individuals [20 UC, 18 CD, and 18 healthy controls] was undertaken. Multivariate analysis included principal components analysis and partial least squares discriminant analysis with orthogonal signal correction [OSC-PLS-DA]. Receiver operating characteristic [ROC] analysis was performed for each comparative analysis using statistically significant VOCs. OSC-PLS-DA modelling was able to distinguish both CD and UC from healthy controls and from one other with good sensitivity and specificity. ROC analysis using combinations of statistically significant VOCs [dimethyl sulphide, hydrogen sulphide, hydrogen cyanide, ammonia, butanal, and nonanal] gave integrated areas under the curve of 0.86 [CD vs healthy controls], 0.74 [UC vs healthy controls], and 0.83 [CD vs UC]. Exhaled breath VOC profiling was able to distinguish IBD patients from controls, as well as to separate UC from CD, using both multivariate and univariate statistical techniques. Copyright © 2015 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Does climate directly influence NPP globally?
Chu, Chengjin; Bartlett, Megan; Wang, Youshi; He, Fangliang; Weiner, Jacob; Chave, Jérôme; Sack, Lawren
2016-01-01
The need for rigorous analyses of climate impacts has never been more crucial. Current textbooks state that climate directly influences ecosystem annual net primary productivity (NPP), emphasizing the urgent need to monitor the impacts of climate change. A recent paper challenged this consensus, arguing, based on an analysis of NPP for 1247 woody plant communities across global climate gradients, that temperature and precipitation have negligible direct effects on NPP and only perhaps have indirect effects by constraining total stand biomass (Mtot ) and stand age (a). The authors of that study concluded that the length of the growing season (lgs ) might have a minor influence on NPP, an effect they considered not to be directly related to climate. In this article, we describe flaws that affected that study's conclusions and present novel analyses to disentangle the effects of stand variables and climate in determining NPP. We re-analyzed the same database to partition the direct and indirect effects of climate on NPP, using three approaches: maximum-likelihood model selection, independent-effects analysis, and structural equation modeling. These new analyses showed that about half of the global variation in NPP could be explained by Mtot combined with climate variables and supported strong and direct influences of climate independently of Mtot , both for NPP and for net biomass change averaged across the known lifetime of the stands (ABC = average biomass change). We show that lgs is an important climate variable, intrinsically correlated with, and contributing to mean annual temperature and precipitation (Tann and Pann ), all important climatic drivers of NPP. Our analyses provide guidance for statistical and mechanistic analyses of climate drivers of ecosystem processes for predictive modeling and provide novel evidence supporting the strong, direct role of climate in determining vegetation productivity at the global scale. © 2015 John Wiley & Sons Ltd.
The influence of aggregates type on W/C ratio on the strength and other properties of concrete
NASA Astrophysics Data System (ADS)
Malaiskiene, J.; Skripkiunas, G.; Vaiciene, M.; Karpova, E.
2017-10-01
The influence of different types of aggregates and W/C ratio on concrete properties is analysed. In order to achieve this aim, lightweight (with expanded clay aggregate) and normal concrete (with gravel aggregate) mixtures are prepared with different W/C ratios. Different W/C ratios are selected by reducing the amount of cement when the amount of water is constant. The following properties of concrete have been determined: density, compressive strength and water absorption. Additionally, the statistical data analysis is performed and influence of aggregate type and W/C ratio on concrete properties is determined. The empirical equations indicating dependence between concrete strength and W/C and strength of aggregate are obtained for normal concrete and light-weight concrete.
Statistical Techniques to Analyze Pesticide Data Program Food Residue Observations.
Szarka, Arpad Z; Hayworth, Carol G; Ramanarayanan, Tharacad S; Joseph, Robert S I
2018-06-26
The U.S. EPA conducts dietary-risk assessments to ensure that levels of pesticides on food in the U.S. food supply are safe. Often these assessments utilize conservative residue estimates, maximum residue levels (MRLs), and a high-end estimate derived from registrant-generated field-trial data sets. A more realistic estimate of consumers' pesticide exposure from food may be obtained by utilizing residues from food-monitoring programs, such as the Pesticide Data Program (PDP) of the U.S. Department of Agriculture. A substantial portion of food-residue concentrations in PDP monitoring programs are below the limits of detection (left-censored), which makes the comparison of regulatory-field-trial and PDP residue levels difficult. In this paper, we present a novel adaption of established statistical techniques, the Kaplan-Meier estimator (K-M), the robust regression on ordered statistic (ROS), and the maximum-likelihood estimator (MLE), to quantify the pesticide-residue concentrations in the presence of heavily censored data sets. The examined statistical approaches include the most commonly used parametric and nonparametric methods for handling left-censored data that have been used in the fields of medical and environmental sciences. This work presents a case study in which data of thiamethoxam residue on bell pepper generated from registrant field trials were compared with PDP-monitoring residue values. The results from the statistical techniques were evaluated and compared with commonly used simple substitution methods for the determination of summary statistics. It was found that the maximum-likelihood estimator (MLE) is the most appropriate statistical method to analyze this residue data set. Using the MLE technique, the data analyses showed that the median and mean PDP bell pepper residue levels were approximately 19 and 7 times lower, respectively, than the corresponding statistics of the field-trial residues.
Conceptual and statistical problems associated with the use of diversity indices in ecology.
Barrantes, Gilbert; Sandoval, Luis
2009-09-01
Diversity indices, particularly the Shannon-Wiener index, have extensively been used in analyzing patterns of diversity at different geographic and ecological scales. These indices have serious conceptual and statistical problems which make comparisons of species richness or species abundances across communities nearly impossible. There is often no a single statistical method that retains all information needed to answer even a simple question. However, multivariate analyses could be used instead of diversity indices, such as cluster analyses or multiple regressions. More complex multivariate analyses, such as Canonical Correspondence Analysis, provide very valuable information on environmental variables associated to the presence and abundance of the species in a community. In addition, particular hypotheses associated to changes in species richness across localities, or change in abundance of one, or a group of species can be tested using univariate, bivariate, and/or rarefaction statistical tests. The rarefaction method has proved to be robust to standardize all samples to a common size. Even the simplest method as reporting the number of species per taxonomic category possibly provides more information than a diversity index value.
NASA Astrophysics Data System (ADS)
Toth-Tascau, Mirela; Balanean, Flavia; Krepelka, Mircea
2013-10-01
Musculoskeletal impairment of the upper limb can cause difficulties in performing basic daily activities. Three dimensional motion analyses can provide valuable data of arm movement in order to precisely determine arm movement and inter-joint coordination. The purpose of this study was to develop a method to evaluate the degree of impairment based on the influence of shoulder movements in the amplitude of elbow flexion and extension based on the assumption that a lack of motion of the elbow joint will be compensated by an increased shoulder activity. In order to develop and validate a statistical model, one healthy young volunteer has been involved in the study. The activity of choice simulated blowing the nose, starting from a slight flexion of the elbow and raising the hand until the middle finger touches the tip of the nose and return to the start position. Inter-joint coordination between the elbow and shoulder movements showed significant correlation. Statistical regression was used to fit an equation model describing the influence of shoulder movements on the elbow mobility. The study provides a brief description of the kinematic analysis protocol and statistical models that may be useful in describing the relation between inter-joint movements of daily activities.
Gadomski, Adam; Ausloos, Marcel; Casey, Tahlia
2017-04-01
This article addresses a set of observations framed in both deterministic as well as statistical formal guidelines. It operates within the framework of nonlinear dynamical systems theory (NDS). It is argued that statistical approaches can manifest themselves ambiguously, creating practical discrepancies in psychological and cognitive data analyses both quantitatively and qualitatively. This is sometimes termed in literature as 'questionable research practices.' This communication points to the demand for a deeper awareness of the data 'initial conditions, allowing to focus on pertinent evolution constraints in such systems.' It also considers whether the exponential (Malthus-type) or the algebraic (Pareto-type) statistical distribution ought to be effectively considered in practical interpretations. The role of repetitive specific behaviors by patients seeking treatment is examined within the NDS frame. The significance of these behaviors, involving a certain memory effect seems crucial in determining a patient's progression or regression. With this perspective, it is discussed how a sensitively applied hazardous or triggering factor can be helpful for well-controlled psychological strategic treatments; those attributable to obsessive-compulsive disorders or self-injurious behaviors are recalled in particular. There are both inherent criticality- and complexity-exploiting (reduced-variance based) relations between a therapist and a patient that can be intrinsically included in NDS theory.
Taylor, Sandra L; Ruhaak, L Renee; Weiss, Robert H; Kelly, Karen; Kim, Kyoungmi
2017-01-01
High through-put mass spectrometry (MS) is now being used to profile small molecular compounds across multiple biological sample types from the same subjects with the goal of leveraging information across biospecimens. Multivariate statistical methods that combine information from all biospecimens could be more powerful than the usual univariate analyses. However, missing values are common in MS data and imputation can impact between-biospecimen correlation and multivariate analysis results. We propose two multivariate two-part statistics that accommodate missing values and combine data from all biospecimens to identify differentially regulated compounds. Statistical significance is determined using a multivariate permutation null distribution. Relative to univariate tests, the multivariate procedures detected more significant compounds in three biological datasets. In a simulation study, we showed that multi-biospecimen testing procedures were more powerful than single-biospecimen methods when compounds are differentially regulated in multiple biospecimens but univariate methods can be more powerful if compounds are differentially regulated in only one biospecimen. We provide R functions to implement and illustrate our method as supplementary information CONTACT: sltaylor@ucdavis.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Lee, Dong-Hyun; Kang, Bo-Sik; Park, Hyun-Jin
2011-11-09
The oxidation of Cabernet Sauvignon wines during secondary shelf life was studied by headspace solid-phase microextraction (HS-SPME) coupled to gas chromatography-quadrupole mass spectrometry (GC-qMS) and sensory tests, with the support of multivariate statistical analyses such as OPLS-DA loading plot and PCA score plot. Four different oxidation conditions were established during a 1-week secondary shelf life. Samples collected on a regular basis were analyzed to determine the changes of volatile chemicals, with sensory characteristics evaluated through pattern recognition models. During secondary shelf life the separation among collected samples depended on the degree of oxidation in wine. Isoamyl acetate, ethyl decanoate, nonanoic acid, n-decanoic acid, undecanoic acid, 2-furancarboxylic acid, dodecanoic acid, and phenylacetaldehyde were determined to be associated with the oxidation of the wine. PCA sensory evaluation revealed that least oxidized wine and fresh wine was well-separated from more oxidized wines, demonstrating that sensory characteristics of less oxidized wines tend toward "fruity", "citrous", and "sweetness", while those of more oxidized wines are positively correlated with "animal", "bitterness", and "dairy". The study also demonstrates that OPLS-DA and PCA are very useful statistical tools for the understanding of wine oxidation.
Preoperative and post-operative sleep quality evaluation in rotator cuff tear patients.
Serbest, Sancar; Tiftikçi, Uğur; Askın, Aydogan; Yaman, Ferda; Alpua, Murat
2017-07-01
The aim of this study was to examine the potential relationship between subjective sleep quality and degree of pain in patients with rotator cuff repair. Thirty-one patients who underwent rotator cuff repair prospectively completed the Pittsburgh Sleep Quality Index, the Western Ontario Rotator Cuff Index, and the Constant and Murley shoulder scores before surgery and at 6 months after surgery. Preoperative demographic, clinical, and radiologic parameters were also evaluated. The study analysed 31 patients with a median age of 61 years. There was a significant difference preoperatively versus post-operatively in terms of all PSQI global scores and subdivisions (p < 0.001). A statistically significant improvement was determined by the Western Ontario Rotator Cuff Scale and the Constant and Murley shoulder scores (p ˂ 0.001). Sleep disorders are commonly seen in patients with rotator cuff tear, and after repair, there is an increase in the quality of sleep with a parallel improvement in shoulder functions. However, no statistically significant correlation was determined between arthroscopic procedures and the size of the tear and sleep quality. It is suggested that rotator cuff tear repair improves the quality of sleep and the quality of life. IV.
Athletic Engagement and Athletic Identity in Top Croatian Sprint Runners.
Babić, Vesna; Sarac, Jelena; Missoni, Sasa; Sindik, Josko
2015-09-01
The aim of the research was to determine construct validity and reliability for two questionnaires (Athlete Engagement Questionnaire-AEQ and Athletic Identity Measurement Scale-AIMS), applied on elite Croatian athletes-sprinters, as well as the correlations among the dimensions in these measuring instruments. Then, we have determined the differences in the dimensions of sport engagement and sport identity, according to gender, education level and winning medals on international competitions. A total of 71 elite athletes-sprinters (former and still active) are examined, from which 27 (38%) females and 44 (62%) males. The results of factor analyses revealed the existence of dimensions very similar as in the original instruments, which showed moderate to-high reliabilities. A small number of statistically significant correlations have been found between the dimensions of sport engagement and sport identity, mainly in male sprinter runners. Small number of statistically significant differences in the dimensions of sport engagement and sport identity have been found according to the gender, education level and winning medals on the international competitions. The most reasonable explanation of these differences could be given in terms of very similar characteristics of elite athletes on the same level of sport excellence.
NASA Technical Reports Server (NTRS)
Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.
1984-01-01
An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.
[Clinical research=design*measurements*statistical analyses].
Furukawa, Toshiaki
2012-06-01
A clinical study must address true endpoints that matter for the patients and the doctors. A good clinical study starts with a good clinical question. Formulating a clinical question in the form of PECO can sharpen one's original question. In order to perform a good clinical study one must have a knowledge of study design, measurements and statistical analyses: The first is taught by epidemiology, the second by psychometrics and the third by biostatistics.
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
Preisser, J. S.; Hammett-Stabler, C. A.; Renner, J. B.; Rubin, J.
2011-01-01
Summary The association between follicle-stimulating hormone (FSH) and bone density was tested in 111 postmenopausal women aged 50–64 years. In the multivariable analysis, weight and race were important determinants of bone mineral density. FSH, bioavailable estradiol, and other hormonal variables did not show statistically significant associations with bone density at any site. Introduction FSH has been associated with bone density loss in animal models and longitudinal studies of women. Most of these analyses have not considered the effect of weight or race. Methods We tested the association between FSH and bone density in younger postmenopausal women, adjusting for patient-related factors. In 111 postmenopausal women aged 50–64 years, areal bone mineral density (BMD) was measured at the lumbar spine, femoral neck, total hip, and distal radius using dual-energy X-ray absorptiometry, and volumetric BMD was measured at the distal radius using peripheral quantitative computed tomography (pQCT). Height, weight, osteoporosis risk factors, and serum hormonal factors were assessed. Results FSH inversely correlated with weight, bioavailable estradiol, areal BMD at the lumbar spine and hip, and volumetric BMD at the ultradistal radius. In the multivariable analysis, no hormonal variable showed a statistically significant association with areal BMD at any site. Weight was independently associated with BMD at all central sites (p<0.001), but not with BMD or pQCT measures at the distal radius. Race was independently associated with areal BMD at all sites (p≤0.008) and with cortical area at the 33% distal radius (p=0.004). Conclusions Correlations between FSH and bioavailable estradiol and BMD did not persist after adjustment for weight and race in younger postmenopausal women. Weight and race were more important determinants of bone density and should be included in analyses of hormonal influences on bone. PMID:21125395
Nimptsch, Ulrike; Wengler, Annelene; Mansky, Thomas
2016-11-01
In Germany, nationwide hospital discharge data (DRG statistics provided by the research data centers of the Federal Statistical Office and the Statistical Offices of the 'Länder') are increasingly used as data source for health services research. Within this data hospitals can be separated via their hospital identifier ([Institutionskennzeichen] IK). However, this hospital identifier primarily designates the invoicing unit and is not necessarily equivalent to one hospital location. Aiming to investigate direction and extent of possible bias in hospital-level analyses this study examines the continuity of the hospital identifier within a cross-sectional and longitudinal approach and compares the results to official hospital census statistics. Within the DRG statistics from 2005 to 2013 the annual number of hospitals as classified by hospital identifiers was counted for each year of observation. The annual number of hospitals derived from DRG statistics was compared to the number of hospitals in the official census statistics 'Grunddaten der Krankenhäuser'. Subsequently, the temporal continuity of hospital identifiers in the DRG statistics was analyzed within cohorts of hospitals. Until 2013, the annual number of hospital identifiers in the DRG statistics fell by 175 (from 1,725 to 1,550). This decline affected only providers with small or medium case volume. The number of hospitals identified in the DRG statistics was lower than the number given in the census statistics (e.g., in 2013 1,550 IK vs. 1,668 hospitals in the census statistics). The longitudinal analyses revealed that the majority of hospital identifiers persisted in the years of observation, while one fifth of hospital identifiers changed. In cross-sectional studies of German hospital discharge data the separation of hospitals via the hospital identifier might lead to underestimating the number of hospitals and consequential overestimation of caseload per hospital. Discontinuities of hospital identifiers over time might impair the follow-up of hospital cohorts. These limitations must be taken into account in analyses of German hospital discharge data focusing on the hospital level. Copyright © 2016. Published by Elsevier GmbH.
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
Nansseu, Jobert Richie N; Ngo-Um, Suzanne S; Balti, Eric V
2016-11-10
In the absence of existing data, the present review intends to determine the incidence, prevalence and/or genetic determinants of neonatal diabetes mellitus (NDM), with expected contribution to disease characterization. We will include cross-sectional, cohort or case-control studies which have reported the incidence, prevalence and/or genetic determinants of NDM between January 01, 2000 and May 31, 2016, published in English or French languages and without any geographical limitation. PubMed and EMBASE will be extensively screened to identify potentially eligible studies, completed by manual search. Two authors will independently screen, select studies, extract data, and assess the risk of bias; disagreements will be resolved by consensus. Clinical heterogeneity will be investigated by examining the design and setting (including geographic region), procedure used for genetic testing, calculation of incidence or prevalence, and outcomes in each study. Studies found to be clinically homogeneous will be pooled together through a random effects meta-analysis. Statistical heterogeneity will be assessed using the chi-square test of homogeneity and quantified using the I 2 statistic. In case of substantial heterogeneity, subgroup analyses will be undertaken. Publication bias will be assessed with funnel plots, complemented with the use of Egger's test of bias. This systematic review and meta-analysis is expected to draw a clear picture of phenotypic and genotypic presentations of NDM in order to better understand the condition and adequately address challenges in respect with its management. PROSPERO CRD42016039765.
Ecogeographic Genetic Epidemiology
Sloan, Chantel D.; Duell, Eric J.; Shi, Xun; Irwin, Rebecca; Andrew, Angeline S.; Williams, Scott M.; Moore, Jason H.
2009-01-01
Complex diseases such as cancer and heart disease result from interactions between an individual's genetics and environment, i.e. their human ecology. Rates of complex diseases have consistently demonstrated geographic patterns of incidence, or spatial “clusters” of increased incidence relative to the general population. Likewise, genetic subpopulations and environmental influences are not evenly distributed across space. Merging appropriate methods from genetic epidemiology, ecology and geography will provide a more complete understanding of the spatial interactions between genetics and environment that result in spatial patterning of disease rates. Geographic Information Systems (GIS), which are tools designed specifically for dealing with geographic data and performing spatial analyses to determine their relationship, are key to this kind of data integration. Here the authors introduce a new interdisciplinary paradigm, ecogeographic genetic epidemiology, which uses GIS and spatial statistical analyses to layer genetic subpopulation and environmental data with disease rates and thereby discern the complex gene-environment interactions which result in spatial patterns of incidence. PMID:19025788
Authenticated DNA from Ancient Wood Remains
LIEPELT, SASCHA; SPERISEN, CHRISTOPH; DEGUILLOUX, MARIE-FRANCE; PETIT, REMY J.; KISSLING, ROY; SPENCER, MATTHEW; DE BEAULIEU, JACQUES-LOUIS; TABERLET, PIERRE; GIELLY, LUDOVIC; ZIEGENHAGEN, BIRGIT
2006-01-01
• Background The reconstruction of biological processes and human activities during the last glacial cycle relies mainly on data from biological remains. Highly abundant tissues, such as wood, are candidates for a genetic analysis of past populations. While well-authenticated DNA has now been recovered from various fossil remains, the final ‘proof’ is still missing for wood, despite some promising studies. • Scope The goal of this study was to determine if ancient wood can be analysed routinely in studies of archaeology and palaeogenetics. An experiment was designed which included blind testing, independent replicates, extensive contamination controls and rigorous statistical tests. Ten samples of ancient wood from major European forest tree genera were analysed with plastid DNA markers. • Conclusions Authentic DNA was retrieved from wood samples up to 1000 years of age. A new tool for real-time vegetation history and archaeology is ready to use. PMID:16987920
[The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].
Pérol, M; Pérol, D
2004-02-01
Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.
Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita
2018-03-01
The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Teenage pregnancy and long-term mental health outcomes among Indigenous women in Canada.
Xavier, Chloé G; Brown, Hilary K; Benoit, Anita C
2018-06-01
Our objectives were to (1) compare the risks for poor long-term mental health outcomes among indigenous women with and without a teenage pregnancy and (2) determine if community and cultural factors modify this risk. We conducted a secondary analysis of the 2012 Aboriginal Peoples Survey. Respondents were women aged 25 to 49 years who had given birth to at least one child. Teenage mothers (age at first birth 13 to 19 years; n = 1330) were compared to adult mothers (age at first birth 20 years or older; n = 2630). Mental health outcomes were psychological distress, mental health status, suicide ideation/attempt, and alcohol consumption. To address objective 1, we used binary logistic regression analyses before and after controlling for covariates. To address objective 2, we tested the significance of interaction terms between teenage pregnancy status and effect measure modifiers. In unadjusted analyses, teenage pregnancy was associated with increased risk for poor/fair mental health [odds ratio (OR) 1.77, 95% confidence interval (CI) 1.24-2.53] and suicide attempt/ideation (OR 1.95, 95% CI 1.07-3.54). However, the associations were not statistically significant after adjusting for demographic, socioeconomic, environmental, and health covariates. Teenage pregnancy was not associated with increased risk for high psychological distress or heavy alcohol consumption in unadjusted or adjusted analyses. The interaction term for involvement in cultural activities was statistically significant for poor/fair mental health; however, after stratification, ORs were non-significant. Among indigenous mothers, teenage pregnancy was less important than broader social and health circumstances in predicting long-term mental health.
NASA Astrophysics Data System (ADS)
Kessel, Kerstin A.; Jäger, Andreas; Bohn, Christian; Habermehl, Daniel; Zhang, Lanlan; Engelmann, Uwe; Bougatf, Nina; Bendl, Rolf; Debus, Jürgen; Combs, Stephanie E.
2013-03-01
To date, conducting retrospective clinical analyses is rather difficult and time consuming. Especially in radiation oncology, handling voluminous datasets from various information systems and different documentation styles efficiently is crucial for patient care and research. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using analysis tools connected with a documentation system. A total number of 783 patients have been documented into a professional, web-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported. For patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose-volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are stored in the database and included in statistical calculations. The main goal of using an automatic evaluation system is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the evaluation system to other types of tumors in radiation oncology.
ERIC Educational Resources Information Center
Merrill, Ray M.; Chatterley, Amanda; Shields, Eric C.
2005-01-01
This study explored the effectiveness of selected statistical measures at motivating or maintaining regular exercise among college students. The study also considered whether ease in understanding these statistical measures was associated with perceived effectiveness at motivating or maintaining regular exercise. Analyses were based on a…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
The Empirical Nature and Statistical Treatment of Missing Data
ERIC Educational Resources Information Center
Tannenbaum, Christyn E.
2009-01-01
Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-05
...] Guidance for Industry on Documenting Statistical Analysis Programs and Data Files; Availability AGENCY... Programs and Data Files.'' This guidance is provided to inform study statisticians of recommendations for documenting statistical analyses and data files submitted to the Center for Veterinary Medicine (CVM) for the...
Color recovery effect of different bleaching systems on a discolored composite resin.
Gul, P; Harorlı, O T; Ocal, I B; Ergin, Z; Barutcigil, C
2017-10-01
Discoloration of resin-based composites is a commonly encountered problem, and bleaching agents may be used for the therapy of the existing discoloration. The purpose of this study was to investigate in vitro color recovery effect of different bleaching systems on the heavily discolored composite resin. Fifty disk-shaped dental composite specimens were prepared using A2 shade nanohybrid universal composite resin (3M ESPE Filtek Z550, St. Paul, MN, USA). Composite samples were immersed in coffee and turnip juice for 1 week in each. One laser activated bleaching (LB) (Biolase Laserwhite*20) and three conventional bleaching systems (Ultradent Opalescence Boost 40% (OB), Ultradent Opalescence PF 15% home bleaching (HB), Crest 3D White [Whitening Mouthwash]) were tested in this study. Distilled water was used as control group. The color of the samples were measured using a spectrophotometer (VITA Easy shade Compact, VITA Zahnfabrik, Bad Säckingen, Germany). Color changes (ΔE00) were calculated using the CIEDE2000 formula. Statistical analyses were conducted using paired samples test, one-way analysis of variance, and Tukey's multiple comparison tests (α = 0.05). The staining beverages caused perceptible discoloration (ΔE00 > 2.25). The color recovery effect of all bleaching systems was statistically determined to be more effective than the control group (P < 0.05). Although OB group was found as the most effective bleaching system, there was no statistically significant difference among HB, OB, and LB groups (P > 0.05). Within the limitation of this in vitro study, the highest recovery effect was determined in office bleaching system among all bleaching systems. However, home and laser bleaching systems were determined as effective as office bleaching system.
van de Glind, Esther M M; Willems, Hanna C; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F; Hooft, Lotty; de Rooij, Sophia E; Black, Dennis M; van Munster, Barbara C
2016-05-01
For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relation to fracture risk with alendronate versus placebo in postmenopausal women. We performed a post hoc analysis of the Fracture Intervention Trial (FIT), a randomized, controlled trial that investigated the effect of alendronate versus placebo on fracture risk in postmenopausal women. We used SPC, a statistical method used for monitoring processes for quality control, to determine if and when the intervention group benefited significantly more than the control group. SPC discriminated between the normal variations over time in the numbers of fractures in both groups and the variations that were attributable to alendronate. The TTB was defined as the time point from which the cumulative difference in the number of clinical fractures remained greater than the upper control limit on the SPC chart. For the total group, the TTB was defined as 11 months. For patients aged ≥70 years, the TTB was 8 months [absolute risk reduction (ARR) = 1.4%]; for patients aged <70 years, it was 19 months (ARR = 0.7%). SPC is a clear and understandable graphical method to determine the TTB. Its main advantage is that there is no need to define a prespecified time point, as is the case in traditional survival analyses. Prescribing alendronate to patients who are aged ≥70 years is useful because the TTB shows that they will benefit after 8 months. Investigators should report the TTB to simplify clinical decision making.
Development of the Statistical Reasoning in Biology Concept Inventory (SRBCI).
Deane, Thomas; Nomme, Kathy; Jeffery, Erica; Pollock, Carol; Birol, Gülnur
2016-01-01
We followed established best practices in concept inventory design and developed a 12-item inventory to assess student ability in statistical reasoning in biology (Statistical Reasoning in Biology Concept Inventory [SRBCI]). It is important to assess student thinking in this conceptual area, because it is a fundamental requirement of being statistically literate and associated skills are needed in almost all walks of life. Despite this, previous work shows that non-expert-like thinking in statistical reasoning is common, even after instruction. As science educators, our goal should be to move students along a novice-to-expert spectrum, which could be achieved with growing experience in statistical reasoning. We used item response theory analyses (the one-parameter Rasch model and associated analyses) to assess responses gathered from biology students in two populations at a large research university in Canada in order to test SRBCI's robustness and sensitivity in capturing useful data relating to the students' conceptual ability in statistical reasoning. Our analyses indicated that SRBCI is a unidimensional construct, with items that vary widely in difficulty and provide useful information about such student ability. SRBCI should be useful as a diagnostic tool in a variety of biology settings and as a means of measuring the success of teaching interventions designed to improve statistical reasoning skills. © 2016 T. Deane et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Effects of foot strike on low back posture, shock attenuation, and comfort in running.
Delgado, Traci L; Kubera-Shelton, Emilia; Robb, Robert R; Hickman, Robbin; Wallmann, Harvey W; Dufek, Janet S
2013-03-01
Barefoot running (BF) is gaining popularity in the running community. Biomechanical changes occur with BF, especially when initial contact changes from rearfoot strike (RFS) to forefoot strike (FFS). Changes in lumbar spine range of motion (ROM), particularly involving lumbar lordosis, have been associated with increased low back pain. However, it is not known if changing from RFS to FFS affects lumbar lordosis or low back pain. The purpose of this study was to determine whether a change from RFS to FFS would change lumbar lordosis, influence shock attenuation, or change comfort levels in healthy recreational/experienced runners. Forty-three subjects performed a warm-up on the treadmill where a self-selected foot strike pattern was determined. Instructions on running RFS/FFS were taught, and two conditions were examined. Each condition consisted of 90 s of BF with RFS or FFS, order randomly assigned. A comfort questionnaire was completed after both conditions. Fifteen consecutive strides from each condition were extracted for analyses. Statistically significant differences between FFS and RFS shock attenuation (P < 0.001), peak leg acceleration (P < 0.001), and overall lumbar ROM (P = 0.045) were found. There were no statistically significant differences between FFS and RFS in lumbar extension or lumbar flexion. There was a statistically significant difference between FFS and RFS for comfort/discomfort of the comfort questionnaire (P = .007). There were no statistically significant differences between other questions or the average of all questions. Change in foot strike from RFS to FFS decreased overall ROM in the lumbar spine but did not make a difference in flexion or extension in which the lumbar spine is positioned. Shock attenuation was greater in RFS. RFS was perceived a more comfortable running pattern.
Teenage pregnancy and mental health beyond the postpartum period: a systematic review.
Xavier, Chloé; Benoit, Anita; Brown, Hilary K
2018-06-01
Teenage mothers are at increased risk for adverse social outcomes and short-term health problems, but long-term impacts on mental health are poorly understood. The aims of our systematic review were to determine the association between teenage pregnancy and mental health beyond the postpartum period, critically appraise the literature's quality and guide future research. We systematically searched MEDLINE, Embase, PsycINFO, CINAHL, Scopus and Web of Science from inception to June 2017 for peer-reviewed articles written in English or French. Data were collected using a modified Cochrane Data Extraction Form. Study quality was assessed using the Effective Public Health Practice Project critical appraisal tool. Heterogeneity of studies permitted only a qualitative synthesis. Nine quantitative studies comprising the results from analyses of 11 cohorts met our criteria and were rated as strong (n=5), moderate (n=2) or weak (n=2). Three cohorts found a statistically significant association between teenage pregnancy and poor long-term mental health after adjustment, three found a statistically significant association before but not after adjustment and five did not find a statistically significant association. Studies observed varying degrees of attenuation after considering social context. Studies with statistically significant findings tended to comprise earlier cohorts, with outcomes measured at older ages. The association between teenage pregnancy and mental health beyond the postpartum period remains unclear. Future studies should employ age-period-cohort frameworks to disentangle effects of normative patterns and stress accumulation. Social factors are important in determining long-term mental health of teenage mothers and should be prioritised in prevention and intervention strategies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Evaluation and application of summary statistic imputation to discover new height-associated loci.
Rüeger, Sina; McDaid, Aaron; Kutalik, Zoltán
2018-05-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression.
Evaluation and application of summary statistic imputation to discover new height-associated loci
2018-01-01
As most of the heritability of complex traits is attributed to common and low frequency genetic variants, imputing them by combining genotyping chips and large sequenced reference panels is the most cost-effective approach to discover the genetic basis of these traits. Association summary statistics from genome-wide meta-analyses are available for hundreds of traits. Updating these to ever-increasing reference panels is very cumbersome as it requires reimputation of the genetic data, rerunning the association scan, and meta-analysing the results. A much more efficient method is to directly impute the summary statistics, termed as summary statistics imputation, which we improved to accommodate variable sample size across SNVs. Its performance relative to genotype imputation and practical utility has not yet been fully investigated. To this end, we compared the two approaches on real (genotyped and imputed) data from 120K samples from the UK Biobank and show that, genotype imputation boasts a 3- to 5-fold lower root-mean-square error, and better distinguishes true associations from null ones: We observed the largest differences in power for variants with low minor allele frequency and low imputation quality. For fixed false positive rates of 0.001, 0.01, 0.05, using summary statistics imputation yielded a decrease in statistical power by 9, 43 and 35%, respectively. To test its capacity to discover novel associations, we applied summary statistics imputation to the GIANT height meta-analysis summary statistics covering HapMap variants, and identified 34 novel loci, 19 of which replicated using data in the UK Biobank. Additionally, we successfully replicated 55 out of the 111 variants published in an exome chip study. Our study demonstrates that summary statistics imputation is a very efficient and cost-effective way to identify and fine-map trait-associated loci. Moreover, the ability to impute summary statistics is important for follow-up analyses, such as Mendelian randomisation or LD-score regression. PMID:29782485
ParallABEL: an R library for generalized parallelization of genome-wide association studies.
Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S
2010-04-29
Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.
Formalizing the definition of meta-analysis in Molecular Ecology.
ArchMiller, Althea A; Bauer, Eric F; Koch, Rebecca E; Wijayawardena, Bhagya K; Anil, Ammu; Kottwitz, Jack J; Munsterman, Amelia S; Wilson, Alan E
2015-08-01
Meta-analysis, the statistical synthesis of pertinent literature to develop evidence-based conclusions, is relatively new to the field of molecular ecology, with the first meta-analysis published in the journal Molecular Ecology in 2003 (Slate & Phua 2003). The goal of this article is to formalize the definition of meta-analysis for the authors, editors, reviewers and readers of Molecular Ecology by completing a review of the meta-analyses previously published in this journal. We also provide a brief overview of the many components required for meta-analysis with a more specific discussion of the issues related to the field of molecular ecology, including the use and statistical considerations of Wright's FST and its related analogues as effect sizes in meta-analysis. We performed a literature review to identify articles published as 'meta-analyses' in Molecular Ecology, which were then evaluated by at least two reviewers. We specifically targeted Molecular Ecology publications because as a flagship journal in this field, meta-analyses published in Molecular Ecology have the potential to set the standard for meta-analyses in other journals. We found that while many of these reviewed articles were strong meta-analyses, others failed to follow standard meta-analytical techniques. One of these unsatisfactory meta-analyses was in fact a secondary analysis. Other studies attempted meta-analyses but lacked the fundamental statistics that are considered necessary for an effective and powerful meta-analysis. By drawing attention to the inconsistency of studies labelled as meta-analyses, we emphasize the importance of understanding the components of traditional meta-analyses to fully embrace the strengths of quantitative data synthesis in the field of molecular ecology. © 2015 John Wiley & Sons Ltd.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Kindu, Mengistie; Schneider, Thomas; Teketay, Demel; Knoke, Thomas
2015-07-01
Understanding drivers of changes in land use/land cover (LULC) is essential for modeling future dynamics or development of management strategies to ameliorate or prevent further decline of natural resources. In this study, an attempt has been made to identify the main drivers behind the LULC changes that had occurred in the past four decades in Munessa-Shashemene landscape of the south-central highlands of Ethiopia. The datasets required for the study were generated through both primary and secondary sources. Combination of techniques, including descriptive statistics, GIS-based processing, and regression analyses were employed for data analyses. Changes triggered by the interplay of more than 12 drivers were identified related to social, economic, environmental, policy/institutional, and technological factors. Specifically, population growth, expansion of cultivated lands and settlements, livestock ranching, cutting of woody species for fuelwood, and charcoal making were the top six important drivers of LULC change as viewed by the local people and confirmed by quantitative analyses. Differences in respondents' perceptions related to environmental (i.e., location specific) and socioeconomic determinants (e.g., age and literacy) about drivers were statically significant (P = 0.001). LULC changes were also determined by distances to major drivers (e.g., the further a pixel is from the road, the less likelihood of changes) as shown by the landscape level analyses. Further studies are suggested targeting these drivers to explore the consequences and future options and formulate intervention strategies for sustainable development in the studied landscape and elsewhere with similar geographic settings.
The Short-Term Effects of Lying, Sitting and Standing on Energy Expenditure in Women
POPP, COLLIN J.; BRIDGES, WILLIAM C.; JESCH, ELLIOT D.
2018-01-01
The deleterious health effects of too much sitting have been associated with an increased risk for overweight and obesity. Replacing sitting with standing is the proposed intervention to increase daily energy expenditure (EE). The purpose of this study was to determine the short-term effects of lying, sitting, and standing postures on EE, and determine the magnitude of the effect each posture has on EE using indirect calorimetry (IC). Twenty-eight healthy females performed three separate positions (lying, sitting, standing) in random order. Inspired and expired gases were collected for 45-minutes (15 minutes for each position) using breath-by-breath indirect calorimetry. Oxygen consumption (VO2) and carbon dioxide production (VCO2) were measured to estimate EE. Statistical analyses used repeat measures ANOVA to analyze all variables and post hoc t-tests. Based on the ANOVA the individual, time period and order term did not result in a statistically significant difference. Lying EE and sitting EE were not different from each other (P = 0.56). However, standing EE (kcal/min) was 9.0 % greater than lying EE (kcal/min) (P = 0.003), and 7.1% greater than sitting EE (kcal/min) (P = 0.02). The energetic cost of standing was higher compared to lying and sitting. While this is statistically significant, the magnitude of the effect of standing when compared to sitting was small (Cohen’s d = 0.31). Short-term standing does not offer an energetic advantage when compared to sitting.
Valentine, Nicole Britt; Bonsel, Gouke J
2016-01-01
Intersectoral perspectives of health are present in the rhetoric of the sustainable development goals. Yet its descriptions of systematic approaches for an intersectoral monitoring vision, joining determinants of health, and barriers or facilitators to accessing healthcare services are lacking. To explore models of associations between health outcomes and health service coverage, and health determinants and health systems responsiveness, and thereby to contribute to monitoring, analysis, and assessment approaches informed by an intersectoral vision of health. The study is designed as a series of ecological, cross-country regression analyses, covering between 23 and 57 countries with dependent health variables concentrated on the years 2002-2003. Countries cover a range of development contexts. Health outcome and health service coverage dependent variables were derived from World Health Organization (WHO) information sources. Predictor variables representing determinants are derived from the WHO and World Bank databases; variables used for health systems' responsiveness are derived from the WHO World Health Survey. Responsiveness is a measure of acceptability of health services to the population, complementing financial health protection. Health determinants' indicators - access to improved drinking sources, accountability, and average years of schooling - were statistically significant in particular health outcome regressions. Statistically significant coefficients were more common for mortality rate regressions than for coverage rate regressions. Responsiveness was systematically associated with poorer health and health service coverage. With respect to levels of inequality in health, the indicator of responsiveness problems experienced by the unhealthy poor groups in the population was statistically significant for regressions on measles vaccination inequalities between rich and poor. For the broader determinants, the Gini mattered most for inequalities in child mortality; education mattered more for inequalities in births attended by skilled personnel. This paper adds to the literature on comparative health systems research. National and international health monitoring frameworks need to incorporate indicators on trends in and impacts of other policy sectors on health. This will empower the health sector to carry out public health practices that promote health and health equity.
Schroeder, Krista; Jia, Haomiao; Smaldone, Arlene
Propensity score (PS) methods are increasingly being employed by researchers to reduce bias arising from confounder imbalance when using observational data to examine intervention effects. The purpose of this study was to examine PS theory and methodology and compare application of three PS methods (matching, stratification, weighting) to determine which best improves confounder balance. Baseline characteristics of a sample of 20,518 school-aged children with severe obesity (of whom 1,054 received an obesity intervention) were assessed prior to PS application. Three PS methods were then applied to the data to determine which showed the greatest improvement in confounder balance between the intervention and control group. The effect of each PS method on the outcome variable-body mass index percentile change at one year-was also examined. SAS 9.4 and Comprehensive Meta-analysis statistical software were used for analyses. Prior to PS adjustment, the intervention and control groups differed significantly on seven of 11 potential confounders. PS matching removed all differences. PS stratification and weighting both removed one difference but created two new differences. Sensitivity analyses did not change these results. Body mass index percentile at 1 year decreased in both groups. The size of the decrease was smaller in the intervention group, and the estimate of the decrease varied by PS method. Selection of a PS method should be guided by insight from statistical theory and simulation experiments, in addition to observed improvement in confounder balance. For this data set, PS matching worked best to correct confounder imbalance. Because each method varied in correcting confounder imbalance, we recommend that multiple PS methods be compared for ability to improve confounder balance before implementation in evaluating treatment effects in observational data.
Maneerat, Sujira; Lehtinen, Markus J; Childs, Caroline E; Forssten, Sofia D; Alhoniemi, Esa; Tiphaine, Milin; Yaqoob, Parveen; Ouwehand, Arthur C; Rastall, Robert A
2013-01-01
Elderly adults have alterations in their gut microbiota and immune functions that are associated with higher susceptibility to infections and metabolic disorders. Probiotics and prebiotics, and their synbiotic combinations are food supplements that have been shown to improve both gut and immune function. The objective of this randomised, double-blind, placebo-controlled, cross-over human clinical trial was to study immune function and the gut microbiota in healthy elderly adults. Volunteers (n 37) consumed prebiotic galacto-oligosaccharides (GOS; 8 g/d), probiotic Bifidobacterium lactis Bi-07 (Bi-07; 10(9) colony-forming units/d), their combination (Bi-07 + GOS) and maltodextrin control (8 g/d) in four 3-week periods separated by 4-week wash-out periods. Immune function was analysed by determining the phagocytic and oxidative burst activity of monocytes and granulocytes, whole-blood response to lipopolysaccharide, plasma chemokine concentrations and salivary IgA levels. Gut microbiota composition and faecal SCFA content were determined using 16S ribosomal RNA fluorescence in situ hybridisation and HPLC, respectively. Primary statistical analyses indicated the presence of carry-over effects and thus measurements from only the first supplementation period were considered valid. Subsequent statistical analysis showed that consumption of Bi-07 improved the phagocytic activity of monocytes (P < 0·001) and granulocytes (P = 0·02). Other parameters were unchanged. We have for the first time shown that the probiotic Bi-07 may provide health benefits to elderly individuals by improving the phagocytic activity of monocytes and granulocytes. The present results also suggest that in the elderly, the effects of some probiotics and prebiotics may last longer than in adults.
Effect Size Analyses of Souvenaid in Patients with Alzheimer’s Disease
Cummings, Jeffrey; Scheltens, Philip; McKeith, Ian; Blesa, Rafael; Harrison, John E.; Bertolucci, Paulo H.F.; Rockwood, Kenneth; Wilkinson, David; Wijker, Wouter; Bennett, David A.; Shah, Raj C.
2016-01-01
Background: Souvenaid® (uridine monophosphate, docosahexaenoic acid, eicosapentaenoic acid, choline, phospholipids, folic acid, vitamins B12, B6, C, and E, and selenium), was developed to support the formation and function of neuronal membranes. Objective: To determine effect sizes observed in clinical trials of Souvenaid and to calculate the number needed to treat to show benefit or harm. Methods: Data from all three reported randomized controlled trials of Souvenaid in Alzheimer’s disease (AD) dementia (Souvenir I, Souvenir II, and S-Connect) and an open-label extension study were included in analyses of effect size for cognitive, functional, and behavioral outcomes. Effect size was determined by calculating Cohen’s d statistic (or Cramér’s V method for nominal data), number needed to treat and number needed to harm. Statistical calculations were performed for the intent-to-treat populations. Results: In patients with mild AD, effect sizes were 0.21 (95% confidence intervals: –0.06, 0.49) for the primary outcome in Souvenir II (neuropsychological test battery memory z-score) and 0.20 (0.10, 0.34) for the co-primary outcome of Souvenir I (Wechsler memory scale delayed recall). No effect was shown on cognition in patients with mild-to-moderate AD (S-Connect). The number needed to treat (6 and 21 for Souvenir I and II, respectively) and high number needed to harm values indicate a favorable harm:benefit ratio for Souvenaid versus control in patients with mild AD. Conclusions: The favorable safety profile and impact on outcome measures converge to corroborate the putative mode of action and demonstrate that Souvenaid can achieve clinically detectable effects in patients with early AD. PMID:27767993
Searching for forcing signatures in decadal patterns of shoreline change
NASA Astrophysics Data System (ADS)
Burningham, H.; French, J.
2016-12-01
Analysis of shoreline position at spatial scales of the order 10 - 100 km and at a multi-decadal time-scale has the potential to reveal regional coherence (or lack of) in the primary controls on shoreline tendencies and trends. Such information is extremely valuable for the evaluation of climate forcing on coastal behaviour. Segmenting a coast into discrete behaviour units based on these types of analyses is often subjective, however, and in the context of pervasive human interventions and alongshore variability in ocean climate, determining the most important controls on shoreline dynamics can be challenging. Multivariate analyses provide one means to resolve common behaviours across shoreline position datasets, thereby underpinning a more objective evaluation of possible coupling between shorelines at different scales. In an analysis of the Suffolk coast (eastern England) we explore the use of multivariate statistics to understand and classify mesoscale coastal behaviour. Suffolk comprises a relatively linear shoreline that shifts from east-facing in the north to southeast-facing in the south. Although primarily formed of a beach foreshore backed by cliffs or shingle barrier, the shoreline is punctuated at 3 locations by narrow tidal inlets with offset entrances that imply a persistent north to south sediment transport direction. Tidal regime decreases south to north from mesotidal (3.6m STR) to microtidal (1.9m STR), and the bimodal wave climate (northeast and southwest modes) presents complex local-scale variability in nearshore conditions. Shorelines exhibit a range of decadal behaviours from rapid erosion (up to 4m/yr) to quasi-stability that cannot be directly explained by the spatial organisation of contemporary landforms or coastal defences. A multivariate statistical approach to shoreline change analysis helps to define the key modes of change and determine the most likely forcing factors.
SEER Cancer Query Systems (CanQues)
These applications provide access to cancer statistics including incidence, mortality, survival, prevalence, and probability of developing or dying from cancer. Users can display reports of the statistics or extract them for additional analyses.
Jasiewicz, Andrzej; Grzywacz, Anna; Jabłoński, Marcin; Bieńkowski, Przemysław; Samochowiec, Agnieszka; Samochowiec, Jerzy
2014-01-01
The purpose of this study was to determine the relationship between sweet-liking phenotype and the variation of the gene sequence of the dopaminergic and serotonergic system. The study recruited 100 probands. The participants were interviewed for addiction (SSAGA-Semi Structured Assessment for the Genetics of Alcoholism) and assessed with the questionnaires: MMSE, Beck Depression Inventory and Hamilton Anxiety, Snaith-Hamilton Pleasure Scale. The taste was analyzed with tests to assess sensitivity to sweet taste and also smell tests were performed. Patients preferring the highest glucose volumes were called sweet likers. Statistical analyses were performed (SPSS- Statistical Package for the Social Sciences). Links between sweet liking phenotype and polymorphic variant of DAT1 gene were determined. The presence of DAT1 9/10 genotype increased three fold time sweet liking phenotype (p = 0.015, odds ratio-3.00), the presence of DAT1 10/10 decreased two fold time the chance being sweet liker (p = 0.051, odds ratio-0.43). Genotype 10/10 was significantly more common among sweet dislikers 10/10 (68.18% vs 47.92%) i 9/9 (6.82% vs 2.08%). CONCLUSIONS; A genetically significant association between the presence of 9/10 DAT1 VNTR genotype and a sweet-liking phenotype in probands was determined.
Riley, Sean P; Covington, Kyle; Landry, Michel D; McCallum, Christine; Engelhard, Chalee; Cook, Chad E
2016-01-01
This study aimed to compare selectivity characteristics among institution characteristics to determine differences by institutional funding source (public vs. private) or research activity level (research vs. non-research). This study included information provided by the Commission on Accreditation in Physical Therapy Education (CAPTE) and the Federation of State Boards of Physical Therapy. Data were extracted from all students who graduated in 2011 from accredited physical therapy programs in the United States. The public and private designations of the institutions were extracted directly from the classifications from the 'CAPTE annual accreditation report,' and high and low research activity was determined based on Carnegie classifications. The institutions were classified into four groups: public/research intensive, public/non-research intensive, private/research intensive, and private/non-research intensive. Descriptive and comparison analyses with post hoc testing were performed to determine whether there were statistically significant differences among the four groups. Although there were statistically significant baseline grade point average differences among the four categorized groups, there were no significant differences in licensure pass rates or for any of the selectivity variables of interest. Selectivity characteristics did not differ by institutional funding source (public vs. private) or research activity level (research vs. non-research). This suggests that the concerns about reduced selectivity among physiotherapy programs, specifically the types that are experiencing the largest proliferation, appear less warranted.
Huh, Iksoo; Wu, Xin; Park, Taesung; Yi, Soojin V
2017-07-21
DNA methylation is one of the most extensively studied epigenetic modifications of genomic DNA. In recent years, sequencing of bisulfite-converted DNA, particularly via next-generation sequencing technologies, has become a widely popular method to study DNA methylation. This method can be readily applied to a variety of species, dramatically expanding the scope of DNA methylation studies beyond the traditionally studied human and mouse systems. In parallel to the increasing wealth of genomic methylation profiles, many statistical tools have been developed to detect differentially methylated loci (DMLs) or differentially methylated regions (DMRs) between biological conditions. We discuss and summarize several key properties of currently available tools to detect DMLs and DMRs from sequencing of bisulfite-converted DNA. However, the majority of the statistical tools developed for DML/DMR analyses have been validated using only mammalian data sets, and less priority has been placed on the analyses of invertebrate or plant DNA methylation data. We demonstrate that genomic methylation profiles of non-mammalian species are often highly distinct from those of mammalian species using examples of honey bees and humans. We then discuss how such differences in data properties may affect statistical analyses. Based on these differences, we provide three specific recommendations to improve the power and accuracy of DML and DMR analyses of invertebrate data when using currently available statistical tools. These considerations should facilitate systematic and robust analyses of DNA methylation from diverse species, thus advancing our understanding of DNA methylation. © The Author 2017. Published by Oxford University Press.
Imaging Depression in Adults with ASD
2017-10-01
collected temporally close enough to imaging data in Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk...Phase 2 to be confidently incorporated in the planned statistical analyses, and (b) not unduly risk attrition between Phase 1 and 2, we chose to hold...supervision is ongoing (since 9/2014). • Co-l Dr. Lerner’s 2nd year Clinical Psychology PhD students have participated in ADOS- 2 Introductory Clinical
Schulz, Marcus; Neumann, Daniel; Fleet, David M; Matthies, Michael
2013-12-01
During the last decades, marine pollution with anthropogenic litter has become a worldwide major environmental concern. Standardized monitoring of litter since 2001 on 78 beaches selected within the framework of the Convention for the Protection of the Marine Environment of the North-East Atlantic (OSPAR) has been used to identify temporal trends of marine litter. Based on statistical analyses of this dataset a two-part multi-criteria evaluation system for beach litter pollution of the North-East Atlantic and the North Sea is proposed. Canonical correlation analyses, linear regression analyses, and non-parametric analyses of variance were used to identify different temporal trends. A classification of beaches was derived from cluster analyses and served to define different states of beach quality according to abundances of 17 input variables. The evaluation system is easily applicable and relies on the above-mentioned classification and on significant temporal trends implied by significant rank correlations. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zheng, Jie; Harris, Marcelline R; Masci, Anna Maria; Lin, Yu; Hero, Alfred; Smith, Barry; He, Yongqun
2016-09-14
Statistics play a critical role in biological and clinical research. However, most reports of scientific results in the published literature make it difficult for the reader to reproduce the statistical analyses performed in achieving those results because they provide inadequate documentation of the statistical tests and algorithms applied. The Ontology of Biological and Clinical Statistics (OBCS) is put forward here as a step towards solving this problem. The terms in OBCS including 'data collection', 'data transformation in statistics', 'data visualization', 'statistical data analysis', and 'drawing a conclusion based on data', cover the major types of statistical processes used in basic biological research and clinical outcome studies. OBCS is aligned with the Basic Formal Ontology (BFO) and extends the Ontology of Biomedical Investigations (OBI), an OBO (Open Biological and Biomedical Ontologies) Foundry ontology supported by over 20 research communities. Currently, OBCS comprehends 878 terms, representing 20 BFO classes, 403 OBI classes, 229 OBCS specific classes, and 122 classes imported from ten other OBO ontologies. We discuss two examples illustrating how the ontology is being applied. In the first (biological) use case, we describe how OBCS was applied to represent the high throughput microarray data analysis of immunological transcriptional profiles in human subjects vaccinated with an influenza vaccine. In the second (clinical outcomes) use case, we applied OBCS to represent the processing of electronic health care data to determine the associations between hospital staffing levels and patient mortality. Our case studies were designed to show how OBCS can be used for the consistent representation of statistical analysis pipelines under two different research paradigms. Other ongoing projects using OBCS for statistical data processing are also discussed. The OBCS source code and documentation are available at: https://github.com/obcs/obcs . The Ontology of Biological and Clinical Statistics (OBCS) is a community-based open source ontology in the domain of biological and clinical statistics. OBCS is a timely ontology that represents statistics-related terms and their relations in a rigorous fashion, facilitates standard data analysis and integration, and supports reproducible biological and clinical research.
2016-12-01
KS and AD Statistical Power via Monte Carlo Simulation Statistical power is the probability of correctly rejecting the null hypothesis when the...Select a caveat DISTRIBUTION STATEMENT A. Approved for public release: distribution unlimited. Determining the Statistical Power...real-world data to test the accuracy of the simulation. Statistical comparison of these metrics can be necessary when making such a determination
A new statistical method for design and analyses of component tolerance
NASA Astrophysics Data System (ADS)
Movahedi, Mohammad Mehdi; Khounsiavash, Mohsen; Otadi, Mahmood; Mosleh, Maryam
2017-03-01
Tolerancing conducted by design engineers to meet customers' needs is a prerequisite for producing high-quality products. Engineers use handbooks to conduct tolerancing. While use of statistical methods for tolerancing is not something new, engineers often use known distributions, including the normal distribution. Yet, if the statistical distribution of the given variable is unknown, a new statistical method will be employed to design tolerance. In this paper, we use generalized lambda distribution for design and analyses component tolerance. We use percentile method (PM) to estimate the distribution parameters. The findings indicated that, when the distribution of the component data is unknown, the proposed method can be used to expedite the design of component tolerance. Moreover, in the case of assembled sets, more extensive tolerance for each component with the same target performance can be utilized.
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-08-01
the nine questions. The Statistical Package for the Social Sciences ( SPSS ) [11] was used to conduct statistical analysis on the sample. Two types...constructs. SPSS was again used to conduct statistical analysis on the sample. This time factor analysis was conducted. Factor analysis attempts to...Business Research Methods and Statistics using SPSS . P432. 11 IBM SPSS Statistics . (2012) 12 Burns, R.B., Burns, R.A. (2008) ‘Business Research
Association factor analysis between osteoporosis with cerebral artery disease: The STROBE study.
Jin, Eun-Sun; Jeong, Je Hoon; Lee, Bora; Im, Soo Bin
2017-03-01
The purpose of this study was to determine the clinical association factors between osteoporosis and cerebral artery disease in Korean population. Two hundred nineteen postmenopausal women and men undergoing cerebral computed tomography angiography were enrolled in this study to evaluate the cerebral artery disease by cross-sectional study. Cerebral artery disease was diagnosed if there was narrowing of 50% higher diameter in one or more cerebral vessel artery or presence of vascular calcification. History of osteoporotic fracture was assessed using medical record, and radiographic data such as simple radiography, MRI, and bone scan. Bone mineral density was checked by dual-energy x-ray absorptiometry. We reviewed clinical characteristics in all patients and also performed subgroup analysis for total or extracranial/ intracranial cerebral artery disease group retrospectively. We performed statistical analysis by means of chi-square test or Fisher's exact test for categorical variables and Student's t-test or Wilcoxon's rank sum test for continuous variables. We also used univariate and multivariate logistic regression analyses were conducted to assess the factors associated with the prevalence of cerebral artery disease. A two-tailed p-value of less than 0.05 was considered as statistically significant. All statistical analyses were performed using R (version 3.1.3; The R Foundation for Statistical Computing, Vienna, Austria) and SPSS (version 14.0; SPSS, Inc, Chicago, Ill, USA). Of the 219 patients, 142 had cerebral artery disease. All vertebral fracture was observed in 29 (13.24%) patients. There was significant difference in hip fracture according to the presence or absence of cerebral artery disease. In logistic regression analysis, osteoporotic hip fracture was significantly associated with extracranial cerebral artery disease after adjusting for multiple risk factors. Females with osteoporotic hip fracture were associated with total calcified cerebral artery disease. Some clinical factors such as age, hypertension, and osteoporotic hip fracture, smoking history and anti-osteoporosis drug use were associated with cerebral artery disease.
Research Design and Statistical Methods in Indian Medical Journals: A Retrospective Survey
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S.; Mayya, Shreemathi S.
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were – study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems. PMID:25856194
Research design and statistical methods in Indian medical journals: a retrospective survey.
Hassan, Shabbeer; Yellur, Rajashree; Subramani, Pooventhan; Adiga, Poornima; Gokhale, Manoj; Iyer, Manasa S; Mayya, Shreemathi S
2015-01-01
Good quality medical research generally requires not only an expertise in the chosen medical field of interest but also a sound knowledge of statistical methodology. The number of medical research articles which have been published in Indian medical journals has increased quite substantially in the past decade. The aim of this study was to collate all evidence on study design quality and statistical analyses used in selected leading Indian medical journals. Ten (10) leading Indian medical journals were selected based on impact factors and all original research articles published in 2003 (N = 588) and 2013 (N = 774) were categorized and reviewed. A validated checklist on study design, statistical analyses, results presentation, and interpretation was used for review and evaluation of the articles. Main outcomes considered in the present study were - study design types and their frequencies, error/defects proportion in study design, statistical analyses, and implementation of CONSORT checklist in RCT (randomized clinical trials). From 2003 to 2013: The proportion of erroneous statistical analyses did not decrease (χ2=0.592, Φ=0.027, p=0.4418), 25% (80/320) in 2003 compared to 22.6% (111/490) in 2013. Compared with 2003, significant improvement was seen in 2013; the proportion of papers using statistical tests increased significantly (χ2=26.96, Φ=0.16, p<0.0001) from 42.5% (250/588) to 56.7 % (439/774). The overall proportion of errors in study design decreased significantly (χ2=16.783, Φ=0.12 p<0.0001), 41.3% (243/588) compared to 30.6% (237/774). In 2013, randomized clinical trials designs has remained very low (7.3%, 43/588) with majority showing some errors (41 papers, 95.3%). Majority of the published studies were retrospective in nature both in 2003 [79.1% (465/588)] and in 2013 [78.2% (605/774)]. Major decreases in error proportions were observed in both results presentation (χ2=24.477, Φ=0.17, p<0.0001), 82.2% (263/320) compared to 66.3% (325/490) and interpretation (χ2=25.616, Φ=0.173, p<0.0001), 32.5% (104/320) compared to 17.1% (84/490), though some serious ones were still present. Indian medical research seems to have made no major progress regarding using correct statistical analyses, but error/defects in study designs have decreased significantly. Randomized clinical trials are quite rarely published and have high proportion of methodological problems.
Analysis of self-determined motivation in basketball players through goal orientations.
Gómez-López, Manuel; Granero-Gallegos, Antonio; Abraldes, J Arturo; Rodríguez-Suárez, Nuria
2013-09-01
The purpose of this study was twofold. Firstly to examine the relations between the different constructs that defines Nicholls' Achievement Goal Theory and Deci and Ryan's self-determination theory. Secondly to analyse the differences that exist between them with respect to the socio-demographic variables gender and age. A sample of 292 federated basketball players from the Region of Murcia (Spain) with ages between 14 and 18 years old was used. In addition, Castilian versions of The Perception of Success Questionnaire (POSQ) and the Sports Motivational Scale (SMS) were administered. Three statistical analyses were employed, a descriptive analysis, a correlation analysis and a regression analysis. The results showed a positive relation between ego orientation, extrinsic motivation and amotivation. The motivational relations between both theories and the differences with respect to gender and age are discussed. We have found out that mainly gender and also age differences are strong predictors of ego orientation, extrinsic motivation of external regulation and amotivation. We can also confirm that extrinsic motivation of external regulation positively predicts ego orientation and a decrease of task orientation. The results ratify the use of the Spanish version of the SMS to measure different types of motivation within the sports context.
The effectiveness of air bags.
Barry, S; Ginpil, S; O'Neill, T J
1999-11-01
Previous research has shown that the installation of air bags in vehicles significantly reduces crash related deaths, but these analyses have used statistical techniques which have not been capable of controlling for other major determinants of crash survival. This study analysed data from the US FARS database of fatal crashes using conditional logistic regression which is simultaneously able to estimate occupant protection effects for a range of variables. Results of the analysis provided a comparative quantification of both the effect of the air bag as well as other well known determinants of occupant crash survival (age, seat belt use, and gender). When potentially confounding variables were controlled, both the driver and passenger side air bag devices were shown to significantly reduce the probability of death in direct frontal collisions, but the effect size calculated was small compared to the effect of the seat belt. The effect size may also be very small in absolute terms depending on the severity of the crash involved. Given the limited benefit of the air bag, efforts to promote air bags seem particularly difficult to justify in countries such as the United States where the vastly superior occupant protection of the seat belt is under-utilised.
Scale Matters: A Cost-Outcome Analysis of an m-Health Intervention in Malawi.
Larsen-Cooper, Erin; Bancroft, Emily; Rajagopal, Sharanya; O'Toole, Maggie; Levin, Ann
2016-04-01
The primary objectives of this study are to determine cost per user and cost per contact with users of a mobile health (m-health) intervention. The secondary objectives are to map costs to changes in maternal, newborn, and child health (MNCH) and to estimate costs of alternate implementation and usage scenarios. A base cost model, constructed from recurrent costs and selected capital costs, was used to estimate average cost per user and per contact of an m-health intervention. This model was mapped to statistically significant changes in MNCH intermediate outcomes to determine the cost of improvements in MNCH indicators. Sensitivity analyses were conducted to estimate costs in alternate scenarios. The m-health intervention cost $29.33 per user and $4.33 per successful contact. The average cost for each user experiencing a change in an MNCH indicator ranged from $67 to $355. The sensitivity analyses showed that cost per user could be reduced by 48% if the service were to operate at full capacity. We believe that the intervention, operating at scale, has potential to be a cost-effective method for improving maternal and child health indicators.
Scale Matters: A Cost-Outcome Analysis of an m-Health Intervention in Malawi
Bancroft, Emily; Rajagopal, Sharanya; O'Toole, Maggie; Levin, Ann
2016-01-01
Abstract Background: The primary objectives of this study are to determine cost per user and cost per contact with users of a mobile health (m-health) intervention. The secondary objectives are to map costs to changes in maternal, newborn, and child health (MNCH) and to estimate costs of alternate implementation and usage scenarios. Materials and Methods: A base cost model, constructed from recurrent costs and selected capital costs, was used to estimate average cost per user and per contact of an m-health intervention. This model was mapped to statistically significant changes in MNCH intermediate outcomes to determine the cost of improvements in MNCH indicators. Sensitivity analyses were conducted to estimate costs in alternate scenarios. Results: The m-health intervention cost $29.33 per user and $4.33 per successful contact. The average cost for each user experiencing a change in an MNCH indicator ranged from $67 to $355. The sensitivity analyses showed that cost per user could be reduced by 48% if the service were to operate at full capacity. Conclusions: We believe that the intervention, operating at scale, has potential to be a cost-effective method for improving maternal and child health indicators. PMID:26348994
Path loss variation of on-body UWB channel in the frequency bands of IEEE 802.15.6 standard.
Goswami, Dayananda; Sarma, Kanak C; Mahanta, Anil
2016-06-01
The wireless body area network (WBAN) has gaining tremendous attention among researchers and academicians for its envisioned applications in healthcare service. Ultra wideband (UWB) radio technology is considered as excellent air interface for communication among body area network devices. Characterisation and modelling of channel parameters are utmost prerequisite for the development of reliable communication system. The path loss of on-body UWB channel for each frequency band defined in IEEE 802.15.6 standard is experimentally determined. The parameters of path loss model are statistically determined by analysing measurement data. Both the line-of-sight and non-line-of-sight channel conditions are considered in the measurement. Variations of parameter values with the size of human body are analysed along with the variation of parameter values with the surrounding environments. It is observed that the parameters of the path loss model vary with the frequency band as well as with the body size and surrounding environment. The derived parameter values are specific to the particular frequency bands of IEEE 802.15.6 standard, which will be useful for the development of efficient UWB WBAN system.
Statistical Literacy in the Data Science Workplace
ERIC Educational Resources Information Center
Grant, Robert
2017-01-01
Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…
Hankonen, Nelli; Heino, Matti T J; Kujala, Emilia; Hynynen, Sini-Tuuli; Absetz, Pilvikki; Araújo-Soares, Vera; Borodulin, Katja; Haukkala, Ari
2017-02-01
Designing evidence-based interventions to address socioeconomic disparities in health and health behaviours requires a better understanding of the specific explanatory mechanisms. We aimed to investigate a comprehensive range of potential theoretical mediators of physical activity (PA) and screen time in different socioeconomic status (SES) groups: a high SES group of high school students, and a low SES group of vocational school students. The COM-B system, including the Theoretical Domains Framework (TDF), was used as a heuristic framework to synthesise different theoretical determinants in this exploratory study. Finnish vocational and high school students (N = 659) aged 16-19, responded to a survey assessing psychological, social and environmental determinants of activity (PA and screen time). These determinants are mappable into the COM-B domains: capability, opportunity and motivation. The outcome measures were validated self-report measures for PA and screen time. The statistical analyses included a bootstrapping-based mediation procedure. Regarding PA, there were SES differences in all of the COM-B domains. For example, vocational school students reported using less self-monitoring of PA, weaker injunctive norms to engage in regular PA, and fewer intentions than high school students. Mediation analyses identified potential mediators of the SES-PA relationship in all of three domains: The most important candidates included self-monitoring (CI95 for b: 0.19-0.47), identity (0.04-0.25) and material resources available (0.01-0.16). However, SES was not related to most determinants of screentime, where there were mainly gender differences. Most determinants were similarly related with both behaviours in both SES groups, indicating no major moderation effect of SES on these relationships. This study revealed that already in the first years of educational differentiation, levels of key PA determinants differ, contributing to socioeconomic differences in PA. The analyses identified the strongest mediators of the SES-PA association, but additional investigation utilising longitudinal and experimental designs are needed. This study demonstrates the usefulness of combining constructs from various theoretical approaches to better understand the role of distinct mechanisms that underpin socioeconomic health behaviour disparities.
Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.
Counsell, Alyssa; Harlow, Lisa L
2017-05-01
With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
Austin, Peter C; Reeves, Mathew J
2013-03-01
Hospital report cards, in which outcomes following the provision of medical or surgical care are compared across health care providers, are being published with increasing frequency. Essential to the production of these reports is risk-adjustment, which allows investigators to account for differences in the distribution of patient illness severity across different hospitals. Logistic regression models are frequently used for risk adjustment in hospital report cards. Many applied researchers use the c-statistic (equivalent to the area under the receiver operating characteristic curve) of the logistic regression model as a measure of the credibility and accuracy of hospital report cards. To determine the relationship between the c-statistic of a risk-adjustment model and the accuracy of hospital report cards. Monte Carlo simulations were used to examine this issue. We examined the influence of 3 factors on the accuracy of hospital report cards: the c-statistic of the logistic regression model used for risk adjustment, the number of hospitals, and the number of patients treated at each hospital. The parameters used to generate the simulated datasets came from analyses of patients hospitalized with a diagnosis of acute myocardial infarction in Ontario, Canada. The c-statistic of the risk-adjustment model had, at most, a very modest impact on the accuracy of hospital report cards, whereas the number of patients treated at each hospital had a much greater impact. The c-statistic of a risk-adjustment model should not be used to assess the accuracy of a hospital report card.
Austin, Peter C.; Reeves, Mathew J.
2015-01-01
Background Hospital report cards, in which outcomes following the provision of medical or surgical care are compared across health care providers, are being published with increasing frequency. Essential to the production of these reports is risk-adjustment, which allows investigators to account for differences in the distribution of patient illness severity across different hospitals. Logistic regression models are frequently used for risk-adjustment in hospital report cards. Many applied researchers use the c-statistic (equivalent to the area under the receiver operating characteristic curve) of the logistic regression model as a measure of the credibility and accuracy of hospital report cards. Objectives To determine the relationship between the c-statistic of a risk-adjustment model and the accuracy of hospital report cards. Research Design Monte Carlo simulations were used to examine this issue. We examined the influence of three factors on the accuracy of hospital report cards: the c-statistic of the logistic regression model used for risk-adjustment, the number of hospitals, and the number of patients treated at each hospital. The parameters used to generate the simulated datasets came from analyses of patients hospitalized with a diagnosis of acute myocardial infarction in Ontario, Canada. Results The c-statistic of the risk-adjustment model had, at most, a very modest impact on the accuracy of hospital report cards, whereas the number of patients treated at each hospital had a much greater impact. Conclusions The c-statistic of a risk-adjustment model should not be used to assess the accuracy of a hospital report card. PMID:23295579
Henderson, G; Fahey, T; McGuire, W
2007-10-17
Preterm infants are often growth-restricted at hospital discharge. Feeding infants after hospital discharge with nutrient-enriched formula rather than standard term formula might facilitate "catch-up" growth and improve development. To determine the effect of feeding nutrient-enriched formula compared with standard term formula on growth and development for preterm infants following hospital discharge. The standard search strategy of the Cochrane Neonatal Review Group were used. This included searches of the Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library, Issue 2, 2007), MEDLINE (1966 - May 2007), EMBASE (1980 - May 2007), CINAHL (1982 - May 2007), conference proceedings, and previous reviews. Randomised or quasi-randomised controlled trials that compared the effect of feeding preterm infants following hospital discharge with nutrient-enriched formula compared with standard term formula. Data was extracted using the standard methods of the Cochrane Neonatal Review Group, with separate evaluation of trial quality and data extraction by two authors, and synthesis of data using weighted mean difference and a fixed effects model for meta-analysis. Seven trials were found that were eligible for inclusion. These recruited a total of 631 infants and were generally of good methodological quality. The trials found little evidence that feeding with nutrient-enriched formula milk affected growth and development. Because of differences in the way individual trials measured and presented outcomes, data synthesis was limited. Growth data from two trials found that, at six months post-term, infants fed with nutrient-enriched formula had statistically significantly lower weights [weighted mean difference: -601 (95% confidence interval -1028, -174) grams], lengths [-18.8 (-30.0, -7.6) millimetres], and head circumferences [-10.2 ( -18.0, -2.4) millimetres], than infants fed standard term formula. At 12 to 18 months post-term, meta-analyses of data from three trials did not find any statistically significant differences in growth parameters. However, examination of these meta-analyses demonstrated statistical heterogeneity. Meta-analyses of data from two trials did not reveal a statistically significant difference in Bayley Mental Development or Psychomotor Development Indices. There are not yet any data on growth or development through later childhood. The available data do not provide strong evidence that feeding preterm infants following hospital discharge with nutrient-enriched formula compared with standard term formula affects growth rates or development up to 18 months post-term.
Wisniowski, Brendan; Barnes, Mary; Jenkins, Jason; Boyne, Nicholas; Kruger, Allan; Walker, Philip J
2011-09-01
Endovascular abdominal aortic aneurysm (AAA) repair (EVAR) has been associated with lower operative mortality and morbidity than open surgery but comparable long-term mortality and higher delayed complication and reintervention rates. Attention has therefore been directed to identifying preoperative and operative variables that influence outcomes after EVAR. Risk-prediction models, such as the EVAR Risk Assessment (ERA) model, have also been developed to help surgeons plan EVAR procedures. The aims of this study were (1) to describe outcomes of elective EVAR at the Royal Brisbane and Women's Hospital (RBWH), (2) to identify preoperative and operative variables predictive of outcomes after EVAR, and (3) to externally validate the ERA model. All elective EVAR procedures at the RBWH before July 1, 2009, were reviewed. Descriptive analyses were performed to determine the outcomes. Univariate and multivariate analyses were performed to identify preoperative and operative variables predictive of outcomes after EVAR. Binomial logistic regression analyses were used to externally validate the ERA model. Before July 1, 2009, 197 patients (172 men), who were a mean age of 72.8 years, underwent elective EVAR at the RBWH. Operative mortality was 1.0%. Survival was 81.1% at 3 years and 63.2% at 5 years. Multivariate analysis showed predictors of survival were age (P = .0126), American Society of Anesthesiologists (ASA) score (P = .0180), and chronic obstructive pulmonary disease (P = .0348) at 3 years and age (P = .0103), ASA score (P = .0006), renal failure (P = .0048), and serum creatinine (P = .0022) at 5 years. Aortic branch vessel score was predictive of initial (30-day) type II endoleak (P = .0015). AAA tortuosity was predictive of midterm type I endoleak (P = .0251). Female sex was associated with lower rates of initial clinical success (P = .0406). The ERA model fitted RBWH data well for early death (C statistic = .906), 3-year survival (C statistic = .735), 5-year survival (C statistic = .800), and initial type I endoleak (C statistic = .850). The outcomes of elective EVAR at the RBWH are broadly consistent with those of a nationwide Australian audit and recent randomized trials. Age and ASA score are independent predictors of midterm survival after elective EVAR. The ERA model predicts mortality-related outcomes and initial type I endoleak well for RBWH elective EVAR patients. Copyright © 2011 Society for Vascular Surgery. All rights reserved.
Reduction of Fasting Blood Glucose and Hemoglobin A1c Using Oral Aloe Vera: A Meta-Analysis.
Dick, William R; Fletcher, Emily A; Shah, Sachin A
2016-06-01
Diabetes mellitus is a global epidemic and one of the leading causes of morbidity and mortality. Additional medications that are novel, affordable, and efficacious are needed to treat this rampant disease. This meta-analysis was performed to ascertain the effectiveness of oral aloe vera consumption on the reduction of fasting blood glucose (FBG) and hemoglobin A1c (HbA1c). PubMed, CINAHL, Natural Medicines Comprehensive Database, and Natural Standard databases were searched. Studies of aloe vera's effect on FBG, HbA1c, homeostasis model assessment-estimated insulin resistance (HOMA-IR), fasting serum insulin, fructosamine, and oral glucose tolerance test (OGTT) in prediabetic and diabetic populations were examined. After data extraction, the parameters of FBG and HbA1c had appropriate data for meta-analyses. Extracted data were verified and then analyzed by StatsDirect Statistical Software. Reductions of FBG and HbA1c were reported as the weighted mean differences from baseline, calculated by a random-effects model with 95% confidence intervals. Subgroup analyses to determine clinical and statistical heterogeneity were also performed. Publication bias was assessed by using the Egger bias statistic. Nine studies were included in the FBG parameter (n = 283); 5 of these studies included HbA1c data (n = 89). Aloe vera decreased FBG by 46.6 mg/dL (p < 0.0001) and HbA1c by 1.05% (p = 0.004). Significant reductions of both endpoints were maintained in all subgroup analyses. Additionally, the data suggest that patients with an FBG ≥200 mg/dL may see a greater benefit. A mean FBG reduction of 109.9 mg/dL was observed in this population (p ≤ 0.0001). The Egger statistic showed publication bias with FBG but not with HbA1c (p = 0.010 and p = 0.602, respectively). These results support the use of oral aloe vera for significantly reducing FBG (46.6 mg/dL) and HbA1c (1.05%). Further clinical studies that are more robust and better controlled are warranted to further explore these findings.
The SPARC Intercomparison of Middle Atmosphere Climatologies
NASA Technical Reports Server (NTRS)
Randel, William; Fleming, Eric; Geller, Marvin; Gelman, Mel; Hamilton, Kevin; Karoly, David; Ortland, Dave; Pawson, Steve; Swinbank, Richard; Udelhofen, Petra
2003-01-01
Our current confidence in 'observed' climatological winds and temperatures in the middle atmosphere (over altitudes approx. 10-80 km) is assessed by detailed intercomparisons of contemporary and historic data sets. These data sets include global meteorological analyses and assimilations, climatologies derived from research satellite measurements, and historical reference atmosphere circulation statistics. We also include comparisons with historical rocketsonde wind and temperature data, and with more recent lidar temperature measurements. The comparisons focus on a few basic circulation statistics, such as temperature, zonal wind, and eddy flux statistics. Special attention is focused on tropical winds and temperatures, where large differences exist among separate analyses. Assimilated data sets provide the most realistic tropical variability, but substantial differences exist among current schemes.
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
Fracture Networks from a deterministic physical model as 'forerunners' of Maze Caves
NASA Astrophysics Data System (ADS)
Ferer, M. V.; Smith, D. H.; Lace, M. J.
2013-12-01
'Fractures are the chief forerunners of caves because they transmit water much more rapidly than intergranular pores.[1] Thus, the cave networks can follow the fracture networks from which the Karst caves formed by a variety of processes. Traditional models of continental Karst define water flow through subsurface geologic formations, slowly dissolving the rock along the pathways (e.g. water saturated with respect to carbon dioxide flowing through fractured carbonate formations). We have developed a deterministic, physical model of fracturing in a model geologic layer of a given thickness, when that layer is strained in one direction and subsequently in a perpendicular direction. It was observed that the connected fracture networks from our model visually resemble maps of maze caves. Since these detailed cave maps offer critical tools in modeling cave development patterns and conduit flow in Karst systems, we were able to test the qualitative resemblance by using statistical analyses to compare our model networks in geologic layers of four different thicknesses with the corresponding statistical analyses of four different maze caves, formed in a variety of geologic settings. The statistical studies performed are: i) standard box-counting to determine if either the caves or the model networks are fractal. We found that both are fractal with a fractal dimension Df ≈ 1.75 . ii) for each section inside a closed path, we determined the area and perimeter-length, enabling a study of the tortuosity of the networks. From the dependence of the section's area upon its perimeter-length, we have found a power-law behavior (for sufficiently large sections) characterized by a 'tortuosity' exponent. These exponents have similar values for both the model networks and the maze caves. The best agreement is between our thickest model layer and the maze-like part of Wind Cave in South Dakota where the data from the model and the cave overlie each other. For the present networks from the physical model, we assumed that the geologic layer was of uniform thickness and that the strain in both directions were the same. The latter may not be the case for the Brazilian, Toca de Boa Cave. These assumptions can be easily modified in our computer code to reflect different geologic histories. Even so the quantitative agreement suggests that our model networks are statistically realistic both for the 'forerunners' of caves and for general fracture networks in geologic layers, which should assist the study of underground fluid flow in many applications for which fracture patterns and fluid flow are difficult to determine (e.g., hydrology, watershed management, oil recovery, carbon dioxide sequestration, etc.). Keywords - Fracture Networks, Karst, Caves, Structurally Variable Pathways, hydrogeological modeling 1 Arthur N. Palmer, CAVE GEOLOGY, pub. Cave Books, Dayton OH, (2007).
Martin, Lisa; Watanabe, Sharon; Fainsinger, Robin; Lau, Francis; Ghosh, Sunita; Quan, Hue; Atkins, Marlis; Fassbender, Konrad; Downing, G Michael; Baracos, Vickie
2010-10-01
To determine whether elements of a standard nutritional screening assessment are independently prognostic of survival in patients with advanced cancer. A prospective nested cohort of patients with metastatic cancer were accrued from different units of a Regional Palliative Care Program. Patients completed a nutritional screen on admission. Data included age, sex, cancer site, height, weight history, dietary intake, 13 nutrition impact symptoms, and patient- and physician-reported performance status (PS). Univariate and multivariate survival analyses were conducted. Concordance statistics (c-statistics) were used to test the predictive accuracy of models based on training and validation sets; a c-statistic of 0.5 indicates the model predicts the outcome as well as chance; perfect prediction has a c-statistic of 1.0. A training set of patients in palliative home care (n = 1,164) was used to identify prognostic variables. Primary disease site, PS, short-term weight change (either gain or loss), dietary intake, and dysphagia predicted survival in multivariate analysis (P < .05). A model including only patients separated by disease site and PS with high c-statistics between predicted and observed responses for survival in the training set (0.90) and validation set (0.88; n = 603). The addition of weight change, dietary intake, and dysphagia did not further improve the c-statistic of the model. The c-statistic was also not altered by substituting physician-rated palliative PS for patient-reported PS. We demonstrate a high probability of concordance between predicted and observed survival for patients in distinct palliative care settings (home care, tertiary inpatient, ambulatory outpatient) based on patient-reported information.
NASA Astrophysics Data System (ADS)
Miranda, Susan Jennifer
A mixed methods approach was used to identify factors that influence the underrepresentation of Latinos in the domain of science. The researcher investigated the role of family influences, academic preparation, and personal motivations to determine science-related career choices by Latinos. Binary logistic regression analyses were conducted using information from Latinos gathered from the National Education Longitudinal Study of 1988 (NELS: 88) administered by the National Center for Education Statistics. For the present study, data were analyzed using participants' responses as high school seniors, college students, and post-baccalaureates. Students responded to questions on school, work, parental academic influences, personal aspirations, and self-perception. To provide more insight into the experiences of Latinos in science and support the statistical analyses, nine students majoring in science in a private, urban university located in the northeastern part of the country were interviewed. Eleven variables related to parents' academic support and students' perceptions of parental support were taken together as predictors for two separate criteria from the survey. These results identified parents' level of education and the importance of academics to parents in their teen's college choice as significant predictors in determining college major in science. When the criterion was degree in science, the significant predictor was the frequency parents contacted high school as volunteers. Student interviews supported this information, demonstrating the importance of parental support in attaining a degree in science. Academic preparation was also analyzed. Students' reasons for taking science classes in high school was a significant predictor for science major; significant predictors for science degree were the emphasis placed on objectives in math and science classes and number of courses in biology and physics. Student interviews supported this information and demonstrated the influence their own motivation placed on their goals. Survey data were also obtained about the students' test scores and academic achievement. Data collected from the statistical and interview components of the study developed a greater understanding for the lack of Latinos in the sciences as influenced by personal and familial factors.
Evidence-based pathology in its second decade: toward probabilistic cognitive computing.
Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R
2017-03-01
Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kadhi, Tau; Holley, D.
2010-01-01
The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…
Evaluating the consistency of gene sets used in the analysis of bacterial gene expression data.
Tintle, Nathan L; Sitarik, Alexandra; Boerema, Benjamin; Young, Kylie; Best, Aaron A; Dejongh, Matthew
2012-08-08
Statistical analyses of whole genome expression data require functional information about genes in order to yield meaningful biological conclusions. The Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) are common sources of functionally grouped gene sets. For bacteria, the SEED and MicrobesOnline provide alternative, complementary sources of gene sets. To date, no comprehensive evaluation of the data obtained from these resources has been performed. We define a series of gene set consistency metrics directly related to the most common classes of statistical analyses for gene expression data, and then perform a comprehensive analysis of 3581 Affymetrix® gene expression arrays across 17 diverse bacteria. We find that gene sets obtained from GO and KEGG demonstrate lower consistency than those obtained from the SEED and MicrobesOnline, regardless of gene set size. Despite the widespread use of GO and KEGG gene sets in bacterial gene expression data analysis, the SEED and MicrobesOnline provide more consistent sets for a wide variety of statistical analyses. Increased use of the SEED and MicrobesOnline gene sets in the analysis of bacterial gene expression data may improve statistical power and utility of expression data.