Dymova, Natalya; Hanumara, R. Choudary; Gagnon, Ronald N.
2009-01-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies. PMID:19696393
Dymova, Natalya; Hanumara, R Choudary; Enander, Richard T; Gagnon, Ronald N
2009-10-01
Performance measurement is increasingly viewed as an essential component of environmental and public health protection programs. In characterizing program performance over time, investigators often observe multiple changes resulting from a single intervention across a range of categories. Although a variety of statistical tools allow evaluation of data one variable at a time, the global test statistic is uniquely suited for analyses of categories or groups of interrelated variables. Here we demonstrate how the global test statistic can be applied to environmental and occupational health data for the purpose of making overall statements on the success of targeted intervention strategies.
Ahlborn, W; Tuz, H J; Uberla, K
1990-03-01
In cohort studies the Mantel-Haenszel estimator ORMH is computed from sample data and is used as a point estimator of relative risk. Test-based confidence intervals are estimated with the help of the asymptotic chi-squared distributed MH-statistic chi 2MHS. The Mantel-extension-chi-squared is used as a test statistic for a dose-response relationship. Both test statistics--the Mantel-Haenszel-chi as well as the Mantel-extension-chi--assume homogeneity of risk across strata, which is rarely present. Also an extended nonparametric statistic, proposed by Terpstra, which is based on the Mann-Whitney-statistics assumes homogeneity of risk across strata. We have earlier defined four risk measures RRkj (k = 1,2,...,4) in the population and considered their estimates and the corresponding asymptotic distributions. In order to overcome the homogeneity assumption we use the delta-method to get "test-based" confidence intervals. Because the four risk measures RRkj are presented as functions of four weights gik we give, consequently, the asymptotic variances of these risk estimators also as functions of the weights gik in a closed form. Approximations to these variances are given. For testing a dose-response relationship we propose a new class of chi 2(1)-distributed global measures Gk and the corresponding global chi 2-test. In contrast to the Mantel-extension-chi homogeneity of risk across strata must not be assumed. These global test statistics are of the Wald type for composite hypotheses.(ABSTRACT TRUNCATED AT 250 WORDS)
A global goodness-of-fit statistic for Cox regression models.
Parzen, M; Lipsitz, S R
1999-06-01
In this paper, a global goodness-of-fit test statistic for a Cox regression model, which has an approximate chi-squared distribution when the model has been correctly specified, is proposed. Our goodness-of-fit statistic is global and has power to detect if interactions or higher order powers of covariates in the model are needed. The proposed statistic is similar to the Hosmer and Lemeshow (1980, Communications in Statistics A10, 1043-1069) goodness-of-fit statistic for binary data as well as Schoenfeld's (1980, Biometrika 67, 145-153) statistic for the Cox model. The methods are illustrated using data from a Mayo Clinic trial in primary billiary cirrhosis of the liver (Fleming and Harrington, 1991, Counting Processes and Survival Analysis), in which the outcome is the time until liver transplantation or death. The are 17 possible covariates. Two Cox proportional hazards models are fit to the data, and the proposed goodness-of-fit statistic is applied to the fitted models.
An omnibus test for the global null hypothesis.
Futschik, Andreas; Taus, Thomas; Zehetmayer, Sonja
2018-01-01
Global hypothesis tests are a useful tool in the context of clinical trials, genetic studies, or meta-analyses, when researchers are not interested in testing individual hypotheses, but in testing whether none of the hypotheses is false. There are several possibilities how to test the global null hypothesis when the individual null hypotheses are independent. If it is assumed that many of the individual null hypotheses are false, combination tests have been recommended to maximize power. If, however, it is assumed that only one or a few null hypotheses are false, global tests based on individual test statistics are more powerful (e.g. Bonferroni or Simes test). However, usually there is no a priori knowledge on the number of false individual null hypotheses. We therefore propose an omnibus test based on cumulative sums of the transformed p-values. We show that this test yields an impressive overall performance. The proposed method is implemented in an R-package called omnibus.
Decadal power in land air temperatures: Is it statistically significant?
NASA Astrophysics Data System (ADS)
Thejll, Peter A.
2001-12-01
The geographical distribution and properties of the well-known 10-11 year signal in terrestrial temperature records is investigated. By analyzing the Global Historical Climate Network data for surface air temperatures we verify that the signal is strongest in North America and is similar in nature to that reported earlier by R. G. Currie. The decadal signal is statistically significant for individual stations, but it is not possible to show that the signal is statistically significant globally, using strict tests. In North America, during the twentieth century, the decadal variability in the solar activity cycle is associated with the decadal part of the North Atlantic Oscillation index series in such a way that both of these signals correspond to the same spatial pattern of cooling and warming. A method for testing statistical results with Monte Carlo trials on data fields with specified temporal structure and specific spatial correlation retained is presented.
Evaluation of the Air Void Analyzer
2013-07-01
lack of measurement would help explain the difference in values shown. Brief descriptions of other unpublished testing (Wang et al. 2008) CTL Group...structure measurements taken from the controlled laboratory mixtures. A three-phase approach was used to evaluate the machine. First, a global ...method. Hypothesis testing using t-statistics was performed to increase understanding of the data collected globally in terms of the processes used for
BrightStat.com: free statistics online.
Stricker, Daniel
2008-10-01
Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.
Generalized functional linear models for gene-based case-control association studies.
Fan, Ruzong; Wang, Yifan; Mills, James L; Carter, Tonia C; Lobach, Iryna; Wilson, Alexander F; Bailey-Wilson, Joan E; Weeks, Daniel E; Xiong, Momiao
2014-11-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene region are disease related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease datasets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. © 2014 WILEY PERIODICALS, INC.
Generalized Functional Linear Models for Gene-based Case-Control Association Studies
Mills, James L.; Carter, Tonia C.; Lobach, Iryna; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Weeks, Daniel E.; Xiong, Momiao
2014-01-01
By using functional data analysis techniques, we developed generalized functional linear models for testing association between a dichotomous trait and multiple genetic variants in a genetic region while adjusting for covariates. Both fixed and mixed effect models are developed and compared. Extensive simulations show that Rao's efficient score tests of the fixed effect models are very conservative since they generate lower type I errors than nominal levels, and global tests of the mixed effect models generate accurate type I errors. Furthermore, we found that the Rao's efficient score test statistics of the fixed effect models have higher power than the sequence kernel association test (SKAT) and its optimal unified version (SKAT-O) in most cases when the causal variants are both rare and common. When the causal variants are all rare (i.e., minor allele frequencies less than 0.03), the Rao's efficient score test statistics and the global tests have similar or slightly lower power than SKAT and SKAT-O. In practice, it is not known whether rare variants or common variants in a gene are disease-related. All we can assume is that a combination of rare and common variants influences disease susceptibility. Thus, the improved performance of our models when the causal variants are both rare and common shows that the proposed models can be very useful in dissecting complex traits. We compare the performance of our methods with SKAT and SKAT-O on real neural tube defects and Hirschsprung's disease data sets. The Rao's efficient score test statistics and the global tests are more sensitive than SKAT and SKAT-O in the real data analysis. Our methods can be used in either gene-disease genome-wide/exome-wide association studies or candidate gene analyses. PMID:25203683
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
Significance levels for studies with correlated test statistics.
Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S
2008-07-01
When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.
NASA Astrophysics Data System (ADS)
Yu, Xiao-Ying; Barnett, J. Matthew; Amidan, Brett G.; Recknagle, Kurtis P.; Flaherty, Julia E.; Antonio, Ernest J.; Glissmeyer, John A.
2018-03-01
The ANSI/HPS N13.1-2011 standard requires gaseous tracer uniformity testing for sampling associated with stacks used in radioactive air emissions. Sulfur hexafluoride (SF6), a greenhouse gas with a high global warming potential, has long been the gas tracer used in such testing. To reduce the impact of gas tracer tests on the environment, nitrous oxide (N2O) was evaluated as a potential replacement to SF6. The physical evaluation included the development of a test plan to record percent coefficient of variance and the percent maximum deviation between the two gases while considering variables such as fan configuration, injection position, and flow rate. Statistical power was calculated to determine how many sample sets were needed, and computational fluid dynamic modeling was utilized to estimate overall mixing in stacks. Results show there are no significant differences between the behaviors of the two gases, and SF6 modeling corroborated N2O test results. Although, in principle, all tracer gases should behave in an identical manner for measuring mixing within a stack, the series of physical tests guided by statistics was performed to demonstrate the equivalence of N2O testing to SF6 testing in the context of stack qualification tests. The results demonstrate that N2O is a viable choice leading to a four times reduction in global warming impacts for future similar compliance driven testing.
Cluster Detection Tests in Spatial Epidemiology: A Global Indicator for Performance Assessment
Guttmann, Aline; Li, Xinran; Feschet, Fabien; Gaudart, Jean; Demongeot, Jacques; Boire, Jean-Yves; Ouchchane, Lemlih
2015-01-01
In cluster detection of disease, the use of local cluster detection tests (CDTs) is current. These methods aim both at locating likely clusters and testing for their statistical significance. New or improved CDTs are regularly proposed to epidemiologists and must be subjected to performance assessment. Because location accuracy has to be considered, performance assessment goes beyond the raw estimation of type I or II errors. As no consensus exists for performance evaluations, heterogeneous methods are used, and therefore studies are rarely comparable. A global indicator of performance, which assesses both spatial accuracy and usual power, would facilitate the exploration of CDTs behaviour and help between-studies comparisons. The Tanimoto coefficient (TC) is a well-known measure of similarity that can assess location accuracy but only for one detected cluster. In a simulation study, performance is measured for many tests. From the TC, we here propose two statistics, the averaged TC and the cumulated TC, as indicators able to provide a global overview of CDTs performance for both usual power and location accuracy. We evidence the properties of these two indicators and the superiority of the cumulated TC to assess performance. We tested these indicators to conduct a systematic spatial assessment displayed through performance maps. PMID:26086911
Statistical fluctuations in pedestrian evacuation times and the effect of social contagion
NASA Astrophysics Data System (ADS)
Nicolas, Alexandre; Bouzat, Sebastián; Kuperman, Marcelo N.
2016-08-01
Mathematical models of pedestrian evacuation and the associated simulation software have become essential tools for the assessment of the safety of public facilities and buildings. While a variety of models is now available, their calibration and test against empirical data are generally restricted to global averaged quantities; the statistics compiled from the time series of individual escapes ("microscopic" statistics) measured in recent experiments are thus overlooked. In the same spirit, much research has primarily focused on the average global evacuation time, whereas the whole distribution of evacuation times over some set of realizations should matter. In the present paper we propose and discuss the validity of a simple relation between this distribution and the microscopic statistics, which is theoretically valid in the absence of correlations. To this purpose, we develop a minimal cellular automaton, with features that afford a semiquantitative reproduction of the experimental microscopic statistics. We then introduce a process of social contagion of impatient behavior in the model and show that the simple relation under test may dramatically fail at high contagion strengths, the latter being responsible for the emergence of strong correlations in the system. We conclude with comments on the potential practical relevance for safety science of calculations based on microscopic statistics.
Efficient Global Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2012-01-01
A method for identifying global aerodynamic models from flight data in an efficient manner is explained and demonstrated. A novel experiment design technique was used to obtain dynamic flight data over a range of flight conditions with a single flight maneuver. Multivariate polynomials and polynomial splines were used with orthogonalization techniques and statistical modeling metrics to synthesize global nonlinear aerodynamic models directly and completely from flight data alone. Simulation data and flight data from a subscale twin-engine jet transport aircraft were used to demonstrate the techniques. Results showed that global multivariate nonlinear aerodynamic dependencies could be accurately identified using flight data from a single maneuver. Flight-derived global aerodynamic model structures, model parameter estimates, and associated uncertainties were provided for all six nondimensional force and moment coefficients for the test aircraft. These models were combined with a propulsion model identified from engine ground test data to produce a high-fidelity nonlinear flight simulation very efficiently. Prediction testing using a multi-axis maneuver showed that the identified global model accurately predicted aircraft responses.
Hommel, Gerhard; Bretz, Frank; Maurer, Willi
2011-07-01
Global tests and multiple test procedures are often based on ordered p values. Such procedures are available for arbitrary dependence structures as well as for specific dependence assumptions of the test statistics. Most of these procedures have been considered as global tests. Multiple test procedures can be obtained by applying the closure principle in order to control the familywise error rate, or by using the false discovery rate as a criterion for type I error rate control. We provide an overview and present examples showing the importance of these procedures in medical research. Finally, we discuss modifications when different weights for the hypotheses of interest are chosen.
Multiple Phenotype Association Tests Using Summary Statistics in Genome-Wide Association Studies
Liu, Zhonghua; Lin, Xihong
2017-01-01
Summary We study in this paper jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. PMID:28653391
Multiple phenotype association tests using summary statistics in genome-wide association studies.
Liu, Zhonghua; Lin, Xihong
2018-03-01
We study in this article jointly testing the associations of a genetic variant with correlated multiple phenotypes using the summary statistics of individual phenotype analysis from Genome-Wide Association Studies (GWASs). We estimated the between-phenotype correlation matrix using the summary statistics of individual phenotype GWAS analyses, and developed genetic association tests for multiple phenotypes by accounting for between-phenotype correlation without the need to access individual-level data. Since genetic variants often affect multiple phenotypes differently across the genome and the between-phenotype correlation can be arbitrary, we proposed robust and powerful multiple phenotype testing procedures by jointly testing a common mean and a variance component in linear mixed models for summary statistics. We computed the p-values of the proposed tests analytically. This computational advantage makes our methods practically appealing in large-scale GWASs. We performed simulation studies to show that the proposed tests maintained correct type I error rates, and to compare their powers in various settings with the existing methods. We applied the proposed tests to a GWAS Global Lipids Genetics Consortium summary statistics data set and identified additional genetic variants that were missed by the original single-trait analysis. © 2017, The International Biometric Society.
Yu, Xiao-Ying; Barnett, J. Matthew; Amidan, Brett G.; ...
2017-12-12
The ANSI/HPS N13.1–2011 standard requires gaseous tracer uniformity testing for sampling associated with stacks used in radioactive air emissions. Sulfur hexafluoride (SF 6), a greenhouse gas with a high global warming potential, has long been the gas tracer used in such testing. To reduce the impact of gas tracer tests on the environment, nitrous oxide (N 2O) was evaluated as a potential replacement to SF 6. The physical evaluation included the development of a test plan to record percent coefficient of variance and the percent maximum deviation between the two gases while considering variables such as fan configuration, injection position,more » and flow rate. Statistical power was calculated to determine how many sample sets were needed, and computational fluid dynamic modeling was utilized to estimate overall mixing in stacks. Results show there are no significant differences between the behaviors of the two gases, and SF 6 modeling corroborated N 2O test results. Although, in principle, all tracer gases should behave in an identical manner for measuring mixing within a stack, the series of physical tests guided by statistics was performed to demonstrate the equivalence of N 2O testing to SF 6 testing in the context of stack qualification tests. In conclusion, the results demonstrate that N 2O is a viable choice leading to a four times reduction in global warming impacts for future similar compliance driven testing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Xiao-Ying; Barnett, J. Matthew; Amidan, Brett G.
The ANSI/HPS N13.1–2011 standard requires gaseous tracer uniformity testing for sampling associated with stacks used in radioactive air emissions. Sulfur hexafluoride (SF 6), a greenhouse gas with a high global warming potential, has long been the gas tracer used in such testing. To reduce the impact of gas tracer tests on the environment, nitrous oxide (N 2O) was evaluated as a potential replacement to SF 6. The physical evaluation included the development of a test plan to record percent coefficient of variance and the percent maximum deviation between the two gases while considering variables such as fan configuration, injection position,more » and flow rate. Statistical power was calculated to determine how many sample sets were needed, and computational fluid dynamic modeling was utilized to estimate overall mixing in stacks. Results show there are no significant differences between the behaviors of the two gases, and SF 6 modeling corroborated N 2O test results. Although, in principle, all tracer gases should behave in an identical manner for measuring mixing within a stack, the series of physical tests guided by statistics was performed to demonstrate the equivalence of N 2O testing to SF 6 testing in the context of stack qualification tests. In conclusion, the results demonstrate that N 2O is a viable choice leading to a four times reduction in global warming impacts for future similar compliance driven testing.« less
Acceleration techniques in the univariate Lipschitz global optimization
NASA Astrophysics Data System (ADS)
Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.; De Franco, Angela
2016-10-01
Univariate box-constrained Lipschitz global optimization problems are considered in this contribution. Geometric and information statistical approaches are presented. The novel powerful local tuning and local improvement techniques are described in the contribution as well as the traditional ways to estimate the Lipschitz constant. The advantages of the presented local tuning and local improvement techniques are demonstrated using the operational characteristics approach for comparing deterministic global optimization algorithms on the class of 100 widely used test functions.
Global Fire Trends from Satellite ATSR Instrument Series
NASA Astrophysics Data System (ADS)
Arino, Olivier; Casadio, Stefano; Serpe, Danilo
2010-12-01
Global night-time fire counts for the years from 1995 to 2009 have been obtained by using the latest version of Along Track Scanning Radiometer TOA radiance products (level 1), and related trends have been estimated. Possible biases due to cloud coverage variations have been assumed to be negligible. The sampling number (acquisition frequency) has also been analysed and proved not to influence our results. Global night-time fire trends have been evaluated by inspecting the time series of hot spots aggregated a) at 2°x2° scale; b) at district/country/region/continent scales, and c) globally. The statistical significance of the estimated trend parameters has been verified by means of the Mann-Kendal test. Results indicate that no trends in the absolute number of spots can be identified at the global scale, that there has been no appreciable shift in the fire season during the last fourteen years, and that statistically significant positive and negative trends are only found when data are aggregated at smaller scales.
NASA Astrophysics Data System (ADS)
Shih, A. L.; Liu, J. Y. G.
2015-12-01
A median-based method and a z test are employed to find characteristics of seismo-ionospheric precursor (SIP) of the total electron content (TEC) in global ionosphere map (GIM) associated with 129 M≥5.5 earthquakes in Taiwan during 1999-2014. Results show that both negative and positive anomalies in the GIM TEC with the statistical significance of the z test appear few days before the earthquakes. The receiver operating characteristic (ROC) curve is further applied to see whether the SIPs exist in Taiwan.
Tanglertsampan, Chuchai
2012-10-01
Topical minoxidil and oral finasteride have been used to treat men with androgenetic alopecia (AGA). There are concerns about side effects of oral finasteride especially erectile dysfunction. To compare the efficacy and safety of the 24 weeks application of 3% minoxidil lotion (MNX) versus combined 3% minoxidil and 0.1% finasteride lotion (MFX) in men with AGA. Forty men with AGA were randomized treated with MNX or MFX. Efficacy was evaluated by hair counts and global photographic assessment. Safety assessment was performed by history and physical examination. At week 24, hair counts were increased from baseline in both groups. However paired t-test revealed statistical difference only in MFX group (p = 0.044). Unpaired t-test revealed no statistical difference between two groups with respect to change of hair counts at 24 weeks from baseline (p = 0.503). MFX showed significantly higher efficacy than MNX by global photographic assessment (p = 0.003). There was no significant difference in side effects between both groups. Although change of hair counts was not statistically different between two groups, global photographic assessment showed significantly greater improvement in the MFX group than the MNX group. There was no sexual side effect. MFX may be a safe and effective treatment option.
User Selection Criteria of Airspace Designs in Flexible Airspace Management
NASA Technical Reports Server (NTRS)
Lee, Hwasoo E.; Lee, Paul U.; Jung, Jaewoo; Lai, Chok Fung
2011-01-01
A method for identifying global aerodynamic models from flight data in an efficient manner is explained and demonstrated. A novel experiment design technique was used to obtain dynamic flight data over a range of flight conditions with a single flight maneuver. Multivariate polynomials and polynomial splines were used with orthogonalization techniques and statistical modeling metrics to synthesize global nonlinear aerodynamic models directly and completely from flight data alone. Simulation data and flight data from a subscale twin-engine jet transport aircraft were used to demonstrate the techniques. Results showed that global multivariate nonlinear aerodynamic dependencies could be accurately identified using flight data from a single maneuver. Flight-derived global aerodynamic model structures, model parameter estimates, and associated uncertainties were provided for all six nondimensional force and moment coefficients for the test aircraft. These models were combined with a propulsion model identified from engine ground test data to produce a high-fidelity nonlinear flight simulation very efficiently. Prediction testing using a multi-axis maneuver showed that the identified global model accurately predicted aircraft responses.
Gontscharuk, Veronika; Landwehr, Sandra; Finner, Helmut
2015-01-01
The higher criticism (HC) statistic, which can be seen as a normalized version of the famous Kolmogorov-Smirnov statistic, has a long history, dating back to the mid seventies. Originally, HC statistics were used in connection with goodness of fit (GOF) tests but they recently gained some attention in the context of testing the global null hypothesis in high dimensional data. The continuing interest for HC seems to be inspired by a series of nice asymptotic properties related to this statistic. For example, unlike Kolmogorov-Smirnov tests, GOF tests based on the HC statistic are known to be asymptotically sensitive in the moderate tails, hence it is favorably applied for detecting the presence of signals in sparse mixture models. However, some questions around the asymptotic behavior of the HC statistic are still open. We focus on two of them, namely, why a specific intermediate range is crucial for GOF tests based on the HC statistic and why the convergence of the HC distribution to the limiting one is extremely slow. Moreover, the inconsistency in the asymptotic and finite behavior of the HC statistic prompts us to provide a new HC test that has better finite properties than the original HC test while showing the same asymptotics. This test is motivated by the asymptotic behavior of the so-called local levels related to the original HC test. By means of numerical calculations and simulations we show that the new HC test is typically more powerful than the original HC test in normal mixture models. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Thomas, J. N.; Huard, J.; Masci, F.
2017-02-01
There are many reports on the occurrence of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between ionospheric disturbances and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M ≥ 6.0 earthquakes globally for 2000-2014. We use TEC data from the global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in GIM-TEC within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM-TEC). Our analysis has not found any statistically significant changes in GIM-TEC prior to earthquakes. Thus, we have found no evidence that would suggest that monitoring changes in GIM-TEC might be useful for predicting earthquakes.
A global approach to estimate irrigated areas - a comparison between different data and statistics
NASA Astrophysics Data System (ADS)
Meier, Jonas; Zabel, Florian; Mauser, Wolfram
2018-02-01
Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.
Identifiability of PBPK Models with Applications to ...
Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy
Gregori, Josep; Méndez, Olga; Katsila, Theodora; Pujals, Mireia; Salvans, Cándida; Villarreal, Laura; Arribas, Joaquin; Tabernero, Josep; Sánchez, Alex; Villanueva, Josep
2014-07-15
Secretome profiling has become a methodology of choice for the identification of tumor biomarkers. We hypothesized that due to the dynamic nature of secretomes cellular perturbations could affect their composition but also change the global amount of protein secreted per cell. We confirmed our hypothesis by measuring the levels of secreted proteins taking into account the amount of proteome produced per cell. Then, we established a correlation between cell proliferation and protein secretion that explained the observed changes in global protein secretion. Next, we implemented a normalization correcting the statistical results of secretome studies by the global protein secretion of cells into a generalized linear model (GLM). The application of the normalization to two biological perturbations on tumor cells resulted in drastic changes in the list of statistically significant proteins. Furthermore, we found that known epithelial-to-mesenchymal transition (EMT) effectors were only statistically significant when the normalization was applied. Therefore, the normalization proposed here increases the sensitivity of statistical tests by increasing the number of true-positives. From an oncology perspective, the correlation between protein secretion and cellular proliferation suggests that slow-growing tumors could have high-protein secretion rates and consequently contribute strongly to tumor paracrine signaling.
Global, Local, and Graphical Person-Fit Analysis Using Person-Response Functions
ERIC Educational Resources Information Center
Emons, Wilco H. M.; Sijtsma, Klaas; Meijer, Rob R.
2005-01-01
Person-fit statistics test whether the likelihood of a respondent's complete vector of item scores on a test is low given the hypothesized item response theory model. This binary information may be insufficient for diagnosing the cause of a misfitting item-score vector. The authors propose a comprehensive methodology for person-fit analysis in the…
Global Active Stretching (SGA®) Practice for Judo Practitioners’ Physical Performance Enhancement
ALMEIDA, HELENO; DE SOUZA, RAPHAEL F.; AIDAR, FELIPE J.; DA SILVA, ALISSON G.; REGI, RICARDO P.; BASTOS, AFRÂNIO A.
2018-01-01
In order to analyze the Global Active Stretching (SGA®) practice on the physical performance enhancement in judo-practitioner competitors, 12 male athletes from Judo Federation of Sergipe (Federação Sergipana de Judô), were divided into two groups: Experimental Group (EG) and Control Group (CG). For 10 weeks, the EG practiced SGA® self-postures and the CG practiced assorted calisthenic exercises. All of them were submitted to a variety of tests (before and after): handgrip strength, flexibility, upper limbs’ muscle power, isometric pull-up force, lower limbs’ muscle power (squat-jump – SJ and countermovement jump – CMJ) and Tokui Waza test. Due to the small number of people in the sample, the data were considered non-parametric and then we applied the Wilcoxon test using the software R version 3.3.2 (R Development Core Team, Austria). The effect size was calculated and considered statistically significant the values p ≤ 0.05. Concerning the results, the EG statistical differences were highlighted in flexibility, upper limbs’ muscle power and lower limbs’ muscle power (CMJ), with a gain of 3.00 ± (1.09) cm, 0,42 ± (0,51) m and 2.49 ± (0.63) cm, respectively. The CG only presented statistical difference in the lower limbs’ test (CMJ), with a gain of 0,55 ± 2,28 cm. Thus, the main results pointed out statistical differences before and after in the EG in the flexibility, upper limbs and lower limbs’ muscle power (CMJ), with a gain of 3.00 ± 1.09 cm, 0.42 ± 0.51 m 2.49 ± 0.63 cm, respectively. On the other hand, the CG presented a statistical difference only the lower limbs’ CMJ test, with a gain of 0.55 ± 2.28 cm. The regular 10-week practice of SGA® self-postures increased judoka practitioners’ posterior chain flexibility and vertical jumping (CMJ) performance. PMID:29795746
Evaluation and Applications of Cloud Climatologies from CALIOP
NASA Technical Reports Server (NTRS)
Winker, David; Getzewitch, Brian; Vaughan, Mark
2008-01-01
Clouds have a major impact on the Earth radiation budget and differences in the representation of clouds in global climate models are responsible for much of the spread in predicted climate sensitivity. Existing cloud climatologies, against which these models can be tested, have many limitations. The CALIOP lidar, carried on the CALIPSO satellite, has now acquired over two years of nearly continuous cloud and aerosol observations. This dataset provides an improved basis for the characterization of 3-D global cloudiness. Global average cloud cover measured by CALIOP is about 75%, significantly higher than for existing cloud climatologies due to the sensitivity of CALIOP to optically thin cloud. Day/night biases in cloud detection appear to be small. This presentation will discuss detection sensitivity and other issues associated with producing a cloud climatology, characteristics of cloud cover statistics derived from CALIOP data, and applications of those statistics.
Characterizing and Addressing the Need for Statistical Adjustment of Global Climate Model Data
NASA Astrophysics Data System (ADS)
White, K. D.; Baker, B.; Mueller, C.; Villarini, G.; Foley, P.; Friedman, D.
2017-12-01
As part of its mission to research and measure the effects of the changing climate, the U. S. Army Corps of Engineers (USACE) regularly uses the World Climate Research Programme's Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model dataset. However, these data are generated at a global level and are not fine-tuned for specific watersheds. This often causes CMIP5 output to vary from locally observed patterns in the climate. Several downscaling methods have been developed to increase the resolution of the CMIP5 data and decrease systemic differences to support decision-makers as they evaluate results at the watershed scale. Evaluating preliminary comparisons of observed and projected flow frequency curves over the US revealed a simple framework for water resources decision makers to plan and design water resources management measures under changing conditions using standard tools. Using this framework as a basis, USACE has begun to explore to use of statistical adjustment to alter global climate model data to better match the locally observed patterns while preserving the general structure and behavior of the model data. When paired with careful measurement and hypothesis testing, statistical adjustment can be particularly effective at navigating the compromise between the locally observed patterns and the global climate model structures for decision makers.
Dispositional optimism and sleep quality: a test of mediating pathways
Cribbet, Matthew; Kent de Grey, Robert G.; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W.
2016-01-01
Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways. PMID:27592128
Dispositional optimism and sleep quality: a test of mediating pathways.
Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W
2017-04-01
Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.
NASA Astrophysics Data System (ADS)
Thomas, J. N.; Huard, J.; Masci, F.
2015-12-01
There are many published reports of anomalous changes in the ionosphere prior to large earthquakes. However, whether or not these ionospheric changes are reliable precursors that could be useful for earthquake prediction is controversial within the scientific community. To test a possible statistical relationship between the ionosphere and earthquakes, we compare changes in the total electron content (TEC) of the ionosphere with occurrences of M≥6.0 earthquakes globally for a multiyear period. We use TEC data from a global ionosphere map (GIM) and an earthquake list declustered for aftershocks. For each earthquake, we look for anomalous changes in TEC within ±30 days of the earthquake time and within 2.5° latitude and 5.0° longitude of the earthquake location (the spatial resolution of GIM). Our preliminary analysis, using global TEC and earthquake data for 2002-2010, has not found any statistically significant changes in TEC prior to earthquakes. Thus, we have found no evidence that would suggest that TEC changes are useful for earthquake prediction. Our results are discussed in the context of prior statistical and case studies. Namely, our results agree with Dautermann et al. (2007) who found no relationship between TEC changes and earthquakes in the San Andreas fault region. Whereas, our results disagree with Le et al. (2011) who found an increased rate in TEC anomalies within a few days before global earthquakes M≥6.0.
A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data.
Nishiyama, Takeshi; Takahashi, Kunihiko; Tango, Toshiro; Pinto, Dalila; Scherer, Stephen W; Takami, Satoshi; Kishino, Hirohisa
2011-05-26
Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.
NASA Astrophysics Data System (ADS)
Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.
2015-05-01
Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing-canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.
NASA Astrophysics Data System (ADS)
Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.
2015-01-01
Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.
SERE: single-parameter quality control and sample comparison for RNA-Seq.
Schulze, Stefan K; Kanwar, Rahul; Gölzenleuchter, Meike; Therneau, Terry M; Beutler, Andreas S
2012-10-03
Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson's correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson's r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen's simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter.
SERE: Single-parameter quality control and sample comparison for RNA-Seq
2012-01-01
Background Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson’s correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Results Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson’s r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen’s simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. Conclusions SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter. PMID:23033915
Analysis of statistical misconception in terms of statistical reasoning
NASA Astrophysics Data System (ADS)
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Statistical classification of drug incidents due to look-alike sound-alike mix-ups.
Wong, Zoie Shui Yee
2016-06-01
It has been recognised that medication names that look or sound similar are a cause of medication errors. This study builds statistical classifiers for identifying medication incidents due to look-alike sound-alike mix-ups. A total of 227 patient safety incident advisories related to medication were obtained from the Canadian Patient Safety Institute's Global Patient Safety Alerts system. Eight feature selection strategies based on frequent terms, frequent drug terms and constituent terms were performed. Statistical text classifiers based on logistic regression, support vector machines with linear, polynomial, radial-basis and sigmoid kernels and decision tree were trained and tested. The models developed achieved an average accuracy of above 0.8 across all the model settings. The receiver operating characteristic curves indicated the classifiers performed reasonably well. The results obtained in this study suggest that statistical text classification can be a feasible method for identifying medication incidents due to look-alike sound-alike mix-ups based on a database of advisories from Global Patient Safety Alerts. © The Author(s) 2014.
Statistical considerations for harmonization of the global multicenter study on reference values.
Ichihara, Kiyoshi
2014-05-15
The global multicenter study on reference values coordinated by the Committee on Reference Intervals and Decision Limits (C-RIDL) of the IFCC was launched in December 2011, targeting 45 commonly tested analytes with the following objectives: 1) to derive reference intervals (RIs) country by country using a common protocol, and 2) to explore regionality/ethnicity of reference values by aligning test results among the countries. To achieve these objectives, it is crucial to harmonize 1) the protocol for recruitment and sampling, 2) statistical procedures for deriving the RI, and 3) test results through measurement of a panel of sera in common. For harmonized recruitment, very lenient inclusion/exclusion criteria were adopted in view of differences in interpretation of what constitutes healthiness by different cultures and investigators. This policy may require secondary exclusion of individuals according to the standard of each country at the time of deriving RIs. An iterative optimization procedure, called the latent abnormal values exclusion (LAVE) method, can be applied to automate the process of refining the choice of reference individuals. For global comparison of reference values, test results must be harmonized, based on the among-country, pair-wise linear relationships of test values for the panel. Traceability of reference values can be ensured based on values assigned indirectly to the panel through collaborative measurement of certified reference materials. The validity of the adopted strategies is discussed in this article, based on interim results obtained to date from five countries. Special considerations are made for dissociation of RIs by parametric and nonparametric methods and between-country difference in the effect of body mass index on reference values. Copyright © 2014 Elsevier B.V. All rights reserved.
Significance tests for functional data with complex dependence structure.
Staicu, Ana-Maria; Lahiri, Soumen N; Carroll, Raymond J
2015-01-01
We propose an L 2 -norm based global testing procedure for the null hypothesis that multiple group mean functions are equal, for functional data with complex dependence structure. Specifically, we consider the setting of functional data with a multilevel structure of the form groups-clusters or subjects-units, where the unit-level profiles are spatially correlated within the cluster, and the cluster-level data are independent. Orthogonal series expansions are used to approximate the group mean functions and the test statistic is estimated using the basis coefficients. The asymptotic null distribution of the test statistic is developed, under mild regularity conditions. To our knowledge this is the first work that studies hypothesis testing, when data have such complex multilevel functional and spatial structure. Two small-sample alternatives, including a novel block bootstrap for functional data, are proposed, and their performance is examined in simulation studies. The paper concludes with an illustration of a motivating experiment.
The impact of Cenozoic cooling on assemblage diversity in planktonic foraminifera
Pearson, Paul N.; Dunkley Jones, Tom; Farnsworth, Alexander; Lunt, Daniel J.; Markwick, Paul; Purvis, Andy
2016-01-01
The Cenozoic planktonic foraminifera (PF) (calcareous zooplankton) have arguably the most detailed fossil record of any group. The quality of this record allows models of environmental controls on macroecology, developed for Recent assemblages, to be tested on intervals with profoundly different climatic conditions. These analyses shed light on the role of long-term global cooling in establishing the modern latitudinal diversity gradient (LDG)—one of the most powerful generalizations in biogeography and macroecology. Here, we test the transferability of environment-diversity models developed for modern PF assemblages to the Eocene epoch (approx. 56–34 Ma), a time of pronounced global warmth. Environmental variables from global climate models are combined with Recent environment–diversity models to predict Eocene richness gradients, which are then compared with observed patterns. The results indicate the modern LDG—lower richness towards the poles—developed through the Eocene. Three possible causes are suggested for the mismatch between statistical model predictions and data in the Early Eocene: the environmental estimates are inaccurate, the statistical model misses a relevant variable, or the intercorrelations among facets of diversity—e.g. richness, evenness, functional diversity—have changed over geological time. By the Late Eocene, environment–diversity relationships were much more similar to those found today. PMID:26977064
NASA Astrophysics Data System (ADS)
Halperin, D.; Hart, R. E.; Fuelberg, H. E.; Cossuth, J.
2013-12-01
Predicting tropical cyclone (TC) genesis has been a vexing problem for forecasters. While the literature describes environmental conditions which are necessary for TC genesis, predicting if and when a specific disturbance will organize and become a TC remains a challenge. As recently as 5-10 years ago, global models possessed little if any skill in forecasting TC genesis. However, due to increased resolution and more advanced model parameterizations, we have reached the point where global models can provide useful TC genesis guidance to operational forecasters. A recent study evaluated five global models' ability to predict TC genesis out to four days over the North Atlantic basin (Halperin et al. 2013). The results indicate that the models are indeed able to capture the genesis time and location correctly a fair percentage of the time. The study also uncovered model biases. For example, probability of detection and false alarm rate varies spatially within the basin. Also, as expected, the models' performance decreases with increasing lead time. In order to explain these and other biases, it is useful to analyze the model-indicated genesis events further to determine whether or not there are systematic differences between successful forecasts (hits), false alarms, and miss events. This study will examine composites of a number of physically-relevant environmental parameters (e.g., magnitude of vertical wind shear, aerially averaged mid-level relative humidity) and disturbance-based parameters (e.g., 925 hPa maximum wind speed, vertical alignment of relative vorticity) among each TC genesis event classification (i.e., hit, false alarm, miss). We will use standard statistical tests (e.g., Student's t test, Mann-Whitney-U Test) to calculate whether or not any differences are statistically significant. We also plan to discuss how these composite results apply to a few illustrative case studies. The results may help determine which aspects of the forecast are (in)correct and whether the incorrect aspects can be bias-corrected. This, in turn, may allow us to further enhance probabilistic forecasts of TC genesis.
Staging Liver Fibrosis with Statistical Observers
NASA Astrophysics Data System (ADS)
Brand, Jonathan Frieman
Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically on order of 1mm, which close to the resolution limit of in vivo Gd-enhanced MRI. In this work the methods to collect training and testing images for a Hotelling observer are covered. An observer based on local texture analysis is trained and tested using wet-tissue phantoms. The technique is used to optimize the MRI sequence based on task performance. The final method developed is a two stage model observer to classify fibrotic and healthy tissue in both phantoms and in vivo MRI images. The first stage observer tests for the presence of local texture. Test statistics from the first observer are used to train the second stage observer to globally sample the local observer results. A decision of the disease class is made for an entire MRI image slice using test statistics collected from the second observer. The techniques are tested on wet-tissue phantoms and in vivo clinical patient data.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent
2010-01-01
This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
Are secular correlations between sunspots, geomagnetic activity, and global temperature significant?
Love, J.J.; Mursula, K.; Tsai, V.C.; Perkins, D.M.
2011-01-01
Recent studies have led to speculation that solar-terrestrial interaction, measured by sunspot number and geomagnetic activity, has played an important role in global temperature change over the past century or so. We treat this possibility as an hypothesis for testing. We examine the statistical significance of cross-correlations between sunspot number, geomagnetic activity, and global surface temperature for the years 1868-2008, solar cycles 11-23. The data contain substantial autocorrelation and nonstationarity, properties that are incompatible with standard measures of cross-correlational significance, but which can be largely removed by averaging over solar cycles and first-difference detrending. Treated data show an expected statistically- significant correlation between sunspot number and geomagnetic activity, Pearson p < 10-4, but correlations between global temperature and sunspot number (geomagnetic activity) are not significant, p = 0.9954, (p = 0.8171). In other words, straightforward analysis does not support widely-cited suggestions that these data record a prominent role for solar-terrestrial interaction in global climate change. With respect to the sunspot-number, geomagnetic-activity, and global-temperature data, three alternative hypotheses remain difficult to reject: (1) the role of solar-terrestrial interaction in recent climate change is contained wholly in long-term trends and not in any shorter-term secular variation, or, (2) an anthropogenic signal is hiding correlation between solar-terrestrial variables and global temperature, or, (3) the null hypothesis, recent climate change has not been influenced by solar-terrestrial interaction. ?? 2011 by the American Geophysical Union.
Are secular correlations between sunspots, geomagnetic activity, and global temperature significant?
NASA Astrophysics Data System (ADS)
Love, Jeffrey J.; Mursula, Kalevi; Tsai, Victor C.; Perkins, David M.
2011-11-01
Recent studies have led to speculation that solar-terrestrial interaction, measured by sunspot number and geomagnetic activity, has played an important role in global temperature change over the past century or so. We treat this possibility as an hypothesis for testing. We examine the statistical significance of cross-correlations between sunspot number, geomagnetic activity, and global surface temperature for the years 1868-2008, solar cycles 11-23. The data contain substantial autocorrelation and nonstationarity, properties that are incompatible with standard measures of cross-correlational significance, but which can be largely removed by averaging over solar cycles and first-difference detrending. Treated data show an expected statistically-significant correlation between sunspot number and geomagnetic activity, Pearson p < 10-4, but correlations between global temperature and sunspot number (geomagnetic activity) are not significant, p = 0.9954, (p = 0.8171). In other words, straightforward analysis does not support widely-cited suggestions that these data record a prominent role for solar-terrestrial interaction in global climate change. With respect to the sunspot-number, geomagnetic-activity, and global-temperature data, three alternative hypotheses remain difficult to reject: (1) the role of solar-terrestrial interaction in recent climate change is contained wholly in long-term trends and not in any shorter-term secular variation, or, (2) an anthropogenic signal is hiding correlation between solar-terrestrial variables and global temperature, or, (3) the null hypothesis, recent climate change has not been influenced by solar-terrestrial interaction.
Mansourian, Robert; Mutch, David M; Antille, Nicolas; Aubert, Jerome; Fogel, Paul; Le Goff, Jean-Marc; Moulin, Julie; Petrov, Anton; Rytz, Andreas; Voegel, Johannes J; Roberts, Matthew-Alan
2004-11-01
Microarray technology has become a powerful research tool in many fields of study; however, the cost of microarrays often results in the use of a low number of replicates (k). Under circumstances where k is low, it becomes difficult to perform standard statistical tests to extract the most biologically significant experimental results. Other more advanced statistical tests have been developed; however, their use and interpretation often remain difficult to implement in routine biological research. The present work outlines a method that achieves sufficient statistical power for selecting differentially expressed genes under conditions of low k, while remaining as an intuitive and computationally efficient procedure. The present study describes a Global Error Assessment (GEA) methodology to select differentially expressed genes in microarray datasets, and was developed using an in vitro experiment that compared control and interferon-gamma treated skin cells. In this experiment, up to nine replicates were used to confidently estimate error, thereby enabling methods of different statistical power to be compared. Gene expression results of a similar absolute expression are binned, so as to enable a highly accurate local estimate of the mean squared error within conditions. The model then relates variability of gene expression in each bin to absolute expression levels and uses this in a test derived from the classical ANOVA. The GEA selection method is compared with both the classical and permutational ANOVA tests, and demonstrates an increased stability, robustness and confidence in gene selection. A subset of the selected genes were validated by real-time reverse transcription-polymerase chain reaction (RT-PCR). All these results suggest that GEA methodology is (i) suitable for selection of differentially expressed genes in microarray data, (ii) intuitive and computationally efficient and (iii) especially advantageous under conditions of low k. The GEA code for R software is freely available upon request to authors.
Statistical analysis of modeling error in structural dynamic systems
NASA Technical Reports Server (NTRS)
Hasselman, T. K.; Chrostowski, J. D.
1990-01-01
The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.
A study of correlations between crude oil spot and futures markets: A rolling sample test
NASA Astrophysics Data System (ADS)
Liu, Li; Wan, Jieqiu
2011-10-01
In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.
Brisson, Romain; Bianchi, Renzo
2015-11-01
The aim of this study is twofold: first, to assess the statistical significance of the data used by Pierre Bourdieu in Distinction; second, to test the hypothesis that the volume of capital (i.e., the global amount of capital) allows for a finer discrimination of dispositional differences than the composition of capital (i.e., the respective weight of the different types of capital in the global amount of capital). To these ends, five data samples were submitted to bilateral between-proportion comparison tests. The findings (1) reveal that about two-thirds of the differences reported by P. Bourdieu are significant and (2) support the view that the volume of capital prevails over its composition. © 2015 Canadian Sociological Association/La Société canadienne de sociologie.
NASA Astrophysics Data System (ADS)
Strader, Anne; Schorlemmer, Danijel; Beutin, Thomas
2017-04-01
The Global Earthquake Activity Rate Model (GEAR1) is a hybrid seismicity model, constructed from a loglinear combination of smoothed seismicity from the Global Centroid Moment Tensor (CMT) earthquake catalog and geodetic strain rates (Global Strain Rate Map, version 2.1). For the 2005-2012 retrospective evaluation period, GEAR1 outperformed both parent strain rate and smoothed seismicity forecasts. Since 1. October 2015, GEAR1 has been prospectively evaluated by the Collaboratory for the Study of Earthquake Predictability (CSEP) testing center. Here, we present initial one-year test results of the GEAR1, GSRM and GSRM2.1, as well as localized evaluation of GEAR1 performance. The models were evaluated on the consistency in number (N-test), spatial (S-test) and magnitude (M-test) distribution of forecasted and observed earthquakes, as well as overall data consistency (CL-, L-tests). Performance at target earthquake locations was compared between models using the classical paired T-test and its non-parametric equivalent, the W-test, to determine if one model could be rejected in favor of another at the 0.05 significance level. For the evaluation period from 1. October 2015 to 1. October 2016, the GEAR1, GSRM and GSRM2.1 forecasts pass all CSEP likelihood tests. Comparative test results show statistically significant improvement of GEAR1 performance over both strain rate-based forecasts, both of which can be rejected in favor of GEAR1. Using point process residual analysis, we investigate the spatial distribution of differences in GEAR1, GSRM and GSRM2 model performance, to identify regions where the GEAR1 model should be adjusted, that could not be inferred from CSEP test results. Furthermore, we investigate whether the optimal combination of smoothed seismicity and strain rates remains stable over space and time.
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
[Alterations of brain network efficiency in patients with post-concussion syndrome].
Peng, Nan; Qian, Ruobing; Fu, Xianming; Li, Shunli; Kang, Zhiqiang; Lin, Bin; Ji, Xuebing; Wei, Xiangpin; Niu, Chaoshi; Wang, Yehan
2015-07-07
To discuss the alterations of brain network efficiency in patients with post-concussion syndrome. A total of 23 patients from Anhui Provincial Hospital in the period from 2013/6 to 2014/3 who have had the concussion for 3 months were enrolled and 23 volunteers paired in sex, age and education were also enrolled as healthy controls. Comparisons of selective attention of both groups were conducted using Stroop Word-Color Test. The data of resting-state functional magnetic resonance imaging (fMRI) in both groups were collected and the data were dealt with Network Construction which is a part of GRETNA software to obtain the Matrix of brain network. Network analysis was used to obtain Global and Nodal efficiency, then independent t-test was used for statistical analyses of the value of Global and Nodal efficiency. The difference in Global efficiency of two groups in every threshold value had no statistical significance. Compared with healthy controls, the Nodal efficiencies in patients with post-concussion syndrome were significantly different in the brain regions as below: left orbital middle frontal gyrus, left posterior cingulate, left lingual, left thalamus, left superior temporal gyrus, right anterior cingulate, right posterior cingulate, right supramarginalgyrus. Compared with healthy controls, there is no significant changes of Globe efficiency in patients with post-concussion syndrome, and the brain function deficits in these patients may be caused by changes of Nodal efficiency in their brain network.
Chiang, Austin W T; Liu, Wei-Chung; Charusanti, Pep; Hwang, Ming-Jing
2014-01-15
A major challenge in mathematical modeling of biological systems is to determine how model parameters contribute to systems dynamics. As biological processes are often complex in nature, it is desirable to address this issue using a systematic approach. Here, we propose a simple methodology that first performs an enrichment test to find patterns in the values of globally profiled kinetic parameters with which a model can produce the required system dynamics; this is then followed by a statistical test to elucidate the association between individual parameters and different parts of the system's dynamics. We demonstrate our methodology on a prototype biological system of perfect adaptation dynamics, namely the chemotaxis model for Escherichia coli. Our results agreed well with those derived from experimental data and theoretical studies in the literature. Using this model system, we showed that there are motifs in kinetic parameters and that these motifs are governed by constraints of the specified system dynamics. A systematic approach based on enrichment statistical tests has been developed to elucidate the relationships between model parameters and the roles they play in affecting system dynamics of a prototype biological network. The proposed approach is generally applicable and therefore can find wide use in systems biology modeling research.
Statistics of the geomagnetic secular variation for the past 5Ma
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1986-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Statistics of the geomagnetic secular variation for the past 5 m.y
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1988-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Noël, G; Jeanmart, M; Reinhardt, B
1983-01-01
100 inpatients of both sexes, most of them older than 65 years and suffering from symptoms of the organic brain syndrome (OBS), primarily associated with aging were included in a 6-week double-blind study. Patients were randomly assigned to two treatment groups of 50 patients each and received either a neurotropic drug (3 X 200 mg EMD 21657) or placebo coated tablets of identical appearance. Patients were evaluated at the beginning of the study and after 6 weeks of treatment using a physician's symptom rating, the Nurses' Observation Scale for Inpatient Evaluation ( NOSIE ), EEG, and a psychometric test battery to assess level of mental and memory functioning ( Rey test, Benton visual retention test, block design test by Kohs ). At the final assessment global response and overall tolerance were rated by the physician. The therapeutic effects of EMD 21657 were shown to be statistically significant compared to placebo in global response (p less than or equal to 0.05), in the factor 'cognitive disturbances' (p less than or equal to 0.05 doctor's symptom rating) and in the negative factors of the NOSIE (p less than or equal to 0.05). In the other parameters of the scales, the EEG and the mental ability tests no statistically significant changes could be demonstrated in the two groups after 6 weeks of treatment. The drugs were well tolerated. EMD 21657 treatment was interrupted because of side effects (increased aggressiveness, rash) in 2 cases.
Forecast Verification: Identification of small changes in weather forecasting skill
NASA Astrophysics Data System (ADS)
Weatherhead, E. C.; Jensen, T. L.
2017-12-01
Global and regonal weather forecasts have improved over the past seven decades most often because of small, incrmental improvements. The identificaiton and verification of forecast improvement due to proposed small changes in forecasting can be expensive and, if not carried out efficiently, can slow progress in forecasting development. This presentation will look at the skill of commonly used verification techniques and show how the ability to detect improvements can depend on the magnitude of the improvement, the number of runs used to test the improvement, the location on the Earth and the statistical techniques used. For continuous variables, such as temperture, wind and humidity, the skill of a forecast can be directly compared using a pair-wise statistical test that accommodates the natural autocorrelation and magnitude of variability. For discrete variables, such as tornado outbreaks, or icing events, the challenges is to reduce the false alarm rate while improving the rate of correctly identifying th discrete event. For both continuus and discrete verification results, proper statistical approaches can reduce the number of runs needed to identify a small improvement in forecasting skill. Verification within the Next Generation Global Prediction System is an important component to the many small decisions needed to make stat-of-the-art improvements to weather forecasting capabilities. The comparison of multiple skill scores with often conflicting results requires not only appropriate testing, but also scientific judgment to assure that the choices are appropriate not only for improvements in today's forecasting capabilities, but allow improvements that will come in the future.
Spatial and Temporal Variability and Trends in 2001-2016 Global Fire Activity
NASA Astrophysics Data System (ADS)
Earl, Nick; Simmonds, Ian
2018-03-01
Fire regimes across the globe have great spatial and temporal variability, and these are influence by many factors including anthropogenic management, climate, and vegetation types. Here we utilize the satellite-based "active fire" product, from Moderate Resolution Imaging Spectroradiometer (MODIS) sensors, to statistically analyze variability and trends in fire activity from the global to regional scales. We split up the regions by economic development, region/geographical land use, clusters of fire-abundant areas, or by religious/cultural influence. Weekly cycle tests are conducted to highlight and quantify part of the anthropogenic influence on fire regime across the world. We find that there is a strong statistically significant decline in 2001-2016 active fires globally linked to an increase in net primary productivity observed in northern Africa, along with global agricultural expansion and intensification, which generally reduces fire activity. There are high levels of variability, however. The large-scale regions exhibit either little change or decreasing in fire activity except for strong increasing trends in India and China, where rapid population increase is occurring, leading to agricultural intensification and increased crop residue burning. Variability in Canada has been linked to a warming global climate leading to a longer growing season and higher fuel loads. Areas with a strong weekly cycle give a good indication of where fire management is being applied most extensively, for example, the United States, where few areas retain a natural fire regime.
A two-component rain model for the prediction of attenuation and diversity improvement
NASA Technical Reports Server (NTRS)
Crane, R. K.
1982-01-01
A new model was developed to predict attenuation statistics for a single Earth-satellite or terrestrial propagation path. The model was extended to provide predictions of the joint occurrences of specified or higher attenuation values on two closely spaced Earth-satellite paths. The joint statistics provide the information required to obtain diversity gain or diversity advantage estimates. The new model is meteorologically based. It was tested against available Earth-satellite beacon observations and terrestrial path measurements. The model employs the rain climate region descriptions of the Global rain model. The rms deviation between the predicted and observed attenuation values for the terrestrial path data was 35 percent, a result consistent with the expectations of the Global model when the rain rate distribution for the path is not used in the calculation. Within the United States the rms deviation between measurement and prediction was 36 percent but worldwide it was 79 percent.
A Method for Retrieving Ground Flash Fraction from Satellite Lightning Imager Data
NASA Technical Reports Server (NTRS)
Koshak, William J.
2009-01-01
A general theory for retrieving the fraction of ground flashes in N lightning observed by a satellite-based lightning imager is provided. An "exponential model" is applied as a physically reasonable constraint to describe the measured optical parameter distributions, and population statistics (i.e., mean, variance) are invoked to add additional constraints to the retrieval process. The retrieval itself is expressed in terms of a Bayesian inference, and the Maximum A Posteriori (MAP) solution is obtained. The approach is tested by performing simulated retrievals, and retrieval error statistics are provided. The ability to retrieve ground flash fraction has important benefits to the atmospheric chemistry community. For example, using the method to partition the existing satellite global lightning climatology into separate ground and cloud flash climatologies will improve estimates of lightning nitrogen oxides (NOx) production; this in turn will improve both regional air quality and global chemistry/climate model predictions.
Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig
2008-01-01
The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.
NASA Astrophysics Data System (ADS)
Endreny, Theodore A.; Pashiardis, Stelios
2007-02-01
SummaryRobust and accurate estimates of rainfall frequencies are difficult to make with short, and arid-climate, rainfall records, however new regional and global methods were used to supplement such a constrained 15-34 yr record in Cyprus. The impact of supplementing rainfall frequency analysis with the regional and global approaches was measured with relative bias and root mean square error (RMSE) values. Analysis considered 42 stations with 8 time intervals (5-360 min) in four regions delineated by proximity to sea and elevation. Regional statistical algorithms found the sites passed discordancy tests of coefficient of variation, skewness and kurtosis, while heterogeneity tests revealed the regions were homogeneous to mildly heterogeneous. Rainfall depths were simulated in the regional analysis method 500 times, and then goodness of fit tests identified the best candidate distribution as the general extreme value (GEV) Type II. In the regional analysis, the method of L-moments was used to estimate location, shape, and scale parameters. In the global based analysis, the distribution was a priori prescribed as GEV Type II, a shape parameter was a priori set to 0.15, and a time interval term was constructed to use one set of parameters for all time intervals. Relative RMSE values were approximately equal at 10% for the regional and global method when regions were compared, but when time intervals were compared the global method RMSE had a parabolic-shaped time interval trend. Relative bias values were also approximately equal for both methods when regions were compared, but again a parabolic-shaped time interval trend was found for the global method. The global method relative RMSE and bias trended with time interval, which may be caused by fitting a single scale value for all time intervals.
Goldie, Fraser C; Fulton, Rachael L; Dawson, Jesse; Bluhmki, Erich; Lees, Kennedy R
2014-08-01
Clinical trials for acute ischemic stroke treatment require large numbers of participants and are expensive to conduct. Methods that enhance statistical power are therefore desirable. We explored whether this can be achieved by a measure incorporating both early and late measures of outcome (e.g. seven-day NIH Stroke Scale combined with 90-day modified Rankin scale). We analyzed sensitivity to treatment effect, using proportional odds logistic regression for ordinal scales and generalized estimating equation method for global outcomes, with all analyses adjusted for baseline severity and age. We ran simulations to assess relations between sample size and power for ordinal scales and corresponding global outcomes. We used R version 2·12·1 (R Development Core Team. R Foundation for Statistical Computing, Vienna, Austria) for simulations and SAS 9·2 (SAS Institute Inc., Cary, NC, USA) for all other analyses. Each scale considered for combination was sensitive to treatment effect in isolation. The mRS90 and NIHSS90 had adjusted odds ratio of 1·56 and 1·62, respectively. Adjusted odds ratio for global outcomes of the combination of mRS90 with NIHSS7 and NIHSS90 with NIHSS7 were 1·69 and 1·73, respectively. The smallest sample sizes required to generate statistical power ≥80% for mRS90, NIHSS7, and global outcomes of mRS90 and NIHSS7 combined and NIHSS90 and NIHSS7 combined were 500, 490, 400, and 380, respectively. When data concerning both early and late outcomes are combined into a global measure, there is increased sensitivity to treatment effect compared with solitary ordinal scales. This delivers a 20% reduction in required sample size at 80% power. Combining early with late outcomes merits further consideration. © 2013 The Authors. International Journal of Stroke © 2013 World Stroke Organization.
Kaminsky, Jessica A
2015-06-16
Case study research often claims culture-variously defined-impacts infrastructure development. I test this claim using Hofstede's cultural dimensions and newly available data representing change in national coverage of sewer connections, sewerage treatment, and onsite sanitation between 1990 and 2010 for 21 developing nations. The results show that the cultural dimensions of uncertainty avoidance, masculinity-femininity, and individualism-collectivism have statistically significant relationships to sanitation technology choice. These data prove the global impact of culture on infrastructure choice, and reemphasize that local cultural preferences must be considered when constructing sanitation infrastructure.
Homeopathy for attention-deficit/hyperactivity disorder: a pilot randomized-controlled trial.
Jacobs, Jennifer; Williams, Anna-Leila; Girard, Christine; Njike, Valentine Yanchou; Katz, David
2005-10-01
The aim of this study was to carry out a preliminary trial evaluating the effectiveness of homeopathy in the treatment of attention-deficit/hyperactivity disorder (ADHD). This work was a randomized, double-blind, placebo-controlled trial. This study was conducted in a private homeopathic clinic in the Seattle metropolitan area. Subjects included children 6-12 years of age meeting Diagnostic and Statistical Manual of Mental Disorders 4th edition (DSM-IV) criteria for ADHD. Forty-three subjects were randomized to receive a homeopathic consultation and either an individualized homeopathic remedy or placebo. Patients were seen by homeopathic physicians every 6 weeks for 18 weeks. Outcome measures included the Conner's Global Index-Parent, Conner's Global Index- Teacher, Conner's Parent Rating Scale-Brief, Continuous Performance Test, and the Clinical Global Impression Scale. There were no statistically significant differences between homeopathic remedy and placebo groups on the primary or secondary outcome variables. However, there were statistically and clinically significant improvements in both groups on many of the outcome measures. This pilot study provides no evidence to support a therapeutic effect of individually selected homeopathic remedies in children with ADHD. A therapeutic effect of the homeopathic encounter is suggested and warrants further evaluation. Future studies should be carried out over a longer period of time and should include a control group that does not receive the homeopathic consultation. Comparison to conventional stimulant medication for ADHD also should be considered.
NASA Astrophysics Data System (ADS)
Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker
2018-04-01
A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as `field' or `global' significance. The block length for the local resampling tests is precisely determined to adequately account for the time series structure. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Daily precipitation climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. While the downscaled precipitation distributions are statistically indistinguishable from the observed ones in most regions in summer, the biases of some distribution characteristics are significant over large areas in winter. WRF-NOAH generates appropriate stationary fine-scale climate features in the daily precipitation field over regions of complex topography in both seasons and appropriate transient fine-scale features almost everywhere in summer. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.
Effect of Creatine Monohydrate on Clinical Progression in Patients With Parkinson Disease
2015-01-01
IMPORTANCE There are no treatments available to slow or prevent the progression of Parkinson disease, despite its global prevalence and significant health care burden. The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson Disease program was established to promote discovery of potential therapies. OBJECTIVE To determine whether creatine monohydrate was more effective than placebo in slowing long-term clinical decline in participants with Parkinson disease. DESIGN, SETTING, AND PATIENTS The Long-term Study 1, a multicenter, double-blind, parallel-group, placebo-controlled, 1:1 randomized efficacy trial. Participants were recruited from 45 investigative sites in the United States and Canada and included 1741 men and women with early (within 5 years of diagnosis) and treated (receiving dopaminergic therapy) Parkinson disease. Participants were enrolled from March 2007 to May 2010 and followed up until September 2013. INTERVENTIONS Participants were randomized to placebo or creatine (10 g/d) monohydrate for a minimum of 5 years (maximum follow-up, 8 years). MAIN OUTCOMES AND MEASURES The primary outcome measure was a difference in clinical decline from baseline to 5-year follow-up, compared between the 2 treatment groups using a global statistical test. Clinical status was defined by 5 outcome measures: Modified Rankin Scale, Symbol Digit Modalities Test, PDQ-39 Summary Index, Schwab and England Activities of Daily Living scale, and ambulatory capacity. All outcomes were coded such that higher scores indicated worse outcomes and were analyzed by a global statistical test. Higher summed ranks (range, 5–4775) indicate worse outcomes. RESULTS The trial was terminated early for futility based on results of a planned interim analysis of participants enrolled at least 5 years prior to the date of the analysis (n = 955). The median follow-up time was 4 years. Of the 955 participants, the mean of the summed ranks for placebo was 2360 (95% CI, 2249–2470) and for creatine was 2414 (95% CI, 2304–2524). The global statistical test yielded t1865.8 = −0.75 (2-sided P = .45). There were no detectable differences (P < .01 to partially adjust for multiple comparisons) in adverse and serious adverse events by body system. CONCLUSIONS AND RELEVANCE Among patients with early and treated Parkinson disease, treatment with creatine monohydrate for at least 5 years, compared with placebo did not improve clinical outcomes. These findings do not support the use of creatine monohydrate in patients with Parkinson disease. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00449865 PMID:25668262
Kieburtz, Karl; Tilley, Barbara C; Elm, Jordan J; Babcock, Debra; Hauser, Robert; Ross, G Webster; Augustine, Alicia H; Augustine, Erika U; Aminoff, Michael J; Bodis-Wollner, Ivan G; Boyd, James; Cambi, Franca; Chou, Kelvin; Christine, Chadwick W; Cines, Michelle; Dahodwala, Nabila; Derwent, Lorelei; Dewey, Richard B; Hawthorne, Katherine; Houghton, David J; Kamp, Cornelia; Leehey, Maureen; Lew, Mark F; Liang, Grace S Lin; Luo, Sheng T; Mari, Zoltan; Morgan, John C; Parashos, Sotirios; Pérez, Adriana; Petrovitch, Helen; Rajan, Suja; Reichwein, Sue; Roth, Jessie Tatsuno; Schneider, Jay S; Shannon, Kathleen M; Simon, David K; Simuni, Tanya; Singer, Carlos; Sudarsky, Lewis; Tanner, Caroline M; Umeh, Chizoba C; Williams, Karen; Wills, Anne-Marie
2015-02-10
There are no treatments available to slow or prevent the progression of Parkinson disease, despite its global prevalence and significant health care burden. The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson Disease program was established to promote discovery of potential therapies. To determine whether creatine monohydrate was more effective than placebo in slowing long-term clinical decline in participants with Parkinson disease. The Long-term Study 1, a multicenter, double-blind, parallel-group, placebo-controlled, 1:1 randomized efficacy trial. Participants were recruited from 45 investigative sites in the United States and Canada and included 1741 men and women with early (within 5 years of diagnosis) and treated (receiving dopaminergic therapy) Parkinson disease. Participants were enrolled from March 2007 to May 2010 and followed up until September 2013. Participants were randomized to placebo or creatine (10 g/d) monohydrate for a minimum of 5 years (maximum follow-up, 8 years). The primary outcome measure was a difference in clinical decline from baseline to 5-year follow-up, compared between the 2 treatment groups using a global statistical test. Clinical status was defined by 5 outcome measures: Modified Rankin Scale, Symbol Digit Modalities Test, PDQ-39 Summary Index, Schwab and England Activities of Daily Living scale, and ambulatory capacity. All outcomes were coded such that higher scores indicated worse outcomes and were analyzed by a global statistical test. Higher summed ranks (range, 5-4775) indicate worse outcomes. The trial was terminated early for futility based on results of a planned interim analysis of participants enrolled at least 5 years prior to the date of the analysis (n = 955). The median follow-up time was 4 years. Of the 955 participants, the mean of the summed ranks for placebo was 2360 (95% CI, 2249-2470) and for creatine was 2414 (95% CI, 2304-2524). The global statistical test yielded t1865.8 = -0.75 (2-sided P = .45). There were no detectable differences (P < .01 to partially adjust for multiple comparisons) in adverse and serious adverse events by body system. Among patients with early and treated Parkinson disease, treatment with creatine monohydrate for at least 5 years, compared with placebo did not improve clinical outcomes. These findings do not support the use of creatine monohydrate in patients with Parkinson disease. clinicaltrials.gov Identifier: NCT00449865.
U.S. Poised to Sit Out TIMSS Test: Physics, Advanced Math Gauged in Global Study
ERIC Educational Resources Information Center
Viadero, Debra
2007-01-01
This article reports on reactions to the U.S. Department of Education's first time decision to sit out an international study designed to show how advanced high school students around the world measure up in math and science. Mark S. Schneider, the commissioner of the department's National Center for Education Statistics, which normally takes the…
NASA Technical Reports Server (NTRS)
Salstein, D. A.; Rosen, R. D.
1982-01-01
A study using the analyses produced from the assimilation cycle of parallel model runs that both include and withhold satellite data was undertaken. The analyzed state of the atmosphere is performed using data from a certain test period during the first Special Observing Period (SOP) of the Global Weather Experiment (FGGE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my; Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, andmore » time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.« less
NASA Astrophysics Data System (ADS)
Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail
2014-12-01
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.
ECG Identification System Using Neural Network with Global and Local Features
ERIC Educational Resources Information Center
Tseng, Kuo-Kun; Lee, Dachao; Chen, Charles
2016-01-01
This paper proposes a human identification system via extracted electrocardiogram (ECG) signals. Two hierarchical classification structures based on global shape feature and local statistical feature is used to extract ECG signals. Global shape feature represents the outline information of ECG signals and local statistical feature extracts the…
Latham, Garry; Long, Tony; Devitt, Patric
2013-12-01
Accidental chemical poisoning causes more than 35 000 child deaths every year across the world, and it leads to disease, disability, and suffering for many more children. Children's ignorance of dangers and their failure to interpret hazard warning signs as intended contribute significantly to this problem. A new Globally Harmonized System for Classification and Labeling is being implemented internationally with a view to unifying the current multiple and disparate national systems. This study was designed to establish a productive, effective means of teaching the new GHS warning signs to primary school children (aged 7-11 years). A pre-test, post-test, follow-up test design was employed, with a teaching intervention informed by a Delphi survey of expert opinion. Children from one school formed the experimental group (n = 49) and a second school provided a control group (n = 23). Both groups showed a gain in knowledge from pre-test to post-test, the experimental group with a larger gain but which was not statistically significant. However, longer-term retention of knowledge, as shown by the follow-up test, was statistically significantly greater in the experimental group (p = 0.001). The employment of teaching to match children's preferred learning styles, and the use of active learning were found to be related to improved retention of knowledge. Part of the study involved eliciting children's interpretation of standard hazard warning symbols, and this provoked considerable concern over the potential for dangerous misinterpretation with disastrous consequences. This article focuses on the reasons for such misconception and the action required to address this successfully in testing the intervention.
Global risk of big earthquakes has not recently increased.
Shearer, Peter M; Stark, Philip B
2012-01-17
The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.
Global risk of big earthquakes has not recently increased
Shearer, Peter M.; Stark, Philip B.
2012-01-01
The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences—if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past. PMID:22184228
Color constancy in natural scenes explained by global image statistics
Foster, David H.; Amano, Kinjiro; Nascimento, Sérgio M. C.
2007-01-01
To what extent do observers' judgments of surface color with natural scenes depend on global image statistics? To address this question, a psychophysical experiment was performed in which images of natural scenes under two successive daylights were presented on a computer-controlled high-resolution color monitor. Observers reported whether there was a change in reflectance of a test surface in the scene. The scenes were obtained with a hyperspectral imaging system and included variously trees, shrubs, grasses, ferns, flowers, rocks, and buildings. Discrimination performance, quantified on a scale of 0 to 1 with a color-constancy index, varied from 0.69 to 0.97 over 21 scenes and two illuminant changes, from a correlated color temperature of 25,000 K to 6700 K and from 4000 K to 6700 K. The best account of these effects was provided by receptor-based rather than colorimetric properties of the images. Thus, in a linear regression, 43% of the variance in constancy index was explained by the log of the mean relative deviation in spatial cone-excitation ratios evaluated globally across the two images of a scene. A further 20% was explained by including the mean chroma of the first image and its difference from that of the second image and a further 7% by the mean difference in hue. Together, all four global color properties accounted for 70% of the variance and provided a good fit to the effects of scene and of illuminant change on color constancy, and, additionally, of changing test-surface position. By contrast, a spatial-frequency analysis of the images showed that the gradient of the luminance amplitude spectrum accounted for only 5% of the variance. PMID:16961965
Color constancy in natural scenes explained by global image statistics.
Foster, David H; Amano, Kinjiro; Nascimento, Sérgio M C
2006-01-01
To what extent do observers' judgments of surface color with natural scenes depend on global image statistics? To address this question, a psychophysical experiment was performed in which images of natural scenes under two successive daylights were presented on a computer-controlled high-resolution color monitor. Observers reported whether there was a change in reflectance of a test surface in the scene. The scenes were obtained with a hyperspectral imaging system and included variously trees, shrubs, grasses, ferns, flowers, rocks, and buildings. Discrimination performance, quantified on a scale of 0 to 1 with a color-constancy index, varied from 0.69 to 0.97 over 21 scenes and two illuminant changes, from a correlated color temperature of 25,000 K to 6700 K and from 4000 K to 6700 K. The best account of these effects was provided by receptor-based rather than colorimetric properties of the images. Thus, in a linear regression, 43% of the variance in constancy index was explained by the log of the mean relative deviation in spatial cone-excitation ratios evaluated globally across the two images of a scene. A further 20% was explained by including the mean chroma of the first image and its difference from that of the second image and a further 7% by the mean difference in hue. Together, all four global color properties accounted for 70% of the variance and provided a good fit to the effects of scene and of illuminant change on color constancy, and, additionally, of changing test-surface position. By contrast, a spatial-frequency analysis of the images showed that the gradient of the luminance amplitude spectrum accounted for only 5% of the variance.
A hierarchical fuzzy rule-based approach to aphasia diagnosis.
Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid
2007-10-01
Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.
Quality of Life and Nutritional Status Among Cancer Patients on Chemotherapy
Vergara, Nunilon; Montoya, Jose Enrique; Luna, Herdee Gloriane; Amparo, Jose Roberto; Cristal-Luna, Gloria
2013-01-01
Objectives Malnutrition is prevalent among cancer patients, and maybe correlated with altered quality of life. The objective of this study is to determine wether quality of life among cancer patients on chemotherapy at the National Kidney and Transplant Institute- Cancer Unit differs from patients with normal nutrition based on the Subjective Global Assessment scale. Methods A cross sectional study was conducted among cancer patients admitted for chemotherapy at the National Kidney and Transplant Institute-Cancer Unit from January to May 2011. Demographic profile, performance status by Eastern Cooperative Oncology Group performance scale, nutritional status assessment by Subjective Global Assessment, and quality of life assessment by the European Organization for Research and Treatment of Cancer QoL-30 core module were obtained. Descriptive statistics and ANOVA were performed for analysis of quality of life parameters and nutritional status. Results A total of 97 subjects were included in this study, 66 subjects (68.04%) were females and 31 (31.96%) were males. Mean age was 54.55 ± 11.14 years, while mean performance status by the Eastern Cooperative Oncology Group classification was 0.88 ± 0.83 with a range of 0-3. According to the Subjective Global Assessment, there were 58 patients with SGA A, classified to have adequate nutrition, and 39 patients (40.21%) were considered malnourished. Among these 39 patients, 32 were classified SGA-B (moderately malnourished) and 7 were classified SGA C (severely malnourished) mean global quality of life was 68.73 ± 19.05. Results from ANOVA test revealed that patients were statistically different across the Subjective Global Assessment groups according to global quality of life (p<0.001), physical (p<0.001), role (p<0.001), emotional (p<0.001), and cognitive functioning (p<0.001); fatigue (p<0.001), nausea and vomiting (p<0.001), pain (p<0.001), insomnia (p<0.001), and appetite loss (p<0.001). Conclusion Global quality of life and its parameters: physical state, role, emotional state, cognitive functioning, cancer fatigue, nausea and vomiting, pain, insomnia, and loss of appetite were statistically different across all Subjective Global Assessment groups. Moreover, there was no difference between financial difficulties, social functioning, constipation and diarrhea among the Subjective Global Assessment groups. PMID:23904921
Quality of life and nutritional status among cancer patients on chemotherapy.
Vergara, Nunilon; Montoya, Jose Enrique; Luna, Herdee Gloriane; Amparo, Jose Roberto; Cristal-Luna, Gloria
2013-07-01
Malnutrition is prevalent among cancer patients, and maybe correlated with altered quality of life. The objective of this study is to determine wether quality of life among cancer patients on chemotherapy at the National Kidney and Transplant Institute- Cancer Unit differs from patients with normal nutrition based on the Subjective Global Assessment scale. A cross sectional study was conducted among cancer patients admitted for chemotherapy at the National Kidney and Transplant Institute-Cancer Unit from January to May 2011. Demographic profile, performance status by Eastern Cooperative Oncology Group performance scale, nutritional status assessment by Subjective Global Assessment, and quality of life assessment by the European Organization for Research and Treatment of Cancer QoL-30 core module were obtained. Descriptive statistics and ANOVA were performed for analysis of quality of life parameters and nutritional status. A total of 97 subjects were included in this study, 66 subjects (68.04%) were females and 31 (31.96%) were males. Mean age was 54.55 ± 11.14 years, while mean performance status by the Eastern Cooperative Oncology Group classification was 0.88 ± 0.83 with a range of 0-3. According to the Subjective Global Assessment, there were 58 patients with SGA A, classified to have adequate nutrition, and 39 patients (40.21%) were considered malnourished. Among these 39 patients, 32 were classified SGA-B (moderately malnourished) and 7 were classified SGA C (severely malnourished) mean global quality of life was 68.73 ± 19.05. Results from ANOVA test revealed that patients were statistically different across the Subjective Global Assessment groups according to global quality of life (p<0.001), physical (p<0.001), role (p<0.001), emotional (p<0.001), and cognitive functioning (p<0.001); fatigue (p<0.001), nausea and vomiting (p<0.001), pain (p<0.001), insomnia (p<0.001), and appetite loss (p<0.001). GLOBAL QUALITY OF LIFE AND ITS PARAMETERS: physical state, role, emotional state, cognitive functioning, cancer fatigue, nausea and vomiting, pain, insomnia, and loss of appetite were statistically different across all Subjective Global Assessment groups. Moreover, there was no difference between financial difficulties, social functioning, constipation and diarrhea among the Subjective Global Assessment groups.
Structural texture similarity metrics for image analysis and retrieval.
Zujovic, Jana; Pappas, Thrasyvoulos N; Neuhoff, David L
2013-07-01
We develop new metrics for texture similarity that accounts for human visual perception and the stochastic nature of textures. The metrics rely entirely on local image statistics and allow substantial point-by-point deviations between textures that according to human judgment are essentially identical. The proposed metrics extend the ideas of structural similarity and are guided by research in texture analysis-synthesis. They are implemented using a steerable filter decomposition and incorporate a concise set of subband statistics, computed globally or in sliding windows. We conduct systematic tests to investigate metric performance in the context of "known-item search," the retrieval of textures that are "identical" to the query texture. This eliminates the need for cumbersome subjective tests, thus enabling comparisons with human performance on a large database. Our experimental results indicate that the proposed metrics outperform peak signal-to-noise ratio (PSNR), structural similarity metric (SSIM) and its variations, as well as state-of-the-art texture classification metrics, using standard statistical measures.
NASA Astrophysics Data System (ADS)
Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.
2011-01-01
Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.
Random Forests for Global and Regional Crop Yield Predictions.
Jeong, Jig Han; Resop, Jonathan P; Mueller, Nathaniel D; Fleisher, David H; Yun, Kyungdahm; Butler, Ethan E; Timlin, Dennis J; Shim, Kyo-Moon; Gerber, James S; Reddy, Vangimalla R; Kim, Soo-Hyung
2016-01-01
Accurate predictions of crop yield are critical for developing effective agricultural and food policies at the regional and global scales. We evaluated a machine-learning method, Random Forests (RF), for its ability to predict crop yield responses to climate and biophysical variables at global and regional scales in wheat, maize, and potato in comparison with multiple linear regressions (MLR) serving as a benchmark. We used crop yield data from various sources and regions for model training and testing: 1) gridded global wheat grain yield, 2) maize grain yield from US counties over thirty years, and 3) potato tuber and maize silage yield from the northeastern seaboard region. RF was found highly capable of predicting crop yields and outperformed MLR benchmarks in all performance statistics that were compared. For example, the root mean square errors (RMSE) ranged between 6 and 14% of the average observed yield with RF models in all test cases whereas these values ranged from 14% to 49% for MLR models. Our results show that RF is an effective and versatile machine-learning method for crop yield predictions at regional and global scales for its high accuracy and precision, ease of use, and utility in data analysis. RF may result in a loss of accuracy when predicting the extreme ends or responses beyond the boundaries of the training data.
Zamani, Omid; Böttcher, Elke; Rieger, Jörg D; Mitterhuber, Johann; Hawel, Reinhold; Stallinger, Sylvia; Eller, Norbert
2014-06-01
In this observer-blinded, multicenter, non-inferiority study, 489 patients suffering from painful osteoarthritis of the hip or knee were included to investigate safety and tolerability of Dexibuprofen vs. Ibuprofen powder for oral suspension. Only patients who had everyday joint pain for the past 3 months and "moderate" to "severe" global pain intensity in the involved hip/knee of within the last 48 h were enrolled. The treatment period was up to 14 days with a control visit after 3 days. The test product was Dexibuprofen 400 mg powder for oral suspension (daily dose 800 mg) compared to Ibuprofen 400 mg powder for oral suspension (daily dose 1,600 mg). Gastrointestinal adverse drug reactions were reported in 8 patients (3.3 %) in the Dexibuprofen group and in 19 patients (7.8 %) in the Ibuprofen group. Statistically significant non-inferiority was shown for Dexibuprofen. Comparing both groups by a Chi square test showed a statistical significant lower proportion of related gastrointestinal events in the Dexibuprofen group. All analyses of secondary tolerability parameters showed the same result of a significantly better safety profile in this therapy setting for Dexibuprofen compared to Ibuprofen. The sum of pain intensity, pain relief and global assessments showed no significant difference between treatment groups. In summary, analyses revealed at least non-inferiority in terms of efficacy and a statistically significant better safety profile for the Dexibuprofen treatment.
Time series modelling of global mean temperature for managerial decision-making.
Romilly, Peter
2005-07-01
Climate change has important implications for business and economic activity. Effective management of climate change impacts will depend on the availability of accurate and cost-effective forecasts. This paper uses univariate time series techniques to model the properties of a global mean temperature dataset in order to develop a parsimonious forecasting model for managerial decision-making over the short-term horizon. Although the model is estimated on global temperature data, the methodology could also be applied to temperature data at more localised levels. The statistical techniques include seasonal and non-seasonal unit root testing with and without structural breaks, as well as ARIMA and GARCH modelling. A forecasting evaluation shows that the chosen model performs well against rival models. The estimation results confirm the findings of a number of previous studies, namely that global mean temperatures increased significantly throughout the 20th century. The use of GARCH modelling also shows the presence of volatility clustering in the temperature data, and a positive association between volatility and global mean temperature.
Random variability explains apparent global clustering of large earthquakes
Michael, A.J.
2011-01-01
The occurrence of 5 Mw ≥ 8.5 earthquakes since 2004 has created a debate over whether or not we are in a global cluster of large earthquakes, temporarily raising risks above long-term levels. I use three classes of statistical tests to determine if the record of M ≥ 7 earthquakes since 1900 can reject a null hypothesis of independent random events with a constant rate plus localized aftershock sequences. The data cannot reject this null hypothesis. Thus, the temporal distribution of large global earthquakes is well-described by a random process, plus localized aftershocks, and apparent clustering is due to random variability. Therefore the risk of future events has not increased, except within ongoing aftershock sequences, and should be estimated from the longest possible record of events.
Vasconcellos, Luiz Felipe; Pereira, João Santos; Adachi, Marcelo; Greca, Denise; Cruz, Manuela; Malak, Ana Lara; Charchat-Fichman, Helenice; Spitz, Mariana
2017-01-01
Few studies have evaluated magnetic resonance imaging (MRI) visual scales in Parkinson's disease-Mild Cognitive Impairment (PD-MCI). We selected 79 PD patients and 92 controls (CO) to perform neurologic and neuropsychological evaluation. Brain MRI was performed to evaluate the following scales: Global Cortical Atrophy (GCA), Fazekas, and medial temporal atrophy (MTA). The analysis revealed that both PD groups (amnestic and nonamnestic) showed worse performance on several tests when compared to CO. Memory, executive function, and attention impairment were more severe in amnestic PD-MCI group. Overall analysis of frequency of MRI visual scales by MCI subtype did not reveal any statistically significant result. Statistically significant inverse correlation was observed between GCA scale and Mini-Mental Status Examination (MMSE), Montreal Cognitive Assessment (MoCA), semantic verbal fluency, Stroop test, figure memory test, trail making test (TMT) B, and Rey Auditory Verbal Learning Test (RAVLT). The MTA scale correlated with Stroop test and Fazekas scale with figure memory test, digit span, and Stroop test according to the subgroup evaluated. Visual scales by MRI in MCI should be evaluated by cognitive domain and might be more useful in more severely impaired MCI or dementia patients.
Heat balance statistics derived from four-dimensional assimilations with a global circulation model
NASA Technical Reports Server (NTRS)
Schubert, S. D.; Herman, G. F.
1981-01-01
The reported investigation was conducted to develop a reliable procedure for obtaining the diabatic and vertical terms required for atmospheric heat balance studies. The method developed employs a four-dimensional assimilation mode in connection with the general circulation model of NASA's Goddard Laboratory for Atmospheric Sciences. The initial analysis was conducted with data obtained in connection with the 1976 Data Systems Test. On the basis of the results of the investigation, it appears possible to use the model's observationally constrained diagnostics to provide estimates of the global distribution of virtually all of the quantities which are needed to compute the atmosphere's heat and energy balance.
Olimpo, Jeffrey T.; Pevey, Ryan S.; McCabe, Thomas M.
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students’ reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce. PMID:29904549
Olimpo, Jeffrey T; Pevey, Ryan S; McCabe, Thomas M
2018-01-01
Course-based undergraduate research experiences (CUREs) provide an avenue for student participation in authentic scientific opportunities. Within the context of such coursework, students are often expected to collect, analyze, and evaluate data obtained from their own investigations. Yet, limited research has been conducted that examines mechanisms for supporting students in these endeavors. In this article, we discuss the development and evaluation of an interactive statistics workshop that was expressly designed to provide students with an open platform for graduate teaching assistant (GTA)-mentored data processing, statistical testing, and synthesis of their own research findings. Mixed methods analyses of pre/post-intervention survey data indicated a statistically significant increase in students' reasoning and quantitative literacy abilities in the domain, as well as enhancement of student self-reported confidence in and knowledge of the application of various statistical metrics to real-world contexts. Collectively, these data reify an important role for scaffolded instruction in statistics in preparing emergent scientists to be data-savvy researchers in a globally expansive STEM workforce.
A graph-based watershed merging using fuzzy C-means and simulated annealing for image segmentation
NASA Astrophysics Data System (ADS)
Vadiveloo, Mogana; Abdullah, Rosni; Rajeswari, Mandava
2015-12-01
In this paper, we have addressed the issue of over-segmented regions produced in watershed by merging the regions using global feature. The global feature information is obtained from clustering the image in its feature space using Fuzzy C-Means (FCM) clustering. The over-segmented regions produced by performing watershed on the gradient of the image are then mapped to this global information in the feature space. Further to this, the global feature information is optimized using Simulated Annealing (SA). The optimal global feature information is used to derive the similarity criterion to merge the over-segmented watershed regions which are represented by the region adjacency graph (RAG). The proposed method has been tested on digital brain phantom simulated dataset to segment white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) soft tissues regions. The experiments showed that the proposed method performs statistically better, with average of 95.242% regions are merged, than the immersion watershed and average accuracy improvement of 8.850% in comparison with RAG-based immersion watershed merging using global and local features.
Geomagnetic South Atlantic Anomaly and global sea level rise: A direct connection?
NASA Astrophysics Data System (ADS)
de Santis, A.; Qamili, E.; Spada, G.; Gasperini, P.
2012-01-01
We highlight the existence of an intriguing and to date unreported relationship between the surface area of the South Atlantic Anomaly (SAA) of the geomagnetic field and the current trend in global sea level rise. These two geophysical variables have been growing coherently during the last three centuries, thus strongly suggesting a causal relationship supported by some statistical tests. The monotonic increase of the SAA surface area since 1600 may have been associated with an increased inflow of radiation energy through the inner Van Allen belt with a consequent warming of the Earth's atmosphere and finally global sea level rise. An alternative suggestive and original explanation is also offered, in which pressure changes at the core-mantle boundary cause surface deformations and relative sea level variations. Although we cannot establish a clear connection between SAA dynamics and global warming, the strong correlation between the former and global sea level supports the idea that global warming may be at least partly controlled by deep Earth processes triggering geomagnetic phenomena, such as the South Atlantic Anomaly, on a century time scale.
The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
Human-modified temperatures induce species changes: Joint attribution.
Root, Terry L; MacMynowski, Dena P; Mastrandrea, Michael D; Schneider, Stephen H
2005-05-24
Average global surface-air temperature is increasing. Contention exists over relative contributions by natural and anthropogenic forcings. Ecological studies attribute plant and animal changes to observed warming. Until now, temperature-species connections have not been statistically attributed directly to anthropogenic climatic change. Using modeled climatic variables and observed species data, which are independent of thermometer records and paleoclimatic proxies, we demonstrate statistically significant "joint attribution," a two-step linkage: human activities contribute significantly to temperature changes and human-changed temperatures are associated with discernible changes in plant and animal traits. Additionally, our analyses provide independent testing of grid-box-scale temperature projections from a general circulation model (HadCM3).
NASA Astrophysics Data System (ADS)
Liu, Jann-Yenq; Chen, Koichi; Tsai, Ho-Fang; Hattori, Katsumi; Le, Huijun
2013-04-01
This paper reports statistical results of seismo-ionospheric precursors (SIPs) of the total electron content (TEC) in the global ionosphere map (GIM) over the epicenter of earthquakes with magnitude 6 and greater in China, Japan, and Taiwan during 1998-2012. To detect SIP, a quartile-based (i.e. median-based) process is performed. The earthquakes are sub-divided into various regions to have a better understanding on SIP characteristics, as well as examined with and without being led by magnetic storms to confirm the SIP existence. Results show that the SIPs mainly are the TEC significant increase in Japan, and decrease in Taiwan and China, respectively, which suggests the latitudinal effect playing an important role. Meanwhile, for a practical application of monitoring SIPs, the GIM TEC at a fixed point is tested. Results show that multi monitoring points and/or a spatial observation are required to enhance the SIP detection.
Earthquake number forecasts testing
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.
2017-10-01
We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.
Seasonality of Kawasaki Disease: A Global Perspective
Burns, Jane C.; Herzog, Lauren; Fabri, Olivia; Tremoulet, Adriana H.; Rodó, Xavier; Uehara, Ritei; Burgner, David; Bainto, Emelia; Pierce, David; Tyree, Mary; Cayan, Daniel
2013-01-01
Background Understanding global seasonal patterns of Kawasaki disease (KD) may provide insight into the etiology of this vasculitis that is now the most common cause of acquired heart disease in children in developed countries worldwide. Methods Data from 1970-2012 from 25 countries distributed over the globe were analyzed for seasonality. The number of KD cases from each location was normalized to minimize the influence of greater numbers from certain locations. The presence of seasonal variation of KD at the individual locations was evaluated using three different tests: time series modeling, spectral analysis, and a Monte Carlo technique. Results A defined seasonal structure emerged demonstrating broad coherence in fluctuations in KD cases across the Northern Hemisphere extra-tropical latitudes. In the extra-tropical latitudes of the Northern Hemisphere, KD case numbers were highest in January through March and approximately 40% higher than in the months of lowest case numbers from August through October. Datasets were much sparser in the tropics and the Southern Hemisphere extra-tropics and statistical significance of the seasonality tests was weak, but suggested a maximum in May through June, with approximately 30% higher number of cases than in the least active months of February, March and October. The seasonal pattern in the Northern Hemisphere extra-tropics was consistent across the first and second halves of the sample period. Conclusion Using the first global KD time series, analysis of sites located in the Northern Hemisphere extra-tropics revealed statistically significant and consistent seasonal fluctuations in KD case numbers with high numbers in winter and low numbers in late summer and fall. Neither the tropics nor the Southern Hemisphere extra-tropics registered a statistically significant aggregate seasonal cycle. These data suggest a seasonal exposure to a KD agent that operates over large geographic regions and is concentrated during winter months in the Northern Hemisphere extra-tropics. PMID:24058585
Seasonality of Kawasaki disease: a global perspective.
Burns, Jane C; Herzog, Lauren; Fabri, Olivia; Tremoulet, Adriana H; Rodó, Xavier; Uehara, Ritei; Burgner, David; Bainto, Emelia; Pierce, David; Tyree, Mary; Cayan, Daniel
2013-01-01
Understanding global seasonal patterns of Kawasaki disease (KD) may provide insight into the etiology of this vasculitis that is now the most common cause of acquired heart disease in children in developed countries worldwide. Data from 1970-2012 from 25 countries distributed over the globe were analyzed for seasonality. The number of KD cases from each location was normalized to minimize the influence of greater numbers from certain locations. The presence of seasonal variation of KD at the individual locations was evaluated using three different tests: time series modeling, spectral analysis, and a Monte Carlo technique. A defined seasonal structure emerged demonstrating broad coherence in fluctuations in KD cases across the Northern Hemisphere extra-tropical latitudes. In the extra-tropical latitudes of the Northern Hemisphere, KD case numbers were highest in January through March and approximately 40% higher than in the months of lowest case numbers from August through October. Datasets were much sparser in the tropics and the Southern Hemisphere extra-tropics and statistical significance of the seasonality tests was weak, but suggested a maximum in May through June, with approximately 30% higher number of cases than in the least active months of February, March and October. The seasonal pattern in the Northern Hemisphere extra-tropics was consistent across the first and second halves of the sample period. Using the first global KD time series, analysis of sites located in the Northern Hemisphere extra-tropics revealed statistically significant and consistent seasonal fluctuations in KD case numbers with high numbers in winter and low numbers in late summer and fall. Neither the tropics nor the Southern Hemisphere extra-tropics registered a statistically significant aggregate seasonal cycle. These data suggest a seasonal exposure to a KD agent that operates over large geographic regions and is concentrated during winter months in the Northern Hemisphere extra-tropics.
Evidence for a Global Sampling Process in Extraction of Summary Statistics of Item Sizes in a Set.
Tokita, Midori; Ueda, Sachiyo; Ishiguchi, Akira
2016-01-01
Several studies have shown that our visual system may construct a "summary statistical representation" over groups of visual objects. Although there is a general understanding that human observers can accurately represent sets of a variety of features, many questions on how summary statistics, such as an average, are computed remain unanswered. This study investigated sampling properties of visual information used by human observers to extract two types of summary statistics of item sets, average and variance. We presented three models of ideal observers to extract the summary statistics: a global sampling model without sampling noise, global sampling model with sampling noise, and limited sampling model. We compared the performance of an ideal observer of each model with that of human observers using statistical efficiency analysis. Results suggest that summary statistics of items in a set may be computed without representing individual items, which makes it possible to discard the limited sampling account. Moreover, the extraction of summary statistics may not necessarily require the representation of individual objects with focused attention when the sets of items are larger than 4.
Early onset obsessive-compulsive disorder with and without tics.
de Mathis, Maria Alice; Diniz, Juliana B; Shavitt, Roseli G; Torres, Albina R; Ferrão, Ygor A; Fossaluza, Victor; Pereira, Carlos; Miguel, Eurípedes; do Rosario, Maria Conceicão
2009-07-01
Research suggests that obsessive-compulsive disorder (OCD) is not a unitary entity, but rather a highly heterogeneous condition, with complex and variable clinical manifestations. The aims of this study were to compare clinical and demographic characteristics of OCD patients with early and late age of onset of obsessive-compulsive symptoms (OCS); and to compare the same features in early onset OCD with and without tics. The independent impact of age at onset and presence of tics on comorbidity patterns was investigated. Three hundred and thirty consecutive outpatients meeting Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for OCD were evaluated: 160 patients belonged to the "early onset" group (EOG): before 11 years of age, 75 patients had an "intermediate onset" (IOG), and 95 patients were from the "late onset" group (LOG): after 18 years of age. From the 160 EOG, 60 had comorbidity with tic disorders. The diagnostic instruments used were: the Yale-Brown Obsessive Compulsive Scale and the Dimensional Yale-Brown Obsessive Compulsive Scale (DY-BOCS), Yale Global Tics Severity Scale, and Structured Clinical Interview for DSM-IV Axis I Disorders-patient edition. Statistical tests used were: Mann-Whitney, full Bayesian significance test, and logistic regression. The EOG had a predominance of males, higher frequency of family history of OCS, higher mean scores on the "aggression/violence" and "miscellaneous" dimensions, and higher mean global DY-BOCS scores. Patients with EOG without tic disorders presented higher mean global DY-BOCS scores and higher mean scores in the "contamination/cleaning" dimension. The current results disentangle some of the clinical overlap between early onset OCD with and without tics.
Global motion perception is associated with motor function in 2-year-old children.
Thompson, Benjamin; McKinlay, Christopher J D; Chakraborty, Arijit; Anstice, Nicola S; Jacobs, Robert J; Paudel, Nabin; Yu, Tzu-Ying; Ansell, Judith M; Wouldes, Trecia A; Harding, Jane E
2017-09-29
The dorsal visual processing stream that includes V1, motion sensitive area V5 and the posterior parietal lobe, supports visually guided motor function. Two recent studies have reported associations between global motion perception, a behavioural measure of processing in V5, and motor function in pre-school and school aged children. This indicates a relationship between visual and motor development and also supports the use of global motion perception to assess overall dorsal stream function in studies of human neurodevelopment. We investigated whether associations between vision and motor function were present at 2 years of age, a substantially earlier stage of development. The Bayley III test of Infant and Toddler Development and measures of vision including visual acuity (Cardiff Acuity Cards), stereopsis (Lang stereotest) and global motion perception were attempted in 404 2-year-old children (±4 weeks). Global motion perception (quantified as a motion coherence threshold) was assessed by observing optokinetic nystagmus in response to random dot kinematograms of varying coherence. Linear regression revealed that global motion perception was modestly, but statistically significantly associated with Bayley III composite motor (r 2 =0.06, P<0.001, n=375) and gross motor scores (r 2 =0.06, p<0.001, n=375). The associations remained significant when language score was included in the regression model. In addition, when language score was included in the model, stereopsis was significantly associated with composite motor and fine motor scores, but unaided visual acuity was not statistically significantly associated with any of the motor scores. These results demonstrate that global motion perception and binocular vision are associated with motor function at an early stage of development. Global motion perception can be used as a partial measure of dorsal stream function from early childhood. Copyright © 2017 Elsevier B.V. All rights reserved.
How Well Has Global Ocean Heat Content Variability Been Measured?
NASA Astrophysics Data System (ADS)
Nelson, A.; Weiss, J.; Fox-Kemper, B.; Fabienne, G.
2016-12-01
We introduce a new strategy that uses synthetic observations of an ensemble of model simulations to test the fidelity of an observational strategy, quantifying how well it captures the statistics of variability. We apply this test to the 0-700m global ocean heat content anomaly (OHCA) as observed with in-situ measurements by the Coriolis Dataset for Reanalysis (CORA), using the Community Climate System Model (CCSM) version 3.5. One-year running mean OHCAs for the years 2005 onward are found to faithfully capture the variability. During these years, synthetic observations of the model are strongly correlated at 0.94±0.06 with the actual state of the model. Overall, sub-annual variability and data before 2005 are significantly affected by the variability of the observing system. In contrast, the sometimes-used weighted integral of observations is not a good indicator of OHCA as variability in the observing system contaminates dynamical variability.
Le Quellec, Sandra; Paris, Mickaël; Nougier, Christophe; Sobas, Frédéric; Rugeri, Lucia; Girard, Sandrine; Bordet, Jean-Claude; Négrier, Claude; Dargaud, Yesim
2017-05-01
Pneumatic tube system (PTS) in hospitals is commonly used for the transport of blood samples to clinical laboratories, as it is rapid and cost-effective. The aim was to compare the effects on haematology samples of a newly acquired ~2km-long PTS that links 2 hospitals with usual transport (non-pneumatic tube system, NPTS). Complete blood cell count, routine coagulation assays, platelet function tests (PFT) with light-transmission aggregometry and global coagulation assays including ROTEM® and thrombin generation assay (TGA) were performed on blood samples from 30 healthy volunteers and 9 healthy volunteers who agreed to take aspirin prior to blood sampling. The turnaround time was reduced by 31% (p<0.001) with the use of PTS. No statistically significant difference was observed for most routine haematology assays including PFT, and ROTEM® analysis. A statistically significant, but not clinically relevant, shortening of the APTT after sample transport by PTS was found (mean±SD: 30s±1.8 vs. 29.5s±2.1 for NPTS). D-dimer levels were 7.4% higher after transport through PTS but were not discordant. A statistically significant increase of thrombin generation was found in both platelet poor- and platelet rich- plasma samples after PTS transport compared to NPTS transport. PTS is suitable for the transport of samples prior to routine haematology assays including PFT, but should not be used for samples intended for thrombin generation measurement. Copyright © 2017 Elsevier Ltd. All rights reserved.
Rodent Biocompatibility Test Using the NASA Foodbar and Epoxy EP21LV
NASA Technical Reports Server (NTRS)
Tillman, J.; Steele, M.; Dumars, P.; Vasques, M.; Girten, B.; Sun, S. (Technical Monitor)
2002-01-01
Epoxy has been used successfully to affix NASA foodbars to the inner walls of the Animal Enclosure Module for past space flight experiments utilizing rodents. The epoxy used on past missions was discontinued, making it necessary to identify a new epoxy for use on the STS-108 and STS-107 missions. This experiment was designed to test the basic biocompatibility of epoxy EP21LV with male rats (Sprague Dawley) and mice (Swiss Webster) when applied to NASA foodbars. For each species, the test was conducted with a control group fed untreated foodbars and an experimental group fed foodbars applied with EP21LV. For each species, there were no group differences in animal health and no statistical differences (P<0.05) in body weights throughout the study. In mice, there was a 16% increase in heart weight in the epoxy group; this result was not found in rats. For both species, there were no statistical differences found in other organ weights measured. In rats, blood glucose levels were 15% higher and both total protein and globulin were 10% lower in the epoxy group. Statistical differences in these parameters were not found in mice. For both species, no statistical differences were found in other blood parameters tested. Food consumption was not different in rats but water consumption was significantly decreased 10 to 15% in the epoxy group. The difference in water consumption is likely due to an increased water content of the epoxy-treated foodbars. Finally, both species avoided consumption of the epoxy material. Based on the global analysis of the results, the few parameters found to be statistically different do not appear to be a physiologically relevant effect of the epoxy material, We conclude that the EP21LV epoxy is biocompatible with rodents.
Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis
NASA Astrophysics Data System (ADS)
Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.
2014-03-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.
Multifractal Approach to Time Clustering of Earthquakes. Application to Mt. Vesuvio Seismicity
NASA Astrophysics Data System (ADS)
Codano, C.; Alonzo, M. L.; Vilardo, G.
The clustering structure of the Vesuvian earthquakes occurring is investigated by means of statistical tools: the inter-event time distribution, the running mean and the multifractal analysis. The first cannot clearly distinguish between a Poissonian process and a clustered one due to the difficulties of clearly distinguishing between an exponential distribution and a power law one. The running mean test reveals the clustering of the earthquakes, but looses information about the structure of the distribution at global scales. The multifractal approach can enlighten the clustering at small scales, while the global behaviour remains Poissonian. Subsequently the clustering of the events is interpreted in terms of diffusive processes of the stress in the earth crust.
Learning Scene Categories from High Resolution Satellite Image for Aerial Video Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheriyadat, Anil M
2011-01-01
Automatic scene categorization can benefit various aerial video processing applications. This paper addresses the problem of predicting the scene category from aerial video frames using a prior model learned from satellite imagery. We show that local and global features in the form of line statistics and 2-D power spectrum parameters respectively can characterize the aerial scene well. The line feature statistics and spatial frequency parameters are useful cues to distinguish between different urban scene categories. We learn the scene prediction model from highresolution satellite imagery to test the model on the Columbus Surrogate Unmanned Aerial Vehicle (CSUAV) dataset ollected bymore » high-altitude wide area UAV sensor platform. e compare the proposed features with the popular Scale nvariant Feature Transform (SIFT) features. Our experimental results show that proposed approach outperforms te SIFT model when the training and testing are conducted n disparate data sources.« less
Global, local and focused geographic clustering for case-control data with residential histories
Jacquez, Geoffrey M; Kaufmann, Andy; Meliker, Jaymie; Goovaerts, Pierre; AvRuskin, Gillian; Nriagu, Jerome
2005-01-01
Background This paper introduces a new approach for evaluating clustering in case-control data that accounts for residential histories. Although many statistics have been proposed for assessing local, focused and global clustering in health outcomes, few, if any, exist for evaluating clusters when individuals are mobile. Methods Local, global and focused tests for residential histories are developed based on sets of matrices of nearest neighbor relationships that reflect the changing topology of cases and controls. Exposure traces are defined that account for the latency between exposure and disease manifestation, and that use exposure windows whose duration may vary. Several of the methods so derived are applied to evaluate clustering of residential histories in a case-control study of bladder cancer in south eastern Michigan. These data are still being collected and the analysis is conducted for demonstration purposes only. Results Statistically significant clustering of residential histories of cases was found but is likely due to delayed reporting of cases by one of the hospitals participating in the study. Conclusion Data with residential histories are preferable when causative exposures and disease latencies occur on a long enough time span that human mobility matters. To analyze such data, methods are needed that take residential histories into account. PMID:15784151
NASA Astrophysics Data System (ADS)
Guadagnini, A.; Riva, M.; Dell'Oca, A.
2017-12-01
We propose to ground sensitivity of uncertain parameters of environmental models on a set of indices based on the main (statistical) moments, i.e., mean, variance, skewness and kurtosis, of the probability density function (pdf) of a target model output. This enables us to perform Global Sensitivity Analysis (GSA) of a model in terms of multiple statistical moments and yields a quantification of the impact of model parameters on features driving the shape of the pdf of model output. Our GSA approach includes the possibility of being coupled with the construction of a reduced complexity model that allows approximating the full model response at a reduced computational cost. We demonstrate our approach through a variety of test cases. These include a commonly used analytical benchmark, a simplified model representing pumping in a coastal aquifer, a laboratory-scale tracer experiment, and the migration of fracturing fluid through a naturally fractured reservoir (source) to reach an overlying formation (target). Our strategy allows discriminating the relative importance of model parameters to the four statistical moments considered. We also provide an appraisal of the error associated with the evaluation of our sensitivity metrics by replacing the original system model through the selected surrogate model. Our results suggest that one might need to construct a surrogate model with increasing level of accuracy depending on the statistical moment considered in the GSA. The methodological framework we propose can assist the development of analysis techniques targeted to model calibration, design of experiment, uncertainty quantification and risk assessment.
Reese, Sarah E; Archer, Kellie J; Therneau, Terry M; Atkinson, Elizabeth J; Vachon, Celine M; de Andrade, Mariza; Kocher, Jean-Pierre A; Eckel-Passow, Jeanette E
2013-11-15
Batch effects are due to probe-specific systematic variation between groups of samples (batches) resulting from experimental features that are not of biological interest. Principal component analysis (PCA) is commonly used as a visual tool to determine whether batch effects exist after applying a global normalization method. However, PCA yields linear combinations of the variables that contribute maximum variance and thus will not necessarily detect batch effects if they are not the largest source of variability in the data. We present an extension of PCA to quantify the existence of batch effects, called guided PCA (gPCA). We describe a test statistic that uses gPCA to test whether a batch effect exists. We apply our proposed test statistic derived using gPCA to simulated data and to two copy number variation case studies: the first study consisted of 614 samples from a breast cancer family study using Illumina Human 660 bead-chip arrays, whereas the second case study consisted of 703 samples from a family blood pressure study that used Affymetrix SNP Array 6.0. We demonstrate that our statistic has good statistical properties and is able to identify significant batch effects in two copy number variation case studies. We developed a new statistic that uses gPCA to identify whether batch effects exist in high-throughput genomic data. Although our examples pertain to copy number data, gPCA is general and can be used on other data types as well. The gPCA R package (Available via CRAN) provides functionality and data to perform the methods in this article. reesese@vcu.edu
NASA Astrophysics Data System (ADS)
Huang, C. H.; Chen, Y. I.; Liu, J. Y. G.; Huang, Y. H.
2014-12-01
Statistical evidence of the Seismo-Ionospheric Precursors (SIPs) is reported by statistically investigating the relationship between the Total Electron Content (TEC) in Global Ionosphere Map (GIM) and 56 M≥6.0 earthquakes during 1998-2013 in China. A median-based method and a z test are employed to detect the overall earthquake signatures. It is found that a reduction of positive signatures and an enhancement of negative signatures appear simultaneously on 3-5 days prior to the earthquakes in China. Finally, receiver operating characteristic (ROC) curves are used to measure the power of TEC for predicting M≥6.0 earthquakes in China.
NASA Astrophysics Data System (ADS)
Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.
2015-08-01
We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.
What Fraction of Global Fire Activity Can Be Forecast Using Sea Surface Temperatures?
NASA Astrophysics Data System (ADS)
Chen, Y.; Randerson, J. T.; Morton, D. C.; Andela, N.; Giglio, L.
2015-12-01
Variations in sea surface temperatures (SSTs) can influence climate dynamics in local and remote land areas, and thus influence fire-climate interactions that govern burned area. SST information has been recently used in statistical models to create seasonal outlooks of fire season severity in South America and as the initial condition for dynamical model predictions of fire activity in Indonesia. However, the degree to which large-scale ocean-atmosphere interactions can influence burned area in other continental regions has not been systematically explored. Here we quantified the amount of global burned area that can be predicted using SSTs in 14 different oceans regions as statistical predictors. We first examined lagged correlations between GFED4s burned area and the 14 ocean climate indices (OCIs) individually. The maximum correlations from different OCIs were used to construct a global map of fire predictability. About half of the global burned area can be forecast by this approach 3 months before the peak burning month (with a Pearson's r of 0.5 or higher), with the highest levels of predictability in Central America and Equatorial Asia. Several hotspots of predictability were identified using k-means cluster analysis. Within these regions, we tested the improvements of the forecast by using two OCIs from different oceans. Our forecast models were based on near-real-time SST data and may therefore support the development of new seasonal outlooks for fire activity that can aid the sustainable management of these fire-prone ecosystems.
α -induced reactions on 115In: Cross section measurements and statistical model analysis
NASA Astrophysics Data System (ADS)
Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.
2018-05-01
Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.
Adapt-Mix: learning local genetic correlation structure improves summary statistics-based analyses
Park, Danny S.; Brown, Brielin; Eng, Celeste; Huntsman, Scott; Hu, Donglei; Torgerson, Dara G.; Burchard, Esteban G.; Zaitlen, Noah
2015-01-01
Motivation: Approaches to identifying new risk loci, training risk prediction models, imputing untyped variants and fine-mapping causal variants from summary statistics of genome-wide association studies are playing an increasingly important role in the human genetics community. Current summary statistics-based methods rely on global ‘best guess’ reference panels to model the genetic correlation structure of the dataset being studied. This approach, especially in admixed populations, has the potential to produce misleading results, ignores variation in local structure and is not feasible when appropriate reference panels are missing or small. Here, we develop a method, Adapt-Mix, that combines information across all available reference panels to produce estimates of local genetic correlation structure for summary statistics-based methods in arbitrary populations. Results: We applied Adapt-Mix to estimate the genetic correlation structure of both admixed and non-admixed individuals using simulated and real data. We evaluated our method by measuring the performance of two summary statistics-based methods: imputation and joint-testing. When using our method as opposed to the current standard of ‘best guess’ reference panels, we observed a 28% decrease in mean-squared error for imputation and a 73.7% decrease in mean-squared error for joint-testing. Availability and implementation: Our method is publicly available in a software package called ADAPT-Mix available at https://github.com/dpark27/adapt_mix. Contact: noah.zaitlen@ucsf.edu PMID:26072481
Moshtagh-Khorasani, Majid; Akbarzadeh-T, Mohammad-R; Jahangiri, Nader; Khoobdel, Mehdi
2009-01-01
BACKGROUND: Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. METHODS: Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features. RESULTS: Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN results as well as author's earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, respectively, strongly rejecting the null hypothesis. CONCLUSIONS: The technique is applied and compared on both comprehensive and spontaneous speech test data for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features. PMID:21772867
Maurício, Sílvia Fernandes; da Silva, Jacqueline Braga; Bering, Tatiana; Correia, Maria Isabel Toulson Davisson
2013-04-01
The association between nutritional status and inflammation was assessed in patients with colorectal cancer and to verify their association with complications during anticancer treatment. The agreement between the Subjective Global Assessment (SGA) and different nutritional assessment methods was also evaluated. A cross-sectional, prospective, and descriptive study was performed. The nutritional status was defined by the SGA and the severity of inflammation was defined by the Glasgow Prognostic Score (GPS). The complications were classified using the Common Toxicity Criteria, version 3. Anthropometric measurements such as body mass index, triceps skinfold, midarm circumference, midarm muscle area, and adductor pollicis muscle thickness were also performed, as were handgrip strength and phase angle. The chi-square test, Fisher exact test, Spearman correlation coefficient, independent t test, analysis of variance, Gabriel test, and κ index were used for the statistical analysis. P < 0.05 was considered statistically significant. Seventy patients with colorectal cancer (60.4 ± 14.3 y old) were included. The nutritional status according to the SGA was associated with the GPS (P < 0.05), but the SGA and GPS were not related to the presence of complications. When comparing the different nutritional assessment methods with the SGA, there were statistically significant differences. Malnutrition is highly prevalent in patients with colorectal cancer. The nutritional status was associated with the GPS. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
2014-01-01
This Technical Publication (TP) is part 2 of a two-part study of the North Atlantic basin tropical cyclones that occurred during the weather satellite era, 1960-2013. In particular, this TP examines the inferred statistical relationships between 25 tropical cyclone parameters and 9 specific climate-related factors, including the (1) Oceanic Niño Index (ONI), (2) Southern Oscillation Index (SOI), (3) Atlantic Multidecadal Oscillation (AMO) index, (4) Quasi-Biennial Oscillation (QBO) index, (5) North Atlantic Oscillation (NAO) index of the Climate Prediction Center (CPC), (6) NAO index of the Climate Research Unit (CRU), (7) Armagh surface air temperature (ASAT), (8) Global Land-Ocean Temperature Index (GLOTI), and (9) Mauna Loa carbon dioxide (CO2) (MLCO2) index. Part 1 of this two-part study examined the statistical aspects of the 25 tropical cyclone parameters (e.g., frequencies, peak wind speed (PWS), accumulated cyclone energy (ACE), etc.) and provided the results of statistical testing (i.e., runs-testing, the t-statistic for independent samples, and Poisson distributions). Also, the study gave predictions for the frequencies of the number of tropical cyclones (NTC), number of hurricanes (NH), number of major hurricanes (NMH), and number of United States land-falling hurricanes (NUSLFH) expected for the 2014 season, based on the statistics of the overall interval 1960-2013, the subinterval 1995-2013, and whether the year 2014 would be either an El Niño year (ENY) or a non-El Niño year (NENY).
[Hydrologic variability and sensitivity based on Hurst coefficient and Bartels statistic].
Lei, Xu; Xie, Ping; Wu, Zi Yi; Sang, Yan Fang; Zhao, Jiang Yan; Li, Bin Bin
2018-04-01
Due to the global climate change and frequent human activities in recent years, the pure stochastic components of hydrological sequence is mixed with one or several of the variation ingredients, including jump, trend, period and dependency. It is urgently needed to clarify which indices should be used to quantify the degree of their variability. In this study, we defined the hydrological variability based on Hurst coefficient and Bartels statistic, and used Monte Carlo statistical tests to test and analyze their sensitivity to different variants. When the hydrological sequence had jump or trend variation, both Hurst coefficient and Bartels statistic could reflect the variation, with the Hurst coefficient being more sensitive to weak jump or trend variation. When the sequence had period, only the Bartels statistic could detect the mutation of the sequence. When the sequence had a dependency, both the Hurst coefficient and the Bartels statistics could reflect the variation, with the latter could detect weaker dependent variations. For the four variations, both the Hurst variability and Bartels variability increased with the increases of variation range. Thus, they could be used to measure the variation intensity of the hydrological sequence. We analyzed the temperature series of different weather stations in the Lancang River basin. Results showed that the temperature of all stations showed the upward trend or jump, indicating that the entire basin had experienced warming in recent years and the temperature variability in the upper and lower reaches was much higher. This case study showed the practicability of the proposed method.
Kang, Xiao-guang; Ma, Qing-Bin
2005-01-01
Within the global urban system, the statistical relationship between urban eco-environment (UE) and urban competitiveness (UC) (RUEC) is researched. Data showed that there is a statistically inverted-U relationship between UE and UC. Eco-environmental factor is put into the classification of industries, and gets six industrial types by two indexes viz. industries' eco-environmental demand and pressure. The statistical results showed that there is a strong relationship, for new industrial classification, between the changes of industrial structure and evolvement of UE. The drive mechanism of the evolvement of urban eco-environment, with human demand and global work division was analyzed. The conclusion is that the development stratege, industrial policies of cities, and environmental policies fo cities must be fit with their ranks among the global urban system. At the era of globalization, so far as the environmental policies, their rationality could not be assessed with the level of strictness, but it can enhance cities' competitiveness when they are fit with cities' capabilities to attract and control some sections of the industry's value-chain. None but these kinds of environmental policies can probably enhance the UC.
NASA Astrophysics Data System (ADS)
Ribera, M.; Gopal, S.
2014-12-01
Productivity hotspots are traditionally defined as concentrations of relatively high biomass compared to global reference values. These hotspots often signal atypical processes occurring in a location, and identifying them is a great first step at understanding the complexity inherent in the system. However, identifying local hotspots can be difficult when an overarching global pattern (i.e. spatial autocorrelation) already exists. This problem is particularly apparent in marine ecosystems because values of productivity in near-shore areas are consistently higher than those of the open ocean due to oceanographic processes such as upwelling. In such cases, if the global reference layer used to detect hotspots is too wide, hotspots may be only identified near the coast while missing known concentrations of organisms in offshore waters. On the other hand, if the global reference layer is too small, every single location may be considered a hotspot. We applied spatial and traditional statistics to remote sensing data to determine the optimal reference global spatial scale for identifying marine productivity hotspots in the Gulf of Maine. Our iterative process measured Getis and Ord's local G* statistic at different global scales until the variance of each hotspot was maximized. We tested this process with different full resolution MERIS chlorophyll layers (300m spatial resolution) for the whole Gulf of Maine. We concluded that the optimal global scale depends on the time of the year the remote sensing data was collected, particularly when coinciding with known seasonal phytoplankton blooms. The hotspots found through this process were also spatially heterogeneous in size, with bigger hotspots in areas offshore than in locations inshore. These results may be instructive for both managers and fisheries researchers as they adapt their fisheries management policies and methods to an ecosystem based approach (EBM).
ComprehensiveBench: a Benchmark for the Extensive Evaluation of Global Scheduling Algorithms
NASA Astrophysics Data System (ADS)
Pilla, Laércio L.; Bozzetti, Tiago C.; Castro, Márcio; Navaux, Philippe O. A.; Méhaut, Jean-François
2015-10-01
Parallel applications that present tasks with imbalanced loads or complex communication behavior usually do not exploit the underlying resources of parallel platforms to their full potential. In order to mitigate this issue, global scheduling algorithms are employed. As finding the optimal task distribution is an NP-Hard problem, identifying the most suitable algorithm for a specific scenario and comparing algorithms are not trivial tasks. In this context, this paper presents ComprehensiveBench, a benchmark for global scheduling algorithms that enables the variation of a vast range of parameters that affect performance. ComprehensiveBench can be used to assist in the development and evaluation of new scheduling algorithms, to help choose a specific algorithm for an arbitrary application, to emulate other applications, and to enable statistical tests. We illustrate its use in this paper with an evaluation of Charm++ periodic load balancers that stresses their characteristics.
Recognition of coarse-grained protein tertiary structure.
Lezon, Timothy; Banavar, Jayanth R; Maritan, Amos
2004-05-15
A model of the protein backbone is considered in which each residue is characterized by the location of its C(alpha) atom and one of a discrete set of conformal (phi, psi) states. We investigate the key differences between a description that offers a locally precise fit to known backbone structures and one that provides a globally accurate fit to protein structures. Using a statistical scoring scheme and threading, a protein's local best-fit conformation is highly recognizable, but its global structure cannot be directly determined from an amino acid sequence. The incorporation of information about the conformal states of neighboring residues along the chain allows one to accurately translate the local structure into a global structure. We present a two-step algorithm, which recognizes up to 95% of the tested protein native-state structures to within a 2.5 A root mean square deviation. Copyright 2004 Wiley-Liss, Inc.
Consistency of extreme flood estimation approaches
NASA Astrophysics Data System (ADS)
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Erus, Guray; Zacharaki, Evangelia I; Davatzikos, Christos
2014-04-01
This paper presents a method for capturing statistical variation of normal imaging phenotypes, with emphasis on brain structure. The method aims to estimate the statistical variation of a normative set of images from healthy individuals, and identify abnormalities as deviations from normality. A direct estimation of the statistical variation of the entire volumetric image is challenged by the high-dimensionality of images relative to smaller sample sizes. To overcome this limitation, we iteratively sample a large number of lower dimensional subspaces that capture image characteristics ranging from fine and localized to coarser and more global. Within each subspace, a "target-specific" feature selection strategy is applied to further reduce the dimensionality, by considering only imaging characteristics present in a test subject's images. Marginal probability density functions of selected features are estimated through PCA models, in conjunction with an "estimability" criterion that limits the dimensionality of estimated probability densities according to available sample size and underlying anatomy variation. A test sample is iteratively projected to the subspaces of these marginals as determined by PCA models, and its trajectory delineates potential abnormalities. The method is applied to segmentation of various brain lesion types, and to simulated data on which superiority of the iterative method over straight PCA is demonstrated. Copyright © 2014 Elsevier B.V. All rights reserved.
Erus, Guray; Zacharaki, Evangelia I.; Davatzikos, Christos
2014-01-01
This paper presents a method for capturing statistical variation of normal imaging phenotypes, with emphasis on brain structure. The method aims to estimate the statistical variation of a normative set of images from healthy individuals, and identify abnormalities as deviations from normality. A direct estimation of the statistical variation of the entire volumetric image is challenged by the high-dimensionality of images relative to smaller sample sizes. To overcome this limitation, we iteratively sample a large number of lower dimensional subspaces that capture image characteristics ranging from fine and localized to coarser and more global. Within each subspace, a “target-specific” feature selection strategy is applied to further reduce the dimensionality, by considering only imaging characteristics present in a test subject’s images. Marginal probability density functions of selected features are estimated through PCA models, in conjunction with an “estimability” criterion that limits the dimensionality of estimated probability densities according to available sample size and underlying anatomy variation. A test sample is iteratively projected to the subspaces of these marginals as determined by PCA models, and its trajectory delineates potential abnormalities. The method is applied to segmentation of various brain lesion types, and to simulated data on which superiority of the iterative method over straight PCA is demonstrated. PMID:24607564
NASA Astrophysics Data System (ADS)
Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.
2017-12-01
Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.
Sanchez, Ana M; Denny, Thomas N; O'Gorman, Maurice
2014-07-01
This Special Issue of the Journal of Immunological Methods includes 16 manuscripts describing quality assurance activities related to virologic and immunologic monitoring of six global laboratory resource programs that support international HIV/AIDS clinical trial studies: Collaboration for AIDS Vaccine Discovery (CAVD); Center for HIV/AIDS Vaccine Immunology (CHAVI); External Quality Assurance Program Oversight Laboratory (EQAPOL); HIV Vaccine Trial Network (HVTN); International AIDS Vaccine Initiative (IAVI); and Immunology Quality Assessment (IQA). The reports from these programs address the many components required to develop comprehensive quality control activities and subsequent quality assurance programs for immune monitoring in global clinical trials including: all aspects of processing, storing, and quality assessment of PBMC preparations used ubiquitously in HIV clinical trials, the development and optimization of assays for CD8 HIV responses and HIV neutralization, a comprehensive global HIV virus repository, and reports on the development and execution of novel external proficiency testing programs for immunophenotyping, intracellular cytokine staining, ELISPOT and luminex based cytokine measurements. In addition, there are articles describing the implementation of Good Clinical Laboratory Practices (GCLP) in a large quality assurance laboratory, the development of statistical methods specific for external proficiency testing assessment, a discussion on the ability to set objective thresholds for measuring rare events by flow cytometry, and finally, a manuscript which addresses a framework for the structured reporting of T cell immune function based assays. It is anticipated that this series of manuscripts covering a wide range of quality assurance activities associated with the conduct of global clinical trials will provide a resource for individuals and programs involved in improving the harmonization, standardization, accuracy, and sensitivity of virologic and immunologic testing. Copyright © 2014 Elsevier B.V. All rights reserved.
Evaluating attention in delirium: A comparison of bedside tests of attention.
Adamis, Dimitrios; Meagher, David; Murray, Orla; O'Neill, Donagh; O'Mahony, Edmond; Mulligan, Owen; McCarthy, Geraldine
2016-09-01
Impaired attention is a core diagnostic feature for delirium. The present study examined the discriminating properties for patients with delirium versus those with dementia and/or no neurocognitive disorder of four objective tests of attention: digit span, vigilance "A" test, serial 7s subtraction and months of the year backwards together with global clinical subjective rating of attention. This as a prospective study of older patients admitted consecutively in a general hospital. Participants were assessed using the Confusion Assessment Method, Delirium Rating Scale-98 Revised and Montreal Cognitive Assessment scales, and months of the year backwards. Pre-existing dementia was diagnosed according to the Diagnostic and Statistical Manual of Mental Disorders fourth edition criteria. The sample consisted of 200 participants (mean age 81.1 ± 6.5 years; 50% women; pre-existing cognitive impairment in 126 [63%]). A total of 34 (17%) were identified with delirium (Confusion Assessment Method +). The five approaches to assessing attention had statistically significant correlations (P < 0.05). Discriminant analysis showed that clinical subjective rating of attention in conjunction with the months of the year backwards had the best discriminatory ability to identify Confusion Assessment Method-defined delirium, and to discriminate patients with delirium from those with dementia and/or normal cognition. Both of these approaches had high sensitivity, but modest specificity. Objective tests are useful for prediction of non-delirium, but lack specificity for a delirium diagnosis. Global attentional deficits were more indicative of delirium than deficits of specific domains of attention. Geriatr Gerontol Int 2016; 16: 1028-1035. © 2015 The Authors. Geriatrics & Gerontology International published by. Wiley Publishing Asia Pty Ltd on behalf of Japanese Geriatrics Society.
A novel modification of the Turing test for artificial intelligence and robotics in healthcare.
Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos
2015-03-01
The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.
Koldaş Doğan, Şebnem; Ay, Saime; Evcik, Deniz
2017-01-01
The purpose of this study was to compare the effectiveness of two different laser therapy regimens on pain, lumbar range of motions (ROM) and functional capacity in patients with chronic low back pain (CLBP). Forty nine patients with CLBP were randomly assigned into two groups. Group 1 (n= 20) received hot-pack + laser therapy 1 (wavelength of 850 nm Gallium-Aluminum-Arsenide (Ga-Al-As) laser); group 2 (n= 29) received hot-pack + laser therapy 2 (wavelength of 650 nm Helyum-Neon (He-Ne), 785 ve 980 nm Gal-Al-As combined plaque laser) for 15 sessions. Pain severity, patient's and physician's global assessments were evaluated with visual analogue scale (VAS). Modified Schober test, right and left lateral flexion measurements were done. Modified Oswestry Disability Questionnaire (MODQ) was used for evaluation of functional disability. Measurements were done before and after the treatment. After treatment there were statistically significant improvements in pain severity, patient's and physician's global assessment, ROM and MODQ scores in both groups (P< 0.05). After the treatment there were statistically significant differences between the groups in lateral flexion measurements and MODQ scores (P< 0.05) except in pain severity, Modified Schober test, patient's and physician's global assessments (P> 0.05) in favor of those patients who received combined plaque laser therapy (group 2). Laser therapy applied with combined He-Ne and Ga-Al-As provides more improvements in lateral flexion measurements and disability of the patients, however no superiority of the two different laser devices to one another were detected on pain severity.
A simulation of orientation dependent, global changes in camera sensitivity in ECT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bieszk, J.A.; Hawman, E.G.; Malmin, R.E.
1984-01-01
ECT promises the abilities to: 1) observe radioisotope distributions in a patient without the summation of overlying activity to reduce contrast, and 2) measure quantitatively these distributions to further and more accurately assess organ function. Ideally, camera-based ECT systems should have a performance that is independent of camera orientation or gantry angle. This study is concerned with ECT quantitation errors that can arise from angle-dependent variations of camera sensitivity. Using simulated phantoms representative of heart and liver sections, the effects of sensitivity changes on reconstructed images were assessed both visually and quantitatively based on ROI sums. The sinogram for eachmore » test image was simulated with 128 linear digitization and 180 angular views. The global orientation-dependent sensitivity was modelled by applying an angular sensitivity dependence to the sinograms of the test images. Four sensitivity variations were studied. Amplitudes of 0% (as a reference), 5%, 10%, and 25% with a costheta dependence were studied as well as a cos2theta dependence with a 5% amplitude. Simulations were done with and without Poisson noise to: 1) determine trends in the quantitative effects as a function of the magnitude of the variation, and 2) to see how these effects are manifested in studies having statistics comparable to clinical cases. For the most realistic sensitivity variation (costheta, 5% ampl.), the ROIs chosen in the present work indicated changes of <0.5% in the noiseless case and <5% for the case with Poisson noise. The effects of statistics appear to dominate any effects due to global, sinusoidal, orientation-dependent sensitivity changes in the cases studied.« less
Support vector regression methodology for estimating global solar radiation in Algeria
NASA Astrophysics Data System (ADS)
Guermoui, Mawloud; Rabehi, Abdelaziz; Gairaa, Kacem; Benkaciali, Said
2018-01-01
Accurate estimation of Daily Global Solar Radiation (DGSR) has been a major goal for solar energy applications. In this paper we show the possibility of developing a simple model based on the Support Vector Regression (SVM-R), which could be used to estimate DGSR on the horizontal surface in Algeria based only on sunshine ratio as input. The SVM model has been developed and tested using a data set recorded over three years (2005-2007). The data was collected at the Applied Research Unit for Renewable Energies (URAER) in Ghardaïa city. The data collected between 2005-2006 are used to train the model while the 2007 data are used to test the performance of the selected model. The measured and the estimated values of DGSR were compared during the testing phase statistically using the Root Mean Square Error (RMSE), Relative Square Error (rRMSE), and correlation coefficient (r2), which amount to 1.59(MJ/m2), 8.46 and 97,4%, respectively. The obtained results show that the SVM-R is highly qualified for DGSR estimation using only sunshine ratio.
Contreras-Torres, Ernesto
2018-06-02
In this study, I introduce novel global and local 0D-protein descriptors based on a statistical quantity named Total Sum of Squares (TSS). This quantity represents the sum of the squares differences of amino acid properties from the arithmetic mean property. As an extension, the amino acid-types and amino acid-groups formalisms are used for describing zones of interest in proteins. To assess the effectiveness of the proposed descriptors, a Nearest Neighbor model for predicting the major four protein structural classes was built. This model has a success rate of 98.53% on the jackknife cross-validation test; this performance being superior to other reported methods despite the simplicity of the predictor. Additionally, this predictor has an average success rate of 98.35% in different cross-validation tests performed. A value of 0.98 for the Kappa statistic clearly discriminates this model from a random predictor. The results obtained by the Nearest Neighbor model demonstrated the ability of the proposed descriptors not only to reflect relevant biochemical information related to the structural classes of proteins but also to allow appropriate interpretability. It can thus be expected that the current method may play a supplementary role to other existing approaches for protein structural class prediction and other protein attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gene set analysis using variance component tests.
Huang, Yen-Tsung; Lin, Xihong
2013-06-28
Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.
Global Tuberculosis Report 2016
... Alt+0 Navigation Alt+1 Content Alt+2 Tuberculosis (TB) Menu Tuberculosis Data and statistics Regional Framework Resources Meetings and events Global tuberculosis report 2017 WHO has published a global TB ...
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
NASA Astrophysics Data System (ADS)
Ivanov, Martin; Warrach-Sagi, Kirsten; Wulfmeyer, Volker
2018-04-01
A new approach for rigorous spatial analysis of the downscaling performance of regional climate model (RCM) simulations is introduced. It is based on a multiple comparison of the local tests at the grid cells and is also known as "field" or "global" significance. New performance measures for estimating the added value of downscaled data relative to the large-scale forcing fields are developed. The methodology is exemplarily applied to a standard EURO-CORDEX hindcast simulation with the Weather Research and Forecasting (WRF) model coupled with the land surface model NOAH at 0.11 ∘ grid resolution. Monthly temperature climatology for the 1990-2009 period is analysed for Germany for winter and summer in comparison with high-resolution gridded observations from the German Weather Service. The field significance test controls the proportion of falsely rejected local tests in a meaningful way and is robust to spatial dependence. Hence, the spatial patterns of the statistically significant local tests are also meaningful. We interpret them from a process-oriented perspective. In winter and in most regions in summer, the downscaled distributions are statistically indistinguishable from the observed ones. A systematic cold summer bias occurs in deep river valleys due to overestimated elevations, in coastal areas due probably to enhanced sea breeze circulation, and over large lakes due to the interpolation of water temperatures. Urban areas in concave topography forms have a warm summer bias due to the strong heat islands, not reflected in the observations. WRF-NOAH generates appropriate fine-scale features in the monthly temperature field over regions of complex topography, but over spatially homogeneous areas even small biases can lead to significant deteriorations relative to the driving reanalysis. As the added value of global climate model (GCM)-driven simulations cannot be smaller than this perfect-boundary estimate, this work demonstrates in a rigorous manner the clear additional value of dynamical downscaling over global climate simulations. The evaluation methodology has a broad spectrum of applicability as it is distribution-free, robust to spatial dependence, and accounts for time series structure.
[Randomized double-blind comparative study of minaprine (200mg/j) and of placebo on memory loss].
Allain, H; Belliard, S; Lieury, A; Menard, G; Patat, A; Le Coz, F; Gandon, J M
1996-01-01
Thirty five subjects (age: 45-69 years) with subjective memory loss, without any other neuropsychiatric or somatic disease, were recruited in a phase II study. This double blind randomized versus placebo controlled study compared the effects of minaprine (200 mg/d) with placebo, in two parallel groups, during 2 months, on memory, attention and vigilance. Three psychometric tests were the main criteria of assessment: a standardized battery of memory tests (SM 5), the dual-coding test, the analysis of choice reaction times (CRT) and the critical flicker fusion point (CFF). A positive effect of minaprine was detected on words delayed recall (p = 0.028) and immediate recognition of words (p = 0.049). The global clinical tests (CGI, MacNair scale) were not statistically modified. Tolerability of minaprine and placebo were comparable. A positive pharmacodynamic activity on mnemonic performance is thus demonstrated in favour of minaprine (200 mg/d) in this specific population characterized by a memory complaint. These results would lead to a phase III study in which the main criteria would be global scales in order to confirm the clinical reliability of the present results.
Aiello, Francesco A; Gross, Erica R; Krajewski, Aleksandra; Fuller, Robert; Morgan, Anthony; Duffy, Andrew; Longo, Walter; Kozol, Robert; Chandawarkar, Rajiv
2010-09-01
Postoperative visits to the emergency department (ED) instead of the surgeon's office consume enormous cost. Postoperative ED visits can be avoided. Fully accredited, single-institution, 617-bed hospital affiliated with the University of Connecticut School of Medicine. Retrospective analysis of 597 consecutive patients with appendectomies over a 4-year period. Demographic and medical data, at initial presentation, surgery, and ED visit were recorded as categorical variables and statistically analyzed (Pearson chi(2) test, Fisher exact test, and linear-by-linear). Costs were calculated from the hospital's billing department. Forty-six patients returned to the ED within the global period with pain (n = 22, 48%), wound-related issues (n = 6, 13%), weakness (n = 4, 9%), fever (13%), and nausea and vomiting (n = 3, 6%). Thirteen patients (28%) required readmission. Predictive factors for ED visit postoperatively were perforated appendicitis (2-fold increase over uncomplicated appendicitis) and comorbidities (cardiovascular or diabetes). The cost of investigations during ED visits was $55,000 plus physician services. ED visits during the postoperative global period are avoidable by identifying patients who may need additional care; improving patient education, optimizing pain control, and improving patient office access. 2010 Elsevier Inc. All rights reserved.
Revisiting the climate impacts of cool roofs around the globe using an Earth system model
NASA Astrophysics Data System (ADS)
Zhang, Jiachen; Zhang, Kai; Liu, Junfeng; Ban-Weiss, George
2016-08-01
Solar reflective ‘cool roofs’ absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofs in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (-0.11 ± 0.10 K) and the United States (-0.14 ± 0.12 K); India and Europe show statistically insignificant changes. Though past research has disagreed on whether widespread adoption of cool roofs would cool or warm global climate, these studies have lacked analysis on the statistical significance of global temperature changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (-0.0021 ± 0.026 K). Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.
Revisiting the Climate Impacts of Cool Roofs around the Globe Using an Earth System Model
NASA Astrophysics Data System (ADS)
Zhang, J.; Ban-Weiss, G. A.; Zhang, K.; Liu, J.
2016-12-01
Solar reflective "cool roofs" absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofs in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (-0.11±0.10 K) and the United States (-0.14±0.12 K); India and Europe show statistically insignificant changes. Though past research has disagreed on whether widespread adoption of cool roofs would cool or warm global climate, these studies have lacked analysis on the statistical significance of global temperature changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (-0.0021 ± 0.026 K). Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.
Global Statistical Learning in a Visual Search Task
ERIC Educational Resources Information Center
Jones, John L.; Kaschak, Michael P.
2012-01-01
Locating a target in a visual search task is facilitated when the target location is repeated on successive trials. Global statistical properties also influence visual search, but have often been confounded with local regularities (i.e., target location repetition). In two experiments, target locations were not repeated for four successive trials,…
ERIC Educational Resources Information Center
Komatsu, Hikaru; Rappleye, Jeremy
2017-01-01
Several recent, highly influential comparative studies have made strong statistical claims that improvements on global learning assessments such as PISA will lead to higher GDP growth rates. These claims have provided the primary source of legitimation for policy reforms championed by leading international organisations, most notably the World…
van Tilburg, C W J; Stronks, D L; Groeneweg, J G; Huygen, F J P M
2017-03-01
Investigate the effect of percutaneous radiofrequency compared to a sham procedure, applied to the ramus communicans for treatment of lumbar disc pain. Randomized sham-controlled, double-blind, crossover, multicenter clinical trial. Multidisciplinary pain centres of two general hospitals. Sixty patients aged 18 or more with medical history and physical examination suggestive for lumbar disc pain and a reduction of two or more on a numerical rating scale (0-10) after a diagnostic ramus communicans test block. Treatment group: percutaneous radiofrequency treatment applied to the ramus communicans; sham: same procedure except radiofrequency treatment. pain reduction. Secondary outcome measure: Global Perceived Effect. No statistically significant difference in pain level over time between the groups, as well as in the group was found; however, the factor period yielded a statistically significant result. In the crossover group, 11 out of 16 patients experienced a reduction in NRS of 2 or more at 1 month (no significant deviation from chance). No statistically significant difference in satisfaction over time between the groups was found. The independent factors group and period also showed no statistically significant effects. The same applies to recovery: no statistically significant effects were found. The null hypothesis of no difference in pain reduction and in Global Perceived Effect between the treatment and sham group cannot be rejected. Post hoc analysis revealed that none of the investigated parameters contributed to the prediction of a significant pain reduction. Interrupting signalling through the ramus communicans may interfere with the transition of painful information from the discs to the central nervous system. Methodological differences exist in studies evaluating the efficacy of radiofrequency treatment for lumbar disc pain. A randomized, sham-controlled, double-blind, multicenter clinical trial on the effect of radiofrequency at the ramus communicans for lumbar disc pain was conducted. The null hypothesis of no difference in pain reduction and in Global Perceived Effect between the treatment and sham group cannot be rejected. © 2016 The Authors. European Journal of Pain published by John Wiley & Sons Ltd on behalf of European Pain Federation - EFIC®.
Super-delta: a new differential gene expression analysis procedure with robust data normalization.
Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing
2017-12-21
Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super-delta provides new insights to the area of differential gene expression analysis. Solid theoretical foundation supports its asymptotic unbiasedness and technical noise-free properties. Implementation on real and simulated datasets demonstrates its decent performance compared with state-of-art procedures. It also has the potential of expansion to be incorporated with other data type and/or more general between-group comparison problems.
Knudsen, Anders Dahl; Bennike, Tue; Kjeldal, Henrik; Birkelund, Svend; Otzen, Daniel Erik; Stensballe, Allan
2014-05-30
We describe Condenser, a freely available, comprehensive open-source tool for merging multidimensional quantitative proteomics data from the Matrix Science Mascot Distiller Quantitation Toolbox into a common format ready for subsequent bioinformatic analysis. A number of different relative quantitation technologies, such as metabolic (15)N and amino acid stable isotope incorporation, label-free and chemical-label quantitation are supported. The program features multiple options for curative filtering of the quantified peptides, allowing the user to choose data quality thresholds appropriate for the current dataset, and ensure the quality of the calculated relative protein abundances. Condenser also features optional global normalization, peptide outlier removal, multiple testing and calculation of t-test statistics for highlighting and evaluating proteins with significantly altered relative protein abundances. Condenser provides an attractive addition to the gold-standard quantitative workflow of Mascot Distiller, allowing easy handling of larger multi-dimensional experiments. Source code, binaries, test data set and documentation are available at http://condenser.googlecode.com/. Copyright © 2014 Elsevier B.V. All rights reserved.
Alonso, Conchita; Pérez, Ricardo; Bazaga, Pilar; Herrera, Carlos M.
2015-01-01
DNA cytosine methylation is a widespread epigenetic mechanism in eukaryotes, and plant genomes commonly are densely methylated. Genomic methylation can be associated with functional consequences such as mutational events, genomic instability or altered gene expression, but little is known on interspecific variation in global cytosine methylation in plants. In this paper, we compare global cytosine methylation estimates obtained by HPLC and use a phylogenetically-informed analytical approach to test for significance of evolutionary signatures of this trait across 54 angiosperm species in 25 families. We evaluate whether interspecific variation in global cytosine methylation is statistically related to phylogenetic distance and also whether it is evolutionarily correlated with genome size (C-value). Global cytosine methylation varied widely between species, ranging between 5.3% (Arabidopsis) and 39.2% (Narcissus). Differences between species were related to their evolutionary trajectories, as denoted by the strong phylogenetic signal underlying interspecific variation. Global cytosine methylation and genome size were evolutionarily correlated, as revealed by the significant relationship between the corresponding phylogenetically independent contrasts. On average, a ten-fold increase in genome size entailed an increase of about 10% in global cytosine methylation. Results show that global cytosine methylation is an evolving trait in angiosperms whose evolutionary trajectory is significantly linked to changes in genome size, and suggest that the evolutionary implications of epigenetic mechanisms are likely to vary between plant lineages. PMID:25688257
Statistics in biomedical laboratory and clinical science: applications, issues and pitfalls.
Ludbrook, John
2008-01-01
This review is directed at biomedical scientists who want to gain a better understanding of statistics: what tests to use, when, and why. In my view, even during the planning stage of a study it is very important to seek the advice of a qualified biostatistician. When designing and analyzing a study, it is important to construct and test global hypotheses, rather than to make multiple tests on the data. If the latter cannot be avoided, it is essential to control the risk of making false-positive inferences by applying multiple comparison procedures. For comparing two means or two proportions, it is best to use exact permutation tests rather then the better known, classical, ones. For comparing many means, analysis of variance, often of a complex type, is the most powerful approach. The correlation coefficient should never be used to compare the performances of two methods of measurement, or two measures, because it does not detect bias. Instead the Altman-Bland method of differences or least-products linear regression analysis should be preferred. Finally, the educational value to investigators of interaction with a biostatistician, before, during and after a study, cannot be overemphasized. (c) 2007 S. Karger AG, Basel.
Memory and Trend of Precipitation in China during 1966-2013
NASA Astrophysics Data System (ADS)
Du, M.; Sun, F.; Liu, W.
2017-12-01
As climate change has had a significant impact on water cycle, the characteristic and variation of precipitation under climate change turned into a hotspot in hydrology. This study aims to analyze the trend and memory (both short-term and long-term) of precipitation in China. To do that, we apply statistical tests (including Mann-Kendall test, Ljung-Box test and Hurst exponent) to annual precipitation (P), frequency of rainy day (λ) and mean daily rainfall in days when precipitation occurs (α) in China (1966-2013). We also use a resampling approach to determine the field significance. From there, we evaluate the spatial distribution and percentages of stations with significant memory or trend. We find that the percentages of significant downtrends for λ and significant uptrends for α are significantly larger than the critical values at 95% field significance level, probably caused by the global warming. From these results, we conclude that extra care is necessary when significant results are obtained using statistical tests. This is because the null hypothesis could be rejected by chance and this situation is more likely to occur if spatial correlation is ignored according to the results of the resampling approach.
Magliocca, Nicholas R; Brown, Daniel G; Ellis, Erle C
2014-01-01
Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM) as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model) landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model enhancement.
Magliocca, Nicholas R.; Brown, Daniel G.; Ellis, Erle C.
2014-01-01
Local changes in land use result from the decisions and actions of land-users within land systems, which are structured by local and global environmental, economic, political, and cultural contexts. Such cross-scale causation presents a major challenge for developing a general understanding of how local decision-making shapes land-use changes at the global scale. This paper implements a generalized agent-based model (ABM) as a virtual laboratory to explore how global and local processes influence the land-use and livelihood decisions of local land-users, operationalized as settlement-level agents, across the landscapes of six real-world test sites. Test sites were chosen in USA, Laos, and China to capture globally-significant variation in population density, market influence, and environmental conditions, with land systems ranging from swidden to commercial agriculture. Publicly available global data were integrated into the ABM to model cross-scale effects of economic globalization on local land-use decisions. A suite of statistics was developed to assess the accuracy of model-predicted land-use outcomes relative to observed and random (i.e. null model) landscapes. At four of six sites, where environmental and demographic forces were important constraints on land-use choices, modeled land-use outcomes were more similar to those observed across sites than the null model. At the two sites in which market forces significantly influenced land-use and livelihood decisions, the model was a poorer predictor of land-use outcomes than the null model. Model successes and failures in simulating real-world land-use patterns enabled the testing of hypotheses on land-use decision-making and yielded insights on the importance of missing mechanisms. The virtual laboratory approach provides a practical framework for systematic improvement of both theory and predictive skill in land change science based on a continual process of experimentation and model enhancement. PMID:24489696
A global reference model of Curie-point depths based on EMAG2
NASA Astrophysics Data System (ADS)
Li, Chun-Feng; Lu, Yu; Wang, Jian
2017-03-01
In this paper, we use a robust inversion algorithm, which we have tested in many regional studies, to obtain the first global model of Curie-point depth (GCDM) from magnetic anomaly inversion based on fractal magnetization. Statistically, the oceanic Curie depth mean is smaller than the continental one, but continental Curie depths are almost bimodal, showing shallow Curie points in some old cratons. Oceanic Curie depths show modifications by hydrothermal circulations in young oceanic lithosphere and thermal perturbations in old oceanic lithosphere. Oceanic Curie depths also show strong dependence on the spreading rate along active spreading centers. Curie depths and heat flow are correlated, following optimal theoretical curves of average thermal conductivities K = ~2.0 W(m°C)-1 for the ocean and K = ~2.5 W(m°C)-1 for the continent. The calculated heat flow from Curie depths and large-interval gridding of measured heat flow all indicate that the global heat flow average is about 70.0 mW/m2, leading to a global heat loss ranging from ~34.6 to 36.6 TW.
Statistical models of global Langmuir mixing
NASA Astrophysics Data System (ADS)
Li, Qing; Fox-Kemper, Baylor; Breivik, Øyvind; Webb, Adrean
2017-05-01
The effects of Langmuir mixing on the surface ocean mixing may be parameterized by applying an enhancement factor which depends on wave, wind, and ocean state to the turbulent velocity scale in the K-Profile Parameterization. Diagnosing the appropriate enhancement factor online in global climate simulations is readily achieved by coupling with a prognostic wave model, but with significant computational and code development expenses. In this paper, two alternatives that do not require a prognostic wave model, (i) a monthly mean enhancement factor climatology, and (ii) an approximation to the enhancement factor based on the empirical wave spectra, are explored and tested in a global climate model. Both appear to reproduce the Langmuir mixing effects as estimated using a prognostic wave model, with nearly identical and substantial improvements in the simulated mixed layer depth and intermediate water ventilation over control simulations, but significantly less computational cost. Simpler approaches, such as ignoring Langmuir mixing altogether or setting a globally constant Langmuir number, are found to be deficient. Thus, the consequences of Stokes depth and misaligned wind and waves are important.
A global reference model of Curie-point depths based on EMAG2.
Li, Chun-Feng; Lu, Yu; Wang, Jian
2017-03-21
In this paper, we use a robust inversion algorithm, which we have tested in many regional studies, to obtain the first global model of Curie-point depth (GCDM) from magnetic anomaly inversion based on fractal magnetization. Statistically, the oceanic Curie depth mean is smaller than the continental one, but continental Curie depths are almost bimodal, showing shallow Curie points in some old cratons. Oceanic Curie depths show modifications by hydrothermal circulations in young oceanic lithosphere and thermal perturbations in old oceanic lithosphere. Oceanic Curie depths also show strong dependence on the spreading rate along active spreading centers. Curie depths and heat flow are correlated, following optimal theoretical curves of average thermal conductivities K = ~2.0 W(m°C) -1 for the ocean and K = ~2.5 W(m°C) -1 for the continent. The calculated heat flow from Curie depths and large-interval gridding of measured heat flow all indicate that the global heat flow average is about 70.0 mW/m 2 , leading to a global heat loss ranging from ~34.6 to 36.6 TW.
Evaluation of variability in high-resolution protein structures by global distance scoring.
Anzai, Risa; Asami, Yoshiki; Inoue, Waka; Ueno, Hina; Yamada, Koya; Okada, Tetsuji
2018-01-01
Systematic analysis of the statistical and dynamical properties of proteins is critical to understanding cellular events. Extraction of biologically relevant information from a set of high-resolution structures is important because it can provide mechanistic details behind the functional properties of protein families, enabling rational comparison between families. Most of the current structural comparisons are pairwise-based, which hampers the global analysis of increasing contents in the Protein Data Bank. Additionally, pairing of protein structures introduces uncertainty with respect to reproducibility because it frequently accompanies other settings for superimposition. This study introduces intramolecular distance scoring for the global analysis of proteins, for each of which at least several high-resolution structures are available. As a pilot study, we have tested 300 human proteins and showed that the method is comprehensively used to overview advances in each protein and protein family at the atomic level. This method, together with the interpretation of the model calculations, provide new criteria for understanding specific structural variation in a protein, enabling global comparison of the variability in proteins from different species.
A SIGNIFICANCE TEST FOR THE LASSO1
Lockhart, Richard; Taylor, Jonathan; Tibshirani, Ryan J.; Tibshirani, Robert
2014-01-01
In the sparse linear regression setting, we consider testing the significance of the predictor variable that enters the current lasso model, in the sequence of models visited along the lasso solution path. We propose a simple test statistic based on lasso fitted values, called the covariance test statistic, and show that when the true model is linear, this statistic has an Exp(1) asymptotic distribution under the null hypothesis (the null being that all truly active variables are contained in the current lasso model). Our proof of this result for the special case of the first predictor to enter the model (i.e., testing for a single significant predictor variable against the global null) requires only weak assumptions on the predictor matrix X. On the other hand, our proof for a general step in the lasso path places further technical assumptions on X and the generative model, but still allows for the important high-dimensional case p > n, and does not necessarily require that the current lasso model achieves perfect recovery of the truly active variables. Of course, for testing the significance of an additional variable between two nested linear models, one typically uses the chi-squared test, comparing the drop in residual sum of squares (RSS) to a χ12 distribution. But when this additional variable is not fixed, and has been chosen adaptively or greedily, this test is no longer appropriate: adaptivity makes the drop in RSS stochastically much larger than χ12 under the null hypothesis. Our analysis explicitly accounts for adaptivity, as it must, since the lasso builds an adaptive sequence of linear models as the tuning parameter λ decreases. In this analysis, shrinkage plays a key role: though additional variables are chosen adaptively, the coefficients of lasso active variables are shrunken due to the l1 penalty. Therefore, the test statistic (which is based on lasso fitted values) is in a sense balanced by these two opposing properties—adaptivity and shrinkage—and its null distribution is tractable and asymptotically Exp(1). PMID:25574062
Global ensemble texture representations are critical to rapid scene perception.
Brady, Timothy F; Shafer-Skelton, Anna; Alvarez, George A
2017-06-01
Traditionally, recognizing the objects within a scene has been treated as a prerequisite to recognizing the scene itself. However, research now suggests that the ability to rapidly recognize visual scenes could be supported by global properties of the scene itself rather than the objects within the scene. Here, we argue for a particular instantiation of this view: That scenes are recognized by treating them as a global texture and processing the pattern of orientations and spatial frequencies across different areas of the scene without recognizing any objects. To test this model, we asked whether there is a link between how proficient individuals are at rapid scene perception and how proficiently they represent simple spatial patterns of orientation information (global ensemble texture). We find a significant and selective correlation between these tasks, suggesting a link between scene perception and spatial ensemble tasks but not nonspatial summary statistics In a second and third experiment, we additionally show that global ensemble texture information is not only associated with scene recognition, but that preserving only global ensemble texture information from scenes is sufficient to support rapid scene perception; however, preserving the same information is not sufficient for object recognition. Thus, global ensemble texture alone is sufficient to allow activation of scene representations but not object representations. Together, these results provide evidence for a view of scene recognition based on global ensemble texture rather than a view based purely on objects or on nonspatially localized global properties. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.
Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C
2010-08-06
Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.
Comparative Shotgun Proteomics Using Spectral Count Data and Quasi-Likelihood Modeling
2010-01-01
Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography−tandem mass spectrometry (LC−MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher’s Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography−multiple reaction monitoring mass spectrometry (LC−MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples. PMID:20586475
Price, Charlotte; Stallard, Nigel; Creton, Stuart; Indans, Ian; Guest, Robert; Griffiths, David; Edwards, Philippa
2010-01-01
Acute inhalation toxicity of chemicals has conventionally been assessed by the median lethal concentration (LC50) test (organisation for economic co-operation and development (OECD) TG 403). Two new methods, the recently adopted acute toxic class method (ATC; OECD TG 436) and a proposed fixed concentration procedure (FCP), have recently been considered, but statistical evaluations of these methods did not investigate the influence of differential sensitivity between male and female rats on the outcomes. This paper presents an analysis of data from the assessment of acute inhalation toxicity for 56 substances. Statistically significant differences between the LC50 for males and females were found for 16 substances, with greater than 10-fold differences in the LC50 for two substances. The paper also reports a statistical evaluation of the three test methods in the presence of unanticipated gender differences. With TG 403, a gender difference leads to a slightly greater chance of under-classification. This is also the case for the ATC method, but more pronounced than for TG 403, with misclassification of nearly all substances from Globally Harmonised System (GHS) class 3 into class 4. As the FCP uses females only, if females are more sensitive, the classification is unchanged. If males are more sensitive, the procedure may lead to under-classification. Additional research on modification of the FCP is thus proposed. PMID:20488841
Wehner, Michael F.; Bala, G.; Duffy, Phillip; ...
2010-01-01
We present a set of high-resolution global atmospheric general circulation model (AGCM) simulations focusing on the model's ability to represent tropical storms and their statistics. We find that the model produces storms of hurricane strength with realistic dynamical features. We also find that tropical storm statistics are reasonable, both globally and in the north Atlantic, when compared to recent observations. The sensitivity of simulated tropical storm statistics to increases in sea surface temperature (SST) is also investigated, revealing that a credible late 21st century SST increase produced increases in simulated tropical storm numbers and intensities in all ocean basins. Whilemore » this paper supports previous high-resolution model and theoretical findings that the frequency of very intense storms will increase in a warmer climate, it differs notably from previous medium and high-resolution model studies that show a global reduction in total tropical storm frequency. However, we are quick to point out that this particular model finding remains speculative due to a lack of radiative forcing changes in our time-slice experiments as well as a focus on the Northern hemisphere tropical storm seasons.« less
The GEOS Ozone Data Assimilation System: Specification of Error Statistics
NASA Technical Reports Server (NTRS)
Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.
2000-01-01
A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.
Etude hydromecanique d'une fracture en cisaillement sous contrainte normale constante
NASA Astrophysics Data System (ADS)
Lamontagne, Eric
This research study deals with the effects of shear direction and injection flow rate on the flow directional anisotropy for a given normal stress. It presents experimental works on hydromechanical shear behaviour of a fracture under constant normal stress conditions that permits the characterisation of the intrinsic hydraulic transmissivity in relation with the directional anisotropy of the roughness morphology on the fracture surfaces. Tests were performed on mortar replicas of a natural fracture so that the fracture roughness and void space geometry were kept the same for each test. The experimental work program was performed through direct shear tests on the fracture replicas in four shear directions under four constant normal stress levels. The application of the normal stress was followed by several injections of fluid under constant flow rate. Then, for each defined shear displacement, several injections of fluid were done at different flow rate but under constant flow rate. The test results show that: (1) for the whole shear tests, the global intrinsic transmissivity is included within an enveloping zone of about one order of size. The transmissivity curves within the enveloping zone has a particularity to increase about two orders of size in the first millimetre of shear displacement and subsequently stabilised rapidly; (2) the highest dilatancy do not correspond necessarily with the highest intrinsic transmissivity so that, the behaviour of the global intrinsic transmissivity is not directly proportional to the fracture dilatancy during shear; (3) after the peak shear stress, the divergence is more marked between the global intrinsic transmissivity curves at various flow rate; (4) after peak shear strength and the beginning of asperity degradation, the gradual passage to residual friction shear behaviour causes a directional flow anisotropy and a reorientation of the flow chenalisation direction sub perpendicularly to the shear direction; (5) the anisotropy is not to develop equally between the two sense in the perpendicular direction to shear direction. In order to characterise the dynamics of the flow pattern in the fracture, a statistical analysis of the surfaces morphology of the fracture and the casting of void space geometry were performed before and after shear. A statistical analysis of asperity heights, on the global scale of the fracture surfaces, permits to characterise the fracture morphology and put in evidence a large morphological structure on which are superposed smaller asperities of variable dimensions. This large dimension structure generate a higher level landing occupying more than half of the fracture area. The study of the surfaces morphology of the fracture, performed with the geostatistical mean asperity heights variogram by direction before shearing, show the presence of two entangled morphologic structure families (28 and 15 mm). This same study done after shearing shows that the asperity degradation seems associated with the reduction of the global intrinsic transmissivity of the fracture. Finally, the void spaces morphology evaluated by casting techniques, during the shear tests, has permitted to verify the contacts evolution with the increasing shear displacement and visualised flow chenalisation during fracture shearing.
Does ketorolac have a preemptive analgesic effect? A randomized, double-blind, control study.
Gutta, Rajesh; Koehn, Christopher R; James, Laura E
2013-12-01
To examine the effect of ketorolac used as preemptive analgesia on the intensity of pain and analgesic requirements in the postoperative period. The present study was a randomized, double-blind, control study involving human subjects who underwent extraction of the mandibular third molars under intravenous anesthesia. The study group received 30 mg of intravenous ketorolac preoperatively, and the control group received a placebo. The pain intensity was measured using a visual analog scale. The decrease in postoperative pain was measured as the primary outcome variable. The interval to the first dose of analgesic, total analgesic requirements, and the global assessment were measured as secondary outcomes. The data were analyzed using the Student t test, Wilcoxon rank sum test, and χ(2) test. A total of 85 adult subjects, American Society of Anesthesiologists class I and II, participated in the present study. Randomization was effective, as shown by the absence of differences in the study variables between the 2 groups. Of the 85 patients, 29 were men and 56 were women. The average patient age was 22.6 years in the study group and 24 years in the control group. Those in the ketorolac group recorded lower visual analog scale pain scores at all intervals. However, the difference was statistically significant at the 4-hour interval (P = .01). The median interval to the use of rescue medication in the ketorolac group was 9.5 hours compared with 7 hours in the control group. However, no statistically significant difference was found in the interval to the rescue analgesic between the 2 groups (P = .39). No statistically significant difference was noted in the total amount of postoperative analgesics required in the first 72 hours between the 2 groups (P = .54). Also, no difference was seen in the global assessment between the 2 groups (P = .22). Those who received 30 mg of intravenous ketorolac preoperatively had less pain in the early (8-hour) postoperative period. The median interval to rescue medication was 2 hours longer in the ketorolac group. However, the difference in the total narcotic consumption was clinically and statistically insignificant between the ketorolac and control groups. Copyright © 2013 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Min, J. B.; Xue, D.; Shi, Y.
2013-01-01
A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.
Estimating daily global solar radiation by day of the year in Algeria
NASA Astrophysics Data System (ADS)
Aoun, Nouar; Bouchouicha, Kada
2017-05-01
This study presents six empirical models based on the day-of-the-year number for estimating global solar radiation on a horizontal surface. For this case study, 21 years of experimental data sets for 21 cities over the whole Algerian territory are utilized to develop these models for each city and for all of Algeria. In this study, the territory of Algeria was divided into four different climatic zones, i.e., Arid, Semi-arid, Highlands and Mediterranean. The accuracy of the all-Algeria model was tested for each city and for each climate zone. To evaluate the accuracy of the models, the RMSE, rRMSE, MABE, MAPE, and R, which are the most commonly applied statistical parameters, were utilized. The results show that the six developed models provide excellent predictions for global solar radiation for each city and for all-Algeria. Furthermore, the model showing the greatest accuracy is the sine and cosine wave trigonometric model.
Thinking Globally, Acting Locally: Using the Local Environment to Explore Global Issues.
ERIC Educational Resources Information Center
Simmons, Deborah
1994-01-01
Asserts that water pollution is a global problem and presents statistics indicating how much of the world's water is threatened. Presents three elementary school classroom activities on water quality and local water resources. Includes a figure describing the work of the Global Rivers Environmental Education Network. (CFR)
Optimizing human activity patterns using global sensitivity analysis.
Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M
2014-12-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.
Optimizing human activity patterns using global sensitivity analysis
Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.
2014-01-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080
Optimizing human activity patterns using global sensitivity analysis
Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...
2013-12-10
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less
The Power of Neuroimaging Biomarkers for Screening Frontotemporal Dementia
McMillan, Corey T.; Avants, Brian B.; Cook, Philip; Ungar, Lyle; Trojanowski, John Q.; Grossman, Murray
2014-01-01
Frontotemporal dementia (FTD) is a clinically and pathologically heterogeneous neurodegenerative disease that can result from either frontotemporal lobar degeneration (FTLD) or Alzheimer’s disease (AD) pathology. It is critical to establish statistically powerful biomarkers that can achieve substantial cost-savings and increase feasibility of clinical trials. We assessed three broad categories of neuroimaging methods to screen underlying FTLD and AD pathology in a clinical FTD series: global measures (e.g., ventricular volume), anatomical volumes of interest (VOIs) (e.g., hippocampus) using a standard atlas, and data-driven VOIs using Eigenanatomy. We evaluated clinical FTD patients (N=93) with cerebrospinal fluid, gray matter (GM) MRI, and diffusion tensor imaging (DTI) to assess whether they had underlying FTLD or AD pathology. Linear regression was performed to identify the optimal VOIs for each method in a training dataset and then we evaluated classification sensitivity and specificity in an independent test cohort. Power was evaluated by calculating minimum sample sizes (mSS) required in the test classification analyses for each model. The data-driven VOI analysis using a multimodal combination of GM MRI and DTI achieved the greatest classification accuracy (89% SENSITIVE; 89% SPECIFIC) and required a lower minimum sample size (N=26) relative to anatomical VOI and global measures. We conclude that a data-driven VOI approach employing Eigenanatomy provides more accurate classification, benefits from increased statistical power in unseen datasets, and therefore provides a robust method for screening underlying pathology in FTD patients for entry into clinical trials. PMID:24687814
New advances in the statistical parton distributions approach
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude
2016-03-01
The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p¯p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results. Presented by J. So.er at POETIC 2015
Friedman, David B
2012-01-01
All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.
Adaptive Error Estimation in Linearized Ocean General Circulation Models
NASA Technical Reports Server (NTRS)
Chechelnitsky, Michael Y.
1999-01-01
Data assimilation methods are routinely used in oceanography. The statistics of the model and measurement errors need to be specified a priori. This study addresses the problem of estimating model and measurement error statistics from observations. We start by testing innovation based methods of adaptive error estimation with low-dimensional models in the North Pacific (5-60 deg N, 132-252 deg E) to TOPEX/POSEIDON (TIP) sea level anomaly data, acoustic tomography data from the ATOC project, and the MIT General Circulation Model (GCM). A reduced state linear model that describes large scale internal (baroclinic) error dynamics is used. The methods are shown to be sensitive to the initial guess for the error statistics and the type of observations. A new off-line approach is developed, the covariance matching approach (CMA), where covariance matrices of model-data residuals are "matched" to their theoretical expectations using familiar least squares methods. This method uses observations directly instead of the innovations sequence and is shown to be related to the MT method and the method of Fu et al. (1993). Twin experiments using the same linearized MIT GCM suggest that altimetric data are ill-suited to the estimation of internal GCM errors, but that such estimates can in theory be obtained using acoustic data. The CMA is then applied to T/P sea level anomaly data and a linearization of a global GFDL GCM which uses two vertical modes. We show that the CMA method can be used with a global model and a global data set, and that the estimates of the error statistics are robust. We show that the fraction of the GCM-T/P residual variance explained by the model error is larger than that derived in Fukumori et al.(1999) with the method of Fu et al.(1993). Most of the model error is explained by the barotropic mode. However, we find that impact of the change in the error statistics on the data assimilation estimates is very small. This is explained by the large representation error, i.e. the dominance of the mesoscale eddies in the T/P signal, which are not part of the 21 by 1" GCM. Therefore, the impact of the observations on the assimilation is very small even after the adjustment of the error statistics. This work demonstrates that simult&neous estimation of the model and measurement error statistics for data assimilation with global ocean data sets and linearized GCMs is possible. However, the error covariance estimation problem is in general highly underdetermined, much more so than the state estimation problem. In other words there exist a very large number of statistical models that can be made consistent with the available data. Therefore, methods for obtaining quantitative error estimates, powerful though they may be, cannot replace physical insight. Used in the right context, as a tool for guiding the choice of a small number of model error parameters, covariance matching can be a useful addition to the repertory of tools available to oceanographers.
Multi-criteria evaluation of CMIP5 GCMs for climate change impact analysis
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Rana, Arun; Moradkhani, Hamid; Sharma, Ashish
2017-04-01
Climate change is expected to have severe impacts on global hydrological cycle along with food-water-energy nexus. Currently, there are many climate models used in predicting important climatic variables. Though there have been advances in the field, there are still many problems to be resolved related to reliability, uncertainty, and computing needs, among many others. In the present work, we have analyzed performance of 20 different global climate models (GCMs) from Climate Model Intercomparison Project Phase 5 (CMIP5) dataset over the Columbia River Basin (CRB) in the Pacific Northwest USA. We demonstrate a statistical multicriteria approach, using univariate and multivariate techniques, for selecting suitable GCMs to be used for climate change impact analysis in the region. Univariate methods includes mean, standard deviation, coefficient of variation, relative change (variability), Mann-Kendall test, and Kolmogorov-Smirnov test (KS-test); whereas multivariate methods used were principal component analysis (PCA), singular value decomposition (SVD), canonical correlation analysis (CCA), and cluster analysis. The analysis is performed on raw GCM data, i.e., before bias correction, for precipitation and temperature climatic variables for all the 20 models to capture the reliability and nature of the particular model at regional scale. The analysis is based on spatially averaged datasets of GCMs and observation for the period of 1970 to 2000. Ranking is provided to each of the GCMs based on the performance evaluated against gridded observational data on various temporal scales (daily, monthly, and seasonal). Results have provided insight into each of the methods and various statistical properties addressed by them employed in ranking GCMs. Further; evaluation was also performed for raw GCM simulations against different sets of gridded observational dataset in the area.
Cross-cultural variation of memory colors of familiar objects.
Smet, Kevin A G; Lin, Yandan; Nagy, Balázs V; Németh, Zoltan; Duque-Chica, Gloria L; Quintero, Jesús M; Chen, Hung-Shing; Luo, Ronnier M; Safi, Mahdi; Hanselaer, Peter
2014-12-29
The effect of cross-regional or cross-cultural differences on color appearance ratings and memory colors of familiar objects was investigated in seven different countries/regions - Belgium, Hungary, Brazil, Colombia, Taiwan, China and Iran. In each region the familiar objects were presented on a calibrated monitor in over 100 different colors to a test panel of observers that were asked to rate the similarity of the presented object color with respect to what they thought the object looks like in reality (memory color). For each object and region the mean observer ratings were modeled by a bivariate Gaussian function. A statistical analysis showed significant (p < 0.001) differences between the region average observers and the global average observer obtained by pooling the data from all regions. However, the effect size of geographical region or culture was found to be small. In fact, the differences between the region average observers and the global average observer were found to of the same magnitude or smaller than the typical within region inter-observer variability. Thus, although statistical differences in color appearance ratings and memory between regions were found, regional impact is not likely to be of practical importance.
Recent Achievements of the Collaboratory for the Study of Earthquake Predictability
NASA Astrophysics Data System (ADS)
Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.
2015-12-01
Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being adapted to ground motion prediction experiments.
Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods
NASA Technical Reports Server (NTRS)
Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.;
2016-01-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Similar estimates of temperature impacts on global wheat yield by three independent methods
NASA Astrophysics Data System (ADS)
Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan
2016-12-01
The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.
Cluster detection methods applied to the Upper Cape Cod cancer data.
Ozonoff, Al; Webster, Thomas; Vieira, Veronica; Weinberg, Janice; Ozonoff, David; Aschengrau, Ann
2005-09-15
A variety of statistical methods have been suggested to assess the degree and/or the location of spatial clustering of disease cases. However, there is relatively little in the literature devoted to comparison and critique of different methods. Most of the available comparative studies rely on simulated data rather than real data sets. We have chosen three methods currently used for examining spatial disease patterns: the M-statistic of Bonetti and Pagano; the Generalized Additive Model (GAM) method as applied by Webster; and Kulldorff's spatial scan statistic. We apply these statistics to analyze breast cancer data from the Upper Cape Cancer Incidence Study using three different latency assumptions. The three different latency assumptions produced three different spatial patterns of cases and controls. For 20 year latency, all three methods generally concur. However, for 15 year latency and no latency assumptions, the methods produce different results when testing for global clustering. The comparative analyses of real data sets by different statistical methods provides insight into directions for further research. We suggest a research program designed around examining real data sets to guide focused investigation of relevant features using simulated data, for the purpose of understanding how to interpret statistical methods applied to epidemiological data with a spatial component.
Testing an Earthquake Prediction Algorithm: The 2016 New Zealand and Chile Earthquakes
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir G.
2017-05-01
The 13 November 2016, M7.8, 54 km NNE of Amberley, New Zealand and the 25 December 2016, M7.6, 42 km SW of Puerto Quellon, Chile earthquakes happened outside the area of the on-going real-time global testing of the intermediate-term middle-range earthquake prediction algorithm M8, accepted in 1992 for the M7.5+ range. Naturally, over the past two decades, the level of registration of earthquakes worldwide has grown significantly and by now is sufficient for diagnosis of times of increased probability (TIPs) by the M8 algorithm on the entire territory of New Zealand and Southern Chile as far as below 40°S. The mid-2016 update of the M8 predictions determines TIPs in the additional circles of investigation (CIs) where the two earthquakes have happened. Thus, after 50 semiannual updates in the real-time prediction mode, we (1) confirm statistically approved high confidence of the M8-MSc predictions and (2) conclude a possibility of expanding the territory of the Global Test of the algorithms M8 and MSc in an apparently necessary revision of the 1992 settings.
An adaptive two-stage dose-response design method for establishing proof of concept.
Franchetti, Yoko; Anderson, Stewart J; Sampson, Allan R
2013-01-01
We propose an adaptive two-stage dose-response design where a prespecified adaptation rule is used to add and/or drop treatment arms between the stages. We extend the multiple comparison procedures-modeling (MCP-Mod) approach into a two-stage design. In each stage, we use the same set of candidate dose-response models and test for a dose-response relationship or proof of concept (PoC) via model-associated statistics. The stage-wise test results are then combined to establish "global" PoC using a conditional error function. Our simulation studies showed good and more robust power in our design method compared to conventional and fixed designs.
Development of a Novel Method for Determination of Residual Stresses in a Friction Stir Weld
NASA Technical Reports Server (NTRS)
Reynolds, Anthony P.
2001-01-01
Material constitutive properties, which describe the mechanical behavior of a material under loading, are vital to the design and implementation of engineering materials. For homogeneous materials, the standard process for determining these properties is the tensile test, which is used to measure the material stress-strain response. However, a majority of the applications for engineering materials involve the use of heterogeneous materials and structures (i.e. alloys, welded components) that exhibit heterogeneity on a global or local level. Regardless of the scale of heterogeneity, the overall response of the material or structure is dependent on the response of each of the constituents. Therefore, in order to produce materials and structures that perform in the best possible manner, the properties of the constituents that make up the heterogeneous material must be thoroughly examined. When materials exhibit heterogeneity on a local level, such as in alloys or particle/matrix composites, they are often treated as statistically homogenous and the resulting 'effective' properties may be determined through homogenization techniques. In the case of globally heterogeneous materials, such as weldments, the standard tensile test provides the global response but no information on what is Occurring locally within the different constituents. This information is necessary to improve the material processing as well as the end product.
Combining p-values in replicated single-case experiments with multivariate outcome.
Solmi, Francesca; Onghena, Patrick
2014-01-01
Interest in combining probabilities has a long history in the global statistical community. The first steps in this direction were taken by Ronald Fisher, who introduced the idea of combining p-values of independent tests to provide a global decision rule when multiple aspects of a given problem were of interest. An interesting approach to this idea of combining p-values is the one based on permutation theory. The methods belonging to this particular approach exploit the permutation distributions of the tests to be combined, and use a simple function to combine probabilities. Combining p-values finds a very interesting application in the analysis of replicated single-case experiments. In this field the focus, while comparing different treatments effects, is more articulated than when just looking at the means of the different populations. Moreover, it is often of interest to combine the results obtained on the single patients in order to get more global information about the phenomenon under study. This paper gives an overview of how the concept of combining p-values was conceived, and how it can be easily handled via permutation techniques. Finally, the method of combining p-values is applied to a simulated replicated single-case experiment, and a numerical illustration is presented.
Mullan, Patricia B; Williams, Joy; Malani, Preeti N; Riba, Michelle; Haig, Andrew; Perry, Julie; Kolars, Joseph C; Mangrulkar, Rajesh; Williams, Brent
2014-05-03
The move to frame medical education in terms of competencies - the extent to which trainees "can do" a professional responsibility - is congruent with calls for accountability in medical education. However, the focus on competencies might be a poor fit with curricula intended to prepare students for responsibilities not emphasized in traditional medical education. This study examines an innovative approach to the use of potential competency expectations related to advancing global health equity to promote students' reflections and to inform curriculum development. In 2012, 32 medical students were admitted into a newly developed Global Health and Disparities (GHD) Path of Excellence. The GHD program takes the form of mentored co-curricular activities built around defined competencies related to professional development and leadership skills intended to ameliorate health disparities in medically underserved settings, both domestically and globally. Students reviewed the GHD competencies from two perspectives: a) their ability to perform the identified competencies that they perceived themselves as holding as they began the GHD program and b) the extent to which they perceived that their future career would require these responsibilities. For both sets of assessments the response scale ranged from "Strongly Disagree" to "Strongly Agree." Wilcoxon's paired T-tests compared individual students' ordinal rating of their current level of ability to their perceived need for competence that they anticipated their careers would require. Statistical significance was set at p < .01. Students' ratings ranged from "strongly disagree" to "strongly agree" that they could perform the defined GHD-related competencies. However, on most competencies, at least 50 % of students indicated that the stated competencies were beyond their present ability level. For each competency, the results of Wilcoxon paired T-tests indicate - at statistically significant levels - that students perceive more need in their careers for GHD-program defined competencies than they currently possess. This study suggests congruence between student and program perceptions of the scope of practice required for GHD. Students report the need for enhanced skill levels in the careers they anticipate. This approach to formulating and reflecting on competencies will guide the program's design of learning experiences aligned with students' career goals.
NASA Astrophysics Data System (ADS)
Gordeev, E.; Sergeev, V.; Honkonen, I.; Kuznetsova, M.; Rastätter, L.; Palmroth, M.; Janhunen, P.; Tóth, G.; Lyon, J.; Wiltberger, M.
2015-12-01
Global magnetohydrodynamic (MHD) modeling is a powerful tool in space weather research and predictions. There are several advanced and still developing global MHD (GMHD) models that are publicly available via Community Coordinated Modeling Center's (CCMC) Run on Request system, which allows the users to simulate the magnetospheric response to different solar wind conditions including extraordinary events, like geomagnetic storms. Systematic validation of GMHD models against observations still continues to be a challenge, as well as comparative benchmarking of different models against each other. In this paper we describe and test a new approach in which (i) a set of critical large-scale system parameters is explored/tested, which are produced by (ii) specially designed set of computer runs to simulate realistic statistical distributions of critical solar wind parameters and are compared to (iii) observation-based empirical relationships for these parameters. Being tested in approximately similar conditions (similar inputs, comparable grid resolution, etc.), the four models publicly available at the CCMC predict rather well the absolute values and variations of those key parameters (magnetospheric size, magnetic field, and pressure) which are directly related to the large-scale magnetospheric equilibrium in the outer magnetosphere, for which the MHD is supposed to be a valid approach. At the same time, the models have systematic differences in other parameters, being especially different in predicting the global convection rate, total field-aligned current, and magnetic flux loading into the magnetotail after the north-south interplanetary magnetic field turning. According to validation results, none of the models emerges as an absolute leader. The new approach suggested for the evaluation of the models performance against reality may be used by model users while planning their investigations, as well as by model developers and those interesting to quantitatively evaluate progress in magnetospheric modeling.
McFarlane, William R; Levin, Bruce; Travis, Lori; Lucas, F Lee; Lynch, Sarah; Verdi, Mary; Williams, Deanna; Adelsheim, Steven; Calkins, Roderick; Carter, Cameron S; Cornblatt, Barbara; Taylor, Stephan F; Auther, Andrea M; McFarland, Bentson; Melton, Ryan; Migliorati, Margaret; Niendam, Tara; Ragland, J Daniel; Sale, Tamara; Salvador, Melina; Spring, Elizabeth
2015-01-01
To test effectiveness of the Early Detection, Intervention, and Prevention of Psychosis Program in preventing the onset of severe psychosis and improving functioning in a national sample of at-risk youth. In a risk-based allocation study design, 337 youth (age 12-25) at risk of psychosis were assigned to treatment groups based on severity of positive symptoms. Those at clinically higher risk (CHR) or having an early first episode of psychosis (EFEP) were assigned to receive Family-aided Assertive Community Treatment (FACT); those at clinically lower risk (CLR) were assigned to receive community care. Between-groups differences on outcome variables were adjusted statistically according to regression-discontinuity procedures and evaluated using the Global Test Procedure that combined all symptom and functional measures. A total of 337 young people (mean age: 16.6) were assigned to the treatment group (CHR + EFEP, n = 250) or comparison group (CLR, n = 87). On the primary variable, positive symptoms, after 2 years FACT, were superior to community care (2 df, p < .0001) for both CHR (p = .0034) and EFEP (p < .0001) subgroups. Rates of conversion (6.3% CHR vs 2.3% CLR) and first negative event (25% CHR vs 22% CLR) were low but did not differ. FACT was superior in the Global Test (p = .0007; p = .024 for CHR and p = .0002 for EFEP, vs CLR) and in improvement in participation in work and school (p = .025). FACT is effective in improving positive, negative, disorganized and general symptoms, Global Assessment of Functioning, work and school participation and global outcome in youth at risk for, or experiencing very early, psychosis. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.
Global health business: the production and performativity of statistics in Sierra Leone and Germany.
Erikson, Susan L
2012-01-01
The global push for health statistics and electronic digital health information systems is about more than tracking health incidence and prevalence. It is also experienced on the ground as means to develop and maintain particular norms of health business, knowledge, and decision- and profit-making that are not innocent. Statistics make possible audit and accountability logics that undergird the management of health at a distance and that are increasingly necessary to the business of health. Health statistics are inextricable from their social milieus, yet as business artifacts they operate as if they are freely formed, objectively originated, and accurate. This article explicates health statistics as cultural forms and shows how they have been produced and performed in two very different countries: Sierra Leone and Germany. In both familiar and surprising ways, this article shows how statistics and their pursuit organize and discipline human behavior, constitute subject positions, and reify existing relations of power.
Blanco-Guillot, Francles; Castañeda-Cediel, M Lucía; Cruz-Hervert, Pablo; Ferreyra-Reyes, Leticia; Delgado-Sánchez, Guadalupe; Ferreira-Guerrero, Elizabeth; Montero-Campos, Rogelio; Bobadilla-Del-Valle, Miriam; Martínez-Gamboa, Rosa Areli; Torres-González, Pedro; Téllez-Vazquez, Norma; Canizales-Quintero, Sergio; Yanes-Lane, Mercedes; Mongua-Rodríguez, Norma; Ponce-de-León, Alfredo; Sifuentes-Osornio, José; García-García, Lourdes
2018-01-01
Genotyping and georeferencing in tuberculosis (TB) have been used to characterize the distribution of the disease and occurrence of transmission within specific groups and communities. The objective of this study was to test the hypothesis that diabetes mellitus (DM) and pulmonary TB may occur in spatial and molecular aggregations. Retrospective cohort study of patients with pulmonary TB. The study area included 12 municipalities in the Sanitary Jurisdiction of Orizaba, Veracruz, México. Patients with acid-fast bacilli in sputum smears and/or Mycobacterium tuberculosis in sputum cultures were recruited from 1995 to 2010. Clinical (standardized questionnaire, physical examination, chest X-ray, blood glucose test and HIV test), microbiological, epidemiological, and molecular evaluations were carried out. Patients were considered "genotype-clustered" if two or more isolates from different patients were identified within 12 months of each other and had six or more IS6110 bands in an identical pattern, or < 6 bands with identical IS6110 RFLP patterns and spoligotype with the same spacer oligonucleotides. Residential and health care centers addresses were georeferenced. We used a Jeep hand GPS. The coordinates were transferred from the GPS files to ArcGIS using ArcMap 9.3. We evaluated global spatial aggregation of patients in IS6110-RFLP/ spoligotype clusters using global Moran´s I. Since global distribution was not random, we evaluated "hotspots" using Getis-Ord Gi* statistic. Using bivariate and multivariate analysis we analyzed sociodemographic, behavioral, clinic and bacteriological conditions associated with "hotspots". We used STATA® v13.1 for all statistical analysis. From 1995 to 2010, 1,370 patients >20 years were diagnosed with pulmonary TB; 33% had DM. The proportion of isolates that were genotyped was 80.7% (n = 1105), of which 31% (n = 342) were grouped in 91 genotype clusters with 2 to 23 patients each; 65.9% of total clusters were small (2 members) involving 35.08% of patients. Twenty three (22.7) percent of cases were classified as recent transmission. Moran`s I indicated that distribution of patients in IS6110-RFLP/spoligotype clusters was not random (Moran`s I = 0.035468, Z value = 7.0, p = 0.00). Local spatial analysis showed statistically significant spatial aggregation of patients in IS6110-RFLP/spoligotype clusters identifying "hotspots" and "coldspots". GI* statistic showed that the hotspot for spatial clustering was located in Camerino Z. Mendoza municipality; 14.6% (50/342) of patients in genotype clusters were located in a hotspot; of these, 60% (30/50) lived with DM. Using logistic regression the statistically significant variables associated with hotspots were: DM [adjusted Odds Ratio (aOR) 7.04, 95% Confidence interval (CI) 3.03-16.38] and attending the health center in Camerino Z. Mendoza (aOR18.04, 95% CI 7.35-44.28). The combination of molecular and epidemiological information with geospatial data allowed us to identify the concurrence of molecular clustering and spatial aggregation of patients with DM and TB. This information may be highly useful for TB control programs.
Blanco-Guillot, Francles; Ferreyra-Reyes, Leticia; Delgado-Sánchez, Guadalupe; Ferreira-Guerrero, Elizabeth; Montero-Campos, Rogelio; Bobadilla-del-Valle, Miriam; Martínez-Gamboa, Rosa Areli; Torres-González, Pedro; Téllez-Vazquez, Norma; Canizales-Quintero, Sergio; Yanes-Lane, Mercedes; Mongua-Rodríguez, Norma; Ponce-de-León, Alfredo; Sifuentes-Osornio, José
2018-01-01
Background Genotyping and georeferencing in tuberculosis (TB) have been used to characterize the distribution of the disease and occurrence of transmission within specific groups and communities. Objective The objective of this study was to test the hypothesis that diabetes mellitus (DM) and pulmonary TB may occur in spatial and molecular aggregations. Material and methods Retrospective cohort study of patients with pulmonary TB. The study area included 12 municipalities in the Sanitary Jurisdiction of Orizaba, Veracruz, México. Patients with acid-fast bacilli in sputum smears and/or Mycobacterium tuberculosis in sputum cultures were recruited from 1995 to 2010. Clinical (standardized questionnaire, physical examination, chest X-ray, blood glucose test and HIV test), microbiological, epidemiological, and molecular evaluations were carried out. Patients were considered “genotype-clustered” if two or more isolates from different patients were identified within 12 months of each other and had six or more IS6110 bands in an identical pattern, or < 6 bands with identical IS6110 RFLP patterns and spoligotype with the same spacer oligonucleotides. Residential and health care centers addresses were georeferenced. We used a Jeep hand GPS. The coordinates were transferred from the GPS files to ArcGIS using ArcMap 9.3. We evaluated global spatial aggregation of patients in IS6110-RFLP/ spoligotype clusters using global Moran´s I. Since global distribution was not random, we evaluated “hotspots” using Getis-Ord Gi* statistic. Using bivariate and multivariate analysis we analyzed sociodemographic, behavioral, clinic and bacteriological conditions associated with “hotspots”. We used STATA® v13.1 for all statistical analysis. Results From 1995 to 2010, 1,370 patients >20 years were diagnosed with pulmonary TB; 33% had DM. The proportion of isolates that were genotyped was 80.7% (n = 1105), of which 31% (n = 342) were grouped in 91 genotype clusters with 2 to 23 patients each; 65.9% of total clusters were small (2 members) involving 35.08% of patients. Twenty three (22.7) percent of cases were classified as recent transmission. Moran`s I indicated that distribution of patients in IS6110-RFLP/spoligotype clusters was not random (Moran`s I = 0.035468, Z value = 7.0, p = 0.00). Local spatial analysis showed statistically significant spatial aggregation of patients in IS6110-RFLP/spoligotype clusters identifying “hotspots” and “coldspots”. GI* statistic showed that the hotspot for spatial clustering was located in Camerino Z. Mendoza municipality; 14.6% (50/342) of patients in genotype clusters were located in a hotspot; of these, 60% (30/50) lived with DM. Using logistic regression the statistically significant variables associated with hotspots were: DM [adjusted Odds Ratio (aOR) 7.04, 95% Confidence interval (CI) 3.03–16.38] and attending the health center in Camerino Z. Mendoza (aOR18.04, 95% CI 7.35–44.28). Conclusions The combination of molecular and epidemiological information with geospatial data allowed us to identify the concurrence of molecular clustering and spatial aggregation of patients with DM and TB. This information may be highly useful for TB control programs. PMID:29534104
Further developments in cloud statistics for computer simulations
NASA Technical Reports Server (NTRS)
Chang, D. T.; Willand, J. H.
1972-01-01
This study is a part of NASA's continued program to provide global statistics of cloud parameters for computer simulation. The primary emphasis was on the development of the data bank of the global statistical distributions of cloud types and cloud layers and their applications in the simulation of the vertical distributions of in-cloud parameters such as liquid water content. These statistics were compiled from actual surface observations as recorded in Standard WBAN forms. Data for a total of 19 stations were obtained and reduced. These stations were selected to be representative of the 19 primary cloud climatological regions defined in previous studies of cloud statistics. Using the data compiled in this study, a limited study was conducted of the hemogeneity of cloud regions, the latitudinal dependence of cloud-type distributions, the dependence of these statistics on sample size, and other factors in the statistics which are of significance to the problem of simulation. The application of the statistics in cloud simulation was investigated. In particular, the inclusion of the new statistics in an expanded multi-step Monte Carlo simulation scheme is suggested and briefly outlined.
Statistical functions and relevant correlation coefficients of clearness index
NASA Astrophysics Data System (ADS)
Pavanello, Diego; Zaaiman, Willem; Colli, Alessandra; Heiser, John; Smith, Scott
2015-08-01
This article presents a statistical analysis of the sky conditions, during years from 2010 to 2012, for three different locations: the Joint Research Centre site in Ispra (Italy, European Solar Test Installation - ESTI laboratories), the site of National Renewable Energy Laboratory in Golden (Colorado, USA) and the site of Brookhaven National Laboratories in Upton (New York, USA). The key parameter is the clearness index kT, a dimensionless expression of the global irradiance impinging upon a horizontal surface at a given instant of time. In the first part, the sky conditions are characterized using daily averages, giving a general overview of the three sites. In the second part the analysis is performed using data sets with a short-term resolution of 1 sample per minute, demonstrating remarkable properties of the statistical distributions of the clearness index, reinforced by a proof using fuzzy logic methods. Successively some time-dependent correlations between different meteorological variables are presented in terms of Pearson and Spearman correlation coefficients, and introducing a new one.
Probabilistic Evaluation of Competing Climate Models
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Chatterjee, S.; Heyman, M.; Cressie, N.
2017-12-01
A standard paradigm for assessing the quality of climate model simulations is to compare what these models produce for past and present time periods, to observations of the past and present. Many of these comparisons are based on simple summary statistics called metrics. Here, we propose an alternative: evaluation of competing climate models through probabilities derived from tests of the hypothesis that climate-model-simulated and observed time sequences share common climate-scale signals. The probabilities are based on the behavior of summary statistics of climate model output and observational data, over ensembles of pseudo-realizations. These are obtained by partitioning the original time sequences into signal and noise components, and using a parametric bootstrap to create pseudo-realizations of the noise sequences. The statistics we choose come from working in the space of decorrelated and dimension-reduced wavelet coefficients. We compare monthly sequences of CMIP5 model output of average global near-surface temperature anomalies to similar sequences obtained from the well-known HadCRUT4 data set, as an illustration.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
Peterson, Katie A; Savulich, George; Jackson, Dan; Killikelly, Clare; Pickard, John D; Sahakian, Barbara J
2016-08-01
We conducted a systematic review of the literature and used meta-analytic techniques to evaluate the impact of shunt surgery on neuropsychological performance in patients with normal pressure hydrocephalus (NPH). Twenty-three studies with 1059 patients were identified for review using PubMed, Web of Science, Google scholar and manual searching. Inclusion criteria were prospective, within-subject investigations of cognitive outcome using neuropsychological assessment before and after shunt surgery in patients with NPH. There were statistically significant effects of shunt surgery on cognition (Mini-Mental State Examination; MMSE), learning and memory (Rey Auditory Verbal Learning Test; RAVLT, total and delayed subtests), executive function (backwards digit span, phonemic verbal fluency, trail making test B) and psychomotor speed (trail making test A) all in the direction of improvement following shunt surgery, but with considerable heterogeneity across all measures. A more detailed examination of the data suggested robust evidence for improved MMSE, RAVLT total, RAVLT delayed, phonemic verbal fluency and trail making test A only. Meta-regressions revealed no statistically significant effect of age, sex or follow-up interval on improvement in the MMSE. Our results suggest that shunt surgery is most sensitive for improving global cognition, learning and memory and psychomotor speed in patients with NPH.
Long-term variability of global statistical properties of epileptic brain networks
NASA Astrophysics Data System (ADS)
Kuhnert, Marie-Therese; Elger, Christian E.; Lehnertz, Klaus
2010-12-01
We investigate the influence of various pathophysiologic and physiologic processes on global statistical properties of epileptic brain networks. We construct binary functional networks from long-term, multichannel electroencephalographic data recorded from 13 epilepsy patients, and the average shortest path length and the clustering coefficient serve as global statistical network characteristics. For time-resolved estimates of these characteristics we observe large fluctuations over time, however, with some periodic temporal structure. These fluctuations can—to a large extent—be attributed to daily rhythms while relevant aspects of the epileptic process contribute only marginally. Particularly, we could not observe clear cut changes in network states that can be regarded as predictive of an impending seizure. Our findings are of particular relevance for studies aiming at an improved understanding of the epileptic process with graph-theoretical approaches.
Identifying signatures of positive selection in pigmentation genes in two South Asian populations.
Jonnalagadda, Manjari; Bharti, Neeraj; Patil, Yatish; Ozarkar, Shantanu; K, Sunitha Manjari; Joshi, Rajendra; Norton, Heather
2017-09-10
Skin pigmentation is a polygenic trait showing wide phenotypic variations among global populations. While numerous pigmentation genes have been identified to be under positive selection among European and East populations, genes contributing to phenotypic variation in skin pigmentation within and among South Asian populations are still poorly understood. The present study uses data from the Phase 3 of the 1000 genomes project focusing on two South Asian populations-GIH (Gujarati Indian from Houston, Texas) and ITU (Indian Telugu from UK), so as to decode the genetic architecture involved in adaptation to ultraviolet radiation in South Asian populations. Statistical tests included were (1) tests to identify deviations of the Site Frequency Spectrum (SFS) from neutral expectations (Tajima's D, Fay and Wu's H and Fu and Li's D* and F*), (2) tests focused on the identification of high-frequency haplotypes with extended linkage disequilibrium (iHS and Rsb), and (3) tests based on genetic differentiation between populations (LSBL). Twenty-two pigmentation genes fall in the top 1% for at least one statistic in the GIH population, 5 of which (LYST, OCA2, SLC24A5, SLC45A2, and TYR) have been previously associated with normal variation in skin, hair, or eye color. In comparison, 17 genes fall in the top 1% for at least one statistic in the ITU population. Twelve loci which are identified as outliers in the ITU scan were also identified in the GIH population. These results suggest that selection may have affected these loci broadly across the region. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Van Gundy, Karen; Morton, Beth A.; Liu, Hope Q.; Kline, Jennifer
2006-01-01
To explore the effects of web-based instruction (WBI) on math anxiety, the sense of mastery, and global self-esteem, we use quasi-experimental data from undergraduate statistics students in classes assigned to three study conditions, each with varied access to, and incentive for, the use of online technologies. Results suggest that when statistics…
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
Blanc, Élodie
2017-01-26
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanc, Élodie
This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less
Have the temperature time series a structural change after 1998?
NASA Astrophysics Data System (ADS)
Werner, Rolf; Valev, Dimitare; Danov, Dimitar
2012-07-01
The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.
Boal Carvalho, Pedro; Magalhães, Joana; Dias de Castro, Francisca; Rosa, Bruno; Cotter, José
2017-03-31
Helicobacter pylori eradication has become increasingly difficult as resistances to several antibiotics develop. We aimed to compare Helicobacter pylori eradication rates between triple therapy and sequential therapy in a naive Portuguese population. Prospective randomized trial including consecutive patients referred for first-line Helicobacter pylori eradication treatment. previous gastric surgery/neoplasia, pregnancy/lactancy, allergy to any of the drugs. The compared eradication regimens were triple therapy (pantoprazol, amoxicillin and clarithromycin 12/12 hours, 14 days) and sequential therapy (pantoprazol 12/12 hours for 10 days, amoxicillin 12/12 hours for days 1 - 5 and clarithromycin plus metronidazol 12/12 hours during days 6 - 10). Eradication success was confirmed with urea breath test. Statistical analysis was performed with SPSS v21.0 and a p-value < 0.05 was considered statistically significant. Included 60 patients, 39 (65%) female with mean age 52 years (SD ± 14.3). Treatment groups were homogeneous for gender, age, indication for treatment and smoking status. No statistical differences were encountered between sequential and triple therapy eradication rates (86.2% vs 77.4%, p = 0.379), global eradication rate was 82%. Tobacco consumption was associated with a significantly lower eradication success (54.5 vs 87.8%, p = 0.022). In this randomized controlled trial in a naive Portuguese population, we found a satisfactory global Helicobacter pylori eradication rate of 82%, with no statistical differences observed in the efficacy of the treatment between triple and sequential regimens. These results support the use of either therapy for the first-line eradication of Helicobacter pylori.
A Temperature-Based Model for Estimating Monthly Average Daily Global Solar Radiation in China
Li, Huashan; Cao, Fei; Wang, Xianlong; Ma, Weibin
2014-01-01
Since air temperature records are readily available around the world, the models based on air temperature for estimating solar radiation have been widely accepted. In this paper, a new model based on Hargreaves and Samani (HS) method for estimating monthly average daily global solar radiation is proposed. With statistical error tests, the performance of the new model is validated by comparing with the HS model and its two modifications (Samani model and Chen model) against the measured data at 65 meteorological stations in China. Results show that the new model is more accurate and robust than the HS, Samani, and Chen models in all climatic regions, especially in the humid regions. Hence, the new model can be recommended for estimating solar radiation in areas where only air temperature data are available in China. PMID:24605046
A global logrank test for adaptive treatment strategies based on observational studies.
Li, Zhiguo; Valenstein, Marcia; Pfeiffer, Paul; Ganoczy, Dara
2014-02-28
In studying adaptive treatment strategies, a natural question that is of paramount interest is whether there is any significant difference among all possible treatment strategies. When the outcome variable of interest is time-to-event, we propose an inverse probability weighted logrank test for testing the equivalence of a fixed set of pre-specified adaptive treatment strategies based on data from an observational study. The weights take into account both the possible selection bias in an observational study and the fact that the same subject may be consistent with more than one treatment strategy. The asymptotic distribution of the weighted logrank statistic under the null hypothesis is obtained. We show that, in an observational study where the treatment selection probabilities need to be estimated, the estimation of these probabilities does not have an effect on the asymptotic distribution of the weighted logrank statistic, as long as the estimation of the parameters in the models for these probabilities is n-consistent. Finite sample performance of the test is assessed via a simulation study. We also show in the simulation that the test can be pretty robust to misspecification of the models for the probabilities of treatment selection. The method is applied to analyze data on antidepressant adherence time from an observational database maintained at the Department of Veterans Affairs' Serious Mental Illness Treatment Research and Evaluation Center. Copyright © 2013 John Wiley & Sons, Ltd.
Relation between arithmetic performance and phonological working memory in children.
Silva, Kelly da; Zuanetti, Patrícia Aparecida; Borcat, Vanessa Trombini Ribeiro; Guedes-Granzotti, Raphaela Barroso; Kuroishi, Rita Cristina Sadako; Domenis, Daniele Ramos; Fukuda, Marisa Tomoe Hebihara
2017-08-17
To compare the results of Loop Phonological Working Memory (LPWM) in children without global learning alterations, with lower and average/higher arithmetic performance. The study was conducted with 30 children, between the ages of seven and nine years old, who attended the second or third grade of elementary school in the public network. Exclusion criteria were children with suggestive signs of hearing loss, neurological disorders, poor performance in the reading comprehension test or in speech therapy. The children included in the study were submitted to the subtest of arithmetic of Academic Achievement Test for division into two groups (G1 and G2). The G1 was composed of children with low performance in arithmetic and G2 for children with average/higher performance in arithmetic. All children were submitted to PWM assessment through the repetition of pseudowords test. Statistical analysis was performed using the Mann-Whitney test and a p-value <0.05 was considered significant. The study included 20 girls and 10 boys, mean age 8.7 years. The G1 was composed of 17 children and G2 of 13 children. There was a statistically significant difference between the groups studied for the repetition of pseudowords with three and four syllables. The results of this study provide support for the hypothesis that changes in phonological working memory are related to difficulties in arithmetic tests.
A global reference model of Curie-point depths based on EMAG2
Li, Chun-Feng; Lu, Yu; Wang, Jian
2017-01-01
In this paper, we use a robust inversion algorithm, which we have tested in many regional studies, to obtain the first global model of Curie-point depth (GCDM) from magnetic anomaly inversion based on fractal magnetization. Statistically, the oceanic Curie depth mean is smaller than the continental one, but continental Curie depths are almost bimodal, showing shallow Curie points in some old cratons. Oceanic Curie depths show modifications by hydrothermal circulations in young oceanic lithosphere and thermal perturbations in old oceanic lithosphere. Oceanic Curie depths also show strong dependence on the spreading rate along active spreading centers. Curie depths and heat flow are correlated, following optimal theoretical curves of average thermal conductivities K = ~2.0 W(m°C)−1 for the ocean and K = ~2.5 W(m°C)−1 for the continent. The calculated heat flow from Curie depths and large-interval gridding of measured heat flow all indicate that the global heat flow average is about 70.0 mW/m2, leading to a global heat loss ranging from ~34.6 to 36.6 TW. PMID:28322332
FADTTS: functional analysis of diffusion tensor tract statistics.
Zhu, Hongtu; Kong, Linglong; Li, Runze; Styner, Martin; Gerig, Guido; Lin, Weili; Gilmore, John H
2011-06-01
The aim of this paper is to present a functional analysis of a diffusion tensor tract statistics (FADTTS) pipeline for delineating the association between multiple diffusion properties along major white matter fiber bundles with a set of covariates of interest, such as age, diagnostic status and gender, and the structure of the variability of these white matter tract properties in various diffusion tensor imaging studies. The FADTTS integrates five statistical tools: (i) a multivariate varying coefficient model for allowing the varying coefficient functions in terms of arc length to characterize the varying associations between fiber bundle diffusion properties and a set of covariates, (ii) a weighted least squares estimation of the varying coefficient functions, (iii) a functional principal component analysis to delineate the structure of the variability in fiber bundle diffusion properties, (iv) a global test statistic to test hypotheses of interest, and (v) a simultaneous confidence band to quantify the uncertainty in the estimated coefficient functions. Simulated data are used to evaluate the finite sample performance of FADTTS. We apply FADTTS to investigate the development of white matter diffusivities along the splenium of the corpus callosum tract and the right internal capsule tract in a clinical study of neurodevelopment. FADTTS can be used to facilitate the understanding of normal brain development, the neural bases of neuropsychiatric disorders, and the joint effects of environmental and genetic factors on white matter fiber bundles. The advantages of FADTTS compared with the other existing approaches are that they are capable of modeling the structured inter-subject variability, testing the joint effects, and constructing their simultaneous confidence bands. However, FADTTS is not crucial for estimation and reduces to the functional analysis method for the single measure. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Hopkins, J.; Balch, W. M.; Henson, S.; Poulton, A. J.; Drapeau, D.; Bowler, B.; Lubelczyk, L.
2016-02-01
Coccolithophores, the single celled phytoplankton that produce an outer covering of calcium carbonate coccoliths, are considered to be the greatest contributors to the global oceanic particulate inorganic carbon (PIC) pool. The reflective coccoliths scatter light back out from the ocean surface, enabling PIC concentration to be quantitatively estimated from ocean color satellites. Here we use datasets of AQUA MODIS PIC concentration from 2003-2014 (using the recently-revised PIC algorithm), as well as statistics on coccolithophore vertical distribution derived from cruises throughout the world ocean, to estimate the average global (surface and integrated) PIC standing stock and its associated inter-annual variability. In addition, we divide the global ocean into Longhurst biogeochemical provinces, update the PIC biomass statistics and identify those regions that have the greatest inter-annual variability and thus may exert the greatest influence on global PIC standing stock and the alkalinity pump.
The Impact of Arts Activity on Nursing Staff Well-Being: An Intervention in the Workplace
Karpavičiūtė, Simona; Macijauskienė, Jūratė
2016-01-01
Over 59 million workers are employed in the healthcare sector globally, with a daily risk of being exposed to a complex variety of health and safety hazards. The purpose of this study was to investigate the impact of arts activity on the well-being of nursing staff. During October–December 2014, 115 nursing staff working in a hospital, took part in this study, which lasted for 10 weeks. The intervention group (n = 56) took part in silk painting activities once a week. Data was collected using socio-demographic questions, the Warwick-Edinburgh Mental Well-Being Scale, Short Form—36 Health Survey questionnaire, Reeder stress scale, and Multidimensional fatigue inventory (before and after art activities in both groups). Statistical data analysis included descriptive statistics (frequency, percentage, mean, standard deviation), non-parametric statistics analysis (Man Whitney U Test; Wilcoxon signed—ranks test), Fisher’s exact test and reliability analysis (Cronbach’s Alpha). The level of significance was set at p ≤ 0.05. In the intervention group, there was a tendency for participation in arts activity having a positive impact on their general health and mental well-being, reducing stress and fatigue, awaking creativity and increasing a sense of community at work. The control group did not show any improvements. Of the intervention group 93% reported enjoyment, with 75% aspiring to continue arts activity in the future. This research suggests that arts activity, as a workplace intervention, can be used to promote nursing staff well-being at work. PMID:27104550
NASA Astrophysics Data System (ADS)
Levy, R. C.; Munchak, L. A.; Mattoo, S.; Patadia, F.; Remer, L. A.; Holz, R. E.
2015-10-01
To answer fundamental questions about aerosols in our changing climate, we must quantify both the current state of aerosols and how they are changing. Although NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) sensors have provided quantitative information about global aerosol optical depth (AOD) for more than a decade, this period is still too short to create an aerosol climate data record (CDR). The Visible Infrared Imaging Radiometer Suite (VIIRS) was launched on the Suomi-NPP satellite in late 2011, with additional copies planned for future satellites. Can the MODIS aerosol data record be continued with VIIRS to create a consistent CDR? When compared to ground-based AERONET data, the VIIRS Environmental Data Record (V_EDR) has similar validation statistics as the MODIS Collection 6 (M_C6) product. However, the V_EDR and M_C6 are offset in regards to global AOD magnitudes, and tend to provide different maps of 0.55 μm AOD and 0.55/0.86 μm-based Ångström Exponent (AE). One reason is that the retrieval algorithms are different. Using the Intermediate File Format (IFF) for both MODIS and VIIRS data, we have tested whether we can apply a single MODIS-like (ML) dark-target algorithm on both sensors that leads to product convergence. Except for catering the radiative transfer and aerosol lookup tables to each sensor's specific wavelength bands, the ML algorithm is the same for both. We run the ML algorithm on both sensors between March 2012 and May 2014, and compare monthly mean AOD time series with each other and with M_C6 and V_EDR products. Focusing on the March-April-May (MAM) 2013 period, we compared additional statistics that include global and gridded 1° × 1° AOD and AE, histograms, sampling frequencies, and collocations with ground-based AERONET. Over land, use of the ML algorithm clearly reduces the differences between the MODIS and VIIRS-based AOD. However, although global offsets are near zero, some regional biases remain, especially in cloud fields and over brighter surface targets. Over ocean, use of the ML algorithm actually increases the offset between VIIRS and MODIS-based AOD (to ~ 0.025), while reducing the differences between AE. We characterize algorithm retrievability through statistics of retrieval fraction. In spite of differences between retrieved AOD magnitudes, the ML algorithm will lead to similar decisions about "whether to retrieve" on each sensor. Finally, we discuss how issues of calibration, as well as instrument spatial resolution may be contributing to the statistics and the ability to create a consistent MODIS → VIIRS aerosol CDR.
NASA Astrophysics Data System (ADS)
Levy, R. C.; Munchak, L. A.; Mattoo, S.; Patadia, F.; Remer, L. A.; Holz, R. E.
2015-07-01
To answer fundamental questions about aerosols in our changing climate, we must quantify both the current state of aerosols and how they are changing. Although NASA's Moderate resolution Imaging Spectroradiometer (MODIS) sensors have provided quantitative information about global aerosol optical depth (AOD) for more than a decade, this period is still too short to create an aerosol climate data record (CDR). The Visible Infrared Imaging Radiometer Suite (VIIRS) was launched on the Suomi-NPP satellite in late 2011, with additional copies planned for future satellites. Can the MODIS aerosol data record be continued with VIIRS to create a consistent CDR? When compared to ground-based AERONET data, the VIIRS Environmental Data Record (V_EDR) has similar validation statistics as the MODIS Collection 6 (M_C6) product. However, the V_EDR and M_C6 are offset in regards to global AOD magnitudes, and tend to provide different maps of 0.55 μm AOD and 0.55/0.86 μm-based Ångstrom Exponent (AE). One reason is that the retrieval algorithms are different. Using the Intermediate File Format (IFF) for both MODIS and VIIRS data, we have tested whether we can apply a single MODIS-like (ML) dark-target algorithm on both sensors that leads to product convergence. Except for catering the radiative transfer and aerosol lookup tables to each sensor's specific wavelength bands, the ML algorithm is the same for both. We run the ML algorithm on both sensors between March 2012 and May 2014, and compare monthly mean AOD time series with each other and with M_C6 and V_EDR products. Focusing on the March-April-May (MAM) 2013 period, we compared additional statistics that include global and gridded 1° × 1° AOD and AE, histograms, sampling frequencies, and collocations with ground-based AERONET. Over land, use of the ML algorithm clearly reduces the differences between the MODIS and VIIRS-based AOD. However, although global offsets are near zero, some regional biases remain, especially in cloud fields and over brighter surface targets. Over ocean, use of the ML algorithm actually increases the offset between VIIRS and MODIS-based AOD (to ∼ 0.025), while reducing the differences between AE. We characterize algorithm retrievibility through statistics of retrieval fraction. In spite of differences between retrieved AOD magnitudes, the ML algorithm will lead to similar decisions about "whether to retrieve" on each sensor. Finally, we discuss how issues of calibration, as well as instrument spatial resolution may be contributing to the statistics and the ability to create a consistent MODIS → VIIRS aerosol CDR.
Inferring gene regression networks with model trees
2010-01-01
Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database) is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear regressions to separate areas of the search space favoring to infer localized similarities over a more global similarity. Furthermore, experimental results show the good performance of REGNET. PMID:20950452
Majumdar, Deepanjan; Rao, Padma; Maske, Nilam
2017-03-01
Ground-level concentrations of carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) were monitored over three seasons, i.e., post-monsoon (September-October), winter (January-February), and summer (May-June) for 1 year during 2013-2014 in Nagpur City in India. The selected gases had moderate to high variation both spatially (residential, commercial, traffic intersections, residential cum commercial sites) and temporally (at 7:00, 13:00, 18:00, and 23:00 hours in all three seasons). Concentrations of gases were randomly distributed diurnally over city in all seasons, and there was no specific increasing or decreasing trend with time in a day. Average CO 2 and N 2 O concentrations in winter were higher over post-monsoon and summer while CH 4 had highest average concentration in summer. Observed concentrations of CO 2 were predominantly above global average of 400 ppmv while N 2 O and CH 4 concentrations frequently dropped down below global average of 327 ppbv and 1.8 ppmv, respectively. Two-tailed Student's t test indicated that post-monsoon CO 2 concentrations were statistically different from summer but not so from winter, while difference between summer and winter concentrations was statistically significant (P < 0.05). CH 4 concentrations in all seasons were statistically at par to each other. In case of N 2 O, concentrations in post-monsoon were statistically different from summer but not so from winter, while difference between summer and winter concentrations was statistically significant (P < 0.05). Average ground-level concentrations of the gases calculated for three seasons together were higher in commercial areas. Environmental management priorities vis a vis greenhouse gas emissions in the city are also discussed.
Emergence of patterns in random processes
NASA Astrophysics Data System (ADS)
Newman, William I.; Turcotte, Donald L.; Malamud, Bruce D.
2012-08-01
Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peak-to-peak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3-year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peak-to-peak sequences in time series and that this can be employed for long data sets as a test of their i.i.d. character. We illustrate the validity of our analysis utilizing the peak-to-peak statistics of a Gaussian white noise. We also consider the nearest-neighbor cluster statistics of point processes in time. If the time intervals are random, we show that cluster size statistics are identical to the peak-to-peak sequence statistics of time series. In order to study the influence of correlations in a time series, we determine the peak-to-peak sequence statistics for the Langevin equation of kinetic theory leading to Brownian motion. To test our methodology, we consider a variety of applications. Using a global catalog of earthquakes, we obtain the peak-to-peak statistics of earthquake magnitudes and the nearest neighbor interoccurrence time statistics. In both cases, we find good agreement with the i.i.d. theory. We also consider the interval statistics of the Old Faithful geyser in Yellowstone National Park. In this case, we find a significant deviation from the i.i.d. theory which we attribute to antipersistence. We consider the interval statistics using the AL index of geomagnetic substorms. We again find a significant deviation from i.i.d. behavior that we attribute to mild persistence. Finally, we examine the behavior of Standard and Poor's 500 stock index's daily returns from 1928-2011 and show that, while it is close to being i.i.d., there is, again, significant persistence. We expect that there will be many other applications of our methodology both to interoccurrence statistics and to time series.
Machine Learning Predictions of a Multiresolution Climate Model Ensemble
NASA Astrophysics Data System (ADS)
Anderson, Gemma J.; Lucas, Donald D.
2018-05-01
Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.
Similar negative impacts of temperature on global wheat yield estimated by three independent methods
USDA-ARS?s Scientific Manuscript database
The potential impact of global temperature change on global wheat production has recently been assessed with different methods, scaling and aggregation approaches. Here we show that grid-based simulations, point-based simulations, and statistical regressions produce similar estimates of temperature ...
MAHDI, Alaa Abdul; BOLAÑOS-CARMONA, Victoria; GONZALEZ-LOPEZ, Santiago
2013-01-01
Objectives To investigate the bond strength and seal ability produced by AH Plus/gutta-percha, EndoREZ and RealSeal systems to root canal dentin. Material and Methods Sixty extracted single-root human teeth, instrumented manually to size 40, were divided into three groups (n=20) according to the sealer used; G1: AH Plus, G2: EndoREZ, and G3: RealSeal sealers. After filling using the lateral condensation technique, each sealer group was randomly divided into two subgroups according to the tests applied (n=10 for µPush-out test and n=10 for fluid filtration test). A fluid filtration method was used for quantitative evaluation of apical leakage. Four 1-mm-thick slices (cervical and medium level) were obtained from each root sample and a µPush-out test was performed. Failure modes were examined under microscopy at 40x, and a one-way ANOVA was applied to analyze the permeability. Non-parametrical statistics for related (Friedman's and Wilcoxon's rank tests) or unrelated samples (Kruskal-Wallis' and Mann-Whitney's tests) allowed for comparisons of µPush-out strength values among materials at the different levels. Statistical significance was accepted for p values <.05. Results There are no significant differences among fluid filtration of the three sealers. The sealer/core material does not significantly influence the µPush-out bond strength values (F=2.49; p=0.10), although statistically significant differences were detected with regard to root level (Chi2=23.93; p<0.001). AH Plus and RealSeal obtained higher bond strength to intraradicular dentin in the medium root slices. Conclusions There are no significant differences between the permeability and global µPush-out bond strength to root canal dentin achieved by AH Plus/gutta-percha, EndoREZ and RealSeal systems. PMID:24037078
COP21 climate negotiators' responses to climate model forecasts
NASA Astrophysics Data System (ADS)
Bosetti, Valentina; Weber, Elke; Berger, Loïc; Budescu, David V.; Liu, Ning; Tavoni, Massimo
2017-02-01
Policymakers involved in climate change negotiations are key users of climate science. It is therefore vital to understand how to communicate scientific information most effectively to this group. We tested how a unique sample of policymakers and negotiators at the Paris COP21 conference update their beliefs on year 2100 global mean temperature increases in response to a statistical summary of climate models' forecasts. We randomized the way information was provided across participants using three different formats similar to those used in Intergovernmental Panel on Climate Change reports. In spite of having received all available relevant scientific information, policymakers adopted such information very conservatively, assigning it less weight than their own prior beliefs. However, providing individual model estimates in addition to the statistical range was more effective in mitigating such inertia. The experiment was repeated with a population of European MBA students who, despite starting from similar priors, reported conditional probabilities closer to the provided models' forecasts than policymakers. There was also no effect of presentation format in the MBA sample. These results highlight the importance of testing visualization tools directly on the population of interest.
Quantifying (dis)agreement between direct detection experiments in a halo-independent way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feldstein, Brian; Kahlhoefer, Felix, E-mail: brian.feldstein@physics.ox.ac.uk, E-mail: felix.kahlhoefer@physics.ox.ac.uk
We propose an improved method to study recent and near-future dark matter direct detection experiments with small numbers of observed events. Our method determines in a quantitative and halo-independent way whether the experiments point towards a consistent dark matter signal and identifies the best-fit dark matter parameters. To achieve true halo independence, we apply a recently developed method based on finding the velocity distribution that best describes a given set of data. For a quantitative global analysis we construct a likelihood function suitable for small numbers of events, which allows us to determine the best-fit particle physics properties of darkmore » matter considering all experiments simultaneously. Based on this likelihood function we propose a new test statistic that quantifies how well the proposed model fits the data and how large the tension between different direct detection experiments is. We perform Monte Carlo simulations in order to determine the probability distribution function of this test statistic and to calculate the p-value for both the dark matter hypothesis and the background-only hypothesis.« less
Studies in the use of cloud type statistics in mission simulation
NASA Technical Reports Server (NTRS)
Fowler, M. G.; Willand, J. H.; Chang, D. T.; Cogan, J. L.
1974-01-01
A study to further improve NASA's global cloud statistics for mission simulation is reported. Regional homogeneity in cloud types was examined; most of the original region boundaries defined for cloud cover amount in previous studies were supported by the statistics on cloud types and the number of cloud layers. Conditionality in cloud statistics was also examined with special emphasis on temporal and spatial dependencies, and cloud type interdependence. Temporal conditionality was found up to 12 hours, and spatial conditionality up to 200 miles; the diurnal cycle in convective cloudiness was clearly evident. As expected, the joint occurrence of different cloud types reflected the dynamic processes which form the clouds. Other phases of the study improved the cloud type statistics for several region and proposed a mission simulation scheme combining the 4-dimensional atmospheric model, sponsored by MSFC, with the global cloud model.
Linear retrieval and global measurements of wind speed from the Seasat SMMR
NASA Technical Reports Server (NTRS)
Pandey, P. C.
1983-01-01
Retrievals of wind speed (WS) from Seasat Scanning Multichannel Microwave Radiometer (SMMR) were performed using a two-step statistical technique. Nine subsets of two to five SMMR channels were examined for wind speed retrieval. These subsets were derived by using a leaps and bound procedure based on the coefficient of determination selection criteria to a statistical data base of brightness temperatures and geophysical parameters. Analysis of Monsoon Experiment and ocean station PAPA data showed a strong correlation between sea surface temperature and water vapor. This relation was used in generating the statistical data base. Global maps of WS were produced for one and three month periods.
Statistical assessment of crosstalk enrichment between gene groups in biological networks.
McCormack, Theodore; Frings, Oliver; Alexeyenko, Andrey; Sonnhammer, Erik L L
2013-01-01
Analyzing groups of functionally coupled genes or proteins in the context of global interaction networks has become an important aspect of bioinformatic investigations. Assessing the statistical significance of crosstalk enrichment between or within groups of genes can be a valuable tool for functional annotation of experimental gene sets. Here we present CrossTalkZ, a statistical method and software to assess the significance of crosstalk enrichment between pairs of gene or protein groups in large biological networks. We demonstrate that the standard z-score is generally an appropriate and unbiased statistic. We further evaluate the ability of four different methods to reliably recover crosstalk within known biological pathways. We conclude that the methods preserving the second-order topological network properties perform best. Finally, we show how CrossTalkZ can be used to annotate experimental gene sets using known pathway annotations and that its performance at this task is superior to gene enrichment analysis (GEA). CrossTalkZ (available at http://sonnhammer.sbc.su.se/download/software/CrossTalkZ/) is implemented in C++, easy to use, fast, accepts various input file formats, and produces a number of statistics. These include z-score, p-value, false discovery rate, and a test of normality for the null distributions.
NASA Astrophysics Data System (ADS)
Yazid, N. M.; Din, A. H. M.; Omar, K. M.; Som, Z. A. M.; Omar, A. H.; Yahaya, N. A. Z.; Tugi, A.
2016-09-01
Global geopotential models (GGMs) are vital in computing global geoid undulations heights. Based on the ellipsoidal height by Global Navigation Satellite System (GNSS) observations, the accurate orthometric height can be calculated by adding precise and accurate geoid undulations model information. However, GGMs also provide data from the satellite gravity missions such as GRACE, GOCE and CHAMP. Thus, this will assist to enhance the global geoid undulations data. A statistical assessment has been made between geoid undulations derived from 4 GGMs and the airborne gravity data provided by Department of Survey and Mapping Malaysia (DSMM). The goal of this study is the selection of the best possible GGM that best matches statistically with the geoid undulations of airborne gravity data under the Marine Geodetic Infrastructures in Malaysian Waters (MAGIC) Project over marine areas in Sabah. The correlation coefficients and the RMS value for the geoid undulations of GGM and airborne gravity data were computed. The correlation coefficients between EGM 2008 and airborne gravity data is 1 while RMS value is 0.1499.In this study, the RMS value of EGM 2008 is the lowest among the others. Regarding to the statistical analysis, it clearly represents that EGM 2008 is the best fit for marine geoid undulations throughout South China Sea.
White matter hyperintensity burden and disability in older adults: is chronic pain a contributor?
Buckalew, Neilly; Haut, Marc W; Aizenstein, Howard; Rosano, Caterina; Edelman, Kathryn Dunfee; Perera, Subashan; Marrow, Lisa; Tadic, Stasa; Venkatraman, Vijay; Weiner, Debra
2013-06-01
To primarily explore differences in global and regional white matter hyperintensities (WMH) in older adults with self-reported disabling and nondisabling chronic low back pain (CLBP) and to examine the association of WMH with gait speed in all participants with CLBP. To secondarily compare WMH of the participants with CLBP with the pain-free controls. A cross-sectional, case-control study. University of Pittsburgh. Twenty-four community-dwelling older adults: 8 with self-reported disabling CLBP, 8 with nondisabling CLBP, and 8 were pain-free. Exclusions were psychiatric or neurologic disorders (either central or peripheral), substance abuse, opioid use, or diabetes mellitus. All participants underwent structural brain magnetic resonance imaging, and all participants with CLBP underwent the 4-m walk test. All the participants were assessed for both global and regional WMH by using an automated localization and segmentation method, and gait speed of participants with CLBP. The disabled group demonstrated statistically significant regional WMH in a number of left hemispheric tracts: anterior thalamic radiation (P = .0391), lower cingulate (P = .0336), inferior longitudinal fasciculus (P = .0367), superior longitudinal fasciculus (P = .0011), and the superior longitudinal fasciculus branch to the temporal lobe (P = .0072). Also, there was a statistically significant negative association (rs = -0.57; P = .0225) between the left lower cingulate WMH and the gait speed in all the participants with CLBP. There was a statistical difference in global WMH burden (P = .0014) and nearly all regional tracts (both left and right hemispheres) when comparing CLBP with pain-free participants. Our findings suggest that WMH is associated with, and hence, may be accelerated by chronic pain manifesting as perceived disability, given the self-reported disabled CLBP patients had the greatest burden, and the pain free the least, and manifesting as measurable disability, given increasing WMH was associated with decreasing gait speed in all chronic pain participants. Copyright © 2013 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.
Mandy, William; Charman, Tony; Puura, Kaija; Skuse, David
2014-01-01
The recent Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-5) reformulation of autism spectrum disorder has received empirical support from North American and UK samples. Autism spectrum disorder is an increasingly global diagnosis, and research is needed to discover how well it generalises beyond North America and the United Kingdom. We tested the applicability of the DSM-5 model to a sample of Finnish young people with autism spectrum disorder (n = 130) or the broader autism phenotype (n = 110). Confirmatory factor analysis tested the DSM-5 model in Finland and compared the fit of this model between Finnish and UK participants (autism spectrum disorder, n = 488; broader autism phenotype, n = 220). In both countries, autistic symptoms were measured using the Developmental, Diagnostic and Dimensional Interview. Replicating findings from English-speaking samples, the DSM-5 model fitted well in Finnish autism spectrum disorder participants, outperforming a Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition (DSM-IV) model. The DSM-5 model fitted equally well in Finnish and UK autism spectrum disorder samples. Among broader autism phenotype participants, this model fitted well in the United Kingdom but poorly in Finland, suggesting that cross-cultural variability may be greatest for milder autistic characteristics. We encourage researchers with data from other cultures to emulate our methodological approach, to map any cultural variability in the manifestation of autism spectrum disorder and the broader autism phenotype. This would be especially valuable given the ongoing revision of the International Classification of Diseases-11th Edition, the most global of the diagnostic manuals.
Visual field impairment captures disease burden in multiple sclerosis.
Ortiz-Perez, Santiago; Andorra, Magí; Sanchez-Dalmau, Bernardo; Torres-Torres, Rubén; Calbet, David; Lampert, Erika J; Alba-Arbalat, Salut; Guerrero-Zamora, Ana M; Zubizarreta, Irati; Sola-Valls, Nuria; Llufriu, Sara; Sepúlveda, María; Saiz, Albert; Villoslada, Pablo; Martinez-Lapiscina, Elena H
2016-04-01
Monitoring disease burden is an unmeet need in multiple sclerosis (MS). Identifying patients at high risk of disability progression will be useful for improving clinical-therapeutic decisions in clinical routine. To evaluate the role of visual field testing in non-optic neuritis eyes (non-ON eyes) as a biomarker of disability progression in MS. In 109 patients of the MS-VisualPath cohort, we evaluated the association between visual field abnormalities and global and cognitive disability markers and brain and retinal imaging markers of neuroaxonal injury using linear regression models adjusted for sex, age, disease duration and use of disease-modifying therapies. We evaluated the risk of disability progression associated to have baseline impaired visual field after 3 years of follow-up. Sixty-two percent of patients showed visual field defects in non-ON eyes. Visual field mean deviation was statistically associated with global disability; brain (normalized brain parenchymal, gray matter volume and lesion load) and retinal (peripapillary retinal nerve fiber layer thickness and macular ganglion cell complex thickness) markers of neuroaxonal damage. Patients with impaired visual field had statistically significative greater disability, lower normalized brain parenchymal volume and higher lesion volume than patients with normal visual field testing. MS patients with baseline impaired VF tripled the risk of disability progression during follow-up [OR = 3.35; 95 % CI (1.10-10.19); p = 0.033]. The association of visual field impairment with greater disability and neuroaxonal injury and higher risk of disability progression suggest that VF could be used to monitor MS disease burden.
Analysis of spatial and temporal rainfall trends in Sicily during the 1921-2012 period
NASA Astrophysics Data System (ADS)
Liuzzo, Lorena; Bono, Enrico; Sammartano, Vincenzo; Freni, Gabriele
2016-10-01
Precipitation patterns worldwide are changing under the effects of global warming. The impacts of these changes could dramatically affect the hydrological cycle and, consequently, the availability of water resources. In order to improve the quality and reliability of forecasting models, it is important to analyse historical precipitation data to account for possible future changes. For these reasons, a large number of studies have recently been carried out with the aim of investigating the existence of statistically significant trends in precipitation at different spatial and temporal scales. In this paper, the existence of statistically significant trends in rainfall from observational datasets, which were measured by 245 rain gauges over Sicily (Italy) during the 1921-2012 period, was investigated. Annual, seasonal and monthly time series were examined using the Mann-Kendall non-parametric statistical test to detect statistically significant trends at local and regional scales, and their significance levels were assessed. Prior to the application of the Mann-Kendall test, the historical dataset was completed using a geostatistical spatial interpolation technique, the residual ordinary kriging, and then processed to remove the influence of serial correlation on the test results, applying the procedure of trend-free pre-whitening. Once the trends at each site were identified, the spatial patterns of the detected trends were examined using spatial interpolation techniques. Furthermore, focusing on the 30 years from 1981 to 2012, the trend analysis was repeated with the aim of detecting short-term trends or possible changes in the direction of the trends. Finally, the effect of climate change on the seasonal distribution of rainfall during the year was investigated by analysing the trend in the precipitation concentration index. The application of the Mann-Kendall test to the rainfall data provided evidence of a general decrease in precipitation in Sicily during the 1921-2012 period. Downward trends frequently occurred during the autumn and winter months. However, an increase in total annual precipitation was detected during the period from 1981 to 2012.
Conrad, Kendon J; Bezruczko, Nikolaus; Chan, Ya-Fen; Riley, Barth; Diamond, Guy; Dennis, Michael L
2010-01-15
Symptoms of internalizing disorders (depression, anxiety, somatic, trauma) are the major risk factors for suicide. Atypical suicide risk is characterized by people with few or no symptoms of internalizing disorders. In persons screened at intake to alcohol or other drug (AOD) treatment, this research examined whether person fit statistics would support an atypical subtype at high risk for suicide that did not present with typical depression and other internalizing disorders. Symptom profiles of the prototypical, typical, and atypical persons, as defined using fit statistics, were tested on 7408 persons entering AOD treatment using the Global Appraisal of Individual Needs (GAIN; Dennis et al., 2003a,b). Of those with suicide symptoms, the findings were as expected with the atypical group being higher on suicide and lower on symptoms of internalizing disorders. In addition, the atypical group was similar or lower on substance problems, symptoms of externalizing disorders, and crime and violence. Person fit statistics were useful in identifying persons with atypical suicide profiles and in enlightening aspects of existing theory concerning atypical suicidal ideation. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jiachen; Zhang, Kai; Liu, Junfeng
Solar reflective “cool roofs” absorb less sunlight than traditional dark roofs, reducing solar heat gain, and decreasing the amount of heat transferred to the atmosphere. Widespread adoption of cool roofs could therefore reduce temperatures in urban areas, partially mitigating the urban heat island effect, and contributing to reversing the local impacts of global climate change. The impacts of cool roofs on global climate remain debated by past research and are uncertain. Using a sophisticated Earth system model, the impacts of cool roofs on climate are investigated at urban, continental, and global scales. We find that global adoption of cool roofsmore » in urban areas reduces urban heat islands everywhere, with an annual- and global-mean decrease from 1.6 to 1.2 K. Decreases are statistically significant, except for some areas in Africa and Mexico where urban fraction is low, and some high-latitude areas during wintertime. Analysis of the surface and TOA energy budget in urban regions at continental-scale shows cool roofs causing increases in solar radiation leaving the Earth-atmosphere system in most regions around the globe, though the presence of aerosols and clouds are found to partially offset increases in upward radiation. Aerosols dampen cool roof-induced increases in upward solar radiation, ranging from 4% in the United States to 18% in more polluted China. Adoption of cool roofs also causes statistically significant reductions in surface air temperatures in urbanized regions of China (0.11±0.10 K) and the United States (0.14±0.12 K); India and Europe show statistically insignificant changes. The research presented here indicates that adoption of cool roofs around the globe would lead to statistically insignificant reductions in global mean air temperature (0.0021 ±0.026 K). This counters past research suggesting that cool roofs can reduce, or even increase global mean temperatures. Thus, we suggest that while cool roofs are an effective tool for reducing building energy use in hot climates, urban heat islands, and regional air temperatures, their influence on global climate is likely negligible.« less
Global rotation has high sensitivity in ACL lesions within stress MRI.
Espregueira-Mendes, João; Andrade, Renato; Leal, Ana; Pereira, Hélder; Skaf, Abdala; Rodrigues-Gomes, Sérgio; Oliveira, J Miguel; Reis, Rui L; Pereira, Rogério
2017-10-01
This study aims to objectively compare side-to-side differences of P-A laxity alone and coupled with rotatory laxity within magnetic resonance imaging, in patients with total anterior cruciate ligament (ACL) rupture. This prospective study enrolled sixty-one patients with signs and symptoms of unilateral total anterior cruciate ligament rupture, which were referred to magnetic resonance evaluation with simultaneous instrumented laxity measurements. Sixteen of those patients were randomly selected to also have the contralateral healthy knee laxity profile tested. Images were acquired for the medial and lateral tibial plateaus without pressure, with postero-anterior translation, and postero-anterior translation coupled with maximum internal and external rotation, respectively. All parameters measured were significantly different between healthy and injured knees (P < 0.05), with exception of lateral plateau without stress. The difference between injured and healthy knees for medial and lateral tibial plateaus anterior displacement (P < 0.05) and rotation (P < 0.001) was statistically significant. It was found a significant correlation between the global rotation of the lateral tibial plateau (lateral plateau with internal + external rotation) with pivot-shift, and between the anterior global translation of both tibial plateaus (medial + lateral tibial plateau) with Lachman. The anterior global translation of both tibial plateaus was the most specific test with a cut-off point of 11.1 mm (93.8 %), and the global rotation of the lateral tibial plateau was the most sensitive test with a correspondent cut-off point of 15.1 mm (92.9 %). Objective laxity quantification of ACL-injured knees showed increased sagittal laxity, and simultaneously in sagittal and transversal planes, when compared to their healthy contralateral knee. Moreover, when measuring instability from anterior cruciate ligament ruptures, the anterior global translation of both tibial plateaus and global rotation of the lateral tibial plateau add diagnostic specificity and sensitivity. This work strengthens the evidence that the anterior cruciate ligament plays an important biomechanical role in controlling the anterior translation, but also both internal and external rotation. The high sensitivity and specificity of this device in objectively identifying and measuring the multiplanar instability clearly guides stability restoration clinical procedures. Level of evidence Cross-sectional study, Level III.
World Hunger: Facts. Facts for Action #1.
ERIC Educational Resources Information Center
Phillips, Jim
Designed for global education at the high school level, this document presents statistics on malnutrition, infant mortality, and illiteracy in developing nations. The statistics are compared with private and government expenditures of wealthy nations. Examples of the statistical information for developing nations are: more than 500 million people…
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.
Kaniewska, Malwina; Schuetz, Georg M; Willun, Steffen; Schlattmann, Peter; Dewey, Marc
2017-04-01
To compare the diagnostic accuracy of computed tomography (CT) in the assessment of global and regional left ventricular (LV) function with magnetic resonance imaging (MRI). MEDLINE, EMBASE and ISI Web of Science were systematically reviewed. Evaluation included: ejection fraction (EF), end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV) and left ventricular mass (LVM). Differences between modalities were analysed using limits of agreement (LoA). Publication bias was measured by Egger's regression test. Heterogeneity was evaluated using Cochran's Q test and Higgins I 2 statistic. In the presence of heterogeneity the DerSimonian-Laird method was used for estimation of heterogeneity variance. Fifty-three studies including 1,814 patients were identified. The mean difference between CT and MRI was -0.56 % (LoA, -11.6-10.5 %) for EF, 2.62 ml (-34.1-39.3 ml) for EDV and 1.61 ml (-22.4-25.7 ml) for ESV, 3.21 ml (-21.8-28.3 ml) for SV and 0.13 g (-28.2-28.4 g) for LVM. CT detected wall motion abnormalities on a per-segment basis with 90 % sensitivity and 97 % specificity. CT is accurate for assessing global LV function parameters but the limits of agreement versus MRI are moderately wide, while wall motion deficits are detected with high accuracy. • CT helps to assess patients with coronary artery disease (CAD). • MRI is the reference standard for evaluation of left ventricular function. • CT provides accurate assessment of global left ventricular function.
New axion and hidden photon constraints from a solar data global fit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vinyoles, N.; Serenelli, A.; Isern, J.
2015-10-01
We present a new statistical analysis that combines helioseismology (sound speed, surface helium and convective radius) and solar neutrino observations (the {sup 8}B and {sup 7}Be fluxes) to place upper limits to the properties of non standard weakly interacting particles. Our analysis includes theoretical and observational errors, accounts for tensions between input parameters of solar models and can be easily extended to include other observational constraints. We present two applications to test the method: the well studied case of axions and axion-like particles and the more novel case of low mass hidden photons. For axions we obtain an upper limitmore » at 3σ for the axion-photon coupling constant of g{sub aγ} < 4.1 · 10{sup −10} GeV{sup −1}. For hidden photons we obtain the most restrictive upper limit available accross a wide range of masses for the product of the kinetic mixing and mass of χ m < 1.8 ⋅ 10{sup −12} eV at 3σ. Both cases improve the previous solar constraints based on the Standard Solar Models showing the power of using a global statistical approach.« less
Lee, Romeo B.; Baring, Rito V.; Sta. Maria, Madelene A.
2016-01-01
The study seeks to estimate gender variations in the direct effects of (a) number of organizational memberships, (b) number of social networking sites (SNS), and (c) grade-point average (GPA) on global social responsibility (GSR); and in the indirect effects of (a) and of (b) through (c) on GSR. Cross-sectional survey data were drawn from questionnaire interviews involving 3,173 Filipino university students. Based on a path model, the three factors were tested to determine their inter-relationships and their relationships with GSR. The direct and total effects of the exogenous factors on the dependent variable are statistically significantly robust. The indirect effects of organizational memberships on GSR through GPA are also statistically significant, but the indirect effects of SNS on GSR through GPA are marginal. Men and women significantly differ only in terms of the total effects of their organizational memberships on GSR. The lack of broad gender variations in the effects of SNS, organizational memberships and GPA on GSR may be linked to the relatively homogenous characteristics and experiences of the university students interviewed. There is a need for more path models to better understand the predictors of GSR in local students. PMID:27247700
Lee, Romeo B; Baring, Rito V; Sta Maria, Madelene A
2016-02-01
The study seeks to estimate gender variations in the direct effects of (a) number of organizational memberships, (b) number of social networking sites (SNS), and (c) grade-point average (GPA) on global social responsibility (GSR); and in the indirect effects of (a) and of (b) through (c) on GSR. Cross-sectional survey data were drawn from questionnaire interviews involving 3,173 Filipino university students. Based on a path model, the three factors were tested to determine their inter-relationships and their relationships with GSR. The direct and total effects of the exogenous factors on the dependent variable are statistically significantly robust. The indirect effects of organizational memberships on GSR through GPA are also statistically significant, but the indirect effects of SNS on GSR through GPA are marginal. Men and women significantly differ only in terms of the total effects of their organizational memberships on GSR. The lack of broad gender variations in the effects of SNS, organizational memberships and GPA on GSR may be linked to the relatively homogenous characteristics and experiences of the university students interviewed. There is a need for more path models to better understand the predictors of GSR in local students.
NASA Astrophysics Data System (ADS)
Roberts, B. M.; Blewitt, G.; Dailey, C.; Derevianko, A.
2018-04-01
We analyze the prospects of employing a distributed global network of precision measurement devices as a dark matter and exotic physics observatory. In particular, we consider the atomic clocks of the global positioning system (GPS), consisting of a constellation of 32 medium-Earth orbit satellites equipped with either Cs or Rb microwave clocks and a number of Earth-based receiver stations, some of which employ highly-stable H-maser atomic clocks. High-accuracy timing data is available for almost two decades. By analyzing the satellite and terrestrial atomic clock data, it is possible to search for transient signatures of exotic physics, such as "clumpy" dark matter and dark energy, effectively transforming the GPS constellation into a 50 000 km aperture sensor array. Here we characterize the noise of the GPS satellite atomic clocks, describe the search method based on Bayesian statistics, and test the method using simulated clock data. We present the projected discovery reach using our method, and demonstrate that it can surpass the existing constrains by several order of magnitude for certain models. Our method is not limited in scope to GPS or atomic clock networks, and can also be applied to other networks of precision measurement devices.
Lemey, Philippe; Rambaut, Andrew; Bedford, Trevor; Faria, Nuno; Bielejec, Filip; Baele, Guy; Russell, Colin A; Smith, Derek J; Pybus, Oliver G; Brockmann, Dirk; Suchard, Marc A
2014-02-01
Information on global human movement patterns is central to spatial epidemiological models used to predict the behavior of influenza and other infectious diseases. Yet it remains difficult to test which modes of dispersal drive pathogen spread at various geographic scales using standard epidemiological data alone. Evolutionary analyses of pathogen genome sequences increasingly provide insights into the spatial dynamics of influenza viruses, but to date they have largely neglected the wealth of information on human mobility, mainly because no statistical framework exists within which viral gene sequences and empirical data on host movement can be combined. Here, we address this problem by applying a phylogeographic approach to elucidate the global spread of human influenza subtype H3N2 and assess its ability to predict the spatial spread of human influenza A viruses worldwide. Using a framework that estimates the migration history of human influenza while simultaneously testing and quantifying a range of potential predictive variables of spatial spread, we show that the global dynamics of influenza H3N2 are driven by air passenger flows, whereas at more local scales spread is also determined by processes that correlate with geographic distance. Our analyses further confirm a central role for mainland China and Southeast Asia in maintaining a source population for global influenza diversity. By comparing model output with the known pandemic expansion of H1N1 during 2009, we demonstrate that predictions of influenza spatial spread are most accurate when data on human mobility and viral evolution are integrated. In conclusion, the global dynamics of influenza viruses are best explained by combining human mobility data with the spatial information inherent in sampled viral genomes. The integrated approach introduced here offers great potential for epidemiological surveillance through phylogeographic reconstructions and for improving predictive models of disease control.
Modelling 1-minute directional observations of the global irradiance.
NASA Astrophysics Data System (ADS)
Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon
2016-04-01
Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.
A new statistical approach to climate change detection and attribution
NASA Astrophysics Data System (ADS)
Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe
2017-01-01
We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).
NASA Astrophysics Data System (ADS)
Maulida, N. I.; Firman, H.; Rusyati, L.
2017-02-01
The aims of this study are: (1) to investigate the level of students’ critical thinking skill on living things and environmental sustainability theme for each Inch’ critical thinking elements and overall, (2) to investigate the level of students’ critical thinking skill on living things characteristic, biodiversity, energy resources, ecosystem, environmental pollution, and global warming topics. The research was conducted due to the important of critical thinking measurement to get the current skill description as the basic consideration for further critical thinking skill improvement in lower secondary science. The research method used was descriptive. 331 seventh grade students taken from five lower secondary schools in Cirebon were tested to get the critical thinking skill data by using Science Virtual Test as the instrument. Generally, the mean scores on eight Inch’ critical thinking elements and overall score from descriptive statistic reveals a moderate attainments level. Students’ critical thinking skill on biodiversity, energy resources, ecosystem, environmental pollution, and global warming topics are in moderate level. While students’ critical thinking skill on living things characteristic is identified as high level. Students’ experience in thinking critically during science learning process and the characteristic of the topic are emerged as the reason behind the students’ critical thinking skill level on certain science topic.
Evaluation of annual, global seismicity forecasts, including ensemble models
NASA Astrophysics Data System (ADS)
Taroni, Matteo; Zechar, Jeremy; Marzocchi, Warner
2013-04-01
In 2009, the Collaboratory for the Study of the Earthquake Predictability (CSEP) initiated a prototype global earthquake forecast experiment. Three models participated in this experiment for 2009, 2010 and 2011—each model forecast the number of earthquakes above magnitude 6 in 1x1 degree cells that span the globe. Here we use likelihood-based metrics to evaluate the consistency of the forecasts with the observed seismicity. We compare model performance with statistical tests and a new method based on the peer-to-peer gambling score. The results of the comparisons are used to build ensemble models that are a weighted combination of the individual models. Notably, in these experiments the ensemble model always performs significantly better than the single best-performing model. Our results indicate the following: i) time-varying forecasts, if not updated after each major shock, may not provide significant advantages with respect to time-invariant models in 1-year forecast experiments; ii) the spatial distribution seems to be the most important feature to characterize the different forecasting performances of the models; iii) the interpretation of consistency tests may be misleading because some good models may be rejected while trivial models may pass consistency tests; iv) a proper ensemble modeling seems to be a valuable procedure to get the best performing model for practical purposes.
A Systematic Review of Global Drivers of Ant Elevational Diversity
Szewczyk, Tim; McCain, Christy M.
2016-01-01
Ant diversity shows a variety of patterns across elevational gradients, though the patterns and drivers have not been evaluated comprehensively. In this systematic review and reanalysis, we use published data on ant elevational diversity to detail the observed patterns and to test the predictions and interactions of four major diversity hypotheses: thermal energy, the mid-domain effect, area, and the elevational climate model. Of sixty-seven published datasets from the literature, only those with standardized, comprehensive sampling were used. Datasets included both local and regional ant diversity and spanned 80° in latitude across six biogeographical provinces. We used a combination of simulations, linear regressions, and non-parametric statistics to test multiple quantitative predictions of each hypothesis. We used an environmentally and geometrically constrained model as well as multiple regression to test their interactions. Ant diversity showed three distinct patterns across elevations: most common were hump-shaped mid-elevation peaks in diversity, followed by low-elevation plateaus and monotonic decreases in the number of ant species. The elevational climate model, which proposes that temperature and precipitation jointly drive diversity, and area were partially supported as independent drivers. Thermal energy and the mid-domain effect were not supported as primary drivers of ant diversity globally. The interaction models supported the influence of multiple drivers, though not a consistent set. In contrast to many vertebrate taxa, global ant elevational diversity patterns appear more complex, with the best environmental model contingent on precipitation levels. Differences in ecology and natural history among taxa may be crucial to the processes influencing broad-scale diversity patterns. PMID:27175999
Velocity Statistics and Spectra in Three-Stream Jets
NASA Technical Reports Server (NTRS)
Ecker, Tobias; Lowe, K. Todd; Ng, Wing F.; Henderson, Brenda; Leib, Stewart
2016-01-01
Velocimetry measurements were obtained in three-stream jets at the NASA Glenn Research Center Nozzle Acoustics Test Rig using the time-resolved Doppler global velocimetry technique. These measurements afford exceptional frequency response, to 125 kHz bandwidth, in order to study the detailed dynamics of turbulence in developing shear flows. Mean stream-wise velocity is compared to measurements acquired using particle image velocimetry for validation. Detailed results for convective velocity distributions throughout an axisymmetric plume and the thick side of a plume with an offset third-stream duct are provided. The convective velocity results exhibit that, as expected, the eddy speeds are reduced on the thick side of the plume compared to the axisymmetric case. The results indicate that the time-resolved Doppler global velocimetry method holds promise for obtaining results valuable to the implementation and refinement of jet noise prediction methods being developed for three-stream jets.
Detecting Non-Gaussian and Lognormal Characteristics of Temperature and Water Vapor Mixing Ratio
NASA Astrophysics Data System (ADS)
Kliewer, A.; Fletcher, S. J.; Jones, A. S.; Forsythe, J. M.
2017-12-01
Many operational data assimilation and retrieval systems assume that the errors and variables come from a Gaussian distribution. This study builds upon previous results that shows that positive definite variables, specifically water vapor mixing ratio and temperature, can follow a non-Gaussian distribution and moreover a lognormal distribution. Previously, statistical testing procedures which included the Jarque-Bera test, the Shapiro-Wilk test, the Chi-squared goodness-of-fit test, and a composite test which incorporated the results of the former tests were employed to determine locations and time spans where atmospheric variables assume a non-Gaussian distribution. These tests are now investigated in a "sliding window" fashion in order to extend the testing procedure to near real-time. The analyzed 1-degree resolution data comes from the National Oceanic and Atmospheric Administration (NOAA) Global Forecast System (GFS) six hour forecast from the 0Z analysis. These results indicate the necessity of a Data Assimilation (DA) system to be able to properly use the lognormally-distributed variables in an appropriate Bayesian analysis that does not assume the variables are Gaussian.
High Predictive Skill of Global Surface Temperature a Year Ahead
NASA Astrophysics Data System (ADS)
Folland, C. K.; Colman, A.; Kennedy, J. J.; Knight, J.; Parker, D. E.; Stott, P.; Smith, D. M.; Boucher, O.
2011-12-01
We discuss the high skill of real-time forecasts of global surface temperature a year ahead issued by the UK Met Office, and their scientific background. Although this is a forecasting and not a formal attribution study, we show that the main instrumental global annual surface temperature data sets since 1891 are structured consistently with a set of five physical forcing factors except during and just after the second World War. Reconstructions use a multiple application of cross validated linear regression to minimise artificial skill allowing time-varying uncertainties in the contribution of each forcing factor to global temperature to be assessed. Mean cross validated reconstructions for the data sets have total correlations in the range 0.93-0.95,interannual correlations in the range 0.72-0.75 and root mean squared errors near 0.06oC, consistent with observational uncertainties.Three transient runs of the HadCM3 coupled model for 1888-2002 demonstrate quite similar reconstruction skill from similar forcing factors defined appropriately for the model, showing that skilful use of our technique is not confined to observations. The observed reconstructions show that the Atlantic Multidecadal Oscillation (AMO) likely contributed to the re-commencement of global warming between 1976 and 2010 and to global cooling observed immediately beforehand in 1965-1976. The slowing of global warming in the last decade is likely to be largely due to a phase-delayed response to the downturn in the solar cycle since 2001-2, with no net ENSO contribution. The much reduced trend in 2001-10 is similar in size to other weak decadal temperature trends observed since global warming resumed in the 1970s. The causes of variations in decadal trends can be mostly explained by variations in the strength of the forcing factors. Eleven real-time forecasts of global mean surface temperature for the year ahead for 2000-2010, based on broadly similar methods, provide an independent test of the ideas of this study. They had the high correlation and root mean square error skill levels compared to observations of 0.74 and 0.07oC respectively. Pseudo-forecasts for the same period reconstructed from somewhat improved forcing data used for this study had the slightly better correlation of 0.80 and root mean squared error of 0.05oC. Finally we compare the statistical forecasts with dynamical hindcasts and forecasts of global surface temperature a year ahead made by the Met Office DePreSys coupled model. The statistical and dynamical forecasts of global surface temperature for 2011 will be compared with preliminary verification data.
La Niña diversity and Northwest Indian Ocean Rim teleconnections
Hoell, Andrew; Funk, Christopher C.; Barlow, Mathew
2014-01-01
The differences in tropical Pacific sea surface temperature (SST) expressions of El Niño-Southern Oscillation (ENSO) events of the same phase have been linked with different global atmospheric circulation patterns. This study examines the dynamical forcing of precipitation during October–December (OND) and March–May (MAM) over East Africa and during December–March (DJFM) over Central-Southwest Asia for 1950–2010 associated with four tropical Pacific SST patterns characteristic of La Niña events, the cold phase of ENSO. The self-organizing map method along with a statistical distinguishability test was used to isolate La Niña events, and seasonal precipitation forcing was investigated in terms of the tropical overturning circulation and thermodynamic and moisture budgets. Recent La Niña events with strong opposing SST anomalies between the central and western Pacific Ocean (phases 3 and 4), force the strongest global circulation modifications and drought over the Northwest Indian Ocean Rim. Over East Africa during MAM and OND, subsidence is forced by an enhanced tropical overturning circulation and precipitation reductions are exacerbated by increases in moisture flux divergence. Over Central-Southwest Asia during DJFM, the thermodynamic forcing of subsidence is primarily responsible for precipitation reductions, with moisture flux divergence acting as a secondary mechanism to reduce precipitation. Eastern Pacific La Niña events in the absence of west Pacific SST anomalies (phases 1 and 2), are associated with weaker global teleconnections, particularly over the Indian Ocean Rim. The weak regional teleconnections result in statistically insignificant precipitation modifications over East Africa and Central-Southwest Asia.
Global health-related publications in otolaryngology are increasing.
Chambers, Kyle J; Creighton, Francis; Abdul-Aziz, Dunia; Cheney, Mack; Randolph, Gregory W
2015-04-01
Determine trends in global health-related publication in otolaryngology. A review of research databases. A search of publications available on PubMed and nine additional databases was undertaken reviewing two time periods 10 years apart for the timeframes 1998 to 2002 (early time period) and 2008 to 2012 (recent time period) using specific search terms to identify global health-related publications in otolaryngology. Publications were examined for region of origin, subspecialty, type of publication, and evidence of international collaboration. χ and t test analyses were used to identify trends. In the 1998 to 2002 time period, a total of 26 publications met inclusion criteria for the study, with a mean of 5.2 ± 2.8 publications per year. In the 2008 to 2012 time period, a total of 61 publications met inclusion criteria, with a mean of 12.3 ± 5.6 publications per year. The 235% increase in global health-related publications identified between the two study periods was statistically significant (P = .02). The absolute number of publications in which collaboration occurred between countries increased from three in the early time period to nine the recent time period. There has been a significant increase in the volume of global health-related publications in English language otolaryngology journals over the past decade, providing strong evidence of the increasing trend of global health as an academic pursuit within the field of otolaryngology. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Globalization and the price decline of illicit drugs.
Costa Storti, Cláudia; De Grauwe, Paul
2009-01-01
This study aims at understanding the mechanisms underlying the dramatic decline of the retail prices of major drugs like cocaine and heroin during the past two decades. It also aims at analysing the implications of this decline for drug policies. We use a theoretical model to identify the possible causes of this price decline. This allows us to formulate the hypothesis that the major driving force behind the price decline is a reduction of the intermediation margin (the difference between the retail and producer prices). We also develop the hypothesis that globalization has been an important factor behind the decline of the intermediation margin. We then analyse the statistical information to test these hypotheses. We find that the decline in the retail prices of drugs is related to the strong decline in the intermediation margin in the drug business, and that globalization is the main driving force behind this phenomenon. Globalization has done so by increasing the efficiency of the distribution of drugs, by reducing the risk premium involved in dealing with drugs, and by increasing the degree of competition in the drug markets. We conclude that the cocaine and heroin price declines were due to a sharp fall in the intermediation margin, which was probably influenced by globalization. This phenomenon might have a strong impact on the effectiveness of drug policies, increasing the relative effectiveness of policies aiming at reducing the demand of drugs.
NASA Astrophysics Data System (ADS)
Dingle Robertson, L.; Hosseini, M.; Davidson, A. M.; McNairn, H.
2017-12-01
The Joint Experiment for Crop Assessment and Monitoring (JECAM) is the research and development branch of GEOGLAM (Group on Earth Observations Global Agricultural Monitoring), a G20 initiative to improve the global monitoring of agriculture through the use of Earth Observation (EO) data and remote sensing. JECAM partners represent a diverse network of researchers collaborating towards a set of best practices and recommendations for global agricultural analysis using EO data, with well monitored test sites covering a wide range of agriculture types, cropping systems and climate regimes. Synthetic Aperture Radar (SAR) for crop inventory and condition monitoring offers many advantages particularly the ability to collect data under cloudy conditions. The JECAM SAR Inter-Comparison Experiment is a multi-year, multi-partner project that aims to compare global methods for (1) operational SAR & optical; multi-frequency SAR; and compact polarimetry methods for crop monitoring and inventory, and (2) the retrieval of Leaf Area Index (LAI) and biomass estimations using models such as the Water Cloud Model (WCM) employing single frequency SAR; multi-frequency SAR; and compact polarimetry. The results from these activities will be discussed along with an examination of the requirements of a global experiment including best-date determination for SAR data acquisition, pre-processing techniques, in situ data sharing, model development and statistical inter-comparison of the results.
Yang, Xiaowei; Nie, Kun
2008-03-15
Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.
ERIC Educational Resources Information Center
Kim, Ki Su
2005-01-01
This article examines the relationship between globalization and national education reforms, especially those of educational systems. Instead of exploring the much debated issues of how globalization affects national educational systems and how the nations react by what kinds of systemic education reform, however, it focuses on what such a method…
NASA Astrophysics Data System (ADS)
Wang, Audrey; Price, David T.
2007-03-01
A simple integrated algorithm was developed to relate global climatology to distributions of tree plant functional types (PFT). Multivariate cluster analysis was performed to analyze the statistical homogeneity of the climate space occupied by individual tree PFTs. Forested regions identified from the satellite-based GLC2000 classification were separated into tropical, temperate, and boreal sub-PFTs for use in the Canadian Terrestrial Ecosystem Model (CTEM). Global data sets of monthly minimum temperature, growing degree days, an index of climatic moisture, and estimated PFT cover fractions were then used as variables in the cluster analysis. The statistical results for individual PFT clusters were found consistent with other global-scale classifications of dominant vegetation. As an improvement of the quantification of the climatic limitations on PFT distributions, the results also demonstrated overlapping of PFT cluster boundaries that reflected vegetation transitions, for example, between tropical and temperate biomes. The resulting global database should provide a better basis for simulating the interaction of climate change and terrestrial ecosystem dynamics using global vegetation models.
NASA Astrophysics Data System (ADS)
Hu, Y.; Vaughan, M.; McClain, C.; Behrenfeld, M.; Maring, H.; Anderson, D.; Sun-Mack, S.; Flittner, D.; Huang, J.; Wielicki, B.; Minnis, P.; Weimer, C.; Trepte, C.; Kuehn, R.
2007-03-01
This study presents an empirical relation that links layer integrated depolarization ratios, the extinction coefficients, and effective radii of water clouds, based on Monte Carlo simulations of CALIPSO lidar observations. Combined with cloud effective radius retrieved from MODIS, cloud liquid water content and effective number density of water clouds are estimated from CALIPSO lidar depolarization measurements in this study. Global statistics of the cloud liquid water content and effective number density are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.
In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less
Early-type galaxies in the Antlia cluster: catalogue and isophotal analysis
NASA Astrophysics Data System (ADS)
Calderón, Juan P.; Bassino, Lilia P.; Cellone, Sergio A.; Gómez, Matías
2018-06-01
We present a statistical isophotal analysis of 138 early-type galaxies in the Antlia cluster, located at a distance of ˜ 35 Mpc. The observational material consists of CCD images of four 36 × 36 arcmin2 fields obtained with the MOSAIC II camera at the Blanco 4-m telescope at Cerro Tololo Interamerican Observatory. Our present work supersedes previous Antlia studies in the sense that the covered area is four times larger, the limiting magnitude is MB ˜ -9.6 mag, and the surface photometry parameters of each galaxy are derived from Sérsic model fits extrapolated to infinity. In a companion previous study we focused on the scaling relations obtained by means of surface photometry, and now we present the data, on which the previous paper is based, the parameters of the isophotal fits as well as an isophotal analysis. For each galaxy, we derive isophotal shape parameters along the semimajor axis and search for correlations within different radial bins. Through extensive statistical tests, we also analyse the behaviour of these values against photometric and global parameters of the galaxies themselves. While some galaxies do display radial gradients in their ellipticity (ɛ) and/or their Fourier coefficients, differences in mean values between adjacent regions are not statistically significant. Regarding Fourier coefficients, dwarf galaxies usually display gradients between all adjacent regions, while non-dwarfs tend to show this behaviour just between the two outermost regions. Globally, there is no obvious correlation between Fourier coefficients and luminosity for the whole magnitude range (-12 ≳ MV ≳ -22); however, dwarfs display much higher dispersions at all radii.
Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.; ...
2016-03-01
In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less
Methods and apparatuses for information analysis on shared and distributed computing systems
Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA
2011-02-22
Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.
Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.
2016-01-01
Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
Estimating global cropland production from 1961 to 2010
NASA Astrophysics Data System (ADS)
Han, Pengfei; Zeng, Ning; Zhao, Fang; Lin, Xiaohui
2017-09-01
Global cropland net primary production (NPP) has tripled over the last 50 years, contributing 17-45 % to the increase in global atmospheric CO2 seasonal amplitude. Although many regional-scale comparisons have been made between statistical data and modeling results, long-term national comparisons across global croplands are scarce due to the lack of detailed spatiotemporal management data. Here, we conducted a simulation study of global cropland NPP from 1961 to 2010 using a process-based model called Vegetation-Global Atmosphere-Soil (VEGAS) and compared the results with Food and Agriculture Organization of the United Nations (FAO) statistical data on both continental and country scales. According to the FAO data, the global cropland NPP was 1.3, 1.8, 2.2, 2.6, 3.0, and 3.6 PgC yr-1 in the 1960s, 1970s, 1980s, 1990s, 2000s, and 2010s, respectively. The VEGAS model captured these major trends on global and continental scales. The NPP increased most notably in the US Midwest, western Europe, and the North China Plain and increased modestly in Africa and Oceania. However, significant biases remained in some regions such as Africa and Oceania, especially in temporal evolution. This finding is not surprising as VEGAS is the first global carbon cycle model with full parameterization representing the Green Revolution. To improve model performance for different major regions, we modified the default values of management intensity associated with the agricultural Green Revolution differences across various regions to better match the FAO statistical data at the continental level and for selected countries. Across all the selected countries, the updated results reduced the RMSE from 19.0 to 10.5 TgC yr-1 (˜ 45 % decrease). The results suggest that these regional differences in model parameterization are due to differences in socioeconomic development. To better explain the past changes and predict the future trends, it is important to calibrate key parameters on regional scales and develop data sets for land management history.
Smith, W Brad; Cuenca Lara, Rubí Angélica; Delgado Caballero, Carina Edith; Godínez Valdivia, Carlos Isaías; Kapron, Joseph S; Leyva Reyes, Juan Carlos; Meneses Tovar, Carmen Lourdes; Miles, Patrick D; Oswalt, Sonja N; Ramírez Salgado, Mayra; Song, Xilong Alex; Stinson, Graham; Villela Gaytán, Sergio Armando
2018-05-21
Forests cannot be managed sustainably without reliable data to inform decisions. National Forest Inventories (NFI) tend to report national statistics, with sub-national stratification based on domestic ecological classification systems. It is becoming increasingly important to be able to report statistics on ecosystems that span international borders, as global change and globalization expand stakeholders' spheres of concern. The state of a transnational ecosystem can only be properly assessed by examining the entire ecosystem. In global forest resource assessments, it may be useful to break national statistics down by ecosystem, especially for large countries. The Inventory and Monitoring Working Group (IMWG) of the North American Forest Commission (NAFC) has begun developing a harmonized North American Forest Database (NAFD) for managing forest inventory data, enabling consistent, continental-scale forest assessment supporting ecosystem-level reporting and relational queries. The first iteration of the database contains data describing 1.9 billion ha, including 677.5 million ha of forest. Data harmonization is made challenging by the existence of definitions and methodologies tailored to suit national circumstances, emerging from each country's professional forestry development. This paper reports the methods used to synchronize three national forest inventories, starting with a small suite of variables and attributes.
Statistical evidence of seismo-ionospheric precursors of the GPS total electron content in China
NASA Astrophysics Data System (ADS)
Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq
2015-04-01
Evidence of the seismo-ionospheric precursor (SIP) is reported by statistically investigating the relationship between the total electron content (TEC) in global ionosphere map (GIM) and 56 M≥6.0 earthquakes during 1998-2013 in China. A median-based method together with the z test is employed to examine the TEC variations 30 days before and after the earthquake. It is found that the TEC significantly decreases 0600-1000 LT 1-6 days before the earthquake, and anomalously increases in 3 time periods of 1300-1700 LT 12-15 days; 0000-0500 LT 15-17 days; and 0500-0900 LT 22-28 days before the earthquake. The receiver operating characteristic (ROC) curve is then used to evaluate the efficiency of TEC for predicting M≥6.0 earthquakes in China during a specified time period. Statistical results suggest that the SIP is the significant TEC reduction in the morning period of 0600-1000 LT. The SIP is further confirmed since the area under the ROC curve is positively associated with the earthquake magnitude.
NASA Astrophysics Data System (ADS)
Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq
2015-12-01
Evidence of the seismo-ionospheric precursor (SIP) is reported by statistically investigating the relationship between the total electron content (TEC) in global ionosphere map (GIM) and 56 M ⩾ 6.0 earthquakes during 1998-2013 in China. A median-based method together with the z test is employed to examine the TEC variations 30 days before and after the earthquake. It is found that the TEC significantly decreases 0600-1000 LT 1-6 days before the earthquake, and anomalously increases in 3 time periods of 1300-1700 LT 12-15 days; 0000-0500 LT 15-17 days; and 0500-0900 LT 22-28 days before the earthquake. The receiver operating characteristic (ROC) curve is then used to evaluate the efficiency of TEC for predicting M ⩾ 6.0 earthquakes in China during a specified time period. Statistical results suggest that the SIP is the significant TEC reduction in the morning period of 0600-1000 LT. The SIP is further confirmed since the area under the ROC curve is positively associated with the earthquake magnitude.
Skelly, Daniel A.; Johansson, Marnie; Madeoy, Jennifer; Wakefield, Jon; Akey, Joshua M.
2011-01-01
Variation in gene expression is thought to make a significant contribution to phenotypic diversity among individuals within populations. Although high-throughput cDNA sequencing offers a unique opportunity to delineate the genome-wide architecture of regulatory variation, new statistical methods need to be developed to capitalize on the wealth of information contained in RNA-seq data sets. To this end, we developed a powerful and flexible hierarchical Bayesian model that combines information across loci to allow both global and locus-specific inferences about allele-specific expression (ASE). We applied our methodology to a large RNA-seq data set obtained in a diploid hybrid of two diverse Saccharomyces cerevisiae strains, as well as to RNA-seq data from an individual human genome. Our statistical framework accurately quantifies levels of ASE with specified false-discovery rates, achieving high reproducibility between independent sequencing platforms. We pinpoint loci that show unusual and biologically interesting patterns of ASE, including allele-specific alternative splicing and transcription termination sites. Our methodology provides a rigorous, quantitative, and high-resolution tool for profiling ASE across whole genomes. PMID:21873452
Evaluation of different models to estimate the global solar radiation on inclined surface
NASA Astrophysics Data System (ADS)
Demain, C.; Journée, M.; Bertrand, C.
2012-04-01
Global and diffuse solar radiation intensities are, in general, measured on horizontal surfaces, whereas stationary solar conversion systems (both flat plate solar collector and solar photovoltaic) are mounted on inclined surface to maximize the amount of solar radiation incident on the collector surface. Consequently, the solar radiation incident measured on a tilted surface has to be determined by converting solar radiation from horizontal surface to tilted surface of interest. This study evaluates the performance of 14 models transposing 10 minutes, hourly and daily diffuse solar irradiation from horizontal to inclined surface. Solar radiation data from 8 months (April to November 2011) which include diverse atmospheric conditions and solar altitudes, measured on the roof of the radiation tower of the Royal Meteorological Institute of Belgium in Uccle (Longitude 4.35°, Latitude 50.79°) were used for validation purposes. The individual model performance is assessed by an inter-comparison between the calculated and measured solar global radiation on the south-oriented surface tilted at 50.79° using statistical methods. The relative performance of the different models under different sky conditions has been studied. Comparison of the statistical errors between the different radiation models in function of the clearness index shows that some models perform better under one type of sky condition. Putting together different models acting under different sky conditions can lead to a diminution of the statistical error between global measured solar radiation and global estimated solar radiation. As models described in this paper have been developed for hourly data inputs, statistical error indexes are minimum for hourly data and increase for 10 minutes and one day frequency data.
Cognitive functioning and insight in schizophrenia and in schizoaffective disorder.
Birindelli, Nadia; Montemagni, Cristiana; Crivelli, Barbara; Bava, Irene; Mancini, Irene; Rocca, Paola
2014-01-01
The aim of this study was to investigate cognitive functioning and insight of illness in two groups of patients during their stable phases, one with schizophrenia and one with schizoaffective disorder. We recruited 104 consecutive outpatients, 64 with schizophrenia, 40 with schizoaffective disorder, in the period between July 2010 and July 2011. They all fulfilled formal Diagnostic and Statistical Manual of Mental disorders (DSM-IV-TR) diagnostic criteria for schizophrenia and schizoaffective disorder. Psychiatric assessment included the Clinical Global Impression Scale-Severity (CGI-S), the Positive and Negative Sindrome Scale (PANSS), the Calgary Depression Scale for Schizophrenia (CDSS) and the Global Assessment of Functioning (GAF). Insight of illness was evaluated using SUMD. Neuropsychological assessment included Winsconsin Card Sorting Test (WCST), California Verbal Learning Test (CVLT), Stroop Test and Trail Making Test (TMT). Differences between the groups were tested using Chi-square test for categorical variables and one-way analysis of variance (ANOVA) for continuous variables. All variables significantly different between the two groups of subjects were subsequently analysed using a logistic regression with a backward stepwise procedure using diagnosis (schizophrenia/schizoaffective disorder) as dependent variable. After backward selection of variables, four variables predicted a schizoaffective disorder diagnosis: marital status, a higher number of admission, better attentive functions and awareness of specific signs or symptoms of disease. The prediction model accounted for 55% of the variance of schizoaffective disorder diagnosis. With replication, our findings would allow higher diagnostic accuracy and have an impact on clinical decision making, in light of an amelioration of vocational functioning.
Global Change and Human Consumption of Freshwater Driven by Flow Regulation and Irrigation
NASA Astrophysics Data System (ADS)
Jaramillo, F.; Destouni, G.
2015-12-01
Recent studies show major uncertainties about the magnitude and key drivers of global freshwater change, historically and projected for the future. The tackling of these uncertainties should be a societal priority to understand: 1) the role of human change drivers for freshwater availability changes, 2) the global water footprint of humanity and 3) the relation of human freshwater consumption to a proposed planetary boundary. This study analyses worldwide hydroclimatic changes, as observed during 1900-2009 in 99 large hydrological basins across all continents. We test whether global freshwater change may be driven by major developments of flow regulation and irrigation (FRI) occurring over this period. Independent categorization of the variability of FRI-impact strength among the studied basins is used to identify statistical basin differences in occurrence and strength of characteristic hydroclimatic signals of FRI. Our results show dominant signals of increasing relative evapotranspiration in basins affected by flow regulation and/or irrigation, in conjunction with decreasing relative intra-annual variability of runoff in basins affected by flow regulation. The FRI-related increase in relative evapotranspiration implies an increase of 4,688 km3/yr in global annual average water flow from land to the atmosphere. This observation-based estimate extends considerably the upper quantification limits of both FRI-driven and total global human consumption of freshwater, as well as the global water footprint of humanity. Our worldwide analysis shows clear FRI-related change signals emerging directly from observations, in spite of large change variability among basins and many other coexisting change drivers in both the atmosphere and the landscape. These results highlight the importance of considering local water use as a key change driver in Earth system studies and modelling, of relevance for global change and human consumption of freshwater.
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
Faster Detection of Poliomyelitis Outbreaks to Support Polio Eradication
Chenoweth, Paul; Okayasu, Hiro; Donnelly, Christl A.; Aylward, R. Bruce; Grassly, Nicholas C.
2016-01-01
As the global eradication of poliomyelitis approaches the final stages, prompt detection of new outbreaks is critical to enable a fast and effective outbreak response. Surveillance relies on reporting of acute flaccid paralysis (AFP) cases and laboratory confirmation through isolation of poliovirus from stool. However, delayed sample collection and testing can delay outbreak detection. We investigated whether weekly testing for clusters of AFP by location and time, using the Kulldorff scan statistic, could provide an early warning for outbreaks in 20 countries. A mixed-effects regression model was used to predict background rates of nonpolio AFP at the district level. In Tajikistan and Congo, testing for AFP clusters would have resulted in an outbreak warning 39 and 11 days, respectively, before official confirmation of large outbreaks. This method has relatively high specificity and could be integrated into the current polio information system to support rapid outbreak response activities. PMID:26890053
Faster Detection of Poliomyelitis Outbreaks to Support Polio Eradication.
Blake, Isobel M; Chenoweth, Paul; Okayasu, Hiro; Donnelly, Christl A; Aylward, R Bruce; Grassly, Nicholas C
2016-03-01
As the global eradication of poliomyelitis approaches the final stages, prompt detection of new outbreaks is critical to enable a fast and effective outbreak response. Surveillance relies on reporting of acute flaccid paralysis (AFP) cases and laboratory confirmation through isolation of poliovirus from stool. However, delayed sample collection and testing can delay outbreak detection. We investigated whether weekly testing for clusters of AFP by location and time, using the Kulldorff scan statistic, could provide an early warning for outbreaks in 20 countries. A mixed-effects regression model was used to predict background rates of nonpolio AFP at the district level. In Tajikistan and Congo, testing for AFP clusters would have resulted in an outbreak warning 39 and 11 days, respectively, before official confirmation of large outbreaks. This method has relatively high specificity and could be integrated into the current polio information system to support rapid outbreak response activities.
2014-01-01
Background The move to frame medical education in terms of competencies – the extent to which trainees “can do” a professional responsibility - is congruent with calls for accountability in medical education. However, the focus on competencies might be a poor fit with curricula intended to prepare students for responsibilities not emphasized in traditional medical education. This study examines an innovative approach to the use of potential competency expectations related to advancing global health equity to promote students’ reflections and to inform curriculum development. Methods In 2012, 32 medical students were admitted into a newly developed Global Health and Disparities (GHD) Path of Excellence. The GHD program takes the form of mentored co-curricular activities built around defined competencies related to professional development and leadership skills intended to ameliorate health disparities in medically underserved settings, both domestically and globally. Students reviewed the GHD competencies from two perspectives: a) their ability to perform the identified competencies that they perceived themselves as holding as they began the GHD program and b) the extent to which they perceived that their future career would require these responsibilities. For both sets of assessments the response scale ranged from “Strongly Disagree” to “Strongly Agree.” Wilcoxon’s paired T-tests compared individual students’ ordinal rating of their current level of ability to their perceived need for competence that they anticipated their careers would require. Statistical significance was set at p < .01. Results Students’ ratings ranged from “strongly disagree” to “strongly agree” that they could perform the defined GHD-related competencies. However, on most competencies, at least 50 % of students indicated that the stated competencies were beyond their present ability level. For each competency, the results of Wilcoxon paired T-tests indicate – at statistically significant levels - that students perceive more need in their careers for GHD-program defined competencies than they currently possess. Conclusion This study suggests congruence between student and program perceptions of the scope of practice required for GHD. Students report the need for enhanced skill levels in the careers they anticipate. This approach to formulating and reflecting on competencies will guide the program’s design of learning experiences aligned with students’ career goals. PMID:24886229
NASA Astrophysics Data System (ADS)
Ionita, M.; Grosfeld, K.; Scholz, P.; Lohmann, G.
2016-12-01
Sea ice in both Polar Regions is an important indicator for the expression of global climate change and its polar amplification. Consequently, a broad information interest exists on sea ice, its coverage, variability and long term change. Knowledge on sea ice requires high quality data on ice extent, thickness and its dynamics. However, its predictability depends on various climate parameters and conditions. In order to provide insights into the potential development of a monthly/seasonal signal, we developed a robust statistical model based on ocean heat content, sea surface temperature and atmospheric variables to calculate an estimate of the September minimum sea ice extent for every year. Although previous statistical attempts at monthly/seasonal forecasts of September sea ice minimum show a relatively reduced skill, here it is shown that more than 97% (r = 0.98) of the September sea ice extent can predicted three months in advance by using previous months conditions via a multiple linear regression model based on global sea surface temperature (SST), mean sea level pressure (SLP), air temperature at 850hPa (TT850), surface winds and sea ice extent persistence. The statistical model is based on the identification of regions with stable teleconnections between the predictors (climatological parameters) and the predictand (here sea ice extent). The results based on our statistical model contribute to the sea ice prediction network for the sea ice outlook report (https://www.arcus.org/sipn) and could provide a tool for identifying relevant regions and climate parameters that are important for the sea ice development in the Arctic and for detecting sensitive and critical regions in global coupled climate models with focus on sea ice formation.
Oral azithromycin for treatment of posterior blepharitis.
Igami, Thais Zamudio; Holzchuh, Ricardo; Osaki, Tammy Hentona; Santo, Ruth Miyuki; Kara-Jose, Newton; Hida, Richard Y
2011-10-01
To evaluate the effects of oral azithromycin in patients with posterior blepharitis. Twenty-six eyes of 13 patients with posterior blepharitis diagnosed by a qualified ophthalmologist were enrolled in this study. Patients were instructed to use oral azithromycin 500 mg per day for 3 days in 3 cycles with 7-day intervals. Subjective clinical outcomes were graded and scored 1 day before and 30 days after the end of the treatment (53 days after initiating the treatment) based on severity scores of: (1) eyelid debris; (2) eyelid telangiectasia; (3) swelling of the eyelid margin; (4) redness of the eyelid margin; and (5) ocular mucus secretion. For the assessment of global efficacy, patients were asked by the investigator to rate the subjective symptoms (eyelid itching, ocular itching, eyelid hyperemia, ocular hyperemia, ocular mucus secretion, photophobia, foreign body sensation, and dry eye sensation) on a scale of 0 (no symptoms) to 5 (severe symptoms). Break-up time, Schirmer I test, corneal fluorescein staining score, and rose bengal staining score were also performed in all patients. All clinical outcomes scoring showed statistically significant improvement after oral azithromycin, except for eyelid swelling. Average subjective symptom grading improved statistically after treatment with oral azithromycin, except for eyelid hyperemia, photophobia, and foreign body sensation. Average tear film break-up time values showed statistically significant improvement after the treatment with oral azithromycin. No statistically significant improvement was observed on average values of Schirmer I test, corneal fluorescein staining score, and rose bengal staining score. The combination of multiple clinical parameters shown in this study supports the clinical efficacy of pulsed oral azithromycin therapy for the management of posterior blepharitis.
A statistical spatial power spectrum of the Earth's lithospheric magnetic field
NASA Astrophysics Data System (ADS)
Thébault, E.; Vervelidou, F.
2015-05-01
The magnetic field of the Earth's lithosphere arises from rock magnetization contrasts that were shaped over geological times. The field can be described mathematically in spherical harmonics or with distributions of magnetization. We exploit this dual representation and assume that the lithospheric field is induced by spatially varying susceptibility values within a shell of constant thickness. By introducing a statistical assumption about the power spectrum of the susceptibility, we then derive a statistical expression for the spatial power spectrum of the crustal magnetic field for the spatial scales ranging from 60 to 2500 km. This expression depends on the mean induced magnetization, the thickness of the shell, and a power law exponent for the power spectrum of the susceptibility. We test the relevance of this form with a misfit analysis to the observational NGDC-720 lithospheric magnetic field model power spectrum. This allows us to estimate a mean global apparent induced magnetization value between 0.3 and 0.6 A m-1, a mean magnetic crustal thickness value between 23 and 30 km, and a root mean square for the field value between 190 and 205 nT at 95 per cent. These estimates are in good agreement with independent models of the crustal magnetization and of the seismic crustal thickness. We carry out the same analysis in the continental and oceanic domains separately. We complement the misfit analyses with a Kolmogorov-Smirnov goodness-of-fit test and we conclude that the observed power spectrum can be each time a sample of the statistical one.
Mura, Maria Chiara; De Felice, Marco; Morlino, Roberta; Fuselli, Sergio
2010-01-01
In step with the need to develop statistical procedures to manage small-size environmental samples, in this work we have used concentration values of benzene (C6H6), concurrently detected by seven outdoor and indoor monitoring stations over 12 000 minutes, in order to assess the representativeness of collected data and the impact of the pollutant on indoor environment. Clearly, the former issue is strictly connected to sampling-site geometry, which proves critical to correctly retrieving information from analysis of pollutants of sanitary interest. Therefore, according to current criteria for network-planning, single stations have been interpreted as nodes of a set of adjoining triangles; then, a) node pairs have been taken into account in order to estimate pollutant stationarity on triangle sides, as well as b) node triplets, to statistically associate data from air-monitoring with the corresponding territory area, and c) node sextuplets, to assess the impact probability of the outdoor pollutant on indoor environment for each area. Distributions from the various node combinations are all non-Gaussian, in the consequently, Kruskal-Wallis (KW) non-parametric statistics has been exploited to test variability on continuous density function from each pair, triplet and sextuplet. Results from the above-mentioned statistical analysis have shown randomness of site selection, which has not allowed a reliable generalization of monitoring data to the entire selected territory, except for a single "forced" case (70%); most important, they suggest a possible procedure to optimize network design.
Liljegren, Mats; Ekberg, Kerstin
2009-01-01
The aim of the present study was to examine the cross-sectional and 2-year longitudinal associations between perceived organizational justice, self-rated health and burnout. The study used questionnaire data from 428 Swedish employment officers and the data was analyzed with Structural Equation Modeling, SEM. Two different models were tested: a global organizational justice model (with and without correlated measurement errors) and a differentiated (distributive, procedural and interactional organizational justice) justice model (with and without correlated measurement errors). The global justice model with autocorrelations had the most satisfactory goodness-of-fit indices. Global justice showed statistically significant (p < 0.01) cross-sectional (0.80 {mle 0.84) and longitudinal positive associations (0.76 mle 0.82) between organizational justice and self-rated health, and significant (p < 0.01) negative associations between organizational justice and burnout (cross-sectional: mle = -0.85, longitudinal -0.83 mle -0.84). The global justice construct showed better goodness-of-fit indices than the threefold justice construct but a differentiated organizational justice concept could give valuable information about health related risk factors: if they are structural (distributive justice), procedural (procedural justice) or inter-personal (interactional justice). The two approaches to study organizational justice should therefore be regarded as complementary rather than exclusive.
NASA Technical Reports Server (NTRS)
Adler, R. F.; Gu, G.; Curtis, S.; Huffman, G. J.; Bolvin, D. T.; Nelkin, E. J.
2005-01-01
The Global Precipitation Climatology Project (GPCP) 25-year precipitation data set is used to evaluate the variability and extremes on global and regional scales. The variability of precipitation year-to-year is evaluated in relation to the overall lack of a significant global trend and to climate events such as ENSO and volcanic eruptions. The validity of conclusions and limitations of the data set are checked by comparison with independent data sets (e.g., TRMM). The GPCP data set necessarily has a heterogeneous time series of input data sources, so part of the assessment described above is to test the initial results for potential influence by major data boundaries in the record. Regional trends, or inter-decadal changes, are also analyzed to determine validity and correlation with other long-term data sets related to the hydrological cycle (e.g., clouds and ocean surface fluxes). Statistics of extremes (both wet and dry) are analyzed at the monthly time scale for the 25 years. A preliminary result of increasing frequency of extreme monthly values will be a focus to determine validity. Daily values for an eight-year are also examined for variation in extremes and compared to the longer monthly-based study.
Has the magnitude of floods across the USA changed with global CO2 levels?
Hirsch, Robert M.; Ryberg, Karen R.
2012-01-01
Statistical relationships between annual floods at 200 long-term (85–127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.
Has the magnitude of floods across the USA changed with global CO 2 levels?
Hirsch, R.M.; Ryberg, K.R.
2012-01-01
Statistical relationships between annual floods at 200 long-term (85-127 years of record) streamgauges in the coterminous United States and the global mean carbon dioxide concentration (GMCO2) record are explored. The streamgauge locations are limited to those with little or no regulation or urban development. The coterminous US is divided into four large regions and stationary bootstrapping is used to evaluate if the patterns of these statistical associations are significantly different from what would be expected under the null hypothesis that flood magnitudes are independent of GMCO2. In none of the four regions defined in this study is there strong statistical evidence for flood magnitudes increasing with increasing GMCO2. One region, the southwest, showed a statistically significant negative relationship between GMCO2 and flood magnitudes. The statistical methods applied compensate both for the inter-site correlation of flood magnitudes and the shorter-term (up to a few decades) serial correlation of floods.
Ratajczak, Karina; Płomiński, Janusz
2015-01-01
The most common fracture of the distal end of the radius is Colles' fracture. Treatment modalities available for use in hand rehabilitation after injury include massage. The aim of this study was to evaluate the effect of isometric massage on the recovery of hand function in patients with Colles fractures. For this purpose, the strength of the finger flexors was assessed as an objective criterion for the evaluation of hand function. The study involved 40 patients, randomly divided into Group A of 20 patients and Group B of 20 patients. All patients received physical therapy and exercised individually with a physiotherapist. Isometric massage was additionally used in Group A. Global grip strength was assessed using a pneumatic force meter on the first and last day of therapy. Statistical analysis was performed using STATISTICA. Statistical significance was defined as a P value of less than 0.05. In both groups, global grip strength increased significantly after the therapy. There was no statistically significant difference between the groups. The men and women in both groups equally improved grip strength. A statistically significant difference was demonstrated between younger and older patients, with younger patients achieving greater gains in global grip strength in both groups. The incorporation of isometric massage in the rehabilitation plan of patients after a distal radial fracture did not significantly contribute to faster recovery of hand function or improve their quality of life.
Prasarn, Mark L; Conrad, Bryan; Del Rossi, Gianluca; Horodyski, MaryBeth; Rechtine, Glenn R
2012-06-01
Many studies have compared the restriction of motion that immobilization collars provide to the injured victim. No previous investigation has assessed the amount of motion that is generated during the fitting and removal process. The purpose of this study was to compare the three-dimensional motion generated when one-piece and two-piece cervical collars are applied and removed from cadavers intact and with unstable cervical spine injuries. Five fresh, lightly embalmed cadavers were tested three times each with either a one-piece or two-piece cervical collar in the supine position. Testing was performed in the intact state, following creation of a global ligamentous instability at C5-C6. The amount of angular motion resulting from the collar application and removal was measured using a Fastrak, three-dimensional, electromagnetic motion analysis device (Polhemus Inc., Colchester, VT). The measurements recorded in this investigation included maximum values for flexion/extension, axial rotation, medial/lateral flexion, anterior/posterior displacement, axial distraction, and medial/lateral displacement at the level of instability. There was statistically more motion observed with application or removal of either collar following the creation of a global instability. During application, there was a statistically significant difference in flexion/extension between the one-piece (1.8 degrees) and two-piece (2.6 degrees) collars, p = 0.009. There was also a statistically significant difference in anterior/posterior translation between the one-piece (3.6 mm) and two-piece (3.4 mm) collars, p = 0.015. The maximum angulation and displacement during the application of either collar was 3.4 degrees and 4.4 mm. Statistical analysis revealed no significant differences between the one-piece and two-piece collars during the removal process. The maximum angulation and displacement during removal of either collar type was 1.6 degrees and 2.9 mm. There were statistically significant differences in motion between the one-piece and two-piece collars during the application process, but it was only 1.2 degrees in flexion/extension and 0.2 mm in anterior/posterior translation. Overall, the greatest amount of angulation and displacement observed during collar application was 3.4 degrees and 4.4 mm. Although the exact amount of motion that could be deleterious to a cervical spine-injured patient is unknown, collars can be placed and removed with manual in-line stabilization without large displacements. Only trained practitioners should do so and with great care given that some motion in all planes does occur during the process. Copyright © 2012 by Lippincott Williams & Wilkins.
NASA Astrophysics Data System (ADS)
Del Rio Amador, Lenin; Lovejoy, Shaun
2017-04-01
Over the past ten years, a key advance in our understanding of atmospheric variability is the discovery that between the weather and climate regime lies an intermediate "macroweather" regime, spanning the range of scales from ≈10 days to ≈30 years. Macroweather statistics are characterized by two fundamental symmetries: scaling and the factorization of the joint space-time statistics. In the time domain, the scaling has low intermittency with the additional property that successive fluctuations tend to cancel. In space, on the contrary the scaling has high (multifractal) intermittency corresponding to the existence of different climate zones. These properties have fundamental implications for macroweather forecasting: a) the temporal scaling implies that the system has a long range memory that can be exploited for forecasting; b) the low temporal intermittency implies that mathematically well-established (Gaussian) forecasting techniques can be used; and c), the statistical factorization property implies that although spatial correlations (including teleconnections) may be large, if long enough time series are available, they are not necessarily useful in improving forecasts. Theoretically, these conditions imply the existence of stochastic predictability limits in our talk, we show that these limits apply to GCM's. Based on these statistical implications, we developed the Stochastic Seasonal and Interannual Prediction System (StocSIPS) for the prediction of temperature from regional to global scales and from one month to many years horizons. One of the main components of StocSIPS is the separation and prediction of both the internal and externally forced variabilities. In order to test the theoretical assumptions and consequences for predictability and predictions, we use 41 different CMIP5 model outputs from preindustrial control runs that have fixed external forcings: whose variability is purely internally generated. We first show that these statistical assumptions hold with relatively good accuracy and then we performed hindcasts at global and regional scales from monthly to annual time resolutions using StocSIPS. We obtained excellent agreement between the hindcast Mean Square Skill Score (MSSS) and the theoretical stochastic limits. We also show the application of StocSIPS to the prediction of average global temperature and compare our results with those obtained using multi-model ensemble approaches. StocSIPS has numerous advantages including a) higher MSSS for large time horizons, b) the from convergence to the real - not model - climate, c) much higher computational speed, d) no need for data assimilation, e) no ad hoc post processing and f) no need for downscaling.
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mayes, R.; Lyford, M. E.; Myers, J. D.
2009-12-01
The Quantitative Reasoning in STEM (QR STEM) project is a state level Mathematics and Science Partnership Project (MSP) with a focus on the mathematics and statistics that underlies the understanding of complex global scientific issues. This session is a companion session to the QR STEM: The Science presentation. The focus of this session is the quantitative reasoning aspects of the project. As students move from understandings that range from local to global in perspective on issues of energy and environment, there is a significant increase in the need for mathematical and statistical conceptual understanding. These understandings must be accessible to the students within the scientific context, requiring the special understandings that are endemic within quantitative reasoning. The QR STEM project brings together interdisciplinary teams of higher education faculty and middle/high school teachers to explore complex problems in energy and environment. The disciplines include life sciences, physics, chemistry, earth science, statistics, and mathematics. These interdisciplinary teams develop open ended performance tasks to implement in the classroom, based on scientific concepts that underpin energy and environment. Quantitative reasoning is broken down into three components: Quantitative Literacy, Quantitative Interpretation, and Quantitative Modeling. Quantitative Literacy is composed of arithmetic concepts such as proportional reasoning, numeracy, and descriptive statistics. Quantitative Interpretation includes algebraic and geometric concepts that underlie the ability to interpret a model of natural phenomena which is provided for the student. This model may be a table, graph, or equation from which the student is to make predictions or identify trends, or from which they would use statistics to explore correlations or patterns in data. Quantitative modeling is the ability to develop the model from data, including the ability to test hypothesis using statistical procedures. We use the term model very broadly, so it includes visual models such as box models, as well as best fit equation models and hypothesis testing. One of the powerful outcomes of the project is the conversation which takes place between science teachers and mathematics teachers. First they realize that though they are teaching concepts that cross their disciplines, the barrier of scientific language within their subjects restricts students from applying the concepts across subjects. Second the mathematics teachers discover the context of science as a means of providing real world situations that engage students in the utility of mathematics as a tool for solving problems. Third the science teachers discover the barrier to understanding science that is presented by poor quantitative reasoning ability. Finally the students are engaged in exploring energy and environment in a manner which exposes the importance of seeing a problem from multiple interdisciplinary perspectives. The outcome is a democratic citizen capable of making informed decisions, and perhaps a future scientist.
Smooth quantile normalization.
Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada
2018-04-01
Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.
Jennings, Jacky M.; Schumacher, Christina; Perin, Jamie; Myers, Tanya; Fields, Nathan; Greiner Safi, Amelia; Chaulk, Patrick
2018-01-01
Background Eliminating HIV transmission in a population necessitates identifying population reservoirs of HIV infection and subgroups most likely to transmit. HIV viral load is the single most important predictor of HIV transmission. The objective of this analysis was to evaluate whether a public health practice pilot project based on community viral load resulted in increases in the proportion of time spent testing in high viral load areas (process measure) and 3 outcome measures—the number and percent of overall HIV diagnoses, new diagnoses, and high viral load positives—in one mid-Atlantic US city with a severe HIV epidemic. Methods The evaluation was conducted during three, 3-month periods for 3 years and included the use of community viral load, global positioning system tracking data, and statistical testing to evaluate the effectiveness of the pilot project. Results The proportion of time spent outreach testing in high viral load areas (69%–84%, P < 0.001) and the overall number and percent of HIV positives ((60 (3%) to 127 (6%), P < 0.001) significantly increased for 3 years. The number and percent of new diagnoses (3 (0.1%) to 6 (0.2%)) and high viral load positives (5 (0.2%) to 9 (0.4%)) increased, but the numbers were too small for statistical testing. Discussion These results suggest that using community viral load to increase the efficiency of HIV outreach testing is feasible and may be effective in identifying more HIV positives. The pilot project provides a model for other public health practice demonstration projects. PMID:29420450
Verwijk, Esmée; Comijs, Hannie C; Kok, Rob M; Spaans, Harm-Pieter; Tielkes, Caroline E M; Scherder, Erik J A; Stek, Max L
2014-02-01
It is generally assumed that the elderly patients are more vulnerable to cognitive side effects after electroconvulsive therapy (ECT) than younger depressed patients. The current study aims to evaluate the nature and extent of changes across multiple domains of neurocognitive functioning in a group of elderly depressed patients after ECT. In this prospective naturalistic study, we included 42 depressed patients aged ≥55 years. Global cognitive function, memory, and executive function were assessed before ECT treatment and within one week (short-term post-ECT) and six months after ECT (long-term post-ECT). Associations between cognitive functioning and electrode placement, total number of treatment sessions, age, and the severity of depression at the time of cognitive measurement were studied. Our data offered no evidence of decline for any of the neurocognitive tests after ECT, given its power to detect the difference. Post-ECT improvement of neurocognitive functioning was statistically significant for the Mini-Mental State Examination, Visual Association Test, 10 Words Verbal Learning Test, and Expanded Mental Control Test. Effect sizes were medium to large. After six months, compared with post-ECT performance, statistically significant improvement was found only for the Trail Making Test-A and the Letter Fluency Test with small to medium effect sizes. In our severely depressed elderly patients, neurocognitive performance improved or did not change after ECT. Patients with poor cognitive function were not able to participate in neuropsychological assessment before ECT started. Consequently these results may not apply to patients with more severe cognitive impairment prior to the start of ECT.
Will Outer Tropical Cyclone Size Change due to Anthropogenic Warming?
NASA Astrophysics Data System (ADS)
Schenkel, B. A.; Lin, N.; Chavas, D. R.; Vecchi, G. A.; Knutson, T. R.; Oppenheimer, M.
2017-12-01
Prior research has shown significant interbasin and intrabasin variability in outer tropical cyclone (TC) size. Moreover, outer TC size has even been shown to vary substantially over the lifetime of the majority of TCs. However, the factors responsible for both setting initial outer TC size and determining its evolution throughout the TC lifetime remain uncertain. Given these gaps in our physical understanding, there remains uncertainty in how outer TC size will change, if at all, due to anthropogenic warming. The present study seeks to quantify whether outer TC size will change significantly in response to anthropogenic warming using data from a high-resolution global climate model and a regional hurricane model. Similar to prior work, the outer TC size metric used in this study is the radius in which the azimuthal-mean surface azimuthal wind equals 8 m/s. The initial results from the high-resolution global climate model data suggest that the distribution of outer TC size shifts significantly towards larger values in each global TC basin during future climates, as revealed by 1) statistically significant increase of the median outer TC size by 5-10% (p<0.05) according to a 1,000-sample bootstrap resampling approach with replacement and 2) statistically significant differences between distributions of outer TC size from current and future climate simulations as shown using two-sample Kolmogorov Smirnov testing (p<<0.01). Additional analysis of the high-resolution global climate model data reveals that outer TC size does not uniformly increase within each basin in future climates, but rather shows substantial locational dependence. Future work will incorporate the regional mesoscale hurricane model data to help focus on identifying the source of the spatial variability in outer TC size increases within each basin during future climates and, more importantly, why outer TC size changes in response to anthropogenic warming.
Development of a Multiple Input Integrated Pole-to-Pole Global CMORPH
NASA Astrophysics Data System (ADS)
Joyce, R.; Xie, P.
2013-12-01
A test system is being developed at NOAA Climate Prediction Center (CPC) to produce a passive microwave (PMW), IR-based, and model integrated high-resolution precipitation estimation on a 0.05olat/lon grid covering the entire globe from pole to pole. Experiments have been conducted for a summer Test Bed period using data for July and August of 2009. The pole-to-pole global CMORPH system is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). First, retrievals of instantaneous precipitation rates from PMW observations aboard nine low earth orbit (LEO) satellites are decoded and pole-to-pole mapped onto a 0.05olat/lon grid over the globe. Also precipitation estimates from LEO AVHRR retrievals are derived using a PDF matching of LEO IR with calibrated microwave combined (MWCOMB) precipitation retrievals. The motion vectors for the precipitating cloud systems are defined using information from both satellite IR observations and precipitation fields generated by the NCEP Climate Forecast System Reanalysis (CFSR). To this end, motion vectors are first computed for the CFSR hourly precipitation fields through cross-correlation analysis of consecutive hourly precipitation fields on the global T382 (~35 km) grid. In a similar manner, separate processing is also performed on satellite IR-based precipitation estimates to derive motion vectors from observations. A blended analysis of precipitating cloud motion vectors is then constructed through the combination of CFSR and satellite-derived vectors utilizing a two-dimensional optimal interpolation (2D-OI) method, in which CFSR-derived motion vectors are used as the first guess and subsequently satellite derived vectors modify the first guess. Weights used to generate the combinations are defined under the OI framework as a function of error statistics for the CFSR and satellite IR based motion vectors. The screened and calibrated PMW and AVHRR derived precipitation estimates are then separately spatially propagated forward and backward in time, using precipitating cloud motion vectors, from their observation time to the next PMW observation. The PMW estimates propagated in both the forward and backward directions are then combined with propagated IR-based precipitation estimates under the Kalman Filter framework, with weights defined based on previously determined error statistics dependent on latitude, season, surface type, and temporal distance from observation time. Performance of the pole-to-pole global CMORPH and its key components, including combined PMW (MWCOMB), IR-based, and model precipitation, as well as model-derived, IR-based, and blended precipitation motion vectors, will be examined against NSSL Q2 radar observed precipitation estimates over CONUS, Finland FMI radar precipitation, and a daily gauge-based analysis including daily Canadian surface reports over global land. Also an initial investigation will be performed over a January - February 2010 winter Test Bed period. Detailed results will be reported at the Fall 2013 AGU Meeting.
Paramedic electrocardiogram and rhythm identification: a convenient training device.
Hale, Peggy; Lowe, Robert; Seamon, Jason P; Jenkins, James J
2011-10-01
A common reason for utilizing local paramedics and the emergency medical services is for the recognition and immediate treatment of chest pain, a complaint that has multiple possible etiologies. While many of those complaining of disease processes responsible for chest pain are benign, some will be life-threatening and will require immediate identification and treatment. The ability of paramedics to not only perform field electrocardiograms (ECGs), but to accurately diagnose various unstable cardiac rhythms has shown significant reduction in time to specific treatments. Increasing the overall accuracy of ECG interpretation by paramedics has the potential to facilitate early and appropriate treatment and decrease patient morbidity and mortality. A convenient training device (flip book) on ambulances and in common areas in the fire station could improve field interpretation of certain cardiac rhythms. This training device consists of illustrated sample ECG tracings and their associated diagnostic criteria. The goal was to enhance the recognition and interpretation of ECGs, and thereby, reduce delays in the initiation of treatment and potential complications associated with misinterpretation.This study was a prospective, observational study using a matched pre-test/post-test design. The study period was from November 2008 to December 2008. A total of 136 paramedics were approached to participate in this study. A pre-test consisting of 15 12-lead ECGs was given to all paramedics who agreed to participate in the study. Once the pre-tests were completed, the flip books were placed in common areas. Approximately one month after the flip books were made available to the paramedics, a post-test was administered.Statistical comparisons were made between the pre- and post-test scores for both the global test and each type of rhythm. Using these data, there were no statistically significant improvements in the global ECG interpretation or on individual rhythm interpretations. A flip book with multiple ECG rhythms and definitions without the benefit of any outside support was not effective in improving paramedic identification of ECG rhythms on a post-test. Suggestions for further research include repeating the study with a larger sample size; utilizing a lecturer to explain how to use the flip book in the most efficient manner; reiterating how to read and interpret ECGs; and answering questions. Comparing test scores of paramedic students, and newly certified paramedics as opposed to veteran paramedics also may indicate that the flip books are more suited for one group over another.
Motivational deficits and cognitive test performance in schizophrenia.
Fervaha, Gagan; Zakzanis, Konstantine K; Foussias, George; Graff-Guerrero, Ariel; Agid, Ofer; Remington, Gary
2014-09-01
Motivational and cognitive deficits are core features of schizophrenia, both closely linked with functional outcomes. Although poor effort and decreased motivation are known to affect performance on cognitive tests, the extent of this relationship is unclear in patients with schizophrenia. To evaluate the association between intrinsic motivation and cognitive test performance in patients with schizophrenia. Cross-sectional and 6-month prospective follow-up study performed at 57 sites in the United States, including academic and community medical treatment centers, participating in the Clinical Antipsychotic Trials of Intervention Effectiveness study. The primary sample included 431 stable patients with a DSM-IV diagnosis of schizophrenia currently receiving a stable medication regimen. Cognitive performance and intrinsic motivation were evaluated using a comprehensive neuropsychological test battery and a derived measure from the Heinrichs-Carpenter Quality of Life Scale, respectively. Symptom severity and functional status were also assessed. The primary outcome variable was global neurocognition. Individual domains of cognition were also evaluated for their association with motivation. Level of intrinsic motivation was significantly and positively correlated with global cognitive test performance, a relationship that held for each domain of cognition evaluated (correlation range, 0.20-0.34; P < .001). This association was found to be reliable after statistically accounting for positive, negative, depressive, and overall symptom severity (P < .05) and after accounting for community functioning (P < .001). The relationship between motivation and cognitive performance also remained significant after controlling for antipsychotic dose (P < .05). Prospective increase in motivation during the 6-month follow-up was also found to be significantly related to improvement in global cognitive performance (P < .05). The present findings provide strong support for a robust and reliable relationship between motivation and cognitive performance and suggest that test performance is not purely a measure of ability. Future studies assessing cognition in patients with schizophrenia should consider potential moderating variables such as effort and motivation. Implications for the assessment and interpretation of cognitive impairment based on neuropsychological test measures in schizophrenia are discussed, especially in the case of clinical trials for cognition-enhancing treatments. clinicaltrials.gov Identifier: NCT00014001.
Zotti, Alessandro; Banzato, Tommaso; Gelain, Maria Elena; Centelleghe, Cinzia; Vaccaro, Calogero; Aresu, Luca
2015-04-25
Increased cortical or cortical and medullary echogenicity is one of the most common signs of chronic or acute kidney disease in dogs and cats. Subjective evaluation of the echogenicity is reported to be unreliable. Patient and technical-related factors affect in-vivo quantitative evaluation of the echogenicity of parenchymal organs. The aim of the present study is to investigate the relationship between histopathology and ex-vivo renal cortical echogenicity in dogs and cats devoid of any patient and technical-related biases. Kidney samples were collected from 68 dog and 32 cat cadavers donated by the owners to the Veterinary Teaching Hospital of the University of Padua and standardized ultrasonographic images of each sample were collected. The echogenicity of the renal cortex was quantitatively assessed by means of mean gray value (MGV), and then histopathological analysis was performed. Statistical analysis to evaluate the influence of histological lesions on MGV was performed. The differentiation efficiency of MGV to detect pathological changes in the kidneys was calculated for dogs and cats. Statistical analysis revealed that only glomerulosclerosis was an independent determinant of echogenicity in dogs whereas interstitial nephritis, interstitial necrosis and fibrosis were independent determinants of echogenicity in cats. The global influence of histological lesions on renal echogenicity was higher in cats (23%) than in dogs (12%). Different histopathological lesions influence the echogenicity of the kidneys in dogs and cats. Moreover, MGV is a poor test for distinguishing between normal and pathological kidneys in the dog with a sensitivity of 58.3% and specificity of 59.8%. Instead, it seems to perform globally better in the cat, resulting in a fair test, with a sensitivity of 80.6% and a specificity of 56%.
Brauchli Pernus, Yolanda; Nan, Cassandra; Verstraeten, Thomas; Pedenko, Mariia; Osokogu, Osemeke U; Weibel, Daniel; Sturkenboom, Miriam; Bonhoeffer, Jan
2016-12-12
Safety signal detection in spontaneous reporting system databases and electronic healthcare records is key to detection of previously unknown adverse events following immunization. Various statistical methods for signal detection in these different datasources have been developed, however none are geared to the pediatric population and none specifically to vaccines. A reference set comprising pediatric vaccine-adverse event pairs is required for reliable performance testing of statistical methods within and across data sources. The study was conducted within the context of the Global Research in Paediatrics (GRiP) project, as part of the seventh framework programme (FP7) of the European Commission. Criteria for the selection of vaccines considered in the reference set were routine and global use in the pediatric population. Adverse events were primarily selected based on importance. Outcome based systematic literature searches were performed for all identified vaccine-adverse event pairs and complemented by expert committee reports, evidence based decision support systems (e.g. Micromedex), and summaries of product characteristics. Classification into positive (PC) and negative control (NC) pairs was performed by two independent reviewers according to a pre-defined algorithm and discussed for consensus in case of disagreement. We selected 13 vaccines and 14 adverse events to be included in the reference set. From a total of 182 vaccine-adverse event pairs, we classified 18 as PC, 113 as NC and 51 as unclassifiable. Most classifications (91) were based on literature review, 45 were based on expert committee reports, and for 46 vaccine-adverse event pairs, an underlying pathomechanism was not plausible classifying the association as NC. A reference set of vaccine-adverse event pairs was developed. We propose its use for comparing signal detection methods and systems in the pediatric population. Published by Elsevier Ltd.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344
Dogan, Nurettin Özgür; Corbacioglu, Seref Kerem; Bildik, Fikret; Kilicaslan, Isa; Günaydin, Gül Pamukcu; Cevik, Yunsur; Ülker, Volkan; Hakoglu, Onur; Gökcen, Emre
2014-09-01
To determine whether endogenous carbon monoxide levels in exacerbations of Chronic Obstructive Pulmonary Disease patients were higher compared to healthy individuals and to investigate alteration of carbon monoxide levels across the three different severity stages of Global Initiative for Chronic Obstructive Lung Disease criteria related to Chronic Obstructive Pulmonary Disease exacerbations. The prospective study was conducted from January to March 2011 at two medical institutions in Ankara, Turkey, and comprised patients of acute Chronic Obstructive Pulmonary Disease exacerbations. The severity of the exacerbations was based on the Global Initiative for Chronic Obstructive Lung Disease criteria. Patients with active tobacco smoking, suspicious carbon monoxide poisoning and uncertain diagnosis were excluded. healthy control subjects who did not have any comorbid diseases and smoking habitus were also enrolled to compare the differences between carboxyhaemoglobin levels A two-tailed Mann-Whitney U test with Bonferroni correction was done following a Kruskal-Wallis test for statistical purposes. There were 90 patients and 81 controls in the study. Carboxyhaemoglobin levels were higher in the patients than the controls (p < 0.001). As for the three severity stages, Group 1 had a median carboxyhaemoglobin of 1.6 (0.95- 2.00). The corresponding levels in Group 2 (1.8 [1.38-2.20]) and Group 3 (1.9 [1.5-3.0]) were higher than the controls (p < 0.001 and p < 0.005 respectively). No statistically significant difference between Group 1 and the controls (1.30 [1.10-1.55]) was observed (p < 0.434). Carboxyhaemoglobin levels were significantly higher in exacerbations compared with the normal population. Also, in more serious exacerbations, carboxyhaemoglobin levels were significantly increased compared with healthy individuals and mild exacerbations.
Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur
2010-01-01
A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.
[Visual field progression in glaucoma: cluster analysis].
Bresson-Dumont, H; Hatton, J; Foucher, J; Fonteneau, M
2012-11-01
Visual field progression analysis is one of the key points in glaucoma monitoring, but distinction between true progression and random fluctuation is sometimes difficult. There are several different algorithms but no real consensus for detecting visual field progression. The trend analysis of global indices (MD, sLV) may miss localized deficits or be affected by media opacities. Conversely, point-by-point analysis makes progression difficult to differentiate from physiological variability, particularly when the sensitivity of a point is already low. The goal of our study was to analyse visual field progression with the EyeSuite™ Octopus Perimetry Clusters algorithm in patients with no significant changes in global indices or worsening of the analysis of pointwise linear regression. We analyzed the visual fields of 162 eyes (100 patients - 58 women, 42 men, average age 66.8 ± 10.91) with ocular hypertension or glaucoma. For inclusion, at least six reliable visual fields per eye were required, and the trend analysis (EyeSuite™ Perimetry) of visual field global indices (MD and SLV), could show no significant progression. The analysis of changes in cluster mode was then performed. In a second step, eyes with statistically significant worsening of at least one of their clusters were analyzed point-by-point with the Octopus Field Analysis (OFA). Fifty four eyes (33.33%) had a significant worsening in some clusters, while their global indices remained stable over time. In this group of patients, more advanced glaucoma was present than in stable group (MD 6.41 dB vs. 2.87); 64.82% (35/54) of those eyes in which the clusters progressed, however, had no statistically significant change in the trend analysis by pointwise linear regression. Most software algorithms for analyzing visual field progression are essentially trend analyses of global indices, or point-by-point linear regression. This study shows the potential role of analysis by clusters trend. However, for best results, it is preferable to compare the analyses of several tests in combination with morphologic exam. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Evaluating collective significance of climatic trends: A comparison of methods on synthetic data
NASA Astrophysics Data System (ADS)
Huth, Radan; Dubrovský, Martin
2017-04-01
The common approach to determine whether climatic trends are significantly different from zero is to conduct individual (local) tests at each single site (station or gridpoint). Whether the number of sites where the trends are significantly non-zero can or cannot occur by random, is almost never evaluated in trend studies. That is, collective (global) significance of trends is ignored. We compare three approaches to evaluating collective statistical significance of trends at a network of sites, using the following statistics: (i) the number of successful local tests (a successful test means here a test in which the null hypothesis of no trend is rejected); this is a standard way of assessing collective significance in various applications in atmospheric sciences; (ii) the smallest p-value among the local tests (Walker test); and (iii) the counts of positive and negative trends regardless of their magnitudes and local significance. The third approach is a new procedure that we propose; the rationale behind it is that it is reasonable to assume that the prevalence of one sign of trends at individual sites is indicative of a high confidence in the trend not being zero, regardless of the (in)significance of individual local trends. A potentially large amount of information contained in trends that are not locally significant, which are typically deemed irrelevant and neglected, is thus not lost and is retained in the analysis. In this contribution we examine the feasibility of the proposed way of significance testing on synthetic data, produced by a multi-site stochastic generator, and compare it with the two other ways of assessing collective significance, which are well established now. The synthetic dataset, mimicking annual mean temperature on an array of stations (or gridpoints), is constructed assuming a given statistical structure characterized by (i) spatial separation (density of the station network), (ii) local variance, (iii) temporal and spatial autocorrelations, and (iv) the trend magnitude. The probabilistic distributions of the three test statistics (null distributions) and critical values of the tests are determined from multiple realizations of the synthetic dataset, in which no trend is imposed at each site (that is, any trend is a result of random fluctuations only). The procedure is then evaluated by determining the type II error (the probability of a false detection of a trend) in the presence of a trend with a known magnitude, for which the synthetic dataset with an imposed spatially uniform non-zero trend is used. A sensitivity analysis is conducted for various combinations of the trend magnitude and spatial autocorrelation.
Global atmospheric circulation statistics, 1000-1 mb
NASA Technical Reports Server (NTRS)
Randel, William J.
1992-01-01
The atlas presents atmospheric general circulation statistics derived from twelve years (1979-90) of daily National Meteorological Center (NMC) operational geopotential height analyses; it is an update of a prior atlas using data over 1979-1986. These global analyses are available on pressure levels covering 1000-1 mb (approximately 0-50 km). The geopotential grids are a combined product of the Climate Analysis Center (which produces analyses over 70-1 mb) and operational NMC analyses (over 1000-100 mb). Balance horizontal winds and hydrostatic temperatures are derived from the geopotential fields.
Superior diastolic function with KATP channel opener diazoxide in a novel mouse Langendorff model.
Makepeace, Carol M; Suarez-Pierre, Alejandro; Kanter, Evelyn M; Schuessler, Richard B; Nichols, Colin G; Lawton, Jennifer S
2018-07-01
Adenosine triphosphate-sensitive potassium (K ATP ) channel openers have been found to be cardioprotective in multiple animal models via an unknown mechanism. Mouse models allow genetic manipulation of K ATP channel components for the investigation of this mechanism. Mouse Langendorff models using 30 min of global ischemia are known to induce measurable myocardial infarction and injury. Prolongation of global ischemia in a mouse Langendorff model could allow the determination of the mechanisms involved in K ATP channel opener cardioprotection. Mouse hearts (C57BL/6) underwent baseline perfusion with Krebs-Henseleit buffer (30 min), assessment of function using a left ventricular balloon, delivery of test solution, and prolonged global ischemia (90 min). Hearts underwent reperfusion (30 min) and functional assessment. Coronary flow was measured using an inline probe. Test solutions included were as follows: hyperkalemic cardioplegia alone (CPG, n = 11) or with diazoxide (CPG + DZX, n = 12). Although the CPG + DZX group had greater percent recovery of developed pressure and coronary flow, this was not statistically significant. Following a mean of 74 min (CPG) and 77 min (CPG + DZX), an additional increase in end-diastolic pressure was noted (plateau), which was significantly higher in the CPG group. Similarly, the end-diastolic pressure (at reperfusion and at the end of experiment) was significantly higher in the CPG group. Prolongation of global ischemia demonstrated added benefit when DZX was added to traditional hyperkalemic CPG. This model will allow the investigation of DZX mechanism of cardioprotection following manipulation of targeted K ATP channel components. This model will also allow translation to prolonged ischemic episodes associated with cardiac surgery. Copyright © 2018 Elsevier Inc. All rights reserved.
Hursh, Andrew; Ballantyne, Ashley; Cooper, Leila; Maneta, Marco; Kimball, John; Watts, Jennifer
2017-05-01
Soil respiration (Rs) is a major pathway by which fixed carbon in the biosphere is returned to the atmosphere, yet there are limits to our ability to predict respiration rates using environmental drivers at the global scale. While temperature, moisture, carbon supply, and other site characteristics are known to regulate soil respiration rates at plot scales within certain biomes, quantitative frameworks for evaluating the relative importance of these factors across different biomes and at the global scale require tests of the relationships between field estimates and global climatic data. This study evaluates the factors driving Rs at the global scale by linking global datasets of soil moisture, soil temperature, primary productivity, and soil carbon estimates with observations of annual Rs from the Global Soil Respiration Database (SRDB). We find that calibrating models with parabolic soil moisture functions can improve predictive power over similar models with asymptotic functions of mean annual precipitation. Soil temperature is comparable with previously reported air temperature observations used in predicting Rs and is the dominant driver of Rs in global models; however, within certain biomes soil moisture and soil carbon emerge as dominant predictors of Rs. We identify regions where typical temperature-driven responses are further mediated by soil moisture, precipitation, and carbon supply and regions in which environmental controls on high Rs values are difficult to ascertain due to limited field data. Because soil moisture integrates temperature and precipitation dynamics, it can more directly constrain the heterotrophic component of Rs, but global-scale models tend to smooth its spatial heterogeneity by aggregating factors that increase moisture variability within and across biomes. We compare statistical and mechanistic models that provide independent estimates of global Rs ranging from 83 to 108 Pg yr -1 , but also highlight regions of uncertainty where more observations are required or environmental controls are hard to constrain. © 2016 John Wiley & Sons Ltd.
Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johannesson, G
2010-03-17
Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less
MMPI for personality characteristics of patients with different diseases.
Pop-Jordanova, N
2015-01-01
In the field of psychosomatic medicine the relationship between personality characteristics and diseases is supposed to be an important issue. The aim of this article is to present group's MMPI profiles obtained for patients with different chronic diseases and to discuss about possible specific features of these different groups. We summarized results obtained by psychological testing of following groups of patients: adult patients treated with chronic maintenance dialysis, patients with diabetic retinopathy, general anxiety group, attack panic syndrome, parents of children with rheumatoid arthritis, as well as adolescents with mental anorexia, cystic fibrosis, diabetes mellitus and leukemia. Two control groups comprised adults and adolescents, both without any health problems, selected randomly. As a psychometric test MMPI-201 was used. Statistic 10 package is used for statistical analysis. In our presentation it can be seen some typical personality characteristics for patients with chronic conditions. These findings could be helpful for clinicians concerning treatment planning and follow-up. In general, the MMPI helps us to obtain a global, factual picture from the self-assessment of the patient, explained in a psycho-technical language. Group's profile could be used in clinical practice for planning treatment and to suppose the prognosis of the illness.
Gao, Bin; Li, Xiaoqing; Woo, Wai Lok; Tian, Gui Yun
2018-05-01
Thermographic inspection has been widely applied to non-destructive testing and evaluation with the capabilities of rapid, contactless, and large surface area detection. Image segmentation is considered essential for identifying and sizing defects. To attain a high-level performance, specific physics-based models that describe defects generation and enable the precise extraction of target region are of crucial importance. In this paper, an effective genetic first-order statistical image segmentation algorithm is proposed for quantitative crack detection. The proposed method automatically extracts valuable spatial-temporal patterns from unsupervised feature extraction algorithm and avoids a range of issues associated with human intervention in laborious manual selection of specific thermal video frames for processing. An internal genetic functionality is built into the proposed algorithm to automatically control the segmentation threshold to render enhanced accuracy in sizing the cracks. Eddy current pulsed thermography will be implemented as a platform to demonstrate surface crack detection. Experimental tests and comparisons have been conducted to verify the efficacy of the proposed method. In addition, a global quantitative assessment index F-score has been adopted to objectively evaluate the performance of different segmentation algorithms.
Inocoterone and acne. The effect of a topical antiandrogen: results of a multicenter clinical trial.
Lookingbill, D P; Abrams, B B; Ellis, C N; Jegasothy, B V; Lucky, A W; Ortiz-Ferrer, L C; Savin, R C; Shupack, J L; Stiller, M J; Zone, J J
1992-09-01
Because acne is androgen dependent, antiandrogen therapy might improve the condition. Inocoterone acetate (RU 882) is a nonsteroidal antiandrogen that binds to the androgen receptor and has antiandrogenic activity in animal models. To test its topical effect on acne, 126 male subjects with facial acne completed a 16-week, multi-center, double-blind study in which the twice-daily application of a 10% solution of inocoterone was compared with vehicle solution. Baseline and monthly examinations included acne lesion counts and general and endocrine laboratory tests. Inflammatory papules and pustules showed greater reduction in the inocoterone-treated subjects than in the subjects treated with vehicle. This difference achieved statistical significance by week 12 (24% reduction vs 10%) and week 16 (26% reduction vs 13%) and, with longitudinal analysis, throughout the course of the study. Global assessments and changes in comedo counts and sebum excretion rates were not significantly different between the groups. No serious adverse reactions were encountered. In this double-blind study of 126 male subjects with acne, a topical solution of the antiandrogen inocoterone, compared with vehicle, produced a modest but statistically significant reduction in the number of inflammatory acne lesions.
High-Resolution Regional Reanalysis in China: Evaluation of 1 Year Period Experiments
NASA Astrophysics Data System (ADS)
Zhang, Qi; Pan, Yinong; Wang, Shuyu; Xu, Jianjun; Tang, Jianping
2017-10-01
Globally, reanalysis data sets are widely used in assessing climate change, validating numerical models, and understanding the interactions between the components of a climate system. However, due to the relatively coarse resolution, most global reanalysis data sets are not suitable to apply at the local and regional scales directly with the inadequate descriptions of mesoscale systems and climatic extreme incidents such as mesoscale convective systems, squall lines, tropical cyclones, regional droughts, and heat waves. In this study, by using a data assimilation system of Gridpoint Statistical Interpolation, and a mesoscale atmospheric model of Weather Research and Forecast model, we build a regional reanalysis system. This is preliminary and the first experimental attempt to construct a high-resolution reanalysis for China main land. Four regional test bed data sets are generated for year 2013 via three widely used methods (classical dynamical downscaling, spectral nudging, and data assimilation) and a hybrid method with data assimilation coupled with spectral nudging. Temperature at 2 m, precipitation, and upper level atmospheric variables are evaluated by comparing against observations for one-year-long tests. It can be concluded that the regional reanalysis with assimilation and nudging methods can better produce the atmospheric variables from surface to upper levels, and regional extreme events such as heat waves, than the classical dynamical downscaling. Compared to the ERA-Interim global reanalysis, the hybrid nudging method performs slightly better in reproducing upper level temperature and low-level moisture over China, which improves regional reanalysis data quality.
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Adler, Robert; Adler, David; Peters-Lidard, Christa; Huffman, George
2012-01-01
It is well known that extreme or prolonged rainfall is the dominant trigger of landslides worldwide. While research has evaluated the spatiotemporal distribution of extreme rainfall and landslides at local or regional scales using in situ data, few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This study uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from TRMM data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurrence of precipitation and landslides globally. Evaluation of the GLC indicates that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This study characterizes the variability of satellite precipitation data and reported landslide activity at the globally scale in order to improve landslide cataloging, forecasting and quantify potential triggering sources at daily, monthly and yearly time scales.
NASA Astrophysics Data System (ADS)
Holt, C. R.; Szunyogh, I.; Gyarmati, G.; Hoffman, R. N.; Leidner, M.
2011-12-01
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a vari- ation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area anal- ysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in po- sition analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional com- ponent of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
A hybrid SVM-FFA method for prediction of monthly mean global solar radiation
NASA Astrophysics Data System (ADS)
Shamshirband, Shahaboddin; Mohammadi, Kasra; Tong, Chong Wen; Zamani, Mazdak; Motamedi, Shervin; Ch, Sudheer
2016-07-01
In this study, a hybrid support vector machine-firefly optimization algorithm (SVM-FFA) model is proposed to estimate monthly mean horizontal global solar radiation (HGSR). The merit of SVM-FFA is assessed statistically by comparing its performance with three previously used approaches. Using each approach and long-term measured HGSR, three models are calibrated by considering different sets of meteorological parameters measured for Bandar Abbass situated in Iran. It is found that the model (3) utilizing the combination of relative sunshine duration, difference between maximum and minimum temperatures, relative humidity, water vapor pressure, average temperature, and extraterrestrial solar radiation shows superior performance based upon all approaches. Moreover, the extraterrestrial radiation is introduced as a significant parameter to accurately estimate the global solar radiation. The survey results reveal that the developed SVM-FFA approach is greatly capable to provide favorable predictions with significantly higher precision than other examined techniques. For the SVM-FFA (3), the statistical indicators of mean absolute percentage error (MAPE), root mean square error (RMSE), relative root mean square error (RRMSE), and coefficient of determination ( R 2) are 3.3252 %, 0.1859 kWh/m2, 3.7350 %, and 0.9737, respectively which according to the RRMSE has an excellent performance. As a more evaluation of SVM-FFA (3), the ratio of estimated to measured values is computed and found that 47 out of 48 months considered as testing data fall between 0.90 and 1.10. Also, by performing a further verification, it is concluded that SVM-FFA (3) offers absolute superiority over the empirical models using relatively similar input parameters. In a nutshell, the hybrid SVM-FFA approach would be considered highly efficient to estimate the HGSR.
Levy, Robert M; Saikovsky, Roman; Shmidt, Evgeniya; Khokhlov, Alexander; Burnett, Bruce P
2009-05-01
Flavocoxid (Limbrel), a proprietary mixture of flavonoid molecules (baicalin and catechin), was tested against a traditional nonsteroidal anti-inflammatory drug, naproxen, for the management of the signs and symptoms of moderate osteoarthritis (OA) in humans. Discomfort and global disease activity were used as the primary end points, and safety assessments were also taken for both treatments as a secondary endpoint. In this double-blind study, 103 subjects were randomly assigned to receive either flavocoxid [500 mg twice daily (BID)] or naproxen (500 mg BID) in a 1-month onset of action trial. Outcome measures included the short Western Ontario and McMaster University Osteoarthritis Index, subject Visual Analogue Scale for discomfort and global response, and investigator Visual Analogue Scale for global response and fecal occult blood. Both flavocoxid and naproxen showed significant reduction in the signs and symptoms of knee OA (P < or = .001). There were no statistically detectable differences between the flavocoxid and naproxen groups with respect to any of the outcome variables. Similarly, there were no statistically detectable differences between the groups with respect to any adverse event, although there was a trend toward a higher incidence of edema and nonspecific musculoskeletal discomfort in the naproxen group. In this short-term pilot study, flavocoxid was as effective as naproxen in controlling the signs and symptoms of OA of the knee and would present a safe and effective option for those individuals on traditional nonsteroidal anti-inflammatory drugs or cyclooxygenase-2 inhibitors. A low incidence of adverse events was reported for both groups.
Walling, David; Marder, Stephen R.; Kane, John; Fleischhacker, W. Wolfgang; Keefe, Richard S. E.; Hosford, David A.; Dvergsten, Chris; Segreti, Anthony C.; Beaver, Jessica S.; Toler, Steven M.; Jett, John E.; Dunbar, Geoffrey C.
2016-01-01
Objectives: This trial was conducted to test the effects of an alpha7 nicotinic receptor full agonist, TC-5619, on negative and cognitive symptoms in subjects with schizophrenia. Methods: In 64 sites in the United States, Russia, Ukraine, Hungary, Romania, and Serbia, 477 outpatients (18–65 years; male 62%; 55% tobacco users) with schizophrenia, treated with a new-generation antipsychotic, were randomized to 24 weeks of placebo (n = 235), TC-5619, 5mg (n = 121), or TC-5619, 50mg (n = 121), administered orally once daily. The primary efficacy measure was the Scale for the Assessment of Negative Symptoms (SANS) composite score. Key secondary measures were the Cogstate Schizophrenia Battery (CSB) composite score and the University of California San Diego Performance-Based Skills Assessment-Brief Version (UPSA-B) total score. Secondary measures included: Positive and Negative Syndrome Scale in Schizophrenia (PANSS) total and subscale scores, SANS domain scores, CSB item scores, Clinical Global Impression-Global Improvement (CGI-I) score, CGI-Severity (CGI-S) score, and Subject Global Impression-Cognition (SGI-Cog) total score. Results: SANS score showed no statistical benefit for TC-5619 vs placebo at week 24 (5mg, 2-tailed P = .159; 50mg, P = .689). Likewise, no scores of CSB, UPSA-B, PANSS, CGI-I, CGI-S, or SGI-Cog favored TC-5619 (P > .05). Sporadic statistical benefit favoring TC-5619 in some of these outcome measures were observed in tobacco users, but these benefits did not show concordance by dose, country, gender, or other relevant measures. TC-5619 was generally well tolerated. Conclusion: These results do not support a benefit of TC-5619 for negative or cognitive symptoms in schizophrenia. PMID:26071208
NASA Astrophysics Data System (ADS)
Nguyen, P.; Sorooshian, S.; Hsu, K. L.; Gao, X.; AghaKouchak, A.; Braithwaite, D.; Thorstensen, A. R.; Ashouri, H.; Tran, H.; Huynh, P.; Palacios, T.
2016-12-01
Center for Hydrometeorology and Remote Sensing (CHRS), University of California, Irvine has recently developed the CHRS RainSphere (hosted at http://rainsphere.eng.uci.edu) for scientific studies and applications using the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks - Climate Data Record (PERSIANN-CDR, Ashouri et al. 2015). PERSIANN-CDR is a long-term (33+ years) high-resolution (daily, 0.25 degree) global satellite precipitation dataset which is useful for climatological studies and water resources applications. CHRS RainSphere has functionalities allowing users to visualize and query spatiotemporal statistics of global daily satellite precipitation for the past three decades. With a couple of mouse-clicks, users can easily obtain a report of time series, spatial plots, and basic trend analysis of rainfall for various spatial domains of interest such as location, watershed, basin, political division and country for yearly, monthly, monthly by year or daily. Mann-Kendall test is implemented on CHRS RainSphere for statistically investigating whether there is a significant increasing/decreasing rainfall trend at a location or over a specific spatial domain. CHRS RainSphere has a range of capabilities and should appeal to a broad spectrum of users including climate scientists, water resources managers and planners, and engineers. CHRS RainSphere can also be a useful educational tool for the general public to investigate climate change and variability. The video tutorial on CHRS RainSphere is available at https://www.youtube.com/watch?v=eI2-f88iGlY&feature=youtu.be. A demonstration of CHRS RainSphere will be included in the presentation.
The cognitive profile of myotonic dystrophy type 1: A systematic review and meta-analysis.
Okkersen, Kees; Buskes, Melanie; Groenewoud, Johannes; Kessels, Roy P C; Knoop, Hans; van Engelen, Baziel; Raaphorst, Joost
2017-10-01
To examine the cognitive profile of patients with myotonic dystrophy type 1 (DM1) on the basis of a systematic review and meta-analysis of the literature. Embase, Medline and PsycInfo were searched for studies reporting ≥1 neuropsychological test in both DM1 patients and healthy controls. Search, data extraction and risk of bias analysis were independently performed by two authors to minimize error. Neuropsychological tests were categorized into 12 cognitive domains and effect sizes (Hedges' g) were calculated for each domain and for tests administered in ≥5 studies. DM1 participants demonstrated a significantly worse performance compared to controls in all cognitive domains. Effect sizes ranged from -.33 (small) for verbal memory to -1.01 (large) for visuospatial perception. Except for the domains global cognition, intelligence and social cognition, wide confidence intervals (CIs) were associated with moderate to marked statistical heterogeneity that necessitates careful interpretation of results. Out of the individual tests, the Rey-Osterrieth complex figure-copy (both non-verbal memory and visuoconstruction) showed consistent impairment with acceptable heterogeneity. In DM1 patients, cognitive deficits may include a variable combination of global cognitive impairment with involvement across different domains, including social cognition, memory and visuospatial functioning. Although DM1 is a heterogeneous disorder, our study shows that meta-analysis is feasible, contributes to the understanding of brain involvement and may direct bedside testing. The protocol for this study has been registered in PROSPERO (International prospective register of systematic reviews) under ID: 42016037415. Copyright © 2017 Elsevier Ltd. All rights reserved.
Aydemir, Koray; Tok, Fatih; Peker, Fatma; Safaz, Ismail; Taskaynatan, Mehmet Ali; Ozgul, Ahmet
2010-01-01
This study aimed to determine the effects of balneotherapy on disease activity, functional status, metrology index, pulmonary function and quality of life in patients with ankylosing spondylitis (AS). The study included 28 patients (27 male and 1 female) diagnosed with AS according to modified New York criteria. The patients were treated with balneotherapy for 3 weeks (30 min/day, 5 days/week). The patients were evaluated using the global index, Bath ankylosing spondylitis disease activity index (BASDAI), disease functional index (BASFI), metrology index (BASMI), chest expansion measures, pulmonary function testing, and the medical outcomes study-short form-36 Health Survey (SF-36) (measure of quality of life) before balneotherapy and 1 month after treatment. Post balneotherapy BASDAI and global index decreased, BASMI parameters improved, chest expansion increased, and some SF-36 parameters improved; however, none of these changes were statistically significant (P > 0.05), except for the decrease in BASMI total score (P < 0.05). Before balneotherapy 6 patients had restrictive pulmonary disorder, according to pulmonary function test results. Pulmonary function test results in 3 (50%) patients were normalized following balneotherapy; however, as for the other index, balneotherapy did not significantly affect pulmonary function test results. The AS patients' symptoms, clinical findings, pulmonary function test results, and quality of life showed a trend to improve following balneotherapy, although without reaching significant differences. Comprehensive randomized controlled spa intervention studies with longer follow-up periods may be helpful in further delineating the therapeutic efficacy of balneotherapy in AS patients.
'Disaster day': global health simulation teaching.
Mohamed-Ahmed, Rayan; Daniels, Alex; Goodall, Jack; O'Kelly, Emily; Fisher, James
2016-02-01
As society diversifies and globalisation quickens, the importance of teaching global health to medical undergraduates increases. For undergraduates, the majority of exposure to 'hands-on' teaching on global health occurs during optional elective periods. This article describes an innovative student-led initiative, 'Disaster Day', which used simulation to teach global health to undergraduates. The teaching day began with an introduction outlining the work of Médecins Sans Frontières and the basic principles of resuscitation. Students then undertook four interactive simulation scenarios: Infectious Diseases in a Refugee Camp, Natural Disaster and Crush Injury, Obstetric Emergency in a Low-Income Country, and Warzone Gunshot Wound. Sessions were facilitated by experienced doctors and fourth-year students who had been trained in the delivery of the scenarios. Students completed pre- and post-session evaluation forms that included the self-rating of confidence in eight learning domains (using a five-point Likert scale). Twenty-seven students voluntarily attended the session, and all provided written feedback. Analysis of the pre- and post-session evaluations demonstrated statistically significant improvements in confidence across all but one domains (Wilcoxon signed rank test). Free-text feedback was overwhelmingly positive, with students appreciating the practical aspect of the scenarios. For undergraduates, the majority of exposure to 'hands-on' teaching on global health occurs during optional elective periods Simulation-based teaching can provide students with 'hands-on' exposure to global health in a controlled, reproducible fashion and appears to help develop their confidence in a variety of learning domains. The more widespread use of such teaching methods is encouraged: helping tomorrow's doctors develop insight into global health challenges may produce more rounded clinicians capable of caring for more culturally diverse populations. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Lin, J.
2017-12-01
Recent studies have revealed the issue of globalizing air pollution through complex coupling of atmospheric transport (physical route) and economic trade (socioeconomic route). Recognition of such globalizing air pollution has important implications for understanding the impacts of regional and global consumption (of goods and services) on air quality, public health, climate and the ecosystems. And addressing these questions often requires improved modeling, measurements and economic-emission statistics. This talk will introduce the concept and mechanism of globalizing air pollution, with following demonstrations based on recent works on modeling, satellite measurement and multi-disciplinary assessment.
Tankeu, Aurel T; Bigna, Jean Joël; Nansseu, Jobert Richie; Endomba, Francky Teddy A; Wafeu, Guy Sadeu; Kaze, Arnaud D; Noubiap, Jean Jacques
2017-06-09
Diabetes mellitus (DM) is an important risk factor for active tuberculosis (TB), which also adversely affect TB treatment outcomes. The escalating global DM epidemic is fuelling the burden of TB and should therefore be a major target in the strategy for ending TB. This review aims to estimate the global prevalence of DM in patients with TB. This systematic review will include cross-sectional, case-control or cohort studies of populations including patients diagnosed with TB that have reported the prevalence of DM using one of the fourth standard recommendations for screening and diagnosis. This protocol is written in accordance with recommendations from the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols 2015 statement. Relevant abstracts published in English/French from inception to 31 December 2016 will be searched in PubMed, Excerpta Medica Database and online journals. Two investigators will independently screen, select studies, extract data and assess the risk of bias in each study. The study-specific estimates will be pooled through a random-effects meta-analysis model to obtain an overall summary estimate of the prevalence of diabetes across the studies. Heterogeneity will be assessed, and we will pool studies judged to be clinically homogenous. On the other hand, statistical heterogeneity will be evaluated by the χ² test on Cochrane's Q statistic. Funnel-plots analysis and Egger's test will be used to investigate publication bias. Results will be presented by continent or geographic regions. This study is based on published data. An ethical approval is therefore not required. This systematic review and meta-analysis is expected to inform healthcare providers as well as general population on the co-occurrence of DM and TB. The final report will be published as an original article in a peer-reviewed journal, and will also be presented at conferences and submitted to relevant health authorities. We also plan to update the review every 5 years. PROSPERO International Prospective Register of Systematic Reviews (CRD42016049901). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Estimating the volume and age of water stored in global lakes using a geo-statistical approach
Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver
2016-01-01
Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively. PMID:27976671
NASA Astrophysics Data System (ADS)
Wu, Qing; Luu, Quang-Hung; Tkalich, Pavel; Chen, Ge
2018-04-01
Having great impacts on human lives, global warming and associated sea level rise are believed to be strongly linked to anthropogenic causes. Statistical approach offers a simple and yet conceptually verifiable combination of remotely connected climate variables and indices, including sea level and surface temperature. We propose an improved statistical reconstruction model based on the empirical dynamic control system by taking into account the climate variability and deriving parameters from Monte Carlo cross-validation random experiments. For the historic data from 1880 to 2001, we yielded higher correlation results compared to those from other dynamic empirical models. The averaged root mean square errors are reduced in both reconstructed fields, namely, the global mean surface temperature (by 24-37%) and the global mean sea level (by 5-25%). Our model is also more robust as it notably diminished the unstable problem associated with varying initial values. Such results suggest that the model not only enhances significantly the global mean reconstructions of temperature and sea level but also may have a potential to improve future projections.
Escalante, Agustín; Haas, Roy W; del Rincón, Inmaculada
2004-01-01
Outcome assessment in patients with rheumatoid arthritis (RA) includes measurement of physical function. We derived a scale to quantify global physical function in RA, using three performance-based rheumatology function tests (RFTs). We measured grip strength, walking velocity, and shirt button speed in consecutive RA patients attending scheduled appointments at six rheumatology clinics, repeating these measurements after a median interval of 1 year. We extracted the underlying latent variable using principal component factor analysis. We used the Bayesian information criterion to assess the global physical function scale's cross-sectional fit to criterion standards. The criteria were joint tenderness, swelling, and deformity, pain, physical disability, current work status, and vital status at 6 years after study enrolment. We computed Guyatt's responsiveness statistic for improvement according to the American College of Rheumatology (ACR) definition. Baseline functional performance data were available for 777 patients, and follow-up data were available for 681. Mean ± standard deviation for each RFT at baseline were: grip strength, 14 ± 10 kg; walking velocity, 194 ± 82 ft/min; and shirt button speed, 7.1 ± 3.8 buttons/min. Grip strength and walking velocity departed significantly from normality. The three RFTs loaded strongly on a single factor that explained ≥70% of their combined variance. We rescaled the factor to vary from 0 to 100. Its mean ± standard deviation was 41 ± 20, with a normal distribution. The new global scale had a stronger fit than the primary RFT to most of the criterion standards. It correlated more strongly with physical disability at follow-up and was more responsive to improvement defined according to the ACR20 and ACR50 definitions. We conclude that a performance-based physical function scale extracted from three RFTs has acceptable distributional and measurement properties and is responsive to clinically meaningful change. It provides a parsimonious scale to measure global physical function in RA. PMID:15225367
Schnippel, Kathryn; Sharp, Alana
2016-01-01
Objective Identifying those infected with tuberculosis (TB) is an important component of any strategy for reducing TB transmission and population prevalence. The Stop TB Global Partnership recently launched an initiative with a focus on key populations at greater risk for TB infection or poor clinical outcomes, due to housing and working conditions, incarceration, low household income, malnutrition, co-morbidities, exposure to tobacco and silica dust, or barriers to accessing medical care. To achieve operational targets, the global health community needs effective, low cost, and large-scale strategies for identifying key populations. Using South Africa as a test case, we assess the feasibility and effectiveness of targeting active case finding to populations with TB risk factors identified from regularly collected sources of data. Our approach is applicable to all countries with TB testing and census data. It allows countries to tailor their outreach activities to the particular risk factors of greatest significance in their national context. Methods We use a national database of TB test results to estimate municipality-level TB infection prevalence, and link it to Census data to measure population risk factors for TB including rates of urban households, informal settlements, household income, unemployment, and mobile phone ownership. To examine the relationship between TB prevalence and risk factors, we perform linear regression analysis and plot the set of population characteristics against TB prevalence and TB testing rate by municipality. We overlay lines of best fit and smoothed curves of best fit from locally weighted scatter plot smoothing. Findings Higher TB prevalence is statistically significantly associated with more urban municipalities (slope coefficient β1 = 0.129, p < 0.0001, R2 = 0.133), lower mobile phone access (β1 = -0.053, p < 0.001, R2 = 0.089), lower unemployment rates (β1 = -0.020, p = 0.003, R2 = 0.048), and a lower proportion of low-income households (β1 = -0.048, p < 0.0001, R2 = 0.084). Municipalities with more low-income households also have marginally higher TB testing rates, however, this association is not statistically significant (β1 = -0.025, p = 0.676, R2 = 0.001). There is no relationship between TB prevalence and the proportion of informal settlement households (β1 = 0.021, p = 0.136, R2 = 0.014). Conclusions These analyses reveal that the set of characteristics identified by the Global Plan as defining key populations do not adequately predict populations with high TB burden. For example, we find that higher TB prevalence is correlated with more urbanized municipalities but not with informal settlements. We highlight several factors that are counter-intuitively those most associated with high TB burdens and which should therefore play a large role in any effective targeting strategy. Targeting active case finding to key populations at higher risk of infection or poor clinical outcomes may prove more cost effective than broad efforts. However, these results should increase caution in current targeting of active case finding interventions. PMID:27732606
McLaren, Zoë M; Schnippel, Kathryn; Sharp, Alana
2016-01-01
Identifying those infected with tuberculosis (TB) is an important component of any strategy for reducing TB transmission and population prevalence. The Stop TB Global Partnership recently launched an initiative with a focus on key populations at greater risk for TB infection or poor clinical outcomes, due to housing and working conditions, incarceration, low household income, malnutrition, co-morbidities, exposure to tobacco and silica dust, or barriers to accessing medical care. To achieve operational targets, the global health community needs effective, low cost, and large-scale strategies for identifying key populations. Using South Africa as a test case, we assess the feasibility and effectiveness of targeting active case finding to populations with TB risk factors identified from regularly collected sources of data. Our approach is applicable to all countries with TB testing and census data. It allows countries to tailor their outreach activities to the particular risk factors of greatest significance in their national context. We use a national database of TB test results to estimate municipality-level TB infection prevalence, and link it to Census data to measure population risk factors for TB including rates of urban households, informal settlements, household income, unemployment, and mobile phone ownership. To examine the relationship between TB prevalence and risk factors, we perform linear regression analysis and plot the set of population characteristics against TB prevalence and TB testing rate by municipality. We overlay lines of best fit and smoothed curves of best fit from locally weighted scatter plot smoothing. Higher TB prevalence is statistically significantly associated with more urban municipalities (slope coefficient β1 = 0.129, p < 0.0001, R2 = 0.133), lower mobile phone access (β1 = -0.053, p < 0.001, R2 = 0.089), lower unemployment rates (β1 = -0.020, p = 0.003, R2 = 0.048), and a lower proportion of low-income households (β1 = -0.048, p < 0.0001, R2 = 0.084). Municipalities with more low-income households also have marginally higher TB testing rates, however, this association is not statistically significant (β1 = -0.025, p = 0.676, R2 = 0.001). There is no relationship between TB prevalence and the proportion of informal settlement households (β1 = 0.021, p = 0.136, R2 = 0.014). These analyses reveal that the set of characteristics identified by the Global Plan as defining key populations do not adequately predict populations with high TB burden. For example, we find that higher TB prevalence is correlated with more urbanized municipalities but not with informal settlements. We highlight several factors that are counter-intuitively those most associated with high TB burdens and which should therefore play a large role in any effective targeting strategy. Targeting active case finding to key populations at higher risk of infection or poor clinical outcomes may prove more cost effective than broad efforts. However, these results should increase caution in current targeting of active case finding interventions.
Atypical nucleus accumbens morphology in psychopathy: another limbic piece in the puzzle.
Boccardi, Marina; Bocchetta, Martina; Aronen, Hannu J; Repo-Tiihonen, Eila; Vaurio, Olli; Thompson, Paul M; Tiihonen, Jari; Frisoni, Giovanni B
2013-01-01
Psychopathy has been associated with increased putamen and striatum volumes. The nucleus accumbens - a key structure in reversal learning, less effective in psychopathy - has not yet received specific attention. Moreover, basal ganglia morphology has never been explored. We examined the morphology of the caudate, putamen and accumbens, manually segmented from magnetic resonance images of 26 offenders (age: 32.5 ± 8.4) with medium-high psychopathy (mean PCL-R=30 ± 5) and 25 healthy controls (age: 34.6 ± 10.8). Local differences were statistically modeled using a surface-based radial distance mapping method (p<0.05; multiple comparisons correction through permutation tests). In psychopathy, the caudate and putamen had normal global volume, but different morphology, significant after correction for multiple comparisons, for the right dorsal putamen (permutation test: p=0.02). The volume of the nucleus accumbens was 13% smaller in psychopathy (p corrected for multiple comparisons <0.006). The atypical morphology consisted of predominant anterior hypotrophy bilaterally (10-30%). Caudate and putamen local morphology displayed negative correlation with the lifestyle factor of the PCL-R (permutation test: p=0.05 and 0.03). From these data, psychopathy appears to be associated with an atypical striatal morphology, with highly significant global and local differences of the accumbens. This is consistent with the clinical syndrome and with theories of limbic involvement. Copyright © 2013 Elsevier Ltd. All rights reserved.
Oak, Sameer R; Strnad, Gregory J; Bena, James; Farrow, Lutul D; Parker, Richard D; Jones, Morgan H; Spindler, Kurt P
2016-12-01
The EuroQol 5 dimensions questionnaire (EQ-5D), Patient-Reported Outcomes Measurement Information System (PROMIS) 10 Global Health, and Veterans RAND 12-Item Health Survey (VR-12) are generic patient-reported outcome (PRO) questionnaires that assess a patient's general health. In choosing a PRO to track general health status, it is necessary to consider which measure will be the most responsive to change after treatment. To date, no studies exist comparing responsiveness among the EQ-5D, PROMIS 10 Global Health, and the Veterans Rand 12-Item Health Survey (VR-12). To determine which of the generic PROs are most responsive internally and externally in the setting of knee arthroscopy. Cohort study (diagnosis); Level of evidence, 3. Fifty patients who underwent knee arthroscopy were surveyed preoperatively and a mean 3.6 months postoperatively, with 90% follow-up. PROs included the EQ-5D, EQ-5D visual analog scale, PROMIS 10 Global Health (PROMIS 10) physical and mental components, VR-12 physical and mental components, and the Knee injury and Osteoarthritis Outcome Score (KOOS)-pain subscale. Internal responsiveness was evaluated by performing paired t tests on the changes in measures and calculating 2 measures of effect size: Cohen d and standardized response mean (SRM). External responsiveness was evaluated by comparing Pearson correlation measures between the disease-specific reference KOOS-pain and generic PROs. For internal responsiveness, 3 PROs showed a statistically significant improvement in score after treatment (EQ-5D: +0.10 [95% CI, 0.06-0.15], VR-12 physical: +7.2 [95% CI, 4.0-10.4]), and PROMIS 10 physical: +4.4 [95% CI, 2.6-6.3]) and effect size statistics with moderate change (Cohen d and SRM, 0.5-0.8). Assessing external responsiveness, a high correlation with the disease-specific reference (KOOS-pain score) was found for EQ-5D (0.65), VR-12 physical (0.57), and PROMIS 10 physical (0.77). For both internal and external responsiveness, the EQ-5D, VR-12 physical, and PROMIS 10 physical showed significantly greater responsiveness compared with the other general PRO measures but no statistical differences among themselves. There is no statistical difference in internal or external responsiveness to change among the EQ-5D, VR-12 physical, and PROMIS 10 physical instruments. In tracking longitudinal patient health, researchers and administrators have the flexibility to choose any of the general PROs among the EQ-5D, VR-12 physical, and PROMIS 10 physical. We recommend that any study tracking PROs in knee arthroscopy include 1 of these generic instruments.
NASA Astrophysics Data System (ADS)
Karpudewan, Mageswary; Roth, Wolff-Michael; Abdullah, Mohd Nor Syahrir Bin
2015-01-01
Climate change generally and global warming specifically have become a common feature of the daily news. Due to widespread recognition of the adverse consequences of climate change on human lives, concerted societal effort has been taken to address it (e.g. by means of the science curriculum). This study was designed to test the effect that child-centred, 5E learning cycle-based climate change activities would have over more traditional teacher-centred activities on Malaysian Year 5 primary students (11 years). A quasi-experimental design involving a treatment (n = 55) and a group representing typical teaching method (n = 60) was used to measure the effectiveness of these activities on (a) increasing children's knowledge about global warming; (b) changing their attitudes to be more favourable towards the environment and (c) identify the relationship between knowledge and attitude that exist in this study. Statistically significant differences in favour of the treatment group were detected for both knowledge and environmental attitudes. Non-significant relationship was identified between knowledge and attitude in this study. Interviews with randomly selected students from treatment and comparison groups further underscore these findings. Implications are discussed.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias
2011-11-01
Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.
Changing Pattern of Indian Monsoon Extremes: Global and Local Factors
NASA Astrophysics Data System (ADS)
Ghosh, Subimal; Shastri, Hiteshri; Pathak, Amey; Paul, Supantha
2017-04-01
Indian Summer Monsoon Rainfall (ISMR) extremes have remained a major topic of discussion in the field of global change and hydro-climatology over the last decade. This attributes to multiple conclusions on changing pattern of extremes along with poor understanding of multiple processes at global and local scales associated with monsoon extremes. At a spatially aggregate scale, when number of extremes in the grids are summed over, a statistically significant increasing trend is observed for both Central India (Goswami et al., 2006) and all India (Rajeevan et al., 2008). However, such a result over Central India does not satisfy flied significance test of increase and no decrease (Krishnamurthy et al., 2009). Statistically rigorous extreme value analysis that deals with the tail of the distribution reveals a spatially non-uniform trend of extremes over India (Ghosh et al., 2012). This results into statistically significant increasing trend of spatial variability. Such an increase of spatial variability points to the importance of local factors such as deforestation and urbanization. We hypothesize that increase of spatial average of extremes is associated with the increase of events occurring over large region, while increase in spatial variability attributes to local factors. A Lagrangian approach based dynamic recycling model reveals that the major contributor of moisture to wide spread extremes is Western Indian Ocean, while land surface also contributes around 25-30% of moisture during the extremes in Central India. We further test the impacts of local urbanization on extremes and find the impacts are more visible over West central, Southern and North East India. Regional atmospheric simulations coupled with Urban Canopy Model (UCM) shows that urbanization intensifies extremes in city areas, but not uniformly all over the city. The intensification occurs over specific pockets of the urban region, resulting an increase in spatial variability even within the city. This also points to the need of setting up multiple weather stations over the city at a finer resolution for better understanding of urban extremes. We conclude that the conventional method of considering large scale factors is not sufficient for analysing the monsoon extremes and characterization of the same needs a blending of both global and local factors. Ghosh, S., Das, D., Kao, S-C. & Ganguly, A. R. Lack of uniform trends but increasing spatial variability in observed Indian rainfall extremes. Nature Clim. Change 2, 86-91 (2012) Goswami, B. N., Venugopal, V., Sengupta, D., Madhusoodanan, M. S. & Xavier, P. K. Increasing trend of extreme rain events over India in a warming environment. Science 314, 1442-1445 (2006). Krishnamurthy, C. K. B., Lall, U. & Kwon, H-H. Changing frequency and intensity of rainfall extremes over India from 1951 to 2003. J. Clim. 22, 4737-4746 (2009). Rajeevan, M., Bhate, J. & Jaswal, A. K. Analysis of variability and trends of extreme rainfall events over India using 104 years of gridded daily rainfall data. Geophys. Res. Lett. 35, L18707 (2008).
Scene-based nonuniformity correction using local constant statistics.
Zhang, Chao; Zhao, Wenyi
2008-06-01
In scene-based nonuniformity correction, the statistical approach assumes all possible values of the true-scene pixel are seen at each pixel location. This global-constant-statistics assumption does not distinguish fixed pattern noise from spatial variations in the average image. This often causes the "ghosting" artifacts in the corrected images since the existing spatial variations are treated as noises. We introduce a new statistical method to reduce the ghosting artifacts. Our method proposes a local-constant statistics that assumes that the temporal signal distribution is not constant at each pixel but is locally true. This considers statistically a constant distribution in a local region around each pixel but uneven distribution in a larger scale. Under the assumption that the fixed pattern noise concentrates in a higher spatial-frequency domain than the distribution variation, we apply a wavelet method to the gain and offset image of the noise and separate out the pattern noise from the spatial variations in the temporal distribution of the scene. We compare the results to the global-constant-statistics method using a clean sequence with large artificial pattern noises. We also apply the method to a challenging CCD video sequence and a LWIR sequence to show how effective it is in reducing noise and the ghosting artifacts.
Evaluation of creative thinking in children with idiopathic epilepsy (absence epilepsy).
Di Filippo, T; Parisi, L; Roccella, M
2012-02-01
Creativity represents the silent character of human behaviour. In children with epilepsy, cognitive performance of has mainly been investigated under the assumption that the disorder represents a risk factor for the development of intellectual function. In subjects with different forms of epilepsy, neuropsychologic disorders have been detected even when cognitive-global functioning is unimpaired. The cognitive functions of subjects with epilepsy have been widely studied, but their creativity has been never evaluated to date. The aim of this study was to describe the development of creative thinking in a group of children with absence epilepsy. The test battery included: the Torrance Test of Creative Thinking (TTCT), the Wechsler Intelligence Scale for Children-revised (WISC-R) and the Goodenough Human Figure Drawing Test. Statistical analysis (Mann-Whitney test) showed a statistically significant difference (P <0.05) in test scores between two groups of subjects (children with epilesy vs control group), with higher scores for figure originality, figure fluidity and figure elaboration in the control group. There was a significant correlation (Spearman's rho) between verbal IQ and verbal fluidity and verbal flexibility subscale scores and between performance IQ and figure elaboration, between total IQ and verbal fluidity and verbal flexibility subscales (P <0.05; r >0.30). Low scores on the figure originality subscales seem to confirm the hypothesis that adverse psychodynamic and relational factors impoverish autonomy, flexibility and manipulator interests. The communication channels between subjects with epilepsy and their family members were affected by the disorder, as were the type of emotional dynamics and affective flux.
Civil construction work: The unseen contributor to the occupational and global disease burden
Sitalakshmi, R.; Saikumar, P.; Jeyachandran, P.; Manoharan; Thangavel; Thomas, Jayakar
2016-01-01
Background: Construction industry is the second largest employment giving industry in India with many semi-skilled or unskilled workers taking up the occupation for livelihood without any training and proper guidance. Aim: To evaluate the pathogenic association of cement exposure to occupational contact dermatoses as evidenced by immune markers and to correlate their pulmonary functions with years of exposure to cement. Setting and Design: This was a cross-sectional study conducted among randomly selected cement workers. Methods and material: Evaluation of socioeconomic status (SES) and years of exposure of cement workers was done using a questionnaire. Clinical examination of skin lesions and strip patch test with application of potassium dichromate on unexposed skin was performed. Results were interpreted after 48 hours. Absolute eosinophil count (AEC) and IgE levels measured, and spirometric evaluation was performed. Statistical Analysis: Analysis of variance and Pearson's correlation test were used for data analysis. P < 0.05 was considered to be statistically significant. Results: Clinically, skin lesions were noticed in 51%, elevated AEC in 47%, and raised Anti IgE in 73%. Two participants developed positive reactions to the skin strip patch test. Duration of exposure to cement and SES were compared with clinical skin lesions. Spirometry result was normal in 81%, obstruction in 8%, restriction in 10%, and mixed pattern in 1%. Forced expiratory volume at 1.0 second, forced expiratory flow (25–75%), and (PEFR) Peak Expiratory Flow Rate were markedly reduced with years of exposure. Workers who had greater skin lesions and with increase in exposure had increased AEC and IgE levels, although statistically not significant. Conclusions: Exposure to cement and poor SES is strongly correlated to increased prevalence of skin lesions and reduced pulmonary functions. PMID:28194084
NASA Technical Reports Server (NTRS)
Hoffman, Matthew J.; Eluszkiewicz, Janusz; Weisenstein, Deborah; Uymin, Gennady; Moncet, Jean-Luc
2012-01-01
Motivated by the needs of Mars data assimilation. particularly quantification of measurement errors and generation of averaging kernels. we have evaluated atmospheric temperature retrievals from Mars Global Surveyor (MGS) Thermal Emission Spectrometer (TES) radiances. Multiple sets of retrievals have been considered in this study; (1) retrievals available from the Planetary Data System (PDS), (2) retrievals based on variants of the retrieval algorithm used to generate the PDS retrievals, and (3) retrievals produced using the Mars 1-Dimensional Retrieval (M1R) algorithm based on the Optimal Spectral Sampling (OSS ) forward model. The retrieved temperature profiles are compared to the MGS Radio Science (RS) temperature profiles. For the samples tested, the M1R temperature profiles can be made to agree within 2 K with the RS temperature profiles, but only after tuning the prior and error statistics. Use of a global prior that does not take into account the seasonal dependence leads errors of up 6 K. In polar samples. errors relative to the RS temperature profiles are even larger. In these samples, the PDS temperature profiles also exhibit a poor fit with RS temperatures. This fit is worse than reported in previous studies, indicating that the lack of fit is due to a bias correction to TES radiances implemented after 2004. To explain the differences between the PDS and Ml R temperatures, the algorithms are compared directly, with the OSS forward model inserted into the PDS algorithm. Factors such as the filtering parameter, the use of linear versus nonlinear constrained inversion, and the choice of the forward model, are found to contribute heavily to the differences in the temperature profiles retrieved in the polar regions, resulting in uncertainties of up to 6 K. Even outside the poles, changes in the a priori statistics result in different profile shapes which all fit the radiances within the specified error. The importance of the a priori statistics prevents reliable global retrievals based a single a priori and strongly implies that a robust science analysis must instead rely on retrievals employing localized a priori information, for example from an ensemble based data assimilation system such as the Local Ensemble Transform Kalman Filter (LETKF).
Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability
NASA Astrophysics Data System (ADS)
Singh, U. K.; Singh, G. P.; Singh, Vikas
2015-04-01
The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.
Theory of mind and functionality in bipolar patients with symptomatic remission.
Barrera, Angeles; Vázquez, Gustavo; Tannenhaus, Lucila; Lolich, María; Herbst, Luis
2013-01-01
Functional deficits are commonly observed in bipolar disorder after symptomatic remission. Social cognition deficits have also been reported, which could contribute to dysfunction in patients with bipolar disorder in remission. Twelve bipolar disorder patients in symptomatic remission (7 patients with bipolar disorder type I and 5 with bipolar disorder type II) and 12 healthy controls completed the Reading the Mind in the Eyes Test and the Faux Pas Test to evaluate theory of mind (ToM). Both groups also completed the Functional Assessment Short Test (FAST). The performance of the bipolar patients in the cognitive component of ToM was below normal, although the difference between the control group was not statistically significant (P=.078), with a trend to a worse performance associated with a higher number of depressive episodes (P=.082). There were no statistically significant differences between groups for the emotional component of ToM. Global functionality was significantly lower in bipolar patients compared to the control group (P=.001). Significant differences were also observed between both groups in five of the six dimensions of functionality assessed. No significant correlation was found between functionality and theory of mind. Bipolar patients in symptomatic remission exhibit impairments in several areas of functioning. Cognitive ToM appears more affected than emotional ToM. Deficits in ToM were not related to functional impairment. Copyright © 2012 SEP y SEPB. Published by Elsevier Espana. All rights reserved.
Effects of weight training on cognitive functions in elderly with Alzheimer's disease
Vital, Thays Martins; Hernández, Salma S. Soleman; Pedroso, Renata Valle; Teixeira, Camila Vieira Ligo; Garuffi, Marcelo; Stein, Angelica Miki; Costa, José Luiz Riani; Stella, Florindo
2012-01-01
Deterioration in cognitive functions is characteristic in Alzheimer's disease (AD) and may be associated with decline in daily living activities with consequent reduced quality of life. Objective To analyze weight training effects on cognitive functions in elderly with AD. Subjects 34 elderly with AD were allocated into two groups: Training Group (TG) and Social Gathering Group (SGG). Methods Global cognitive status was determined using the Mini-Mental State Exam. Specific cognitive functions were measured using the Brief Cognitive Battery, Clock Drawing Test and Verbal Fluency Test. The protocols were performed three times a week, one hour per session. The weight training protocol consisted of three sets of 20 repetitions, with two minutes of rest between sets and exercises. The activities proposed for the SGG were not systematized and aimed at promoting social interaction among patients. The statistical analyses were performed with the U Mann Whitney and Wilcoxon tests for group comparisons. All analyses were considered statistically significant at a p-value of 0.05. Results There were no significant differences associated to the effects of the practice of weight training on cognition in AD patients. Conclusion In this study, no improvement in cognitive functions was evident in elderly with AD who followed a low intensity resistance exercise protocol. Thus, future studies could evaluate the effect of more intense exercise programs. PMID:29213805
Absolute plate motions relative to deep mantle plumes
NASA Astrophysics Data System (ADS)
Wang, Shimin; Yu, Hongzheng; Zhang, Qiong; Zhao, Yonghong
2018-05-01
Advances in whole waveform seismic tomography have revealed the presence of broad mantle plumes rooted at the base of the Earth's mantle beneath major hotspots. Hotspot tracks associated with these deep mantle plumes provide ideal constraints for inverting absolute plate motions as well as testing the fixed hotspot hypothesis. In this paper, 27 observed hotspot trends associated with 24 deep mantle plumes are used together with the MORVEL model for relative plate motions to determine an absolute plate motion model, in terms of a maximum likelihood optimization for angular data fitting, combined with an outlier data detection procedure based on statistical tests. The obtained T25M model fits 25 observed trends of globally distributed hotspot tracks to the statistically required level, while the other two hotspot trend data (Comores on Somalia and Iceland on Eurasia) are identified as outliers, which are significantly incompatible with other data. For most hotspots with rate data available, T25M predicts plate velocities significantly lower than the observed rates of hotspot volcanic migration, which cannot be fully explained by biased errors in observed rate data. Instead, the apparent hotspot motions derived by subtracting the observed hotspot migration velocities from the T25M plate velocities exhibit a combined pattern of being opposite to plate velocities and moving towards mid-ocean ridges. The newly estimated net rotation of the lithosphere is statistically compatible with three recent estimates, but differs significantly from 30 of 33 prior estimates.
Bártová, Eva; Sedlák, Kamil; Kobédová, Kateřina; Budíková, Marie; Joel Atuman, Yakubu; Kamani, Joshua
2017-09-26
Neospora spp. and Toxoplasma gondii are considered to be a globally distributed parasites affecting wide range of warm-blooded animals. Neosporosis has caused clinical illness in horses and consumption of horse meat has been epidemiologically linked to clinical toxoplasmosis in humans. This study was conducted to determine Neospora spp. and T. gondii antibodies and risk factors of infection in horses and donkeys from three states of Nigeria. A total of 144 samples were collected from clinically healthy animals (120 horses and 24 donkeys). The sera were tested for antibodies to Neospora spp. and T. gondii by indirect fluorescence antibody test, a titer ≥ 50 was considered positive. Seroprevalence data were statistically analyzed, considering the variables of gender, age, use, state, origin of breed and type of management. Antibodies to Neospora spp. and T. gondii were detected in 8% horses with titers 50 and in 24% horses with titers 50-800, respectively. Co-infection of both parasites was proved in three horses (3%). Statistical differences were found only for T. gondii seroprevalence in horses with different use, locality, origin and management (p-value ≤ 0.05). Antibodies to T. gondii were detected in four (17%) of 24 donkeys with statistical difference (p-value ≤ 0.05) in animals of different use; antibodies to Neospora spp. were not proved in any of the donkeys. This is the first seroprevalence study of Neospora spp. and T. gondii in equids from Nigeria.
Statistical downscaling and future scenario generation of temperatures for Pakistan Region
NASA Astrophysics Data System (ADS)
Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas
2015-04-01
Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.
Brain magnetic resonance imaging CO2 stress testing in adolescent postconcussion syndrome.
Mutch, W Alan C; Ellis, Michael J; Ryner, Lawrence N; Ruth Graham, M; Dufault, Brenden; Gregson, Brian; Hall, Thomas; Bunge, Martin; Essig, Marco; Fisher, Joseph A; Duffin, James; Mikulis, David J
2016-09-01
OBJECT A neuroimaging assessment tool to visualize global and regional impairments in cerebral blood flow (CBF) and cerebrovascular responsiveness in individual patients with concussion remains elusive. Here the authors summarize the safety, feasibility, and results of brain CO2 stress testing in adolescents with postconcussion syndrome (PCS) and healthy controls. METHODS This study was approved by the Biomedical Research Ethics Board at the University of Manitoba. Fifteen adolescents with PCS and 17 healthy control subjects underwent anatomical MRI, pseudo-continuous arterial spin labeling MRI, and brain stress testing using controlled CO2 challenge and blood oxygen level-dependent (BOLD) MRI. Post hoc processing was performed using statistical parametric mapping to determine voxel-by-voxel regional resting CBF and cerebrovascular responsiveness of the brain to the CO2 stimulus (increase in BOLD signal) or the inverse (decrease in BOLD signal). Receiver operating characteristic (ROC) curves were generated to compare voxel counts categorized by control (0) or PCS (1). RESULTS Studies were well tolerated without any serious adverse events. Anatomical MRI was normal in all study participants. No differences in CO2 stimuli were seen between the 2 participant groups. No group differences in global mean CBF were detected between PCS patients and healthy controls. Patient-specific differences in mean regional CBF and CO2 BOLD responsiveness were observed in all PCS patients. The ROC curve analysis for brain regions manifesting a voxel response greater than and less than the control atlas (that is, abnormal voxel counts) produced an area under the curve of 0.87 (p < 0.0001) and 0.80 (p = 0.0003), respectively, consistent with a clinically useful predictive model. CONCLUSIONS Adolescent PCS is associated with patient-specific abnormalities in regional mean CBF and BOLD cerebrovascular responsiveness that occur in the setting of normal global resting CBF. Future prospective studies are warranted to examine the utility of brain MRI CO2 stress testing in the longitudinal assessment of acute sports-related concussion and PCS.
Explorations in statistics: hypothesis tests and P values.
Curran-Everett, Douglas
2009-06-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of Explorations in Statistics delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what we observe in the experiment to what we expect to see if the null hypothesis is true. The P value associated with the magnitude of that test statistic answers this question: if the null hypothesis is true, what proportion of possible values of the test statistic are at least as extreme as the one I got? Although statisticians continue to stress the limitations of hypothesis tests, there are two realities we must acknowledge: hypothesis tests are ingrained within science, and the simple test of a null hypothesis can be useful. As a result, it behooves us to explore the notions of hypothesis tests, test statistics, and P values.
Global Sensory Qualities and Aesthetic Experience in Music.
Brattico, Pauli; Brattico, Elvira; Vuust, Peter
2017-01-01
A well-known tradition in the study of visual aesthetics holds that the experience of visual beauty is grounded in global computational or statistical properties of the stimulus, for example, scale-invariant Fourier spectrum or self-similarity. Some approaches rely on neural mechanisms, such as efficient computation, processing fluency, or the responsiveness of the cells in the primary visual cortex. These proposals are united by the fact that the contributing factors are hypothesized to be global (i.e., they concern the percept as a whole), formal or non-conceptual (i.e., they concern form instead of content), computational and/or statistical, and based on relatively low-level sensory properties. Here we consider that the study of aesthetic responses to music could benefit from the same approach. Thus, along with local features such as pitch, tuning, consonance/dissonance, harmony, timbre, or beat, also global sonic properties could be viewed as contributing toward creating an aesthetic musical experience. Several such properties are discussed and their neural implementation is reviewed in the light of recent advances in neuroaesthetics.
Verification System: First System-Wide Performance Test
NASA Astrophysics Data System (ADS)
Chernobay, I.; Zerbo, L.
2006-05-01
System-wide performance tests are essential for the development, testing and evaluation of individual components of the verification system. In addition to evaluating global readiness it helps establishing the practical and financial requirements for eventual operations. The first system-wide performance test (SPT1) was conducted in three phases: - A preparatory phase in May-June 2004 - A performance testing phase in April-June 2005 - An evaluation phase in the last half of 2005. The preparatory phase was developmental in nature. The main objectives for the performance testing phase included establishment of performance baseline under current provisional mode of operation (CTBT/PC- 19/1/Annex II, CTBT/WGB-21/1), examination of established requirements and procedures for operation and maintenance. To establish a system-wide performance baseline the system configuration was fixed for April-May 2005. The third month (June 2005) was used for implementation of 21 test case scenarios to examine either particular operational procedures or the response of the system components to the failures simulated under controlled conditions. A total of 163 stations and 5 certified radionuclide laboratories of International Monitoring System (IMS) participated in the performance testing phase - about 50% of the eventual IMS network. 156 IMS facilities and 40 National Data Centres (NDCs) were connected to the International Data Centre (IDC) via Global Communication Infrastructure (GCI) communication links. In addition, 12 legacy stations in the auxiliary seismic network sent data to the IDC over the Internet. During the performance testing phase, the IDC produced all required products, analysed more than 6100 seismic events and 1700 radionuclide spectra. Performance of all system elements was documented and analysed. IDC products were compared with results of data processing at the NDCs. On the basis of statistics and information collected during the SPT1 a system-wide performance baseline under current guidelines for provisional Operation and Maintenance was established. The test provided feedback for further development of the draft IMS and IDC Operational Manuals and identified priority areas for further system development.
A weighted generalized score statistic for comparison of predictive values of diagnostic tests.
Kosinski, Andrzej S
2013-03-15
Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.
A weighted generalized score statistic for comparison of predictive values of diagnostic tests
Kosinski, Andrzej S.
2013-01-01
Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations which are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we present, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic which incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, it always reduces to the score statistic in the independent samples situation, and it preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the weighted generalized score test statistic in a general GEE setting. PMID:22912343
Nursing genetics and genomics: The International Society of Nurses in Genetics (ISONG) survey.
Hickey, Kathleen T; Taylor, Jacquelyn Y; Barr, Taura L; Hauser, Nicole R; Jia, Haomiao; Riga, Teresa C; Katapodi, Maria
2018-04-01
The International Society of Nursing in Genetics (ISONG) fosters scientific and professional development in the discovery, interpretation, and application of genomic information in nursing research, education, and clinical practice. Assess genomic-related activities of ISONG members in research, education and practice, and competencies to serve as global leaders in genomics. Cross-sectional survey (21-items) assessing genomic-related training, knowledge, and practice. An email invitation included a link to the anonymous online survey. All ISONG members (n = 350 globally) were invited to partake. Descriptive statistics and Wilcoxon Rank Sum Test for between-group comparisons. Respondents (n = 231, 66%), were mostly Caucasian, female, with a master's degree or higher. Approximately 70% wanted to incorporate genomics in research, teaching, and practice. More than half reported high genomic competency, and over 95% reported that genomics is relevant the next 5 years. Findings provide a foundation for developing additional educational programs for an international nursing workforce in genomics. Copyright © 2018. Published by Elsevier Ltd.
Swahn, Monica H; Ali, Bina; Palmier, Jane B; Sikazwe, George; Mayeya, John
2011-01-01
This study examines the associations between alcohol marketing strategies, alcohol education including knowledge about dangers of alcohol and refusal of alcohol, and drinking prevalence, problem drinking, and drunkenness. Analyses are based on the Global School-Based Student Health Survey (GSHS) conducted in Zambia (2004) of students primarily 11 to 16 years of age (N = 2257). Four statistical models were computed to test the associations between alcohol marketing and education and alcohol use, while controlling for possible confounding factors. Alcohol marketing, specifically through providing free alcohol through a company representative, was associated with drunkenness (AOR = 1.49; 95% CI: 1.09-2.02) and problem drinking (AOR = 1.41; 95% CI: 1.06-1.87) among youth after controlling for demographic characteristics, risky behaviors, and alcohol education. However, alcohol education was not associated with drunkenness or problem drinking. These findings underscore the importance of restricting alcohol marketing practices as an important policy strategy for reducing alcohol use and its dire consequences among vulnerable youth.
Satellite orbit and data sampling requirements
NASA Technical Reports Server (NTRS)
Rossow, William
1993-01-01
Climate forcings and feedbacks vary over a wide range of time and space scales. The operation of non-linear feedbacks can couple variations at widely separated time and space scales and cause climatological phenomena to be intermittent. Consequently, monitoring of global, decadal changes in climate requires global observations that cover the whole range of space-time scales and are continuous over several decades. The sampling of smaller space-time scales must have sufficient statistical accuracy to measure the small changes in the forcings and feedbacks anticipated in the next few decades, while continuity of measurements is crucial for unambiguous interpretation of climate change. Shorter records of monthly and regional (500-1000 km) measurements with similar accuracies can also provide valuable information about climate processes, when 'natural experiments' such as large volcanic eruptions or El Ninos occur. In this section existing satellite datasets and climate model simulations are used to test the satellite orbits and sampling required to achieve accurate measurements of changes in forcings and feedbacks at monthly frequency and 1000 km (regional) scale.
NASA Astrophysics Data System (ADS)
Armal, S.; Devineni, N.; Khanbilvardi, R.
2017-12-01
This study presents a systematic analysis for identifying and attributing trends in the annual frequency of extreme rainfall events across the contiguous United States to climate change and climate variability modes. A Bayesian multilevel model is developed for 1,244 stations simultaneously to test the null hypothesis of no trend and verify two alternate hypotheses: Trend can be attributed to changes in global surface temperature anomalies, or to a combination of cyclical climate modes with varying quasi-periodicities and global surface temperature anomalies. The Bayesian multilevel model provides the opportunity to pool information across stations and reduce the parameter estimation uncertainty, hence identifying the trends better. The choice of the best alternate hypotheses is made based on Watanabe-Akaike Information Criterion, a Bayesian pointwise predictive accuracy measure. Statistically significant time trends are observed in 742 of the 1,244 stations. Trends in 409 of these stations can be attributed to changes in global surface temperature anomalies. These stations are predominantly found in the Southeast and Northeast climate regions. The trends in 274 of these stations can be attributed to the El Nino Southern Oscillations, North Atlantic Oscillation, Pacific Decadal Oscillation and Atlantic Multi-Decadal Oscillation along with changes in global surface temperature anomalies. These stations are mainly found in the Northwest, West and Southwest climate regions.
Zhang, Honghua; Xia, Mingying; Qi, Lijie; Dong, Lei; Song, Shuang; Ma, Teng; Yang, Shuping; Jin, Li; Li, Liming; Li, Shilin
2016-05-01
Estimating the allele frequencies and forensic statistical parameters of commonly used short tandem repeat (STR) loci of the Uyghur population, which is the fifth largest group in China, provides a more precise reference database for forensic investigation. The 6-dye GlobalFiler™ Express PCR Amplification kit incorporates 21 autosomal STRs, which have been proven that could provide reliable DNA typing results and enhance the power of discrimination. Here we analyzed the GlobalFiler STR loci on 1962 unrelated individuals from Chinese Uyghur population of Xinjiang, China. No significant deviations from Hardy-Weinberg equilibrium and linkage disequilibrium were detected within and between the GlobalFiler STR loci. SE33 showed the greatest power of discrimination in Uyghur population, whereas TPOX showed the lowest. The combined power of discrimination was 99.999999999999999999999998746%. No significant difference was observed between Uyghur and the other two Uyghur populations at all tested STRs, as well as Dai and Mongolian. Significant differences were only observed between Uyghur and other Chinese populations at TH01, as well as Central-South Asian at D13S317, East Asian at TH01 and VWA. The phylogenetic analysis showed that Uyghur is genetically close to Chinese populations, as well as East Asian and Central-South Asian. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Global impacts of the 1980s regime shift.
Reid, Philip C; Hari, Renata E; Beaugrand, Grégory; Livingstone, David M; Marty, Christoph; Straile, Dietmar; Barichivich, Jonathan; Goberville, Eric; Adrian, Rita; Aono, Yasuyuki; Brown, Ross; Foster, James; Groisman, Pavel; Hélaouët, Pierre; Hsu, Huang-Hsiung; Kirby, Richard; Knight, Jeff; Kraberg, Alexandra; Li, Jianping; Lo, Tzu-Ting; Myneni, Ranga B; North, Ryan P; Pounds, J Alan; Sparks, Tim; Stübi, René; Tian, Yongjun; Wiltshire, Karen H; Xiao, Dong; Zhu, Zaichun
2016-02-01
Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
Mapping the global health employment market: an analysis of global health jobs.
Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S
2018-02-27
The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.
NASA Astrophysics Data System (ADS)
Anishchenko, V. S.; Boev, Ya. I.; Semenova, N. I.; Strelkova, G. I.
2015-07-01
We review rigorous and numerical results on the statistics of Poincaré recurrences which are related to the modern development of the Poincaré recurrence problem. We analyze and describe the rigorous results which are achieved both in the classical (local) approach and in the recently developed global approach. These results are illustrated by numerical simulation data for simple chaotic and ergodic systems. It is shown that the basic theoretical laws can be applied to noisy systems if the probability measure is ergodic and stationary. Poincaré recurrences are studied numerically in nonautonomous systems. Statistical characteristics of recurrences are analyzed in the framework of the global approach for the cases of positive and zero topological entropy. We show that for the positive entropy, there is a relationship between the Afraimovich-Pesin dimension, Lyapunov exponents and the Kolmogorov-Sinai entropy either without and in the presence of external noise. The case of zero topological entropy is exemplified by numerical results for the Poincare recurrence statistics in the circle map. We show and prove that the dependence of minimal recurrence times on the return region size demonstrates universal properties for the golden and the silver ratio. The behavior of Poincaré recurrences is analyzed at the critical point of Feigenbaum attractor birth. We explore Poincaré recurrences for an ergodic set which is generated in the stroboscopic section of a nonautonomous oscillator and is similar to a circle shift. Based on the obtained results we show how the Poincaré recurrence statistics can be applied for solving a number of nonlinear dynamics issues. We propose and illustrate alternative methods for diagnosing effects of external and mutual synchronization of chaotic systems in the context of the local and global approaches. The properties of the recurrence time probability density can be used to detect the stochastic resonance phenomenon. We also discuss how the fractal dimension of chaotic attractors can be estimated using the Poincaré recurrence statistics.
Texture analysis with statistical methods for wheat ear extraction
NASA Astrophysics Data System (ADS)
Bakhouche, M.; Cointault, F.; Gouton, P.
2007-01-01
In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.
Statistical downscaling of precipitation using long short-term memory recurrent neural networks
NASA Astrophysics Data System (ADS)
Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra
2017-11-01
Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somerville, Richard
2013-08-22
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).« less
Long-term Trends and Variability of Eddy Activities in the South China Sea
NASA Astrophysics Data System (ADS)
Zhang, M.; von Storch, H.
2017-12-01
For constructing empirical downscaling models and projecting possible future states of eddy activities in the South China Sea (SCS), long-term statistical characteristics of the SCS eddy are needed. We use a daily global eddy-resolving model product named STORM covering the period of 1950-2010. This simulation has employed the MPI-OM model with a mean horizontal resolution of 10km and been driven by the NCEP reanalysis-1 data set. An eddy detection and tracking algorithm operating on the gridded sea surface height anomaly (SSHA) fields was developed. A set of parameters for the criteria in the SCS are determined through sensitivity tests. Our method detected more than 6000 eddy tracks in the South China Sea. For all of them, eddy diameters, track length, eddy intensity, eddy lifetime and eddy frequency were determined. The long-term trends and variability of those properties also has been derived. Most of the eddies propagate westward. Nearly 100 eddies travel longer than 1000km, and over 800 eddies have a lifespan of more than 2 months. Furthermore, for building the statistical empirical model, the relationship between the SCS eddy statistics and the large-scale atmospheric and oceanic phenomena has been investigated.
Statistical polarization in greenhouse gas emissions: Theory and evidence.
Remuzgo, Lorena; Trueba, Carmen
2017-11-01
The current debate on climate change is over whether global warming can be limited in order to lessen its impacts. In this sense, evidence of a decrease in the statistical polarization in greenhouse gas (GHG) emissions could encourage countries to establish a stronger multilateral climate change agreement. Based on the interregional and intraregional components of the multivariate generalised entropy measures (Maasoumi, 1986), Gigliarano and Mosler (2009) proposed to study the statistical polarization concept from a multivariate view. In this paper, we apply this approach to study the evolution of such phenomenon in the global distribution of the main GHGs. The empirical analysis has been carried out for the time period 1990-2011, considering an endogenous grouping of countries (Aghevli and Mehran, 1981; Davies and Shorrocks, 1989). Most of the statistical polarization indices showed a slightly increasing pattern that was similar regardless of the number of groups considered. Finally, some policy implications are commented. Copyright © 2017 Elsevier Ltd. All rights reserved.
[The research protocol VI: How to choose the appropriate statistical test. Inferential statistics].
Flores-Ruiz, Eric; Miranda-Novales, María Guadalupe; Villasís-Keever, Miguel Ángel
2017-01-01
The statistical analysis can be divided in two main components: descriptive analysis and inferential analysis. An inference is to elaborate conclusions from the tests performed with the data obtained from a sample of a population. Statistical tests are used in order to establish the probability that a conclusion obtained from a sample is applicable to the population from which it was obtained. However, choosing the appropriate statistical test in general poses a challenge for novice researchers. To choose the statistical test it is necessary to take into account three aspects: the research design, the number of measurements and the scale of measurement of the variables. Statistical tests are divided into two sets, parametric and nonparametric. Parametric tests can only be used if the data show a normal distribution. Choosing the right statistical test will make it easier for readers to understand and apply the results.
Global vegetation distribution and terrestrial climate evolution at the Eocene-Oligocene transition
NASA Astrophysics Data System (ADS)
Pound, Matthew; Salzmann, Ulrich
2016-04-01
The Eocene - Oligocene transition (EOT; ca. 34-33.5 Ma) is widely considered to be the biggest step in Cenozoic climate evolution. Geochemical marine records show both surface and bottom water cooling, associated with the expansion of Antarctic glaciers and a reduction in the atmospheric CO2 concentration. However, the global response of the terrestrial biosphere to the EOT is less well understood and not uniform when comparing different regions. We present new global vegetation and terrestrial climate reconstructions of the Priabonian (late Eocene; 38-33.9 Ma) and Rupelian (early Oligocene; 33.9-28.45 Ma) by synthesising 215 pollen and spore localities. Using presence/absence data of pollen and spores with multivariate statistics has allowed the reconstruction of palaeo-biomes without relying on modern analogues. The reconstructed palaeo-biomes do not show the equator-ward shift at the EOT, which would be expected from a global cooling. Reconstructions of mean annual temperature, cold month mean temperature and warm month mean temperature do not show a global cooling of terrestrial climate across the EOT. Our new reconstructions differ from previous global syntheses by being based on an internally consistent statistically defined classification of palaeo-biomes and our terrestrial based climate reconstructions are in stark contrast to some marine based climate estimates. Our results raise new questions on the nature and extent of terrestrial global climate change at the EOT.
New heterogeneous test statistics for the unbalanced fixed-effect nested design.
Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming
2011-05-01
When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.
NASA Astrophysics Data System (ADS)
Lopez, S. R.; Hogue, T. S.
2011-12-01
Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.
Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis
NASA Astrophysics Data System (ADS)
Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.
2016-08-01
This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.
A Statistical Multimodel Ensemble Approach to Improving Long-Range Forecasting in Pakistan
2012-03-01
Impact of global warming on monsoon variability in Pakistan. J. Anim. Pl. Sci., 21, no. 1, 107–110. Gillies, S., T. Murphree, and D. Meyer, 2012...are generated by multiple regression models that relate globally distributed oceanic and atmospheric predictors to local predictands. The...generated by multiple regression models that relate globally distributed oceanic and atmospheric predictors to local predictands. The predictands are
What contribution can international relations make to the evolving global health agenda?
Davies, Sara E
2010-01-01
This article presents two approaches that have dominated International Relations in their approach to the international politics of health. The statist approach, which is primarily security-focused, seeks to link health initiatives to a foreign or defence policy remit. The globalist approach, in contrast, seeks to advance health not because of its intrinsic security value but because it advances the well-being and rights of individuals. This article charts the evolution of these approaches and demonstrates why both have the potential to shape our understanding of the evolving global health agenda. It examines how the statist and globalist perspectives have helped shape contemporary initiatives in global health governance and suggests that there is evidence of an emerging convergence between the two perspectives. This convergence is particularly clear in the articulation of a number of UN initiatives in this area - especially the One World, One Health Strategic Framework and the Oslo Ministerial Declaration (2007) which inspired the first UN General Assembly resolution on global health and foreign policy in 2009 and the UN Secretary-General's note "Global health and foreign policy: strategic opportunities and challenges". What remains to be seen is whether this convergence will deliver on securing states' interest long enough to promote the interests of the individuals who require global efforts to deliver local health improvements.
Martin, Lisa; Watanabe, Sharon; Fainsinger, Robin; Lau, Francis; Ghosh, Sunita; Quan, Hue; Atkins, Marlis; Fassbender, Konrad; Downing, G Michael; Baracos, Vickie
2010-10-01
To determine whether elements of a standard nutritional screening assessment are independently prognostic of survival in patients with advanced cancer. A prospective nested cohort of patients with metastatic cancer were accrued from different units of a Regional Palliative Care Program. Patients completed a nutritional screen on admission. Data included age, sex, cancer site, height, weight history, dietary intake, 13 nutrition impact symptoms, and patient- and physician-reported performance status (PS). Univariate and multivariate survival analyses were conducted. Concordance statistics (c-statistics) were used to test the predictive accuracy of models based on training and validation sets; a c-statistic of 0.5 indicates the model predicts the outcome as well as chance; perfect prediction has a c-statistic of 1.0. A training set of patients in palliative home care (n = 1,164) was used to identify prognostic variables. Primary disease site, PS, short-term weight change (either gain or loss), dietary intake, and dysphagia predicted survival in multivariate analysis (P < .05). A model including only patients separated by disease site and PS with high c-statistics between predicted and observed responses for survival in the training set (0.90) and validation set (0.88; n = 603). The addition of weight change, dietary intake, and dysphagia did not further improve the c-statistic of the model. The c-statistic was also not altered by substituting physician-rated palliative PS for patient-reported PS. We demonstrate a high probability of concordance between predicted and observed survival for patients in distinct palliative care settings (home care, tertiary inpatient, ambulatory outpatient) based on patient-reported information.
Data resource profile: United Nations Children's Fund (UNICEF).
Murray, Colleen; Newby, Holly
2012-12-01
The United Nations Children's Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children's rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF's wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF's flagship publications, inter-agency reports, including the Secretary General's Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses.
Analysis of the Einstein sample of early-type galaxies
NASA Technical Reports Server (NTRS)
Eskridge, Paul B.; Fabbiano, Giuseppina
1993-01-01
The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.
Effects of Hydrological Parameters on Palm Oil Fresh Fruit Bunch Yield)
NASA Astrophysics Data System (ADS)
Nda, M.; Adnan, M. S.; Suhadak, M. A.; Zakaria, M. S.; Lopa, R. T.
2018-04-01
Climate change effects and variability have been studied by many researchers in diverse geophysical fields. Malaysia produces large volume of palm oil, the effects of climate change on hydrological parameters (rainfall and precipitation) could have adverse effects on palm oil fresh fruit bunch (FFB) production with implications at both local and international market. It is important to understand the effects of climate change on crop yield to adopt new cultivation techniques and guaranteeing food security globally. Based on this background, the paper’s objective is to investigate the effects of rainfall and temperature pattern on crop yield (FFB) within five years period (2013 - 2017) at Batu Pahat District. The Man - Kendall rank technique (trend test) and statistical analyses (correlation and regression) were applied to the dataset used for the study. The results reveal that there are variabilities in rainfall and temperature from one month to the other and the statistical analysis reveals that the hydrological parameters have an insignificant effect on crop yield.
Bacci, Silvia; Seracini, Marco; Chiavarini, Manuela; Bartolucci, Francesco; Minelli, Liliana
2017-01-01
The aim of this study was to investigate the relationship between employment status (permanent employment, fixed-term employment, unemployment, other) and perceived health status in a sample of the Italian population. Data was obtained from the European Union Statistics on Income and Living Condition (EU-SILC) study during the period 2009 - 2012. The sample consists of 4,848 individuals, each with a complete record of observations during four years for a total of 19,392 observations. The causal relationship between perceived/self-reported health status and employment status was tested using a global logit model (STATA). Our results confirm a significant association between employment status and perceived health, as well as between perceived health status and economic status. Unemployment that was dependent on an actual lack of work opportunities and not from individual disability was found to be the most significant determinant of perceived health status; a higher educational level produces a better perceived health status.
NASA Astrophysics Data System (ADS)
Sommer, Philipp S.; Kaplan, Jed O.
2017-10-01
While a wide range of Earth system processes occur at daily and even subdaily timescales, many global vegetation and other terrestrial dynamics models historically used monthly meteorological forcing both to reduce computational demand and because global datasets were lacking. Recently, dynamic land surface modeling has moved towards resolving daily and subdaily processes, and global datasets containing daily and subdaily meteorology have become available. These meteorological datasets, however, cover only the instrumental era of the last approximately 120 years at best, are subject to considerable uncertainty, and represent extremely large data files with associated computational costs of data input/output and file transfer. For periods before the recent past or in the future, global meteorological forcing can be provided by climate model output, but the quality of these data at high temporal resolution is low, particularly for daily precipitation frequency and amount. Here, we present GWGEN, a globally applicable statistical weather generator for the temporal downscaling of monthly climatology to daily meteorology. Our weather generator is parameterized using a global meteorological database and simulates daily values of five common variables: minimum and maximum temperature, precipitation, cloud cover, and wind speed. GWGEN is lightweight, modular, and requires a minimal set of monthly mean variables as input. The weather generator may be used in a range of applications, for example, in global vegetation, crop, soil erosion, or hydrological models. While GWGEN does not currently perform spatially autocorrelated multi-point downscaling of daily weather, this additional functionality could be implemented in future versions.
A simple, physically-based method for evaluating the economic costs of geo-engineering schemes
NASA Astrophysics Data System (ADS)
Garrett, T. J.
2009-04-01
The consumption of primary energy (e.g coal, oil, uranium) by the global economy is done in expectation of a return on investment. For geo-engineering schemes, however, the relationship between the primary energy consumption required and the economic return is, at first glance, quite different. The energy costs of a given scheme represent a removal of economically productive available energy to do work in the normal global economy. What are the economic implications of the energy consumption associated with geo-engineering techniques? I will present a simple thermodynamic argument that, in general, real (inflation-adjusted) economic value has a fixed relationship to the rate of global primary energy consumption. This hypothesis will be shown to be supported by 36 years of available energy statistics and a two millennia period of statistics for global economic production. What is found from this analysis is that the value in any given inflation-adjusted 1990 dollar is sustained by a constant 9.7 +/- 0.3 milliwatts of global primary energy consumption. Thus, insofar as geo-engineering is concerned, any scheme that requires some nominal fraction of continuous global primary energy output necessitates a corresponding inflationary loss of real global economic value. For example, if 1% of global energy output is required, at today's consumption rates of 15 TW this corresponds to an inflationary loss of 15 trillion 1990 dollars of real value. The loss will be less, however, if the geo-engineering scheme also enables a demonstrable enhancement to global economic production capacity through climate modification.
Drakesmith, M; Caeyenberghs, K; Dutt, A; Lewis, G; David, A S; Jones, D K
2015-09-01
Graph theory (GT) is a powerful framework for quantifying topological features of neuroimaging-derived functional and structural networks. However, false positive (FP) connections arise frequently and influence the inferred topology of networks. Thresholding is often used to overcome this problem, but an appropriate threshold often relies on a priori assumptions, which will alter inferred network topologies. Four common network metrics (global efficiency, mean clustering coefficient, mean betweenness and smallworldness) were tested using a model tractography dataset. It was found that all four network metrics were significantly affected even by just one FP. Results also show that thresholding effectively dampens the impact of FPs, but at the expense of adding significant bias to network metrics. In a larger number (n=248) of tractography datasets, statistics were computed across random group permutations for a range of thresholds, revealing that statistics for network metrics varied significantly more than for non-network metrics (i.e., number of streamlines and number of edges). Varying degrees of network atrophy were introduced artificially to half the datasets, to test sensitivity to genuine group differences. For some network metrics, this atrophy was detected as significant (p<0.05, determined using permutation testing) only across a limited range of thresholds. We propose a multi-threshold permutation correction (MTPC) method, based on the cluster-enhanced permutation correction approach, to identify sustained significant effects across clusters of thresholds. This approach minimises requirements to determine a single threshold a priori. We demonstrate improved sensitivity of MTPC-corrected metrics to genuine group effects compared to an existing approach and demonstrate the use of MTPC on a previously published network analysis of tractography data derived from a clinical population. In conclusion, we show that there are large biases and instability induced by thresholding, making statistical comparisons of network metrics difficult. However, by testing for effects across multiple thresholds using MTPC, true group differences can be robustly identified. Copyright © 2015. Published by Elsevier Inc.
Perneger, Thomas V; Combescure, Christophe
2017-07-01
Published P-values provide a window into the global enterprise of medical research. The aim of this study was to use the distribution of published P-values to estimate the relative frequencies of null and alternative hypotheses and to seek irregularities suggestive of publication bias. This cross-sectional study included P-values published in 120 medical research articles in 2016 (30 each from the BMJ, JAMA, Lancet, and New England Journal of Medicine). The observed distribution of P-values was compared with expected distributions under the null hypothesis (i.e., uniform between 0 and 1) and the alternative hypothesis (strictly decreasing from 0 to 1). P-values were categorized according to conventional levels of statistical significance and in one-percent intervals. Among 4,158 recorded P-values, 26.1% were highly significant (P < 0.001), 9.1% were moderately significant (P ≥ 0.001 to < 0.01), 11.7% were weakly significant (P ≥ 0.01 to < 0.05), and 53.2% were nonsignificant (P ≥ 0.05). We noted three irregularities: (1) high proportion of P-values <0.001, especially in observational studies, (2) excess of P-values equal to 1, and (3) about twice as many P-values less than 0.05 compared with those more than 0.05. The latter finding was seen in both randomized trials and observational studies, and in most types of analyses, excepting heterogeneity tests and interaction tests. Under plausible assumptions, we estimate that about half of the tested hypotheses were null and the other half were alternative. This analysis suggests that statistical tests published in medical journals are not a random sample of null and alternative hypotheses but that selective reporting is prevalent. In particular, significant results are about twice as likely to be reported as nonsignificant results. Copyright © 2017 Elsevier Inc. All rights reserved.
Benchmarking the mesoscale variability in global ocean eddy-permitting numerical systems
NASA Astrophysics Data System (ADS)
Cipollone, Andrea; Masina, Simona; Storto, Andrea; Iovino, Doroteaciro
2017-10-01
The role of data assimilation procedures on representing ocean mesoscale variability is assessed by applying eddy statistics to a state-of-the-art global ocean reanalysis (C-GLORS), a free global ocean simulation (performed with the NEMO system) and an observation-based dataset (ARMOR3D) used as an independent benchmark. Numerical results are computed on a 1/4 ∘ horizontal grid (ORCA025) and share the same resolution with ARMOR3D dataset. This "eddy-permitting" resolution is sufficient to allow ocean eddies to form. Further to assessing the eddy statistics from three different datasets, a global three-dimensional eddy detection system is implemented in order to bypass the need of regional-dependent definition of thresholds, typical of commonly adopted eddy detection algorithms. It thus provides full three-dimensional eddy statistics segmenting vertical profiles from local rotational velocities. This criterion is crucial for discerning real eddies from transient surface noise that inevitably affects any two-dimensional algorithm. Data assimilation enhances and corrects mesoscale variability on a wide range of features that cannot be well reproduced otherwise. The free simulation fairly reproduces eddies emerging from western boundary currents and deep baroclinic instabilities, while underestimates shallower vortexes that populate the full basin. The ocean reanalysis recovers most of the missing turbulence, shown by satellite products , that is not generated by the model itself and consistently projects surface variability deep into the water column. The comparison with the statistically reconstructed vertical profiles from ARMOR3D show that ocean data assimilation is able to embed variability into the model dynamics, constraining eddies with in situ and altimetry observation and generating them consistently with local environment.
Revised Perturbation Statistics for the Global Scale Atmospheric Model
NASA Technical Reports Server (NTRS)
Justus, C. G.; Woodrum, A.
1975-01-01
Magnitudes and scales of atmospheric perturbations about the monthly mean for the thermodynamic variables and wind components are presented by month at various latitudes. These perturbation statistics are a revision of the random perturbation data required for the global scale atmospheric model program and are from meteorological rocket network statistical summaries in the 22 to 65 km height range and NASA grenade and pitot tube data summaries in the region up to 90 km. The observed perturbations in the thermodynamic variables were adjusted to make them consistent with constraints required by the perfect gas law and the hydrostatic equation. Vertical scales were evaluated by Buell's depth of pressure system equation and from vertical structure function analysis. Tables of magnitudes and vertical scales are presented for each month at latitude 10, 30, 50, 70, and 90 degrees.
Explorations in Statistics: Hypothesis Tests and P Values
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2009-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This second installment of "Explorations in Statistics" delves into test statistics and P values, two concepts fundamental to the test of a scientific null hypothesis. The essence of a test statistic is that it compares what…
Sustainable rangeland-based livestock production: A perspective on USA and global emerging trends
USDA-ARS?s Scientific Manuscript database
A recent review of statistics published by the United Nations Food and Agriculture Organization showed that global livestock numbers have increased steadily over the past 30 years. By 2030, livestock numbers in the developing world are expected to reach record highs that will surpass livestock popul...
Long-memory and the sea level-temperature relationship: a fractional cointegration approach.
Ventosa-Santaulària, Daniel; Heres, David R; Martínez-Hernández, L Catalina
2014-01-01
Through thermal expansion of oceans and melting of land-based ice, global warming is very likely contributing to the sea level rise observed during the 20th century. The amount by which further increases in global average temperature could affect sea level is only known with large uncertainties due to the limited capacity of physics-based models to predict sea levels from global surface temperatures. Semi-empirical approaches have been implemented to estimate the statistical relationship between these two variables providing an alternative measure on which to base potentially disrupting impacts on coastal communities and ecosystems. However, only a few of these semi-empirical applications had addressed the spurious inference that is likely to be drawn when one nonstationary process is regressed on another. Furthermore, it has been shown that spurious effects are not eliminated by stationary processes when these possess strong long memory. Our results indicate that both global temperature and sea level indeed present the characteristics of long memory processes. Nevertheless, we find that these variables are fractionally cointegrated when sea-ice extent is incorporated as an instrumental variable for temperature which in our estimations has a statistically significant positive impact on global sea level.
Dependency of high coastal water level and river discharge at the global scale
NASA Astrophysics Data System (ADS)
Ward, P.; Couasnon, A.; Haigh, I. D.; Muis, S.; Veldkamp, T.; Winsemius, H.; Wahl, T.
2017-12-01
It is widely recognized that floods cause huge socioeconomic impacts. From 1980-2013, global flood losses exceeded $1 trillion, with 220,000 fatalities. These impacts are particularly hard felt in low-lying densely populated deltas and estuaries, whose location at the coast-land interface makes them naturally prone to flooding. When river and coastal floods coincide, their impacts in these deltas and estuaries are often worse than when they occur in isolation. Such floods are examples of so-called `compound events'. In this contribution, we present the first global scale analysis of the statistical dependency of high coastal water levels (and the storm surge component alone) and river discharge. We show that there is statistical dependency between these components at more than half of the stations examined. We also show time-lags in the highest correlation between peak discharges and coastal water levels. Finally, we assess the probability of the simultaneous occurrence of design discharge and design coastal water levels, assuming both independence and statistical dependence. For those stations where we identified statistical dependency, the probability is between 1 and 5 times greater, when the dependence structure is accounted for. This information is essential for understanding the likelihood of compound flood events occurring at locations around the world as well as for accurate flood risk assessments and effective flood risk management. The research was carried out by analysing the statistical dependency between observed coastal water levels (and the storm surge component) from GESLA-2 and river discharge using gauged data from GRDC stations all around the world. The dependence structure was examined using copula functions.
Statistical wave climate projections for coastal impact assessments
NASA Astrophysics Data System (ADS)
Camus, P.; Losada, I. J.; Izaguirre, C.; Espejo, A.; Menéndez, M.; Pérez, J.
2017-09-01
Global multimodel wave climate projections are obtained at 1.0° × 1.0° scale from 30 Coupled Model Intercomparison Project Phase 5 (CMIP5) global circulation model (GCM) realizations. A semi-supervised weather-typing approach based on a characterization of the ocean wave generation areas and the historical wave information from the recent GOW2 database are used to train the statistical model. This framework is also applied to obtain high resolution projections of coastal wave climate and coastal impacts as port operability and coastal flooding. Regional projections are estimated using the collection of weather types at spacing of 1.0°. This assumption is feasible because the predictor is defined based on the wave generation area and the classification is guided by the local wave climate. The assessment of future changes in coastal impacts is based on direct downscaling of indicators defined by empirical formulations (total water level for coastal flooding and number of hours per year with overtopping for port operability). Global multimodel projections of the significant wave height and peak period are consistent with changes obtained in previous studies. Statistical confidence of expected changes is obtained due to the large number of GCMs to construct the ensemble. The proposed methodology is proved to be flexible to project wave climate at different spatial scales. Regional changes of additional variables as wave direction or other statistics can be estimated from the future empirical distribution with extreme values restricted to high percentiles (i.e., 95th, 99th percentiles). The statistical framework can also be applied to evaluate regional coastal impacts integrating changes in storminess and sea level rise.
Global earthquake catalogs and long-range correlation of seismic activity (Invited)
NASA Astrophysics Data System (ADS)
Ogata, Y.
2009-12-01
In view of the long-term seismic activity in the world, homogeneity of a global catalog is indispensable. Lately, Engdahl and Villaseñor (2002) compiled a global earthquake catalog of magnitude (M)7.0 or larger during the last century (1900-1999). This catalog is based on the various existing catalogs such as Abe catalog (Abe, 1981, 1984; Abe and Noguchi, 1983a, b) for the world seismicity (1894-1980), its modified catalogs by Perez and Scholz (1984) and by Pacheco and Sykes (1992), and also the Harvard University catalog since 1975. However, the original surface wave magnitudes of Abe catalog were systematically changed by Perez and Scholz (1984) and Pacheco and Sykes (1992). They suspected inhomogeneity of the Abe catalog and claimed that the two seeming changes in the occurrence rate around 1922 and 1948 resulted from magnitude shifts for some instrumental-related reasons. They used a statistical test assuming that such a series of large earthquakes in the world should behave as the stationary Poisson process (uniform occurrences). It is obvious that their claim strongly depends on their a priori assumption of an independent or short-range dependence of earthquake occurrence. We question this assumption from the viewpoint of long-range dependence of seismicity. We make some statistical analyses of the spectrum, dispersion-time diagrams and R/S for estimating and testing of the long-range correlations. We also attempt to show the possibility that the apparent rate change in the global seismicity can be simulated by a certain long-range correlated process. Further, if we divide the globe into the two regions of high and low latitudes, for example, we have different shapes of the cumulative curves to each other, and the above mentioned apparent change-points disappear from the both regions. This suggests that the Abe catalog shows the genuine seismic activity rather than the artifact of the suspected magnitude shifts that should appear in any wide enough regions. We also use a local catalog for the wide regional area around Japan (Utsu, 1982a, b; Japan Meteorological Agency) which covers the period of 1885-1999, complete with M>=6.0 and occupies about 10% of the world seismicity. The synchronous variation of seismic frequency in the high latitude area of the world and in the regional area around Japan obtained from the independent catalogs is suggestive of an external effect such as a large-scale motion of the earth rather than the presupposed inhomogeneity of the catalogs.
[Empathy-related factors in Nursing students of the Cartagena University].
Madera-Anaya, Meisser; Tirado-Amador, Lesbia; González-Martínez, Farith
2016-01-01
To determine empathy levels and its relationship with sociodemographic, academic and family factors in nursing students. Cross-sectional study, 196 nursing students were randomly selected at the University of Cartagena, Colombia. A questionnaire that asked about sociodemographic, family and academic factors and the Scale of Physician Empathy Jefferson-version S were applied. Shapiro-Wilk test was used to assess the normality assumption. t Student, ANOVA, Pearson test and simple linear regression were used to establish the relationship (p<0.05). The global empathy score was 108.6±14.6; statistically significant associations between global empathy with the training year (p=0.004) and grade point average (R(2)=0.058; p=0.001; r=0.240) were found. Moreover, the "perspective taking" dimension with provenance (rural/urban) (p=0.010) and family functioning (p=0.003); the "compassionate care" dimension with the training year (p=0.002) and the "putting themselves in the place of the patient" dimension with academic performance (p=0.034). The empathy levels in nursing students may vary depending on various personal and academic factors,these characteristics should be taken into account for implementing teaching strategies to promote higher empathy levels since the early training years. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Vázquez de la Torre, Mayra Jezabel; Stein, Katja; Vásquez Garibay, Edgar Manuel; Kumazawa Ichikawa, Miguel Roberto; Troyo Sanromán, Rogelio; Salcedo Flores, Alicia Guadalupe; Sánchez Zubieta, Fernando Antonio
2017-10-24
The subjective global assessment (SGA) is a simple, sensitive tool used to identify nutritional risk. It is widely used in the adult population, but there is little evidence on its effectiveness in children with cancer. This cross-sectional study was undertaken to demonstrate significant correlation between a simplified version of the Patient-Generated SGA (PG-SGA) and anthropometric assessment to identify nutritional status in children recently diagnosed with cancer. The nutritional status of 70 pediatric cancer patients was assessed with the PG-SGA and anthropometric measurements. The relation between the assessments was tested with ANOVA, independent samples t-test, Kappa statistic, and non-parametric Spearman and Kendall correlation coefficient. The PG-SGA divided the patients into four groups: well nourished, mildly, moderately and severely malnourished. The prevalence of malnutrition according to the PG-SGA was 21.4%. The correlations (r ≥ 0.300, p < 0.001) and the concordance (k ≥ 0.327, p < 0.001) between the PG-SGA and anthropometric indicators were moderate and significant. The results indicate that the PG-SGA is a valid tool for assessing nutritional status in hospitalized children recently diagnosed with cancer. It is important to emphasize that the subjective assessment does not detect growth retardation, overweight or obesity.
A prospective earthquake forecast experiment in the western Pacific
NASA Astrophysics Data System (ADS)
Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan
2012-09-01
Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.
Penolazzi, Barbara; Bergamaschi, Susanna; Pastore, Massimiliano; Villani, Daniele; Sartori, Giuseppe; Mondini, Sara
2015-01-01
In the present study we tested the cognitive effects of transcranial direct current stimulation (tDCS) in a case of probable Alzheimer disease (AD). The patient (male, 60 years, mild AD) underwent two cycles of treatments, separated by 2 months. In the first cycle, active stimulation (10 sessions, 2 mA for 20 min; anode over the left dorsolateral prefrontal cortex) was followed by computerised tasks (CTs) specifically chosen to engage the most impaired cognitive processes in the patient (tDCS+CT condition). In the second cycle, which was structured as the first, CTs were administered after placebo stimulation (sham+CT condition). Effects on cognitive performance were evaluated not only by the CTs, but also by neuropsychological tests assessing global cognitive functioning. Statistical analyses revealed that whereas the tDCS+CT condition had few effects on the CTs, it induced a stability of the patient's global cognitive functioning lasting approximately 3 months, which was not achieved when the patient underwent sham+CT condition. Therefore, the synergetic use of tDCS and CTs appeared to slow down the cognitive decline of our patient. This preliminary result, although in need of further confirmation, suggests the potentiality of tDCS as an adjuvant tool for cognitive rehabilitation in AD.
Jarnevich, Catherine S.; Young, Nicholas E; Sheffels, Trevor R.; Carter, Jacoby; Systma, Mark D.; Talbert, Colin
2017-01-01
Invasive species provide a unique opportunity to evaluate factors controlling biogeographic distributions; we can consider introduction success as an experiment testing suitability of environmental conditions. Predicting potential distributions of spreading species is not easy, and forecasting potential distributions with changing climate is even more difficult. Using the globally invasive coypu (Myocastor coypus [Molina, 1782]), we evaluate and compare the utility of a simplistic ecophysiological based model and a correlative model to predict current and future distribution. The ecophysiological model was based on winter temperature relationships with nutria survival. We developed correlative statistical models using the Software for Assisted Habitat Modeling and biologically relevant climate data with a global extent. We applied the ecophysiological based model to several global circulation model (GCM) predictions for mid-century. We used global coypu introduction data to evaluate these models and to explore a hypothesized physiological limitation, finding general agreement with known coypu distribution locally and globally and support for an upper thermal tolerance threshold. Global circulation model based model results showed variability in coypu predicted distribution among GCMs, but had general agreement of increasing suitable area in the USA. Our methods highlighted the dynamic nature of the edges of the coypu distribution due to climate non-equilibrium, and uncertainty associated with forecasting future distributions. Areas deemed suitable habitat, especially those on the edge of the current known range, could be used for early detection of the spread of coypu populations for management purposes. Combining approaches can be beneficial to predicting potential distributions of invasive species now and in the future and in exploring hypotheses of factors controlling distributions.
The polymyalgia rheumatica activity score in daily use: proposal for a definition of remission.
Leeb, Burkhard F; Rintelen, Bernhard; Sautner, Judith; Fassl, Christian; Bird, Howard A
2007-06-15
To confirm the reliability and applicability of the Polymyalgia Rheumatica Disease Activity Score (PMR-AS), and to establish a threshold for remission. First, 78 patients with PMR (50 women/28 men, mean age 65.97 years) were enrolled in a cross-sectional evaluation. The PMR-AS, patient's satisfaction with disease status (PATSAT; range 1-5), erythrocyte sedimentation rate (ESR; first hour), and a visual analog scale of patients' general health assessment (VAS patient global; range 0-100) were recorded. Subsequently, another 39 PMR patients (24 women/15 men, mean age 68.12 years) were followed longitudinally. Relationships between the PMR-AS, PATSAT, ESR, and VAS patient global were analyzed by the Kruskal-Wallis test, Spearman's rank correlation, and kappa statistics. PMR-AS values in patients with a PATSAT score of 1 and a VAS patient global <10 formed the basis to establish a remission threshold. PMR-AS values were significantly related to PATSAT (P < 0.001), VAS patient global (P < 0.001), and ESR (P < 0.01). PATSAT and VAS patient global were reasonably different (kappa = 0.226). The median PMR-AS score in patients with PATSAT score 1 and VAS patient global <10 was 0.7 (range 0-3.3), and the respective 75th percentile was 1.3. To enhance applicability, a range from 0 to 1.5 was proposed to define remission in PMR. The median ESR in these patients was 10 mm/hour (range 3-28), indicating external validity. We demonstrated the reliability, validity, and applicability of the PMR-AS in daily routine. Moreover, we proposed a remission threshold (0-1.5) founded on patient-dependent parameters.
NASA Astrophysics Data System (ADS)
Hennig, R. J.; Friedrich, J.; Malaguzzi Valeri, L.; McCormick, C.; Lebling, K.; Kressig, A.
2016-12-01
The Power Watch project will offer open data on the global electricity sector starting with power plants and their impacts on climate and water systems; it will also offer visualizations and decision making tools. Power Watch will create the first comprehensive, open database of power plants globally by compiling data from national governments, public and private utilities, transmission grid operators, and other data providers to create a core dataset that has information on over 80% of global installed capacity for electrical generation. Power plant data will at a minimum include latitude and longitude, capacity, fuel type, emissions, water usage, ownership, and annual generation. By providing data that is both comprehensive, as well as making it publically available, this project will support decision making and analysis by actors across the economy and in the research community. The Power Watch research effort focuses on creating a global standard for power plant information, gathering and standardizing data from multiple sources, matching information from multiple sources on a plant level, testing cross-validation approaches (regional statistics, crowdsourcing, satellite data, and others) and developing estimation methodologies for generation, emissions, and water usage. When not available from official reports, emissions, annual generation, and water usage will be estimated. Water use estimates of power plants will be based on capacity, fuel type and satellite imagery to identify cooling types. This analysis is being piloted in several states in India and will then be scaled up to a global level. Other planned applications of of the Power Watch data include improving understanding of energy access, air pollution, emissions estimation, stranded asset analysis, life cycle analysis, tracking of proposed plants and curtailment analysis.
The Impact of Global Budgets on Pharmaceutical Spending and Utilization
Fendrick, A. Mark; Song, Zirui; Landon, Bruce E.; Safran, Dana Gelb; Mechanic, Robert E.; Chernew, Michael E.
2014-01-01
In 2009, Blue Cross Blue Shield of Massachusetts implemented a global budget-based payment system, the Alternative Quality Contract (AQC), in which provider groups assumed accountability for spending. We investigate the impact of global budgets on the utilization of prescription drugs and related expenditures. Our analyses indicate no statistically significant evidence that the AQC reduced the use of drugs. Although the impact may change over time, early evidence suggests that it is premature to conclude that global budget systems may reduce access to medications. PMID:25500751
Local conformity induced global oscillation
NASA Astrophysics Data System (ADS)
Li, Dong; Li, Wei; Hu, Gang; Zheng, Zhigang
2009-04-01
The game ‘rock-paper-scissors’ model, with the consideration of the effect of the psychology of conformity, is investigated. The interaction between each two agents is global, but the strategy of the conformity is local for individuals. In the statistical opinion, the probability of the appearance of each strategy is uniform. The dynamical analysis of this model indicates that the equilibrium state may lose its stability at a threshold and is replaced by a globally oscillating state. The global oscillation is induced by the local conformity, which is originated from the synchronization of individual strategies.
Are nondrinkers missing from the picture?
Bakke, Øystein
2015-03-01
WHO statistics indicate that half the world's population does not drink alcohol. With a Western outlook this fact is often overlooked. The article explores the global drinking patterns focusing on non-drinking and the global forces that contribute towards change. The large segment of non-drinking population is beneficial for public health but it is also seen as a great potential for the international alcoholic beverage industry. The forces of globalization towards conformity and a global mono-culture deprived of cultural diversity also affects non-drinking populations, to the detriment of public health.
EVALUATION OF A NEW MEAN SCALED AND MOMENT ADJUSTED TEST STATISTIC FOR SEM.
Tong, Xiaoxiao; Bentler, Peter M
2013-01-01
Recently a new mean scaled and skewness adjusted test statistic was developed for evaluating structural equation models in small samples and with potentially nonnormal data, but this statistic has received only limited evaluation. The performance of this statistic is compared to normal theory maximum likelihood and two well-known robust test statistics. A modification to the Satorra-Bentler scaled statistic is developed for the condition that sample size is smaller than degrees of freedom. The behavior of the four test statistics is evaluated with a Monte Carlo confirmatory factor analysis study that varies seven sample sizes and three distributional conditions obtained using Headrick's fifth-order transformation to nonnormality. The new statistic performs badly in most conditions except under the normal distribution. The goodness-of-fit χ(2) test based on maximum-likelihood estimation performed well under normal distributions as well as under a condition of asymptotic robustness. The Satorra-Bentler scaled test statistic performed best overall, while the mean scaled and variance adjusted test statistic outperformed the others at small and moderate sample sizes under certain distributional conditions.
Martin, Daniel R; Matyushov, Dmitry V
2012-08-30
We show that electrostatic fluctuations of the protein-water interface are globally non-Gaussian. The electrostatic component of the optical transition energy (energy gap) in a hydrated green fluorescent protein is studied here by classical molecular dynamics simulations. The distribution of the energy gap displays a high excess in the breadth of electrostatic fluctuations over the prediction of the Gaussian statistics. The energy gap dynamics include a nanosecond component. When simulations are repeated with frozen protein motions, the statistics shifts to the expectations of linear response and the slow dynamics disappear. We therefore suggest that both the non-Gaussian statistics and the nanosecond dynamics originate largely from global, low-frequency motions of the protein coupled to the interfacial water. The non-Gaussian statistics can be experimentally verified from the temperature dependence of the first two spectral moments measured at constant-volume conditions. Simulations at different temperatures are consistent with other indicators of the non-Gaussian statistics. In particular, the high-temperature part of the energy gap variance (second spectral moment) scales linearly with temperature and extrapolates to zero at a temperature characteristic of the protein glass transition. This result, violating the classical limit of the fluctuation-dissipation theorem, leads to a non-Boltzmann statistics of the energy gap and corresponding non-Arrhenius kinetics of radiationless electronic transitions, empirically described by the Vogel-Fulcher-Tammann law.
Policies of Global English Tests: Test-Takers' Perspectives on the IELTS Retake Policy
ERIC Educational Resources Information Center
Hamid, M. Obaidul
2016-01-01
Globalized English proficiency tests such as the International English Language Testing System (IELTS) are increasingly playing the role of gatekeepers in a globalizing world. Although the use of the IELTS as a "policy tool" for making decisions in the areas of study, work and migration impacts on test-takers' lives and life chances, not…
NASA Astrophysics Data System (ADS)
Jasper, Ahren W.; Dawes, Richard
2013-10-01
The lowest-energy singlet (1 1A') and two lowest-energy triplet (1 3A' and 1 3A″) electronic states of CO2 are characterized using dynamically weighted multireference configuration interaction (dw-MRCI+Q) electronic structure theory calculations extrapolated to the complete basis set (CBS) limit. Global analytic representations of the dw-MRCI+Q/CBS singlet and triplet surfaces and of their CASSCF/aug-cc-pVQZ spin-orbit coupling surfaces are obtained via the interpolated moving least squares (IMLS) semiautomated surface fitting method. The spin-forbidden kinetics of the title reaction is calculated using the coupled IMLS surfaces and coherent switches with decay of mixing non-Born-Oppenheimer molecular dynamics. The calculated spin-forbidden association rate coefficient (corresponding to the high pressure limit of the rate coefficient) is 7-35 times larger at 1000-5000 K than the rate coefficient used in many detailed chemical models of combustion. A dynamical analysis of the multistate trajectories is presented. The trajectory calculations reveal direct (nonstatistical) and indirect (statistical) spin-forbidden reaction mechanisms and may be used to test the suitability of transition-state-theory-like statistical methods for spin-forbidden kinetics. Specifically, we consider the appropriateness of the "double passage" approximation, of assuming statistical distributions of seam crossings, and of applications of the unified statistical model for spin-forbidden reactions.
Suner, Aslı; Karakülah, Gökhan; Dicle, Oğuz
2014-01-01
Statistical hypothesis testing is an essential component of biological and medical studies for making inferences and estimations from the collected data in the study; however, the misuse of statistical tests is widely common. In order to prevent possible errors in convenient statistical test selection, it is currently possible to consult available test selection algorithms developed for various purposes. However, the lack of an algorithm presenting the most common statistical tests used in biomedical research in a single flowchart causes several problems such as shifting users among the algorithms, poor decision support in test selection and lack of satisfaction of potential users. Herein, we demonstrated a unified flowchart; covers mostly used statistical tests in biomedical domain, to provide decision aid to non-statistician users while choosing the appropriate statistical test for testing their hypothesis. We also discuss some of the findings while we are integrating the flowcharts into each other to develop a single but more comprehensive decision algorithm.
The world's microbiology laboratories can be a global microbial sensor network.
O'Brien, Thomas F; Stelling, John
2014-04-01
The microbes that infect us spread in global and local epidemics, and the resistance genes that block their treatment spread within and between them. All we can know about where they are to track and contain them comes from the only places that can see them, the world's microbiology laboratories, but most report each patient's microbe only to that patient's caregiver. Sensors, ranging from instruments to birdwatchers, are now being linked in electronic networks to monitor and interpret algorithmically in real-time ocean currents, atmospheric carbon, supply-chain inventory, bird migration, etc. To so link the world's microbiology laboratories as exquisite sensors in a truly lifesaving real-time network their data must be accessed and fully subtyped. Microbiology laboratories put individual reports into inaccessible paper or mutually incompatible electronic reporting systems, but those from more than 2,200 laboratories in more than 108 countries worldwide are now accessed and translated into compatible WHONET files. These increasingly web-based files could initiate a global microbial sensor network. Unused microbiology laboratory byproduct data, now from drug susceptibility and biochemical testing but increasingly from new technologies (genotyping, MALDI-TOF, etc.), can be reused to subtype microbes of each genus/species into sub-groupings that are discriminated and traced with greater sensitivity. Ongoing statistical delineation of subtypes from global sensor network data will improve detection of movement into any patient of a microbe or resistance gene from another patient, medical center or country. Growing data on clinical manifestations and global distributions of subtypes can automate comments for patient's reports, select microbes to genotype and alert responders.
The substorm cycle as reproduced by global MHD models
NASA Astrophysics Data System (ADS)
Gordeev, E.; Sergeev, V.; Tsyganenko, N.; Kuznetsova, M.; Rastäetter, L.; Raeder, J.; Tóth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.
2017-01-01
Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized 2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to postprocessing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.
The Substorm Cycle as Reproduced by Global MHD Models
NASA Technical Reports Server (NTRS)
Gordeev, E.; Sergee, V.; Tsyganenko, N.; Kuznetsova, M.; Rastaetter, Lutz; Raeder, J.; Toth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.
2017-01-01
Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to post processing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.
Global Analysis of Empirical Relationships Between Annual Climate and Seasonality of NDVI
NASA Technical Reports Server (NTRS)
Potter, C. S.
1997-01-01
This study describes the use of satellite data to calibrate a new climate-vegetation greenness function for global change studies. We examined statistical relationships between annual climate indexes (temperature, precipitation, and surface radiation) and seasonal attributes of the AVHRR Normalized Difference Vegetation Index (NDVI) time series for the mid-1980s in order to refine our empirical understanding of intraannual patterns and global abiotic controls on natural vegetation dynamics. Multiple linear regression results using global l(sup o) gridded data sets suggest that three climate indexes: growing degree days, annual precipitation total, and an annual moisture index together can account to 70-80 percent of the variation in the NDVI seasonal extremes (maximum and minimum values) for the calibration year 1984. Inclusion of the same climate index values from the previous year explained no significant additional portion of the global scale variation in NDVI seasonal extremes. The monthly timing of NDVI extremes was closely associated with seasonal patterns in maximum and minimum temperature and rainfall, with lag times of 1 to 2 months. We separated well-drained areas from l(sup o) grid cells mapped as greater than 25 percent inundated coverage for estimation of both the magnitude and timing of seasonal NDVI maximum values. Predicted monthly NDVI, derived from our climate-based regression equations and Fourier smoothing algorithms, shows good agreement with observed NDVI at a series of ecosystem test locations from around the globe. Regions in which NDVI seasonal extremes were not accurately predicted are mainly high latitude ecosystems and other remote locations where climate station data are sparse.
NASA Technical Reports Server (NTRS)
Tolson, R. H.
1981-01-01
A technique is described for providing a means of evaluating the influence of spatial sampling on the determination of global mean total columnar ozone. A finite number of coefficients in the expansion are determined, and the truncated part of the expansion is shown to contribute an error to the estimate, which depends strongly on the spatial sampling and is relatively insensitive to data noise. First and second order statistics are derived for each term in a spherical harmonic expansion which represents the ozone field, and the statistics are used to estimate systematic and random errors in the estimates of total ozone.
Multi-region statistical shape model for cochlear implantation
NASA Astrophysics Data System (ADS)
Romera, Jordi; Kjer, H. Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel A.
2016-03-01
Statistical shape models are commonly used to analyze the variability between similar anatomical structures and their use is established as a tool for analysis and segmentation of medical images. However, using a global model to capture the variability of complex structures is not enough to achieve the best results. The complexity of a proper global model increases even more when the amount of data available is limited to a small number of datasets. Typically, the anatomical variability between structures is associated to the variability of their physiological regions. In this paper, a complete pipeline is proposed for building a multi-region statistical shape model to study the entire variability from locally identified physiological regions of the inner ear. The proposed model, which is based on an extension of the Point Distribution Model (PDM), is built for a training set of 17 high-resolution images (24.5 μm voxels) of the inner ear. The model is evaluated according to its generalization ability and specificity. The results are compared with the ones of a global model built directly using the standard PDM approach. The evaluation results suggest that better accuracy can be achieved using a regional modeling of the inner ear.
Medial-based deformable models in nonconvex shape-spaces for medical image segmentation.
McIntosh, Chris; Hamarneh, Ghassan
2012-01-01
We explore the application of genetic algorithms (GA) to deformable models through the proposition of a novel method for medical image segmentation that combines GA with nonconvex, localized, medial-based shape statistics. We replace the more typical gradient descent optimizer used in deformable models with GA, and the convex, implicit, global shape statistics with nonconvex, explicit, localized ones. Specifically, we propose GA to reduce typical deformable model weaknesses pertaining to model initialization, pose estimation and local minima, through the simultaneous evolution of a large number of models. Furthermore, we constrain the evolution, and thus reduce the size of the search-space, by using statistically-based deformable models whose deformations are intuitive (stretch, bulge, bend) and are driven in terms of localized principal modes of variation, instead of modes of variation across the entire shape that often fail to capture localized shape changes. Although GA are not guaranteed to achieve the global optima, our method compares favorably to the prevalent optimization techniques, convex/nonconvex gradient-based optimizers and to globally optimal graph-theoretic combinatorial optimization techniques, when applied to the task of corpus callosum segmentation in 50 mid-sagittal brain magnetic resonance images.
Lee, Adrian Ys; Pridmore, Saxby
2014-04-01
Our aim was (1) to examine global and Australian data with a view to determining the presence of an inverse relationship between suicide and homicide rates, and (2) to examine global Human Development Index (HDI) values and suicide and homicide rates, with a view to determining any statistical relationship. Suicide and homicide rates and HDI values were available for 102 countries, and suicide and homicide rates were available for the states and territories of Australia. The three data sets had non-normal distributions, and the non-parametric Spearman's ρ was used for correlation statistics with α = 0.05. We found a weak, statistically significant inverse relationship between the suicide and homicide rates of 102 countries (ρ = -0.244, p = 0.014). No relationship was established for the Australian values, however. As anticipated, we found a significant negative correlation between homicide and HDI values. We unexpectedly demonstrated a positive correlation between suicide rates and HDI values. The notion that suicide and homicide have an inverse relationship now has some scientific support; but additional research is warranted to characterise and explain this relationship. The unexpected finding of a positive correlation between suicide rates and HDI values requires further examination.
Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models
NASA Astrophysics Data System (ADS)
Rigler, E. J.; Wiltberger, M. J.; Love, J. J.
2017-12-01
Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.
Qiao, Xue; Lin, Xiong-hao; Ji, Shuai; Zhang, Zheng-xiang; Bo, Tao; Guo, De-an; Ye, Min
2016-01-05
To fully understand the chemical diversity of an herbal medicine is challenging. In this work, we describe a new approach to globally profile and discover novel compounds from an herbal extract using multiple neutral loss/precursor ion scanning combined with substructure recognition and statistical analysis. Turmeric (the rhizomes of Curcuma longa L.) was used as an example. This approach consists of three steps: (i) multiple neutral loss/precursor ion scanning to obtain substructure information; (ii) targeted identification of new compounds by extracted ion current and substructure recognition; and (iii) untargeted identification using total ion current and multivariate statistical analysis to discover novel structures. Using this approach, 846 terpecurcumins (terpene-conjugated curcuminoids) were discovered from turmeric, including a number of potentially novel compounds. Furthermore, two unprecedented compounds (terpecurcumins X and Y) were purified, and their structures were identified by NMR spectroscopy. This study extended the application of mass spectrometry to global profiling of natural products in herbal medicines and could help chemists to rapidly discover novel compounds from a complex matrix.
NASA Astrophysics Data System (ADS)
Mistry, Malcolm; De Cian, Enrica; Wing, Ian Sue
2015-04-01
There is widespread concern that trends and variability in weather induced by climate change will detrimentally affect global agricultural productivity and food supplies. Reliable quantification of the risks of negative impacts at regional and global scales is a critical research need, which has so far been met by forcing state-of-the-art global gridded crop models with outputs of global climate model (GCM) simulations in exercises such as the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP)-Fastrack. Notwithstanding such progress, it remains challenging to use these simulation-based projections to assess agricultural risk because their gridded fields of crop yields are fundamentally denominated as discrete combinations of warming scenarios, GCMs and crop models, and not as model-specific or model-averaged yield response functions of meteorological shifts, which may have their own independent probability of occurrence. By contrast, the empirical climate economics literature has adeptly represented agricultural responses to meteorological variables as reduced-form statistical response surfaces which identify the crop productivity impacts of additional exposure to different intervals of temperature and precipitation [cf Schlenker and Roberts, 2009]. This raises several important questions: (1) what do the equivalent reduced-form statistical response surfaces look like for crop model outputs, (2) do they exhibit systematic variation over space (e.g., crop suitability zones) or across crop models with different characteristics, (3) how do they compare to estimates based on historical observations, and (4) what are the implications for the characterization of climate risks? We address these questions by estimating statistical yield response functions for four major crops (maize, rice, wheat and soybeans) over the historical period (1971-2004) as well as future climate change scenarios (2005-2099) using ISIMIP-Fastrack data for five GCMs and seven crop models under rain-fed and irrigated management regimes. Our approach, which is patterned after Lobell and Burke [2010], is a novel application of cross-section/time-series statistical techniques from the climate economics literature to large, high-dimension, multi-model datasets, and holds considerable promise as a diagnostic methodology to elucidate uncertainties in the processes simulated by crop models, and to support the development of climate impact intercomparison exercises.
The span of correlations in dolphin whistle sequences
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, Ramon; McCowan, Brenda
2012-06-01
Long-range correlations are found in symbolic sequences from human language, music and DNA. Determining the span of correlations in dolphin whistle sequences is crucial for shedding light on their communicative complexity. Dolphin whistles share various statistical properties with human words, i.e. Zipf's law for word frequencies (namely that the probability of the ith most frequent word of a text is about i-α) and a parallel of the tendency of more frequent words to have more meanings. The finding of Zipf's law for word frequencies in dolphin whistles has been the topic of an intense debate on its implications. One of the major arguments against the relevance of Zipf's law in dolphin whistles is that it is not possible to distinguish the outcome of a die-rolling experiment from that of a linguistic or communicative source producing Zipf's law for word frequencies. Here we show that statistically significant whistle-whistle correlations extend back to the second previous whistle in the sequence, using a global randomization test, and to the fourth previous whistle, using a local randomization test. None of these correlations are expected by a die-rolling experiment and other simple explanations of Zipf's law for word frequencies, such as Simon's model, that produce sequences of unpredictable elements.
Alemu, Sisay Mulugeta; Habtewold, Tesfa Dejenie; Haile, Yohannes Gebreegziabhere
2017-01-01
Globally 3 to 8% of reproductive age women are suffering from premenstrual dysphoric disorder (PMDD). Several mental and reproductive health-related factors cause low academic achievement during university education. However, limited data exist in Ethiopia. The aim of the study was to investigate mental and reproductive health correlates of academic performance. Institution based cross-sectional study was conducted with 667 Debre Berhan University female students from April to June 2015. Academic performance was the outcome variable. Mental and reproductive health characteristics were explanatory variables. Two-way analysis of variance (ANOVA) test of association was applied to examine group difference in academic performance. Among 529 students who participated, 49.3% reported mild premenstrual syndrome (PMS), 36.9% reported moderate/severe PMS, and 13.8% fulfilled PMDD diagnostic criteria. The ANOVA test of association revealed that there was no significant difference in academic performance between students with different level of PMS experience ( F -statistic = 0.08, p value = 0.93). Nevertheless, there was a significant difference in academic performance between students with different length of menses ( F -statistic = 5.15, p value = 0.006). There was no significant association between PMS experience and academic performance, but on the other hand, the length of menses significantly associated with academic performance.
An Evaluation of the Sniffer Global Optimization Algorithm Using Standard Test Functions
NASA Astrophysics Data System (ADS)
Butler, Roger A. R.; Slaminka, Edward E.
1992-03-01
The performance of Sniffer—a new global optimization algorithm—is compared with that of Simulated Annealing. Using the number of function evaluations as a measure of efficiency, the new algorithm is shown to be significantly better at finding the global minimum of seven standard test functions. Several of the test functions used have many local minima and very steep walls surrounding the global minimum. Such functions are intended to thwart global minimization algorithms.
Global estimates of shark catches using trade records from commercial markets.
Clarke, Shelley C; McAllister, Murdoch K; Milner-Gulland, E J; Kirkwood, G P; Michielsens, Catherine G J; Agnew, David J; Pikitch, Ellen K; Nakano, Hideki; Shivji, Mahmood S
2006-10-01
Despite growing concerns about overexploitation of sharks, lack of accurate, species-specific harvest data often hampers quantitative stock assessment. In such cases, trade studies can provide insights into exploitation unavailable from traditional monitoring. We applied Bayesian statistical methods to trade data in combination with genetic identification to estimate by species, the annual number of globally traded shark fins, the most commercially valuable product from a group of species often unrecorded in harvest statistics. Our results provide the first fishery-independent estimate of the scale of shark catches worldwide and indicate that shark biomass in the fin trade is three to four times higher than shark catch figures reported in the only global data base. Comparison of our estimates to approximated stock assessment reference points for one of the most commonly traded species, blue shark, suggests that current trade volumes in numbers of sharks are close to or possibly exceeding the maximum sustainable yield levels.
Pseudochaotic dynamics near global periodicity
NASA Astrophysics Data System (ADS)
Fan, Rong; Zaslavsky, George M.
2007-09-01
In this paper, we study a piecewise linear version of kicked oscillator model: saw-tooth map. A special case of global periodicity, in which every phase point belongs to a periodic orbit, is presented. With few analytic results known for the corresponding map on torus, we numerically investigate transport properties and statistical behavior of Poincaré recurrence time in two cases of deviation from global periodicity. A non-KAM behavior of the system, as well as subdiffusion and superdiffusion, are observed through numerical simulations. Statistics of Poincaré recurrences shows Kac lemma is valid in the system and there is a relation between the transport exponent and the Poincaré recurrence exponent. We also perform careful numerical computation of capacity, information and correlation dimensions of the so-called exceptional set in both cases. Our results show that the fractal dimension of the exceptional set is strictly less than 2 and that the fractal structures are unifractal rather than multifractal.
[Globalization and infectious diseases in Mexico's indigenous population].
Castro, Roberto; Erviti, Joaquina; Leyva, René
2007-01-01
This paper discusses the health status of indigenous populations in Mexico. The first section characterizes the concept of globalization and its links to the population's health. Based on available statistical data, the second section documents the current indigenous populations' health status in the country. The article then argues that the presupposition of equity, crucial to globalization theory, does not apply to this case. Using the Mexican National Health Survey (2000), the third section further analyzes the health status of indigenous populations and identifies important inconsistencies in the data. The discussion section contends that these inconsistencies derive from the fact that such health surveys fail to contemplate the cultural specificities of indigenous peoples, thus leading to erroneous interpretations of the data. The article concludes that statistics on indigenous peoples' health must be interpreted with extreme caution and always with the support of social science theories and research methods.
NASA Astrophysics Data System (ADS)
Tadić, Bosiljka; Thurner, Stefan; Rodgers, G. J.
2004-03-01
We study the microscopic time fluctuations of traffic load and the global statistical properties of a dense traffic of particles on scale-free cyclic graphs. For a wide range of driving rates R the traffic is stationary and the load time series exhibits antipersistence due to the regulatory role of the superstructure associated with two hub nodes in the network. We discuss how the superstructure affects the functioning of the network at high traffic density and at the jamming threshold. The degree of correlations systematically decreases with increasing traffic density and eventually disappears when approaching a jamming density Rc. Already before jamming we observe qualitative changes in the global network-load distributions and the particle queuing times. These changes are related to the occurrence of temporary crises in which the network-load increases dramatically, and then slowly falls back to a value characterizing free flow.
Rainfall statistics, stationarity, and climate change.
Sun, Fubao; Roderick, Michael L; Farquhar, Graham D
2018-03-06
There is a growing research interest in the detection of changes in hydrologic and climatic time series. Stationarity can be assessed using the autocorrelation function, but this is not yet common practice in hydrology and climate. Here, we use a global land-based gridded annual precipitation (hereafter P ) database (1940-2009) and find that the lag 1 autocorrelation coefficient is statistically significant at around 14% of the global land surface, implying nonstationary behavior (90% confidence). In contrast, around 76% of the global land surface shows little or no change, implying stationary behavior. We use these results to assess change in the observed P over the most recent decade of the database. We find that the changes for most (84%) grid boxes are within the plausible bounds of no significant change at the 90% CI. The results emphasize the importance of adequately accounting for natural variability when assessing change. Copyright © 2018 the Author(s). Published by PNAS.
Rainfall statistics, stationarity, and climate change
NASA Astrophysics Data System (ADS)
Sun, Fubao; Roderick, Michael L.; Farquhar, Graham D.
2018-03-01
There is a growing research interest in the detection of changes in hydrologic and climatic time series. Stationarity can be assessed using the autocorrelation function, but this is not yet common practice in hydrology and climate. Here, we use a global land-based gridded annual precipitation (hereafter P) database (1940–2009) and find that the lag 1 autocorrelation coefficient is statistically significant at around 14% of the global land surface, implying nonstationary behavior (90% confidence). In contrast, around 76% of the global land surface shows little or no change, implying stationary behavior. We use these results to assess change in the observed P over the most recent decade of the database. We find that the changes for most (84%) grid boxes are within the plausible bounds of no significant change at the 90% CI. The results emphasize the importance of adequately accounting for natural variability when assessing change.
Implications of MOLA Global Roughness, Statistics, and Topography
NASA Technical Reports Server (NTRS)
Aharonson, O.; Zuber, M. T.; Neumann, G. A.
1999-01-01
New insights are emerging as the ongoing high-quality measurements of the Martian surface topography by Mars Orbiter Laser Altimeter (MOLA) on board the Mars Global Surveyor (MGS) spacecraft increase in coverage, resolution, and diversity. For the first time, a global characterization of the statistical properties of topography is possible. The data were collected during the aerobreaking hiatus, science phasing, and mapping orbits of MGS, and have a resolution of 300-400 m along track, a range resolution of 37.5 cm, a range precision of 1-10 m for surface slopes up to 30 deg., and an absolute accuracy of topography of 13 m. The spacecraft's orbit inclination dictates that nadir observations have latitude coverage of about 87.1S to 87.1N; the addition of observations obtained during a period of off-nadir pointing over the north pole extended coverage to 90N. Additional information is contained in the original extended abstract.
2014-01-01
Background Assessing heterogeneity in lung images can be an important diagnosis tool. We present a novel and objective method for assessing lung damage in a rat model of emphysema. We combined a three-dimensional (3D) computer graphics method–octree decomposition–with a geostatistics-based approach for assessing spatial relationships–the variogram–to evaluate disease in 3D computed tomography (CT) image volumes. Methods Male, Sprague-Dawley rats were dosed intratracheally with saline (control), or with elastase dissolved in saline to either the whole lung (for mild, global disease) or a single lobe (for severe, local disease). Gated 3D micro-CT images were acquired on the lungs of all rats at end expiration. Images were masked, and octree decomposition was performed on the images to reduce the lungs to homogeneous blocks of 2 × 2 × 2, 4 × 4 × 4, and 8 × 8 × 8 voxels. To focus on lung parenchyma, small blocks were ignored because they primarily defined boundaries and vascular features, and the spatial variance between all pairs of the 8 × 8 × 8 blocks was calculated as the square of the difference of signal intensity. Variograms–graphs of distance vs. variance–were constructed, and results of a least-squares-fit were compared. The robustness of the approach was tested on images prepared with various filtering protocols. Statistical assessment of the similarity of the three control rats was made with a Kruskal-Wallis rank sum test. A Mann-Whitney-Wilcoxon rank sum test was used to measure statistical distinction between individuals. For comparison with the variogram results, the coefficient of variation and the emphysema index were also calculated for all rats. Results Variogram analysis showed that the control rats were statistically indistinct (p = 0.12), but there were significant differences between control, mild global disease, and severe local disease groups (p < 0.0001). A heterogeneity index was calculated to describe the difference of an individual variogram from the control average. This metric also showed clear separation between dose groups. The coefficient of variation and the emphysema index, on the other hand, did not separate groups. Conclusion These results suggest the octree decomposition and variogram analysis approach may be a rapid, non-subjective, and sensitive imaging-based biomarker for characterizing lung disease. PMID:24393332
Moment-based metrics for global sensitivity analysis of hydrological systems
NASA Astrophysics Data System (ADS)
Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto
2017-12-01
We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.
NASA Astrophysics Data System (ADS)
Kim, D.; Youn, J.; Kim, C.
2017-08-01
As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.
Modeling radiation belt dynamics using a 3-D layer method code
NASA Astrophysics Data System (ADS)
Wang, C.; Ma, Q.; Tao, X.; Zhang, Y.; Teng, S.; Albert, J. M.; Chan, A. A.; Li, W.; Ni, B.; Lu, Q.; Wang, S.
2017-08-01
A new 3-D diffusion code using a recently published layer method has been developed to analyze radiation belt electron dynamics. The code guarantees the positivity of the solution even when mixed diffusion terms are included. Unlike most of the previous codes, our 3-D code is developed directly in equatorial pitch angle (α0), momentum (p), and L shell coordinates; this eliminates the need to transform back and forth between (α0,p) coordinates and adiabatic invariant coordinates. Using (α0,p,L) is also convenient for direct comparison with satellite data. The new code has been validated by various numerical tests, and we apply the 3-D code to model the rapid electron flux enhancement following the geomagnetic storm on 17 March 2013, which is one of the Geospace Environment Modeling Focus Group challenge events. An event-specific global chorus wave model, an AL-dependent statistical plasmaspheric hiss wave model, and a recently published radial diffusion coefficient formula from Time History of Events and Macroscale Interactions during Substorms (THEMIS) statistics are used. The simulation results show good agreement with satellite observations, in general, supporting the scenario that the rapid enhancement of radiation belt electron flux for this event results from an increased level of the seed population by radial diffusion, with subsequent acceleration by chorus waves. Our results prove that the layer method can be readily used to model global radiation belt dynamics in three dimensions.
The GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP)
NASA Astrophysics Data System (ADS)
Donlon, C.; Ghrsst-Pp Science Team
2003-04-01
This paper summarises Development and Implementation Plan of the GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP). The aim of the GHRSST-PP is to coordinate a new generation of global, multi-sensor, high-resolution (better than 10 km and 12 hours) SST products for the benefit of the operational and scientific community and for those with a potential interest in the products of GODAE. The GHRSST-PP project will deliver a demonstration system that integrates data from existing international satellite and in situ data sources using state-of-the-art communications and analysis tools. Primary GHRSST-PP products will be generated by fusing infrared and microwave satellite data obtained from sensors in near polar, geostationary and low earth orbits, constrained by in situ observations. Surface skin SST, sub-surface SST and SST at depth will be produced as both merged and analysed data products. Merged data products have a common grid but all input data retaining their error statistics whereas analysed data products use all data to derive a best estimate data source having one set of error statistics. Merged SST fields will not be interpolated thereby preserving the integrity of the source data as much as possible. Products will be first produced and validated using in situ observations for regional areas by regional data assembly centres (RDAC) and sent to a global data analysis centre (GDAC) for integration with other data to provide global coverage. GDAC and RDAC will be connected together with other data using a virtual dynamic distributed database (DDD). The GDAC will merge and analyse RDAC data together with other data (from the GTS and space agencies) to provide global coverage every 12 hours in real time. In all cases data products will be accurate to better than 0.5 K validated using data collected at globally distributed diagnostic data set (DDS) sites. A user information service (UIS) will work together with user applications and services (AUS) to ensure that the GHRSST-PP is able to respond appropriately to user demands. In addition, the GDAC will provide product validation and dissemination services as well as the means for researchers to test and use the In situ and Satellite Data Integration Processing Model (ISDI-PM) operational demonstration code using a large supercomputer.
LaRiccia, Patrick J; Farrar, John T; Sammel, Mary D; Gallo, Joseph J
2008-07-01
To determine the efficacy of the food supplement OPC Factor to increase energy levels in healthy adults aged 45 to 65. Randomized, placebo-controlled, triple-blind crossover study. Twenty-five (25) healthy adults recruited from the University of Pennsylvania Health System. OPC Factor,trade mark (AlivenLabs, Lebanon, TN) a food supplement that contains oligomeric proanthocyanidins from grape seeds and pine bark along with other nutrient supplements including vitamins and minerals, was in the form of an effervescent powder. The placebo was similar in appearance and taste. Five outcome measurements were performed: (1) Energy subscale scores of the Activation-Deactivation Adjective Check List (AD ACL); (2) One (1) global question of percent energy change (Global Energy Percent Change); (3) One (1) global question of energy change measured on a Likert scale (Global Energy Scale Change); 4. One (1) global question of percent overall status change (Global Overall Status Percent Change); and (5) One (1) global question of overall status change measured on a Likert scale (Global Overall Status Scale Change). There were no carryover/period effects in the groups randomized to Placebo/Active Product sequence versus Active Product/Placebo sequence. Examination of the AD ACL Energy subscale scores for the Active Product versus Placebo comparison revealed no significant difference in the intention-to-treat (IT) analysis and the treatment received (TR) analysis. However, Global Energy Percent Change (p = 0.06) and Global Energy Scale Change (p = 0.09) both closely approached conventional levels of statistical significance for the active product in the IT analysis. Global Energy Percent Change (p = 0.05) and Global Energy Scale Change (p = 0.04) reached statistical significance in the TR analysis. A cumulative percent responders analysis graph indicated greater response rates for the active product. OPC Factor may increase energy levels in healthy adults aged 45-65 years. A larger study is recommended. Clinical Trials.gov identifier: NCT03318019.
Advancing the Use of Earth Observations to Benefit Global Food Security and Agriculture
USDA-ARS?s Scientific Manuscript database
USDA plays an important role as “fair broker” of information on the status and security of the United States and global food supply. USDA surveys and farmer relationships are the source of much of the “ground-truth” required for statistical assessments of crop area, yield, and production domestical...
Global Initiative on Out-of-School Children: All Children in School by 2015
ERIC Educational Resources Information Center
UNICEF, 2012
2012-01-01
The United Nations International Children's Emergency Fund (UNICEF) and the United Nations Educational, Scientific and Cultural Organization (UNESCO) Institute for Statistics (UIS) launched the joint Global Initiative on Out-of-School Children in 2010 to accelerate efforts towards the goal of universal primary education by 2015. The goal of the…
Global Document Delivery, User Studies, and Service Evaluation: The Gateway Experience
ERIC Educational Resources Information Center
Miller, Rush; Xu, Hong; Zou, Xiuying
2008-01-01
This study examines user and service data from 2002-2006 at the East Asian Gateway Service for Chinese and Korean Academic Journal Publications (Gateway Service), the University of Pittsburgh. Descriptive statistical analysis reveals that the Gateway Service has been consistently playing the leading role in global document delivery service as well…
ERIC Educational Resources Information Center
Conway-Gomez, Kristen; Palacios, Fabian Araya
2011-01-01
Students participated in online discussions about sustainable development using a lesson from the Center for Global Geography Education. Students showed statistically significant changes in attitudes about solutions for global problems. Results suggest Chilean students' attitudes toward grades in geography classes, and attitudes toward awareness…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chidong
Motivated by the success of the AMIE/DYNAMO field campaign, which collected unprecedented observations of cloud and precipitation from the tropical Indian Ocean in Octber 2011 – March 2012, this project explored how such observations can be applied to assist the development of global cloud-permitting models through evaluating and correcting model biases in cloud statistics. The main accomplishment of this project were made in four categories: generating observational products for model evaluation, using AMIE/DYNAMO observations to validate global model simulations, using AMIE/DYNAMO observations in numerical studies of cloud-permitting models, and providing leadership in the field. Results from this project provide valuablemore » information for building a seamless bridge between DOE ASR program’s component on process level understanding of cloud processes in the tropics and RGCM focus on global variability and regional extremes. In particular, experience gained from this project would be directly applicable to evaluation and improvements of ACME, especially as it transitions to a non-hydrostatic variable resolution model.« less
Colizza, Vittoria; Barrat, Alain; Barthélemy, Marc; Vespignani, Alessandro
2006-02-14
The systematic study of large-scale networks has unveiled the ubiquitous presence of connectivity patterns characterized by large-scale heterogeneities and unbounded statistical fluctuations. These features affect dramatically the behavior of the diffusion processes occurring on networks, determining the ensuing statistical properties of their evolution pattern and dynamics. In this article, we present a stochastic computational framework for the forecast of global epidemics that considers the complete worldwide air travel infrastructure complemented with census population data. We address two basic issues in global epidemic modeling: (i) we study the role of the large scale properties of the airline transportation network in determining the global diffusion pattern of emerging diseases; and (ii) we evaluate the reliability of forecasts and outbreak scenarios with respect to the intrinsic stochasticity of disease transmission and traffic flows. To address these issues we define a set of quantitative measures able to characterize the level of heterogeneity and predictability of the epidemic pattern. These measures may be used for the analysis of containment policies and epidemic risk assessment.
Global Sensory Qualities and Aesthetic Experience in Music
Brattico, Pauli; Brattico, Elvira; Vuust, Peter
2017-01-01
A well-known tradition in the study of visual aesthetics holds that the experience of visual beauty is grounded in global computational or statistical properties of the stimulus, for example, scale-invariant Fourier spectrum or self-similarity. Some approaches rely on neural mechanisms, such as efficient computation, processing fluency, or the responsiveness of the cells in the primary visual cortex. These proposals are united by the fact that the contributing factors are hypothesized to be global (i.e., they concern the percept as a whole), formal or non-conceptual (i.e., they concern form instead of content), computational and/or statistical, and based on relatively low-level sensory properties. Here we consider that the study of aesthetic responses to music could benefit from the same approach. Thus, along with local features such as pitch, tuning, consonance/dissonance, harmony, timbre, or beat, also global sonic properties could be viewed as contributing toward creating an aesthetic musical experience. Several such properties are discussed and their neural implementation is reviewed in the light of recent advances in neuroaesthetics. PMID:28424573
NASA Technical Reports Server (NTRS)
Wilson, Robert M.
2007-01-01
Statistical aspects of the North Atlantic basin tropical cyclones for the interval 1945- 2005 are examined, including the variation of the yearly frequency of occurrence for various subgroups of storms (all tropical cyclones, hurricanes, major hurricanes, U.S. landfalling hurricanes, and category 4/5 hurricanes); the yearly variation of the mean latitude and longitude (genesis location) of all tropical cyclones and hurricanes; and the yearly variation of the mean peak wind speeds, lowest pressures, and durations for all tropical cyclones, hurricanes, and major hurricanes. Also examined is the relationship between inferred trends found in the North Atlantic basin tropical cyclonic activity and natural variability and global warming, the latter described using surface air temperatures from the Armagh Observatory Armagh, Northern Ireland. Lastly, a simple statistical technique is employed to ascertain the expected level of North Atlantic basin tropical cyclonic activity for the upcoming 2007 season.
Load balancing for massively-parallel soft-real-time systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hailperin, M.
1988-09-01
Global load balancing, if practical, would allow the effective use of massively-parallel ensemble architectures for large soft-real-problems. The challenge is to replace quick global communications, which is impractical in a massively-parallel system, with statistical techniques. In this vein, the author proposes a novel approach to decentralized load balancing based on statistical time-series analysis. Each site estimates the system-wide average load using information about past loads of individual sites and attempts to equal that average. This estimation process is practical because the soft-real-time systems of interest naturally exhibit loads that are periodic, in a statistical sense akin to seasonality in econometrics.more » It is shown how this load-characterization technique can be the foundation for a load-balancing system in an architecture employing cut-through routing and an efficient multicast protocol.« less
Long-term sea level trends: Natural or anthropogenic?
NASA Astrophysics Data System (ADS)
Becker, M.; Karpytchev, M.; Lennartz-Sassinek, S.
2014-08-01
Detection and attribution of human influence on sea level rise are important topics that have not yet been explored in depth. We question whether the sea level changes (SLC) over the past century were natural in origin. SLC exhibit power law long-term correlations. By estimating Hurst exponent through Detrended Fluctuation Analysis and by applying statistics of Lennartz and Bunde, we search the lower bounds of statistically significant external sea level trends in longest tidal records worldwide. We provide statistical evidences that the observed SLC, at global and regional scales, is beyond its natural internal variability. The minimum anthropogenic sea level trend (MASLT) contributes to the observed sea level rise more than 50% in New York, Baltimore, San Diego, Marseille, and Mumbai. A MASLT is about 1 mm/yr in global sea level reconstructions that is more than half of the total observed sea level trend during the XXth century.
Global Volcanism on Mercury at About 3.8 Ga
NASA Astrophysics Data System (ADS)
Byrne, P. K.; Ostrach, L. R.; Denevi, B. W.; Head, J. W., III; Hauck, S. A., II; Murchie, S. L.; Solomon, S. C.
2014-12-01
Smooth plains occupy c. 27% of the surface of Mercury. Embayment relations, spectral contrast with surroundings, and morphologic characteristics indicate that the majority of these plains are volcanic. The largest deposits are located in Mercury's northern hemisphere and include the extensive northern plains (NP) and the Caloris interior and exterior plains (with the latter likely including basin material). Both the NP and Caloris deposits are, within statistical error, the same age (~3.8-3.9 Ga). To test whether this age reflects a period of global volcanism on Mercury, we determined crater size-frequency distributions for four smooth plains units in the planet's southern hemisphere interpreted to be volcanic. Two deposits are situated within the Beethoven and Tolstoj impact basins; two are located close to the Debussy and the Alver and Disney basins, respectively. Each deposit hosts two populations of craters, one that postdates plains emplacement and one that consists of partially to nearly filled craters that predate the plains. This latter population indicates that some time elapsed between formation of the underlying basement and plains volcanism, though we cannot statistically resolve this interval at any of the four sites. Nonetheless, we find that the age given by the superposed crater population in each case is ~3.8 Ga, and crater density values are consistent with those for the NP and Caloris plains. This finding supports a global phase of volcanism near the end of the late heavy bombardment of Mercury and may indicate a period of widespread partial melting of Mercury's mantle. Notably, superposition relations between smooth plains, degraded impact structures, and contractional landforms suggest that by this time interior cooling had already placed Mercury's lithosphere in horizontal compression, tending to inhibit voluminous dike-fed volcanism such as that inferred responsible for the NP. Most smooth plains units, including the Caloris plains and our four study sites, are spatially associated with impact structures; even the NP lie in a regional depression that may be impact-related. Because impacts remove overburden, deposit subsurface heat, and relax pre-existing stress, basins and craters may represent preferential sites for volcanic resurfacing on a globally contracting planet.
US medical specialty global health training and the global burden of disease
Kerry, Vanessa B.; Walensky, Rochelle P.; Tsai, Alexander C.; Bergmark, Regan W.; Bergmark, Brian A.; Rouse, Chaturia; Bangsberg, David R.
2013-01-01
Background Rapid growth in global health activity among US medical specialty education programs has lead to heterogeneity in types of activities and global health training models. The breadth and scope of this activity is not well chronicled. Methods Using a standardized search protocol, we examined the characteristics of US medical residency global health programs by number of programs, clinical specialty, nature of activity (elective, research, extended curriculum based field training), and geographic location across seven different clinical medical residency education specialties. We tabulated programmatic activity by clinical discipline, region and country. We calculated the Spearman's rank correlation coefficient to estimate the association between programmatic activity and country–level disease burden. Results Of the 1856 programs assessed between January and June 2011, there were 380 global health residency training programs (20%) working in 141 countries. 529 individual programmatic activities (elective–based rotations, research programs, extended curriculum–based field training, or other) occurred at 1337 specific sites. The majority of the activities consisted of elective–based rotations. At the country level, disease burden had a statistically significant association with programmatic activity (Spearman's ρ = 0.17) but only explained 3% of the total variation between countries. Conclusions There were a substantial number of US medical specialty global health programs, but a relative paucity of surgical and mental health programs. Elective–based programs were more common than programs that offer longitudinal experiences. Despite heterogeneity, there was a small but statistically significant association between program location and the global burden of disease. Areas for further study include the degree to which US–based programs develop partnerships with their program sites, the significance of this activity for training, and number and breadth of programs in medical specialty global health education in other countries around the world. PMID:24363924
Observed decreases in the Canadian outdoor skating season due to recent winter warming
NASA Astrophysics Data System (ADS)
Damyanov, Nikolay N.; Damon Matthews, H.; Mysak, Lawrence A.
2012-03-01
Global warming has the potential to negatively affect one of Canada’s primary sources of winter recreation: hockey and ice skating on outdoor rinks. Observed changes in winter temperatures in Canada suggest changes in the meteorological conditions required to support the creation and maintenance of outdoor skating rinks; while there have been observed increases in the ice-free period of several natural water bodies, there has been no study of potential trends in the duration of the season supporting the construction of outdoor skating rinks. Here we show that the outdoor skating season (OSS) in Canada has significantly shortened in many regions of the country as a result of changing climate conditions. We first established a meteorological criterion for the beginning, and a proxy for the length of the OSS. We extracted this information from daily maximum temperature observations from 1951 to 2005, and tested it for significant changes over time due to global warming as well as due to changes in patterns of large-scale natural climate variability. We found that many locations have seen a statistically significant decrease in the OSS length, particularly in Southwest and Central Canada. This suggests that future global warming has the potential to significantly compromise the viability of outdoor skating in Canada.
Assessing compatibility of direct detection data: halo-independent global likelihood analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelmini, Graciela B.; Huh, Ji-Haeng; Witte, Samuel J.
2016-10-18
We present two different halo-independent methods to assess the compatibility of several direct dark matter detection data sets for a given dark matter model using a global likelihood consisting of at least one extended likelihood and an arbitrary number of Gaussian or Poisson likelihoods. In the first method we find the global best fit halo function (we prove that it is a unique piecewise constant function with a number of down steps smaller than or equal to a maximum number that we compute) and construct a two-sided pointwise confidence band at any desired confidence level, which can then be comparedmore » with those derived from the extended likelihood alone to assess the joint compatibility of the data. In the second method we define a “constrained parameter goodness-of-fit” test statistic, whose p-value we then use to define a “plausibility region” (e.g. where p≥10%). For any halo function not entirely contained within the plausibility region, the level of compatibility of the data is very low (e.g. p<10%). We illustrate these methods by applying them to CDMS-II-Si and SuperCDMS data, assuming dark matter particles with elastic spin-independent isospin-conserving interactions or exothermic spin-independent isospin-violating interactions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu
2016-12-21
A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less
Matis, Georgios K; Birbilis, Theodossios A; Chrysou, Olga I
2009-11-01
The scope of this research has been to investigate the satisfaction of Greek patients hospitalized in a tertiary care university public hospital in Alexandroupolis, Greece, in order to improve medical, nursing and organizational/administrative services. It is a cross-sectional study involving 200 patients hospitalized for at least 24 h. We administered a satisfaction questionnaire previously approved by the Greek Health Ministry. Four aspects of satisfaction were employed (medical, hotel facilities/organizational, nursing, global). Using principal component analysis, summated scales were formed and tested for internal consistency with the aid of Cronbach's alpha coefficient. The non-parametric Spearman rank correlation coefficient was also used. The results reveal a relatively high degree of global satisfaction (75.125%), yet satisfaction is higher for the medical (89.721%) and nursing (86.432%) services. Moreover, satisfaction derived from the hotel facilities and the general organization was found to be more limited (76.536%). Statistically significant differences in participant satisfaction were observed (depending on age, gender, citizenship, education, number of previous admissions and self-assessment of health status at the first and last day of patients' stay) for the medical, nursing and hotel facilities/organizational dimension, but not for global satisfaction. The present study confirms the results of previously published Greek surveys.
NASA Astrophysics Data System (ADS)
Halverson, G. H.; Fisher, J.; Jewell, L. A.; Moore, G.; Verma, M.; McDonald, T.; Kim, S.; Muniz, A.
2016-12-01
Water scarcity and its impact on agriculture is a pressing world concern. At the heart of this crisis is the balance of water exchange between the land and the atmosphere. The ability to monitor evapotranspiration provides a solution by enabling sustainable irrigation practices. The Priestley-Taylor Jet Propulsion Laboratory model of evapotranspiration has been implemented to meet this need as a daily MODIS product with 1 to 5 km resolution. An automated data pipeline for this model implementation provides daily data with global coverage and near real-time latency using the Geospatial Data Abstraction Library. An interactive map providing on-demand statistical analysis enables water resource managers to monitor rates of water loss. To demonstrate the application of remotely-sensed evapotranspiration to water resource management, a partnership has been arranged with the New Mexico Office of the State Engineer (NMOSE). The online water research management tool was developed to meet the specifications of NMOSE using the Leaflet, GeoServer, and Django frameworks. NMOSE will utilize this tool to monitor drought and fire risk and manage irrigation. Through this test-case, it is hoped that real-time, user-friendly remote sensing tools will be adopted globally to make resource management decisions informed by the NASA Earth Observation System.
Global disaster satellite communications system for disaster assessment and relief coordination
NASA Technical Reports Server (NTRS)
Leroy, B. E.
1979-01-01
The global communication requirements for disaster assistance and examines operationally feasible satellite system concepts and the associated system parameters are analyzed. Some potential problems associated with the current method of providing disaster assistance and a scenario for disaster assistance relying on satellite communications are described. Historical statistics are used with the scenario to assess service requirements. Both present and planned commercially available systems are considered. The associated global disaster communication yearly service costs are estimated.
Zhu, Z.; Waller, E.
2003-01-01
Many countries periodically produce national reports on the status and changes of forest resources, using statistical surveys and spatial mapping of remotely sensed data. At the global level, the Food and Agriculture Organization (FAO) of the United Nations has conducted a Forest Resources Assessment (FRA) program every 10 yr since 1980, producing statistics and analysis that give a global synopsis of forest resources in the world. For the year 2000 of the FRA program (FRA2000), a global forest cover map was produced to provide spatial context to the extensive survey. The forest cover map, produced at the U.S. Geological Survey (USGS) EROS Data Center (EDC), has five classes: closed forest, open or fragmented forest, other wooded land, other land cover, and water. The first two forested classes at the global scale were delineated using combinations of temporal compositing, modified mixture analysis, geographic stratification, and other classification techniques. The remaining three FAO classes were derived primarily from the USGS global land cover characteristics database (Loveland et al. 1999). Validated on the basis of existing reference data sets, the map is estimated to be 77% accurate for the first four classes (no reference data were available for water), and 86% accurate for the forest and nonforest classification. The final map will be published as an insert to the FAO FRA2000 report.
On Using the Weimer Statistical Model for Real-Time Ionospheric Specifications and Forecasts
NASA Astrophysics Data System (ADS)
Bekerat, H. A.; Schunk, R. W.; Scherliess, L.
2002-12-01
The Weimer statistical model (Weimer, 2001) for the high-latitude convection pattern was tested with regard to its ability to produce real-time convection patterns. This work is being conducted under the polar section of GAIM (Global Assimilation of Ionospheric Measurements). The method adopted involves the comparison of the cross-track ion drift velocities measured by DMSP satellites with those calculated from the Weimer model. Starting with a Weimer pattern obtained using real-time IMF and solar wind data at the time of a DMSP satellite pass in the high-latitude ionosphere, the cross-track ion drift velocities along the DMSP track were calculated from the Weimer convection model and compared to those measured by the DMSP satellite. Then, in order to improve the agreement between the measurement and the model, two of the input parameters to the model, the IMF clock-angle and the solar wind speed, were varied to get the pattern that gives the best agreement with the DMSP satellite measurements. Four months of data (March, July, September, and December 1998) were used to test the Weimer model. The result shows that the agreement between the measurement and the Weimer model is improved by using this procedure. The Weimer model is good in a statistical sense, it was able to produce the large-scale structure in most cases. However, it is not good enough to be used for real-time ionospheric specifications and forecasts because it failed to produce a lot of the mesoscale structure measured along most DMSP satellite passes. Reference Weimer, D. R., J. Geophys. Res., 106, 407,2001
VoroMQA: Assessment of protein structure quality using interatomic contact areas.
Olechnovič, Kliment; Venclovas, Česlovas
2017-06-01
In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Lukashin, A V; Fuchs, R
2001-05-01
Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.
A global map of rainfed cropland areas (GMRCA) at the end of last millennium using remote sensing
Biradar, C.M.; Thenkabail, P.S.; Noojipady, P.; Li, Y.; Dheeravath, V.; Turral, H.; Velpuri, M.; Gumma, M.K.; Gangalakunta, O.R.P.; Cai, X.L.; Xiao, X.; Schull, M.A.; Alankara, R.D.; Gunasinghe, S.; Mohideen, S.
2009-01-01
The overarching goal of this study was to produce a global map of rainfed cropland areas (GMRCA) and calculate country-by-country rainfed area statistics using remote sensing data. A suite of spatial datasets, methods and protocols for mapping GMRCA were described. These consist of: (a) data fusion and composition of multi-resolution time-series mega-file data-cube (MFDC), (b) image segmentation based on precipitation, temperature, and elevation zones, (c) spectral correlation similarity (SCS), (d) protocols for class identification and labeling through uses of SCS R2-values, bi-spectral plots, space-time spiral curves (ST-SCs), rich source of field-plot data, and zoom-in-views of Google Earth (GE), and (e) techniques for resolving mixed classes by decision tree algorithms, and spatial modeling. The outcome was a 9-class GMRCA from which country-by-country rainfed area statistics were computed for the end of the last millennium. The global rainfed cropland area estimate from the GMRCA 9-class map was 1.13 billion hectares (Bha). The total global cropland areas (rainfed plus irrigated) was 1.53 Bha which was close to national statistics compiled by FAOSTAT (1.51 Bha). The accuracies and errors of GMRCA were assessed using field-plot and Google Earth data points. The accuracy varied between 92 and 98% with kappa value of about 0.76, errors of omission of 2-8%, and the errors of commission of 19-36%. ?? 2008 Elsevier B.V.
Data Resource Profile: United Nations Children’s Fund (UNICEF)
Murray, Colleen; Newby, Holly
2012-01-01
The United Nations Children’s Fund (UNICEF) plays a leading role in the collection, compilation, analysis and dissemination of data to inform sound policies, legislation and programmes for promoting children’s rights and well-being, and for global monitoring of progress towards the Millennium Development Goals. UNICEF maintains a set of global databases representing nearly 200 countries and covering the areas of child mortality, child health, maternal health, nutrition, immunization, water and sanitation, HIV/AIDS, education and child protection. These databases consist of internationally comparable and statistically sound data, and are updated annually through a process that draws on a wealth of data provided by UNICEF’s wide network of >150 field offices. The databases are composed primarily of estimates from household surveys, with data from censuses, administrative records, vital registration systems and statistical models contributing to some key indicators as well. The data are assessed for quality based on a set of objective criteria to ensure that only the most reliable nationally representative information is included. For most indicators, data are available at the global, regional and national levels, plus sub-national disaggregation by sex, urban/rural residence and household wealth. The global databases are featured in UNICEF’s flagship publications, inter-agency reports, including the Secretary General’s Millennium Development Goals Report and Countdown to 2015, sector-specific reports and statistical country profiles. They are also publicly available on www.childinfo.org, together with trend data and equity analyses. PMID:23211414
A global × global test for testing associations between two large sets of variables.
Chaturvedi, Nimisha; de Menezes, Renée X; Goeman, Jelle J
2017-01-01
In high-dimensional omics studies where multiple molecular profiles are obtained for each set of patients, there is often interest in identifying complex multivariate associations, for example, copy number regulated expression levels in a certain pathway or in a genomic region. To detect such associations, we present a novel approach to test for association between two sets of variables. Our approach generalizes the global test, which tests for association between a group of covariates and a single univariate response, to allow high-dimensional multivariate response. We apply the method to several simulated datasets as well as two publicly available datasets, where we compare the performance of multivariate global test (G2) with univariate global test. The method is implemented in R and will be available as a part of the globaltest package in R. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril
2014-07-01
Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Statistical Inference and Patterns of Inequality in the Global North
ERIC Educational Resources Information Center
Moran, Timothy Patrick
2006-01-01
Cross-national inequality trends have historically been a crucial field of inquiry across the social sciences, and new methodological techniques of statistical inference have recently improved the ability to analyze these trends over time. This paper applies Monte Carlo, bootstrap inference methods to the income surveys of the Luxembourg Income…
Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests
James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers
1999-01-01
Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1993-01-01
This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.
Weather extremes in very large, high-resolution ensembles: the weatherathome experiment
NASA Astrophysics Data System (ADS)
Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.
2011-12-01
Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.
Non-statistical effects in bond fission reactions of 1,2-difluoroethane
NASA Astrophysics Data System (ADS)
Schranz, Harold W.; Raff, Lionel M.; Thompson, Donald L.
1991-08-01
A microcanonical, classical variational transition-state theory based on the use of the efficient microcanonical sampling (EMS) procedure is applied to simple bond fission in 1,2-difluoroethane. Comparison is made with results of trajectory calculations performed on the same global potential-energy surface. Agreement between the statistical theory and trajectory results for CC CF and CH bond fissions is poor with differences as large as a factor of 125. Most importantly, at the lower energy studied, 6.0 eV, the statistical calculations predict considerably slower rates than those computed from trajectories. We conclude from these results that the statistical assumptions inherent in the transition-state theory method are not valid for 1,2-difluoroethane in spite of the fact that the total intramolecular energy transfer rate out of CH and CC normal and local modes is large relative to the bond fission rates. The IVR rate is not globally rapid and the trajectories do not access all of the energetically available phase space uniformly on the timescale of the reactions.
SDGs and Geospatial Frameworks: Data Integration in the United States
NASA Astrophysics Data System (ADS)
Trainor, T.
2016-12-01
Responding to the need to monitor a nation's progress towards meeting the Sustainable Development Goals (SDG) outlined in the 2030 U.N. Agenda requires the integration of earth observations with statistical information. The urban agenda proposed in SDG 11 challenges the global community to find a geospatial approach to monitor and measure inclusive, safe, resilient, and sustainable cities and communities. Target 11.7 identifies public safety, accessibility to green and public spaces, and the most vulnerable populations (i.e., women and children, older persons, and persons with disabilities) as the most important priorities of this goal. A challenge for both national statistical organizations and earth observation agencies in addressing SDG 11 is the requirement for detailed statistics at a sufficient spatial resolution to provide the basis for meaningful analysis of the urban population and city environments. Using an example for the city of Pittsburgh, this presentation proposes data and methods to illustrate how earth science and statistical data can be integrated to respond to Target 11.7. Finally, a preliminary series of data initiatives are proposed for extending this method to other global cities.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Real-Time Global Nonlinear Aerodynamic Modeling for Learn-To-Fly
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.
2016-01-01
Flight testing and modeling techniques were developed to accurately identify global nonlinear aerodynamic models for aircraft in real time. The techniques were developed and demonstrated during flight testing of a remotely-piloted subscale propeller-driven fixed-wing aircraft using flight test maneuvers designed to simulate a Learn-To-Fly scenario. Prediction testing was used to evaluate the quality of the global models identified in real time. The real-time global nonlinear aerodynamic modeling algorithm will be integrated and further tested with learning adaptive control and guidance for NASA Learn-To-Fly concept flight demonstrations.
Realistic thermodynamic and statistical-mechanical measures for neural synchronization.
Kim, Sang-Yoon; Lim, Woochang
2014-04-15
Synchronized brain rhythms, associated with diverse cognitive functions, have been observed in electrical recordings of brain activity. Neural synchronization may be well described by using the population-averaged global potential VG in computational neuroscience. The time-averaged fluctuation of VG plays the role of a "thermodynamic" order parameter O used for describing the synchrony-asynchrony transition in neural systems. Population spike synchronization may be well visualized in the raster plot of neural spikes. The degree of neural synchronization seen in the raster plot is well measured in terms of a "statistical-mechanical" spike-based measure Ms introduced by considering the occupation and the pacing patterns of spikes. The global potential VG is also used to give a reference global cycle for the calculation of Ms. Hence, VG becomes an important collective quantity because it is associated with calculation of both O and Ms. However, it is practically difficult to directly get VG in real experiments. To overcome this difficulty, instead of VG, we employ the instantaneous population spike rate (IPSR) which can be obtained in experiments, and develop realistic thermodynamic and statistical-mechanical measures, based on IPSR, to make practical characterization of the neural synchronization in both computational and experimental neuroscience. Particularly, more accurate characterization of weak sparse spike synchronization can be achieved in terms of realistic statistical-mechanical IPSR-based measure, in comparison with the conventional measure based on VG. Copyright © 2014. Published by Elsevier B.V.
Random local temporal structure of category fluency responses.
Meyer, David J; Messer, Jason; Singh, Tanya; Thomas, Peter J; Woyczynski, Wojbor A; Kaye, Jeffrey; Lerner, Alan J
2012-04-01
The Category Fluency Test (CFT) provides a sensitive measurement of cognitive capabilities in humans related to retrieval from semantic memory. In particular, it is widely used to assess progress of cognitive impairment in patients with dementia. Previous research shows that, in the first approximation, the intensity of tested individuals' responses within a standard 60-s test period decays exponentially with time, with faster decay rates for more cognitively impaired patients. Such decay rate can then be viewed as a global (macro) diagnostic parameter of each test. In the present paper we focus on the statistical properties of the properly de-trended time intervals between consecutive responses (inter-call times) in the Category Fluency Test. In a sense, those properties reflect the local (micro) structure of the response generation process. We find that a good approximation for the distribution of the de-trended inter-call times is provided by the Weibull Distribution, a probability distribution that appears naturally in this context as a distribution of a minimum of independent random quantities and is the standard tool in industrial reliability theory. This insight leads us to a new interpretation of the concept of "navigating a semantic space" via patient responses.
Improving GEFS Weather Forecasts for Indian Monsoon with Statistical Downscaling
NASA Astrophysics Data System (ADS)
Agrawal, Ankita; Salvi, Kaustubh; Ghosh, Subimal
2014-05-01
Weather forecast has always been a challenging research problem, yet of a paramount importance as it serves the role of 'key input' in formulating modus operandi for immediate future. Short range rainfall forecasts influence a wide range of entities, right from agricultural industry to a common man. Accurate forecasts actually help in minimizing the possible damage by implementing pre-decided plan of action and hence it is necessary to gauge the quality of forecasts which might vary with the complexity of weather state and regional parameters. Indian Summer Monsoon Rainfall (ISMR) is one such perfect arena to check the quality of weather forecast not only because of the level of intricacy in spatial and temporal patterns associated with it, but also the amount of damage it can cause (because of poor forecasts) to the Indian economy by affecting agriculture Industry. The present study is undertaken with the rationales of assessing, the ability of Global Ensemble Forecast System (GEFS) in predicting ISMR over central India and the skill of statistical downscaling technique in adding value to the predictions by taking them closer to evidentiary target dataset. GEFS is a global numerical weather prediction system providing the forecast results of different climate variables at a fine resolution (0.5 degree and 1 degree). GEFS shows good skills in predicting different climatic variables but fails miserably over rainfall predictions for Indian summer monsoon rainfall, which is evident from a very low to negative correlation values between predicted and observed rainfall. Towards the fulfilment of second rationale, the statistical relationship is established between the reasonably well predicted climate variables (GEFS) and observed rainfall. The GEFS predictors are treated with multicollinearity and dimensionality reduction techniques, such as principal component analysis (PCA) and least absolute shrinkage and selection operator (LASSO). Statistical relationship is established between the principal components and observed rainfall over training period and predictions are obtained for testing period. The validations show high improvements in correlation coefficient between observed and predicted data (0.25 to 0.55). The results speak in favour of statistical downscaling methodology which shows the capability to reduce the gap between observed data and predictions. A detailed study is required to be carried out by applying different downscaling techniques to quantify the improvements in predictions.
Afendulis, Christopher C; Fendrick, A Mark; Song, Zirui; Landon, Bruce E; Safran, Dana Gelb; Mechanic, Robert E; Chernew, Michael E
2014-01-01
In 2009, Blue Cross Blue Shield of Massachusetts implemented a global budget-based payment system, the Alternative Quality Contract (AQC), in which provider groups assumed accountability for spending. We investigate the impact of global budgets on the utilization of prescription drugs and related expenditures. Our analyses indicate no statistically significant evidence that the AQC reduced the use of drugs. Although the impact may change over time, early evidence suggests that it is premature to conclude that global budget systems may reduce access to medications. © The Author(s) 2014.
Explaining patterns in the ratification of global environmental treaties
NASA Technical Reports Server (NTRS)
Cook, David W.
1991-01-01
A study was made of the ratification behavior of 160 countries with respect to 38 global environmental treaties. The study identifies and explains patterns in the ratification of treaties, providing two means of assessing the likelihood that any given country will support global environmental treaties. National ratification totals reveal a pattern of high ratification by countries in Western Europe, North America, Japan, Australia, and New Zealand. A country's standing within the range of high to low ratification rates can be explained by the statistical model developed in the study. This research allows one to identify countries likely to support global environmental treaties.
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
NASA Technical Reports Server (NTRS)
Ruane, Alex C.; McDermid, Sonali; Rosenzweig, Cynthia; Baigorria, Guillermo A.; Jones, James W.; Romero, Consuelo C.; Cecil, L. DeWayne
2014-01-01
Climate change is projected to push the limits of cropping systems and has the potential to disrupt the agricultural sector from local to global scales. This article introduces the Coordinated Climate-Crop Modeling Project (C3MP), an initiative of the Agricultural Model Intercomparison and Improvement Project (AgMIP) to engage a global network of crop modelers to explore the impacts of climate change via an investigation of crop responses to changes in carbon dioxide concentration ([CO2]), temperature, and water. As a demonstration of the C3MP protocols and enabled analyses, we apply the Decision Support System for Agrotechnology Transfer (DSSAT) CROPGRO-Peanut crop model for Henry County, Alabama, to evaluate responses to the range of plausible [CO2], temperature changes, and precipitation changes projected by climate models out to the end of the 21st century. These sensitivity tests are used to derive crop model emulators that estimate changes in mean yield and the coefficient of variation for seasonal yields across a broad range of climate conditions, reproducing mean yields from sensitivity test simulations with deviations of ca. 2% for rain-fed conditions. We apply these statistical emulators to investigate how peanuts respond to projections from various global climate models, time periods, and emissions scenarios, finding a robust projection of modest (<10%) median yield losses in the middle of the 21st century accelerating to more severe (>20%) losses and larger uncertainty at the end of the century under the more severe representative concentration pathway (RCP8.5). This projection is not substantially altered by the selection of the AgMERRA global gridded climate dataset rather than the local historical observations, differences between the Third and Fifth Coupled Model Intercomparison Project (CMIP3 and CMIP5), or the use of the delta method of climate impacts analysis rather than the C3MP impacts response surface and emulator approach.
Apipattanavis, S.; McCabe, G.J.; Rajagopalan, B.; Gangopadhyay, S.
2009-01-01
Dominant modes of individual and joint variability in global sea surface temperatures (SST) and global Palmer drought severity index (PDSI) values for the twentieth century are identified through a multivariate frequency domain singular value decomposition. This analysis indicates that a secular trend and variability related to the El Niño–Southern Oscillation (ENSO) are the dominant modes of variance shared among the global datasets. For the SST data the secular trend corresponds to a positive trend in Indian Ocean and South Atlantic SSTs, and a negative trend in North Pacific and North Atlantic SSTs. The ENSO reconstruction shows a strong signal in the tropical Pacific, North Pacific, and Indian Ocean regions. For the PDSI data, the secular trend reconstruction shows high amplitudes over central Africa including the Sahel, whereas the regions with strong ENSO amplitudes in PDSI are the southwestern and northwestern United States, South Africa, northeastern Brazil, central Africa, the Indian subcontinent, and Australia. An additional significant frequency, multidecadal variability, is identified for the Northern Hemisphere. This multidecadal frequency appears to be related to the Atlantic multidecadal oscillation (AMO). The multidecadal frequency is statistically significant in the Northern Hemisphere SST data, but is statistically nonsignificant in the PDSI data.
Siddall, James; Huebner, E Scott; Jiang, Xu
2013-01-01
This study examined the cross-sectional and prospective relationships between three sources of school-related social support (parent involvement, peer support for learning, and teacher-student relationships) and early adolescents' global life satisfaction. The participants were 597 middle school students from 1 large school in the southeastern United States who completed measures of school social climate and life satisfaction on 2 occasions, 5 months apart. The results revealed that school-related experiences in terms of social support for learning contributed substantial amounts of variance to individual differences in adolescents' satisfaction with their lives as a whole. Cross-sectional multiple regression analyses of the differential contributions of the sources of support demonstrated that family and peer support for learning contributed statistically significant, unique variance to global life satisfaction reports. Prospective multiple regression analyses demonstrated that only family support for learning continued to contribute statistically significant, unique variance to the global life satisfaction reports at Time 2. The results suggest that school-related experiences, especially family-school interactions, spill over into adolescents' overall evaluations of their lives at a time when direct parental involvement in schooling and adolescents' global life satisfaction are generally declining. Recommendations for future research and educational policies and practices are discussed. © 2013 American Orthopsychiatric Association.
NASA Astrophysics Data System (ADS)
Maina, Fadji Zaouna; Guadagnini, Alberto
2018-01-01
We study the contribution of typically uncertain subsurface flow parameters to gravity changes that can be recorded during pumping tests in unconfined aquifers. We do so in the framework of a Global Sensitivity Analysis and quantify the effects of uncertainty of such parameters on the first four statistical moments of the probability distribution of gravimetric variations induced by the operation of the well. System parameters are grouped into two main categories, respectively, governing groundwater flow in the unsaturated and saturated portions of the domain. We ground our work on the three-dimensional analytical model proposed by Mishra and Neuman (2011), which fully takes into account the richness of the physical process taking place across the unsaturated and saturated zones and storage effects in a finite radius pumping well. The relative influence of model parameter uncertainties on drawdown, moisture content, and gravity changes are quantified through (a) the Sobol' indices, derived from a classical decomposition of variance and (b) recently developed indices quantifying the relative contribution of each uncertain model parameter to the (ensemble) mean, skewness, and kurtosis of the model output. Our results document (i) the importance of the effects of the parameters governing the unsaturated flow dynamics on the mean and variance of local drawdown and gravity changes; (ii) the marked sensitivity (as expressed in terms of the statistical moments analyzed) of gravity changes to the employed water retention curve model parameter, specific yield, and storage, and (iii) the influential role of hydraulic conductivity of the unsaturated and saturated zones to the skewness and kurtosis of gravimetric variation distributions. The observed temporal dynamics of the strength of the relative contribution of system parameters to gravimetric variations suggest that gravity data have a clear potential to provide useful information for estimating the key hydraulic parameters of the system.
NASA Astrophysics Data System (ADS)
Scibek, J.; Gleeson, T. P.; Ingebritsen, S.; McKenzie, J. M.
2017-12-01
Fault zones are an important part of the hydraulic structure of the Earth's crust and influence a wide range of Earth processes and a large amount of test data has been collected over the years. We conducted a meta-analysis of global of fault zone permeabilities in the upper brittle continental crust, using about 10,000 published research items from a variety of geoscience and engineering disciplines. Using 460 datasets at 340 localities, the in-situ bulk permeabilities (>10's meters scale, including macro-fractures) and matrix permeabilities (drilled core samples or outcrop spot tests) are separated, analyzed, and compared. The values have log-normal distributions and we analyze the log-permeability values. In the fault damage zones of plutonic and metamorphic rocks the mean bulk permeability was 1x10-14m2, compared to matrix mean of 1x10-16m2. In sedimentary siliciclastic rocks the mean value was the same for bulk and matrix permeability (4x10-14m2). More useful insights were determined from the regression analysis of paired permeability data at all sites (fault damage zone vs. protolith). Much of the variation in fault permeability is explained by the permeability of protolith: in relatively weak volcaniclastic and clay-rich rocks up to 70 to 88% of the variation is explained, and only 20-30% in plutonic and metamorphic rocks. We propose a revision at shallow depths for previously published upper-bound curves for the "fault-damaged crust " and the geothermal-metamorphic rock assemblage outside of major fault zones. Although the bounding curves describe the "fault-damaged crust" permeability parameter space adequately, the only statistically significant permeability-depth trend is for plutonic and metamorphic rocks (50% of variation explained). We find a depth-dependent systematic variation of the permeability ratio (fault damage zone / protolith) from the in-situ bulk permeability global data. A moving average of the log-permeability ratio value is 2 to 2.5 (global mean is 2.2). Although the data is unevenly distributed with depth, the present evidence is that the permeability ratio is at a maximum at depths 1 to 2 kilometers, decreases with depth below 2km, and is also lower near the ground surface.
Statistical Inference for Quality-Adjusted Survival Time
2003-08-01
survival functions of QAL. If an influence function for a test statistic exists for complete data case, denoted as ’i, then a test statistic for...the survival function for the censoring variable. Zhao and Tsiatis (2001) proposed a test statistic where O is the influence function of the general...to 1 everywhere until a subject’s death. We have considered other forms of test statistics. One option is to use an influence function 0i that is
Chou, C P; Bentler, P M; Satorra, A
1991-11-01
Research studying robustness of maximum likelihood (ML) statistics in covariance structure analysis has concluded that test statistics and standard errors are biased under severe non-normality. An estimation procedure known as asymptotic distribution free (ADF), making no distributional assumption, has been suggested to avoid these biases. Corrections to the normal theory statistics to yield more adequate performance have also been proposed. This study compares the performance of a scaled test statistic and robust standard errors for two models under several non-normal conditions and also compares these with the results from ML and ADF methods. Both ML and ADF test statistics performed rather well in one model and considerably worse in the other. In general, the scaled test statistic seemed to behave better than the ML test statistic and the ADF statistic performed the worst. The robust and ADF standard errors yielded more appropriate estimates of sampling variability than the ML standard errors, which were usually downward biased, in both models under most of the non-normal conditions. ML test statistics and standard errors were found to be quite robust to the violation of the normality assumption when data had either symmetric and platykurtic distributions, or non-symmetric and zero kurtotic distributions.
Ganju, Jitendra; Yu, Xinxin; Ma, Guoguang Julie
2013-01-01
Formal inference in randomized clinical trials is based on controlling the type I error rate associated with a single pre-specified statistic. The deficiency of using just one method of analysis is that it depends on assumptions that may not be met. For robust inference, we propose pre-specifying multiple test statistics and relying on the minimum p-value for testing the null hypothesis of no treatment effect. The null hypothesis associated with the various test statistics is that the treatment groups are indistinguishable. The critical value for hypothesis testing comes from permutation distributions. Rejection of the null hypothesis when the smallest p-value is less than the critical value controls the type I error rate at its designated value. Even if one of the candidate test statistics has low power, the adverse effect on the power of the minimum p-value statistic is not much. Its use is illustrated with examples. We conclude that it is better to rely on the minimum p-value rather than a single statistic particularly when that single statistic is the logrank test, because of the cost and complexity of many survival trials. Copyright © 2013 John Wiley & Sons, Ltd.
Wilcoxon's signed-rank statistic: what null hypothesis and why it matters.
Li, Heng; Johnson, Terri
2014-01-01
In statistical literature, the term 'signed-rank test' (or 'Wilcoxon signed-rank test') has been used to refer to two distinct tests: a test for symmetry of distribution and a test for the median of a symmetric distribution, sharing a common test statistic. To avoid potential ambiguity, we propose to refer to those two tests by different names, as 'test for symmetry based on signed-rank statistic' and 'test for median based on signed-rank statistic', respectively. The utility of such terminological differentiation should become evident through our discussion of how those tests connect and contrast with sign test and one-sample t-test. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Examining Students' Perceptions of Globalization and Study Abroad Programs at HBCUs
ERIC Educational Resources Information Center
Walker, Stevon; Bukenya, James O.; Thomas, Terrence
2011-01-01
The objective of this paper is to explore students' perceptions of globalization and the study abroad programs at HBCUs (historically black colleges and universities). Recent statistics reveal that in spite of the current growth in the number of US students receiving academic credit for their overseas academic experience, less than one percent of…
Characterization of Global Near-Nadir Backscatter for Remote Sensing Radar Design
NASA Technical Reports Server (NTRS)
Spencer, Michael W.; Long, David G.
2000-01-01
In order to evaluate side-lobe contamination from the near-nadir region for Ku-Band radars, a statistical characterization of global near-nadir backscatter is constructed. This characterization is performed for a variety of surface types using data from TRMM, Seasat, and Topex. An assessment of the relative calibration accuracy of these sensors is also presented.
Characterization of Global Near-Nadir Backscatter for Remote Sensing Radar Design
NASA Technical Reports Server (NTRS)
Spencer, Michael W.; Long, David G.
2000-01-01
In order to evaluate side-lobe contamination from the near-nadir region for Ku-Band radars, a statistical characterization of global near-nadir backscatter is constructed. This characterization is performed for a variety of surface types using data from TRMM, Seasat, and Topex. An assessment of the relative calibration accuracy of them sensors is also presented.
ERIC Educational Resources Information Center
Taylor, John
2008-01-01
One of the major tasks of the United Nations Permanent Forum on Indigenous Issues (UNPFII) following its establishment in 2000 has been to establish statistical profiles of the world's Indigenous peoples. As part of this broad task, it has recommended that the Millennium Development Goals and other global reporting frameworks should be assessed…
NASA Technical Reports Server (NTRS)
Christensen, E. J.; Haines, B. J.; Mccoll, K. C.; Nerem, R. S.
1994-01-01
We have compared Global Positioning System (GPS)-based dynamic and reduced-dynamic TOPEX/Poseidon orbits over three 10-day repeat cycles of the ground-track. The results suggest that the prelaunch joint gravity model (JGM-1) introduces geographically correlated errors (GCEs) which have a strong meridional dependence. The global distribution and magnitude of these GCEs are consistent with a prelaunch covariance analysis, with estimated and predicted global rms error statistics of 2.3 and 2.4 cm rms, respectively. Repeating the analysis with the post-launch joint gravity model (JGM-2) suggests that a portion of the meridional dependence observed in JGM-1 still remains, with global rms error of 1.2 cm.
Gu, Hai Ting; Xie, Ping; Sang, Yan Fang; Wu, Zi Yi
2018-04-01
Abrupt change is an important manifestation of hydrological process with dramatic variation in the context of global climate change, the accurate recognition of which has great significance to understand hydrological process changes and carry out the actual hydrological and water resources works. The traditional method is not reliable at both ends of the samples. The results of the methods are often inconsistent. In order to solve the problem, we proposed a comprehensive weighted recognition method for hydrological abrupt change based on weighting by comparing of 12 commonly used methods for testing change points. The reliability of the method was verified by Monte Carlo statistical test. The results showed that the efficiency of the 12 methods was influenced by the factors including coefficient of variation (Cv), deviation coefficient (Cs) before the change point, mean value difference coefficient, Cv difference coefficient and Cs difference coefficient, but with no significant relationship with the mean value of the sequence. Based on the performance of each method, the weight of each test method was given following the results from statistical test. The sliding rank sum test method and the sliding run test method had the highest weight, whereas the RS test method had the lowest weight. By this means, the change points with the largest comprehensive weight could be selected as the final result when the results of the different methods were inconsistent. This method was used to analyze the daily maximum sequence of Jiajiu station in the lower reaches of the Lancang River (1-day, 3-day, 5-day, 7-day and 1-month). The results showed that each sequence had obvious jump variation in 2004, which was in agreement with the physical causes of hydrological process change and water conservancy construction. The rationality and reliability of the proposed method was verified.
Validation Test Report for the 1/8 deg Global Navy Coastal Ocean Model Nowcast/Forecast System
2007-01-24
Test Report for the 1/8° Global Navy Coastal Ocean Model Nowcast/Forecast System Charlie N. BarroN a. Birol Kara roBert C. rhodes ClarK rowley......OF ACRONYMS ......................................................................48 VALIDATION TEST REPORT FOR THE 1/8° GLOBAL NAVY COASTAL
NASA Astrophysics Data System (ADS)
Natividad, Gina May R.; Cawiding, Olive R.; Addawe, Rizavel C.
2017-11-01
The increase in the merchandise exports of the country offers information about the Philippines' trading role within the global economy. Merchandise exports statistics are used to monitor the country's overall production that is consumed overseas. This paper investigates the comparison between two models obtained by a) clustering the commodity groups into two based on its proportional contribution to the total exports, and b) treating only the total exports. Different seasonal autoregressive integrated moving average (SARIMA) models were then developed for the clustered commodities and for the total exports based on the monthly merchandise exports of the Philippines from 2011 to 2016. The data set used in this study was retrieved from the Philippine Statistics Authority (PSA) which is the central statistical authority in the country responsible for primary data collection. A test for significance of the difference between means at 0.05 level of significance was then performed on the forecasts produced. The result indicates that there is a significant difference between the mean of the forecasts of the two models. Moreover, upon a comparison of the root mean square error (RMSE) and mean absolute error (MAE) of the models, it was found that the models used for the clustered groups outperform the model for the total exports.
A Novel Strategy for Continuation ECT in Geriatric Depression: Phase 2 of the PRIDE Study.
Kellner, Charles H; Husain, Mustafa M; Knapp, Rebecca G; McCall, W Vaughn; Petrides, Georgios; Rudorfer, Matthew V; Young, Robert C; Sampson, Shirlene; McClintock, Shawn M; Mueller, Martina; Prudic, Joan; Greenberg, Robert M; Weiner, Richard D; Bailine, Samuel H; Rosenquist, Peter B; Raza, Ahmad; Kaliora, Styliani; Latoussakis, Vassilios; Tobias, Kristen G; Briggs, Mimi C; Liebman, Lauren S; Geduldig, Emma T; Teklehaimanot, Abeba A; Dooley, Mary; Lisanby, Sarah H
2016-11-01
The randomized phase (phase 2) of the Prolonging Remission in Depressed Elderly (PRIDE) study evaluated the efficacy and tolerability of continuation ECT plus medication compared with medication alone in depressed geriatric patients after a successful course of ECT (phase 1). PRIDE was a two-phase multisite study. Phase 1 was an acute course of right unilateral ultrabrief pulse ECT, augmented with venlafaxine. Phase 2 compared two randomized treatment arms: a medication only arm (venlafaxine plus lithium, over 24 weeks) and an ECT plus medication arm (four continuation ECT treatments over 1 month, plus additional ECT as needed, using the Symptom-Titrated, Algorithm-Based Longitudinal ECT [STABLE] algorithm, while continuing venlafaxine plus lithium). The intent-to-treat sample comprised 120 remitters from phase 1. The primary efficacy outcome measure was score on the 24-item Hamilton Depression Rating Scale (HAM-D), and the secondary efficacy outcome was score on the Clinical Global Impressions severity scale (CGI-S). Tolerability as measured by neurocognitive performance (reported elsewhere) was assessed using an extensive test battery; global cognitive functioning as assessed by the Mini-Mental State Examination (MMSE) is reported here. Longitudinal mixed-effects repeated-measures modeling was used to compare ECT plus medication and medication alone for efficacy and global cognitive function outcomes. At 24 weeks, the ECT plus medication group had statistically significantly lower HAM-D scores than the medication only group. The difference in adjusted mean HAM-D scores at study end was 4.2 (95% CI=1.6, 6.9). Significantly more patients in the ECT plus medication group were rated "not ill at all" on the CGI-S compared with the medication only group. There was no statistically significant difference between groups in MMSE score. Additional ECT after remission (here operationalized as four continuation ECT treatments followed by further ECT only as needed) was beneficial in sustaining mood improvement for most patients.
Walling, David; Marder, Stephen R; Kane, John; Fleischhacker, W Wolfgang; Keefe, Richard S E; Hosford, David A; Dvergsten, Chris; Segreti, Anthony C; Beaver, Jessica S; Toler, Steven M; Jett, John E; Dunbar, Geoffrey C
2016-03-01
This trial was conducted to test the effects of an alpha7 nicotinic receptor full agonist, TC-5619, on negative and cognitive symptoms in subjects with schizophrenia. In 64 sites in the United States, Russia, Ukraine, Hungary, Romania, and Serbia, 477 outpatients (18-65 years; male 62%; 55% tobacco users) with schizophrenia, treated with a new-generation antipsychotic, were randomized to 24 weeks of placebo (n = 235), TC-5619, 5mg (n = 121), or TC-5619, 50 mg (n = 121), administered orally once daily. The primary efficacy measure was the Scale for the Assessment of Negative Symptoms (SANS) composite score. Key secondary measures were the Cogstate Schizophrenia Battery (CSB) composite score and the University of California San Diego Performance-Based Skills Assessment-Brief Version (UPSA-B) total score. Secondary measures included: Positive and Negative Syndrome Scale in Schizophrenia (PANSS) total and subscale scores, SANS domain scores, CSB item scores, Clinical Global Impression-Global Improvement (CGI-I) score, CGI-Severity (CGI-S) score, and Subject Global Impression-Cognition (SGI-Cog) total score. SANS score showed no statistical benefit for TC-5619 vs placebo at week 24 (5 mg, 2-tailed P = .159; 50 mg, P = .689). Likewise, no scores of CSB, UPSA-B, PANSS, CGI-I, CGI-S, or SGI-Cog favored TC-5619 (P > .05). Sporadic statistical benefit favoring TC-5619 in some of these outcome measures were observed in tobacco users, but these benefits did not show concordance by dose, country, gender, or other relevant measures. TC-5619 was generally well tolerated. These results do not support a benefit of TC-5619 for negative or cognitive symptoms in schizophrenia. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Thomas, Diala; Bachy, Manon; Courvoisier, Aurélien; Dubory, Arnaud; Bouloussa, Houssam; Vialle, Raphaël
2015-03-01
Spinopelvic alignment is crucial in assessing an energy-efficient posture in both normal and disease states, such as high-displacement developmental spondylolisthesis (HDDS). The overall effect in patients with HDDS who have undergone local surgical correction of lumbosacral imbalance for the global correction of spinal balance remains unclear. This paper reports the progressive spontaneous improvement of global sagittal balance following surgical correction of lumbosacral imbalance in patients with HDDS. The records of 15 patients with HDDS who underwent surgery between 2005 and 2010 were reviewed. The treatment consisted of L4-sacrum reduction and fusion via a posterior approach, resulting in complete correction of lumbosacral kyphosis. Preoperative, 6-month postoperative, and final follow-up postoperative angular measurements were taken from full-spine lateral radiographs obtained with the patient in a standard standing position. Radiographic measurements included pelvic incidence, sacral slope, lumbar lordosis, and thoracic kyphosis. The degree of lumbosacral kyphosis was evaluated by the lumbosacral angle. Because of the small number of patients, nonparametric tests were considered for data analysis. Preoperative lumbosacral kyphosis and L-5 anterior slip were corrected by instrumentation. Transient neurological complications were noted in 5 patients. Statistical analysis showed a significant increase of thoracic kyphosis on 6-month postoperative and final follow-up radiographs (p < 0.001). A statistically significant decrease of lumbar lordosis was noted between preoperative and 6-month control radiographs (p < 0.001) and between preoperative and final follow-up radiographs (p < 0.001). Based on the authors' observations, this technique resulted in an effective reduction of L-5 anterior slip and significant reduction of lumbosacral kyphosis (from 69.8° to 105.13°). Due to complete reduction of lumbosacral kyphosis and anterior trunk displacement associated with L-5 anterior slipping, lumbar lordosis progressively decreased and thoracic kyphosis progressively increased postoperatively. Adjusting the sagittal trunk balance produced not only pelvic anteversion, but also reciprocal adjustment of lumbar lordosis and thoracic kyphosis, creating a satisfactory level of compensated global sagittal balance.
NASA Astrophysics Data System (ADS)
Mushtak, V. C.; Williams, E. R.
2011-12-01
Among the palette of methods (satellite, VLF, ELF) for monitoring global lightning activity, observations of the background Schumann resonances (SR) provide a unique prospect for estimating the integrated activity of global lightning activity in absolute units (coul2 km2/sec). This prospect is ensured by the SR waves' low attenuation, with wavelengths commensurate with the dimensions of dominant regional lightning "chimneys", and by the accumulating methodology for background SR techniques. Another benefit is the reduction of SR measurements into a compact set of resonance characteristics (modal frequencies, intensities, and quality factors). Suggested and tested in numerical simulations by T.R. Madden in the 1960s, the idea to invert the SR characteristics for the global lightning source has been farther developed, statistically substantiated, and practically realized here on the basis of the computing power and the quantity of experimental material way beyond what the SR pioneers had at their disposal. The critical issue of the quality of the input SR parameters is addressed by implementing a statistically substantiated sanitizing procedure to dispose of the fragments of the observed time series containing unrepresentative elements - local interference of various origin and strong ELF transients originating outside the major "chimneys" represented in the source model. As a result of preliminary research, a universal empirical sanitizing criterion has been established. Due to the fact that the actual observations have been collected from a set of individually organized ELF stations with various equipment sets and calibration techniques, the relative parameters in both input (the intensities) and output (the "chimney" activities) are being used as far as possible in the inversion process to avoid instabilities caused by calibration inconsistencies. The absolute regional activities - and so the sought for global activity in absolute units - is determined in the final stage from the estimated positions and relative activities of the modeled "chimneys" using SR power spectra at the stations with the most reliable calibrations. Additional stabilization in the procedure has been achieved by exploiting the Le Come/Goltzman inversion algorithm that uses the empirically estimated statistical characteristics of the input parameters. When applied to electric and/or magnetic observations collected simultaneously in January 2009 from six ELF stations in Poland (Belsk), Japan (Moshiri), Hungary (Nagycenk), USA (Rhode Island), India (Shillong), and Antarctica (Syowa), the inversion procedure reveals a general repeatability of diurnal lightning scenarios with variations of "chimney" centroid locations by a few megameters, while the estimated regional activity has been found to vary from day to day by up to several tens of percent. A combined empirical-theoretical analysis of the collected data aimed at selecting the most reliably calibrated ELF stations is presently in progress. All the effort is being made to transform the relative lightning activity into absolute units by the time of this meeting. The authors are greatly thankful to all the experimentalists who generously provided their observations and related information for this study.
The effect of rare variants on inflation of the test statistics in case-control analyses.
Pirie, Ailith; Wood, Angela; Lush, Michael; Tyrer, Jonathan; Pharoah, Paul D P
2015-02-20
The detection of bias due to cryptic population structure is an important step in the evaluation of findings of genetic association studies. The standard method of measuring this bias in a genetic association study is to compare the observed median association test statistic to the expected median test statistic. This ratio is inflated in the presence of cryptic population structure. However, inflation may also be caused by the properties of the association test itself particularly in the analysis of rare variants. We compared the properties of the three most commonly used association tests: the likelihood ratio test, the Wald test and the score test when testing rare variants for association using simulated data. We found evidence of inflation in the median test statistics of the likelihood ratio and score tests for tests of variants with less than 20 heterozygotes across the sample, regardless of the total sample size. The test statistics for the Wald test were under-inflated at the median for variants below the same minor allele frequency. In a genetic association study, if a substantial proportion of the genetic variants tested have rare minor allele frequencies, the properties of the association test may mask the presence or absence of bias due to population structure. The use of either the likelihood ratio test or the score test is likely to lead to inflation in the median test statistic in the absence of population structure. In contrast, the use of the Wald test is likely to result in under-inflation of the median test statistic which may mask the presence of population structure.