Sample records for global regression methods

  1. Resting-state functional magnetic resonance imaging: the impact of regression analysis.

    PubMed

    Yeh, Chia-Jung; Tseng, Yu-Sheng; Lin, Yi-Ru; Tsai, Shang-Yueh; Huang, Teng-Yi

    2015-01-01

    To investigate the impact of regression methods on resting-state functional magnetic resonance imaging (rsfMRI). During rsfMRI preprocessing, regression analysis is considered effective for reducing the interference of physiological noise on the signal time course. However, it is unclear whether the regression method benefits rsfMRI analysis. Twenty volunteers (10 men and 10 women; aged 23.4 ± 1.5 years) participated in the experiments. We used node analysis and functional connectivity mapping to assess the brain default mode network by using five combinations of regression methods. The results show that regressing the global mean plays a major role in the preprocessing steps. When a global regression method is applied, the values of functional connectivity are significantly lower (P ≤ .01) than those calculated without a global regression. This step increases inter-subject variation and produces anticorrelated brain areas. rsfMRI data processed using regression should be interpreted carefully. The significance of the anticorrelated brain areas produced by global signal removal is unclear. Copyright © 2014 by the American Society of Neuroimaging.

  2. Similar Estimates of Temperature Impacts on Global Wheat Yield by Three Independent Methods

    NASA Technical Reports Server (NTRS)

    Liu, Bing; Asseng, Senthold; Muller, Christoph; Ewart, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; hide

    2016-01-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify 'method uncertainty' in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

  3. Similar estimates of temperature impacts on global wheat yield by three independent methods

    NASA Astrophysics Data System (ADS)

    Liu, Bing; Asseng, Senthold; Müller, Christoph; Ewert, Frank; Elliott, Joshua; Lobell, David B.; Martre, Pierre; Ruane, Alex C.; Wallach, Daniel; Jones, James W.; Rosenzweig, Cynthia; Aggarwal, Pramod K.; Alderman, Phillip D.; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andy; Deryng, Delphine; Sanctis, Giacomo De; Doltra, Jordi; Fereres, Elias; Folberth, Christian; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A.; Izaurralde, Roberto C.; Jabloun, Mohamed; Jones, Curtis D.; Kersebaum, Kurt C.; Kimball, Bruce A.; Koehler, Ann-Kristin; Kumar, Soora Naresh; Nendel, Claas; O'Leary, Garry J.; Olesen, Jørgen E.; Ottman, Michael J.; Palosuo, Taru; Prasad, P. V. Vara; Priesack, Eckart; Pugh, Thomas A. M.; Reynolds, Matthew; Rezaei, Ehsan E.; Rötter, Reimund P.; Schmid, Erwin; Semenov, Mikhail A.; Shcherbak, Iurii; Stehfest, Elke; Stöckle, Claudio O.; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wall, Gerard W.; Wang, Enli; White, Jeffrey W.; Wolf, Joost; Zhao, Zhigan; Zhu, Yan

    2016-12-01

    The potential impact of global temperature change on global crop yield has recently been assessed with different methods. Here we show that grid-based and point-based simulations and statistical regressions (from historic records), without deliberate adaptation or CO2 fertilization effects, produce similar estimates of temperature impact on wheat yields at global and national scales. With a 1 °C global temperature increase, global wheat yield is projected to decline between 4.1% and 6.4%. Projected relative temperature impacts from different methods were similar for major wheat-producing countries China, India, USA and France, but less so for Russia. Point-based and grid-based simulations, and to some extent the statistical regressions, were consistent in projecting that warmer regions are likely to suffer more yield loss with increasing temperature than cooler regions. By forming a multi-method ensemble, it was possible to quantify `method uncertainty’ in addition to model uncertainty. This significantly improves confidence in estimates of climate impacts on global food security.

  4. Partial F-tests with multiply imputed data in the linear regression framework via coefficient of determination.

    PubMed

    Chaurasia, Ashok; Harel, Ofer

    2015-02-10

    Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.

  5. Similar negative impacts of temperature on global wheat yield estimated by three independent methods

    USDA-ARS?s Scientific Manuscript database

    The potential impact of global temperature change on global wheat production has recently been assessed with different methods, scaling and aggregation approaches. Here we show that grid-based simulations, point-based simulations, and statistical regressions produce similar estimates of temperature ...

  6. Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall

    NASA Astrophysics Data System (ADS)

    Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik

    2016-02-01

    Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.

  7. Correcting for Blood Arrival Time in Global Mean Regression Enhances Functional Connectivity Analysis of Resting State fMRI-BOLD Signals.

    PubMed

    Erdoğan, Sinem B; Tong, Yunjie; Hocke, Lia M; Lindsey, Kimberly P; deB Frederick, Blaise

    2016-01-01

    Resting state functional connectivity analysis is a widely used method for mapping intrinsic functional organization of the brain. Global signal regression (GSR) is commonly employed for removing systemic global variance from resting state BOLD-fMRI data; however, recent studies have demonstrated that GSR may introduce spurious negative correlations within and between functional networks, calling into question the meaning of anticorrelations reported between some networks. In the present study, we propose that global signal from resting state fMRI is composed primarily of systemic low frequency oscillations (sLFOs) that propagate with cerebral blood circulation throughout the brain. We introduce a novel systemic noise removal strategy for resting state fMRI data, "dynamic global signal regression" (dGSR), which applies a voxel-specific optimal time delay to the global signal prior to regression from voxel-wise time series. We test our hypothesis on two functional systems that are suggested to be intrinsically organized into anticorrelated networks: the default mode network (DMN) and task positive network (TPN). We evaluate the efficacy of dGSR and compare its performance with the conventional "static" global regression (sGSR) method in terms of (i) explaining systemic variance in the data and (ii) enhancing specificity and sensitivity of functional connectivity measures. dGSR increases the amount of BOLD signal variance being modeled and removed relative to sGSR while reducing spurious negative correlations introduced in reference regions by sGSR, and attenuating inflated positive connectivity measures. We conclude that incorporating time delay information for sLFOs into global noise removal strategies is of crucial importance for optimal noise removal from resting state functional connectivity maps.

  8. [Regression on order statistics and its application in estimating nondetects for food exposure assessment].

    PubMed

    Yu, Xiaojin; Liu, Pei; Min, Jie; Chen, Qiguang

    2009-01-01

    To explore the application of regression on order statistics (ROS) in estimating nondetects for food exposure assessment. Regression on order statistics was adopted in analysis of cadmium residual data set from global food contaminant monitoring, the mean residual was estimated basing SAS programming and compared with the results from substitution methods. The results show that ROS method performs better obviously than substitution methods for being robust and convenient for posterior analysis. Regression on order statistics is worth to adopt,but more efforts should be make for details of application of this method.

  9. Mental chronometry with simple linear regression.

    PubMed

    Chen, J Y

    1997-10-01

    Typically, mental chronometry is performed by means of introducing an independent variable postulated to affect selectively some stage of a presumed multistage process. However, the effect could be a global one that spreads proportionally over all stages of the process. Currently, there is no method to test this possibility although simple linear regression might serve the purpose. In the present study, the regression approach was tested with tasks (memory scanning and mental rotation) that involved a selective effect and with a task (word superiority effect) that involved a global effect, by the dominant theories. The results indicate (1) the manipulation of the size of a memory set or of angular disparity affects the intercept of the regression function that relates the times for memory scanning with different set sizes or for mental rotation with different angular disparities and (2) the manipulation of context affects the slope of the regression function that relates the times for detecting a target character under word and nonword conditions. These ratify the regression approach as a useful method for doing mental chronometry.

  10. The impact of global signal regression on resting state correlations: Are anti-correlated networks introduced?

    PubMed Central

    Murphy, Kevin; Birn, Rasmus M.; Handwerker, Daniel A.; Jones, Tyler B.; Bandettini, Peter A.

    2009-01-01

    Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step. PMID:18976716

  11. The impact of global signal regression on resting state correlations: are anti-correlated networks introduced?

    PubMed

    Murphy, Kevin; Birn, Rasmus M; Handwerker, Daniel A; Jones, Tyler B; Bandettini, Peter A

    2009-02-01

    Low-frequency fluctuations in fMRI signal have been used to map several consistent resting state networks in the brain. Using the posterior cingulate cortex as a seed region, functional connectivity analyses have found not only positive correlations in the default mode network but negative correlations in another resting state network related to attentional processes. The interpretation is that the human brain is intrinsically organized into dynamic, anti-correlated functional networks. Global variations of the BOLD signal are often considered nuisance effects and are commonly removed using a general linear model (GLM) technique. This global signal regression method has been shown to introduce negative activation measures in standard fMRI analyses. The topic of this paper is whether such a correction technique could be the cause of anti-correlated resting state networks in functional connectivity analyses. Here we show that, after global signal regression, correlation values to a seed voxel must sum to a negative value. Simulations also show that small phase differences between regions can lead to spurious negative correlation values. A combination breath holding and visual task demonstrates that the relative phase of global and local signals can affect connectivity measures and that, experimentally, global signal regression leads to bell-shaped correlation value distributions, centred on zero. Finally, analyses of negatively correlated networks in resting state data show that global signal regression is most likely the cause of anti-correlations. These results call into question the interpretation of negatively correlated regions in the brain when using global signal regression as an initial processing step.

  12. Using temporal ICA to selectively remove global noise while preserving global signal in functional MRI data.

    PubMed

    Glasser, Matthew F; Coalson, Timothy S; Bijsterbosch, Janine D; Harrison, Samuel J; Harms, Michael P; Anticevic, Alan; Van Essen, David C; Smith, Stephen M

    2018-06-02

    Temporal fluctuations in functional Magnetic Resonance Imaging (fMRI) have been profitably used to study brain activity and connectivity for over two decades. Unfortunately, fMRI data also contain structured temporal "noise" from a variety of sources, including subject motion, subject physiology, and the MRI equipment. Recently, methods have been developed to automatically and selectively remove spatially specific structured noise from fMRI data using spatial Independent Components Analysis (ICA) and machine learning classifiers. Spatial ICA is particularly effective at removing spatially specific structured noise from high temporal and spatial resolution fMRI data of the type acquired by the Human Connectome Project and similar studies. However, spatial ICA is mathematically, by design, unable to separate spatially widespread "global" structured noise from fMRI data (e.g., blood flow modulations from subject respiration). No methods currently exist to selectively and completely remove global structured noise while retaining the global signal from neural activity. This has left the field in a quandary-to do or not to do global signal regression-given that both choices have substantial downsides. Here we show that temporal ICA can selectively segregate and remove global structured noise while retaining global neural signal in both task-based and resting state fMRI data. We compare the results before and after temporal ICA cleanup to those from global signal regression and show that temporal ICA cleanup removes the global positive biases caused by global physiological noise without inducing the network-specific negative biases of global signal regression. We believe that temporal ICA cleanup provides a "best of both worlds" solution to the global signal and global noise dilemma and that temporal ICA itself unlocks interesting neurobiological insights from fMRI data. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Estimated global nitrogen deposition using NO2 column density

    USGS Publications Warehouse

    Lu, Xuehe; Jiang, Hong; Zhang, Xiuying; Liu, Jinxun; Zhang, Zhen; Jin, Jiaxin; Wang, Ying; Xu, Jianhui; Cheng, Miaomiao

    2013-01-01

    Global nitrogen deposition has increased over the past 100 years. Monitoring and simulation studies of nitrogen deposition have evaluated nitrogen deposition at both the global and regional scale. With the development of remote-sensing instruments, tropospheric NO2 column density retrieved from Global Ozone Monitoring Experiment (GOME) and Scanning Imaging Absorption Spectrometer for Atmospheric Chartography (SCIAMACHY) sensors now provides us with a new opportunity to understand changes in reactive nitrogen in the atmosphere. The concentration of NO2 in the atmosphere has a significant effect on atmospheric nitrogen deposition. According to the general nitrogen deposition calculation method, we use the principal component regression method to evaluate global nitrogen deposition based on global NO2 column density and meteorological data. From the accuracy of the simulation, about 70% of the land area of the Earth passed a significance test of regression. In addition, NO2 column density has a significant influence on regression results over 44% of global land. The simulated results show that global average nitrogen deposition was 0.34 g m−2 yr−1 from 1996 to 2009 and is increasing at about 1% per year. Our simulated results show that China, Europe, and the USA are the three hotspots of nitrogen deposition according to previous research findings. In this study, Southern Asia was found to be another hotspot of nitrogen deposition (about 1.58 g m−2 yr−1 and maintaining a high growth rate). As nitrogen deposition increases, the number of regions threatened by high nitrogen deposits is also increasing. With N emissions continuing to increase in the future, areas whose ecosystem is affected by high level nitrogen deposition will increase.

  14. Integrative eQTL analysis of tumor and host omics data in individuals with bladder cancer.

    PubMed

    Pineda, Silvia; Van Steen, Kristel; Malats, Núria

    2017-09-01

    Integrative analyses of several omics data are emerging. The data are usually generated from the same source material (i.e., tumor sample) representing one level of regulation. However, integrating different regulatory levels (i.e., blood) with those from tumor may also reveal important knowledge about the human genetic architecture. To model this multilevel structure, an integrative-expression quantitative trait loci (eQTL) analysis applying two-stage regression (2SR) was proposed. This approach first regressed tumor gene expression levels with tumor markers and the adjusted residuals from the previous model were then regressed with the germline genotypes measured in blood. Previously, we demonstrated that penalized regression methods in combination with a permutation-based MaxT method (Global-LASSO) is a promising tool to fix some of the challenges that high-throughput omics data analysis imposes. Here, we assessed whether Global-LASSO can also be applied when tumor and blood omics data are integrated. We further compared our strategy with two 2SR-approaches, one using multiple linear regression (2SR-MLR) and other using LASSO (2SR-LASSO). We applied the three models to integrate genomic, epigenomic, and transcriptomic data from tumor tissue with blood germline genotypes from 181 individuals with bladder cancer included in the TCGA Consortium. Global-LASSO provided a larger list of eQTLs than the 2SR methods, identified a previously reported eQTLs in prostate stem cell antigen (PSCA), and provided further clues on the complexity of APBEC3B loci, with a minimal false-positive rate not achieved by 2SR-MLR. It also represents an important contribution for omics integrative analysis because it is easy to apply and adaptable to any type of data. © 2017 WILEY PERIODICALS, INC.

  15. Two-step superresolution approach for surveillance face image through radial basis function-partial least squares regression and locality-induced sparse representation

    NASA Astrophysics Data System (ADS)

    Jiang, Junjun; Hu, Ruimin; Han, Zhen; Wang, Zhongyuan; Chen, Jun

    2013-10-01

    Face superresolution (SR), or face hallucination, refers to the technique of generating a high-resolution (HR) face image from a low-resolution (LR) one with the help of a set of training examples. It aims at transcending the limitations of electronic imaging systems. Applications of face SR include video surveillance, in which the individual of interest is often far from cameras. A two-step method is proposed to infer a high-quality and HR face image from a low-quality and LR observation. First, we establish the nonlinear relationship between LR face images and HR ones, according to radial basis function and partial least squares (RBF-PLS) regression, to transform the LR face into the global face space. Then, a locality-induced sparse representation (LiSR) approach is presented to enhance the local facial details once all the global faces for each LR training face are constructed. A comparison of some state-of-the-art SR methods shows the superiority of the proposed two-step approach, RBF-PLS global face regression followed by LiSR-based local patch reconstruction. Experiments also demonstrate the effectiveness under both simulation conditions and some real conditions.

  16. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  17. Evaluation of Denoising Strategies to Address Motion-Correlated Artifacts in Resting-State Functional Magnetic Resonance Imaging Data from the Human Connectome Project

    PubMed Central

    Kandala, Sridhar; Nolan, Dan; Laumann, Timothy O.; Power, Jonathan D.; Adeyemo, Babatunde; Harms, Michael P.; Petersen, Steven E.; Barch, Deanna M.

    2016-01-01

    Abstract Like all resting-state functional connectivity data, the data from the Human Connectome Project (HCP) are adversely affected by structured noise artifacts arising from head motion and physiological processes. Functional connectivity estimates (Pearson's correlation coefficients) were inflated for high-motion time points and for high-motion participants. This inflation occurred across the brain, suggesting the presence of globally distributed artifacts. The degree of inflation was further increased for connections between nearby regions compared with distant regions, suggesting the presence of distance-dependent spatially specific artifacts. We evaluated several denoising methods: censoring high-motion time points, motion regression, the FMRIB independent component analysis-based X-noiseifier (FIX), and mean grayordinate time series regression (MGTR; as a proxy for global signal regression). The results suggest that FIX denoising reduced both types of artifacts, but left substantial global artifacts behind. MGTR significantly reduced global artifacts, but left substantial spatially specific artifacts behind. Censoring high-motion time points resulted in a small reduction of distance-dependent and global artifacts, eliminating neither type. All denoising strategies left differences between high- and low-motion participants, but only MGTR substantially reduced those differences. Ultimately, functional connectivity estimates from HCP data showed spatially specific and globally distributed artifacts, and the most effective approach to address both types of motion-correlated artifacts was a combination of FIX and MGTR. PMID:27571276

  18. Identification of Drivers of Liking for Bar-Type Snacks Based on Individual Consumer Preference.

    PubMed

    Kim, Mina K; Greve, Patrick; Lee, Youngseung

    2016-01-01

    Understanding consumer hedonic responses on food products are of greatest interests in global food industry. A global partial least square regression (GPLSR) had been well accepted method for understanding consumer preferences. Recently, individual partial least square regression (IPLSR) was accepted as an alternative method of predicting consumer preferences on given food product, because it utilizes the individual differences on product acceptability. To improve the understanding of what constitutes bar-type snack preference, the relationship between sensory attributes and consumer overall liking for 12 bar-type snacks was determined. Sensory attributes that drive consumer product likings were analyzed using averaged-consumer data by GPLSR. To facilitate the interpretation of individual consumer liking, a dummy matrix for the significant weighted regression coefficients of each consumer derived from IPLSR was created. From the application of GPLSR and IPLSR, current study revealed that chocolate and cereal-flavored bars were preferred over fruit-flavored bars. Attributes connected to chocolate flavor positively influenced consumer overall likings on the global and individual consumer levels. Textural attributes affected liking only on the individual level. To fully capture the importance of sensory attributes on consumer preference, the use of GPLSR in conjunction with IPLSR is recommended. © 2015 Institute of Food Technologists®

  19. Global and system-specific resting-state fMRI fluctuations are uncorrelated: principal component analysis reveals anti-correlated networks.

    PubMed

    Carbonell, Felix; Bellec, Pierre; Shmuel, Amir

    2011-01-01

    The influence of the global average signal (GAS) on functional-magnetic resonance imaging (fMRI)-based resting-state functional connectivity is a matter of ongoing debate. The global average fluctuations increase the correlation between functional systems beyond the correlation that reflects their specific functional connectivity. Hence, removal of the GAS is a common practice for facilitating the observation of network-specific functional connectivity. This strategy relies on the implicit assumption of a linear-additive model according to which global fluctuations, irrespective of their origin, and network-specific fluctuations are super-positioned. However, removal of the GAS introduces spurious negative correlations between functional systems, bringing into question the validity of previous findings of negative correlations between fluctuations in the default-mode and the task-positive networks. Here we present an alternative method for estimating global fluctuations, immune to the complications associated with the GAS. Principal components analysis was applied to resting-state fMRI time-series. A global-signal effect estimator was defined as the principal component (PC) that correlated best with the GAS. The mean correlation coefficient between our proposed PC-based global effect estimator and the GAS was 0.97±0.05, demonstrating that our estimator successfully approximated the GAS. In 66 out of 68 runs, the PC that showed the highest correlation with the GAS was the first PC. Since PCs are orthogonal, our method provides an estimator of the global fluctuations, which is uncorrelated to the remaining, network-specific fluctuations. Moreover, unlike the regression of the GAS, the regression of the PC-based global effect estimator does not introduce spurious anti-correlations beyond the decrease in seed-based correlation values allowed by the assumed additive model. After regressing this PC-based estimator out of the original time-series, we observed robust anti-correlations between resting-state fluctuations in the default-mode and the task-positive networks. We conclude that resting-state global fluctuations and network-specific fluctuations are uncorrelated, supporting a Resting-State Linear-Additive Model. In addition, we conclude that the network-specific resting-state fluctuations of the default-mode and task-positive networks show artifact-free anti-correlations.

  20. Gene set analysis using variance component tests.

    PubMed

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  1. "Mad or bad?": burden on caregivers of patients with personality disorders.

    PubMed

    Bauer, Rita; Döring, Antje; Schmidt, Tanja; Spießl, Hermann

    2012-12-01

    The burden on caregivers of patients with personality disorders is often greatly underestimated or completely disregarded. Possibilities for caregiver support have rarely been assessed. Thirty interviews were conducted with caregivers of such patients to assess illness-related burden. Responses were analyzed with a mixed method of qualitative and quantitative analysis in a sequential design. Patient and caregiver data, including sociodemographic and disease-related variables, were evaluated with regression analysis and regression trees. Caregiver statements (n = 404) were summarized into 44 global statements. The most frequent global statements were worries about the burden on other family members (70.0%), poor cooperation with clinical centers and other institutions (60.0%), financial burden (56.7%), worry about the patient's future (53.3%), and dissatisfaction with the patient's treatment and rehabilitation (53.3%). Linear regression and regression tree analysis identified predictors for more burdened caregivers. Caregivers of patients with personality disorders experience a variety of burdens, some disorder specific. Yet these caregivers often receive little attention or support.

  2. Reduced rank regression via adaptive nuclear norm penalization

    PubMed Central

    Chen, Kun; Dong, Hongbo; Chan, Kung-Sik

    2014-01-01

    Summary We propose an adaptive nuclear norm penalization approach for low-rank matrix approximation, and use it to develop a new reduced rank estimation method for high-dimensional multivariate regression. The adaptive nuclear norm is defined as the weighted sum of the singular values of the matrix, and it is generally non-convex under the natural restriction that the weight decreases with the singular value. However, we show that the proposed non-convex penalized regression method has a global optimal solution obtained from an adaptively soft-thresholded singular value decomposition. The method is computationally efficient, and the resulting solution path is continuous. The rank consistency of and prediction/estimation performance bounds for the estimator are established for a high-dimensional asymptotic regime. Simulation studies and an application in genetics demonstrate its efficacy. PMID:25045172

  3. Estimating top-of-atmosphere thermal infrared radiance using MERRA-2 atmospheric data

    NASA Astrophysics Data System (ADS)

    Kleynhans, Tania; Montanaro, Matthew; Gerace, Aaron; Kanan, Christopher

    2017-05-01

    Thermal infrared satellite images have been widely used in environmental studies. However, satellites have limited temporal resolution, e.g., 16 day Landsat or 1 to 2 day Terra MODIS. This paper investigates the use of the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis data product, produced by NASA's Global Modeling and Assimilation Office (GMAO) to predict global topof-atmosphere (TOA) thermal infrared radiance. The high temporal resolution of the MERRA-2 data product presents opportunities for novel research and applications. Various methods were applied to estimate TOA radiance from MERRA-2 variables namely (1) a parameterized physics based method, (2) Linear regression models and (3) non-linear Support Vector Regression. Model prediction accuracy was evaluated using temporally and spatially coincident Moderate Resolution Imaging Spectroradiometer (MODIS) thermal infrared data as reference data. This research found that Support Vector Regression with a radial basis function kernel produced the lowest error rates. Sources of errors are discussed and defined. Further research is currently being conducted to train deep learning models to predict TOA thermal radiance

  4. Inferring gene regression networks with model trees

    PubMed Central

    2010-01-01

    Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database) is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear regressions to separate areas of the search space favoring to infer localized similarities over a more global similarity. Furthermore, experimental results show the good performance of REGNET. PMID:20950452

  5. Global Food Demand Scenarios for the 21st Century

    PubMed Central

    Biewald, Anne; Weindl, Isabelle; Popp, Alexander; Lotze-Campen, Hermann

    2015-01-01

    Long-term food demand scenarios are an important tool for studying global food security and for analysing the environmental impacts of agriculture. We provide a simple and transparent method to create scenarios for future plant-based and animal-based calorie demand, using time-dependent regression models between calorie demand and income. The scenarios can be customized to a specific storyline by using different input data for gross domestic product (GDP) and population projections and by assuming different functional forms of the regressions. Our results confirm that total calorie demand increases with income, but we also found a non-income related positive time-trend. The share of animal-based calories is estimated to rise strongly with income for low-income groups. For high income groups, two ambiguous relations between income and the share of animal-based products are consistent with historical data: First, a positive relation with a strong negative time-trend and second a negative relation with a slight negative time-trend. The fits of our regressions are highly significant and our results compare well to other food demand estimates. The method is exemplarily used to construct four food demand scenarios until the year 2100 based on the storylines of the IPCC Special Report on Emissions Scenarios (SRES). We find in all scenarios a strong increase of global food demand until 2050 with an increasing share of animal-based products, especially in developing countries. PMID:26536124

  6. Global Food Demand Scenarios for the 21st Century.

    PubMed

    Bodirsky, Benjamin Leon; Rolinski, Susanne; Biewald, Anne; Weindl, Isabelle; Popp, Alexander; Lotze-Campen, Hermann

    2015-01-01

    Long-term food demand scenarios are an important tool for studying global food security and for analysing the environmental impacts of agriculture. We provide a simple and transparent method to create scenarios for future plant-based and animal-based calorie demand, using time-dependent regression models between calorie demand and income. The scenarios can be customized to a specific storyline by using different input data for gross domestic product (GDP) and population projections and by assuming different functional forms of the regressions. Our results confirm that total calorie demand increases with income, but we also found a non-income related positive time-trend. The share of animal-based calories is estimated to rise strongly with income for low-income groups. For high income groups, two ambiguous relations between income and the share of animal-based products are consistent with historical data: First, a positive relation with a strong negative time-trend and second a negative relation with a slight negative time-trend. The fits of our regressions are highly significant and our results compare well to other food demand estimates. The method is exemplarily used to construct four food demand scenarios until the year 2100 based on the storylines of the IPCC Special Report on Emissions Scenarios (SRES). We find in all scenarios a strong increase of global food demand until 2050 with an increasing share of animal-based products, especially in developing countries.

  7. Load forecast method of electric vehicle charging station using SVR based on GA-PSO

    NASA Astrophysics Data System (ADS)

    Lu, Kuan; Sun, Wenxue; Ma, Changhui; Yang, Shenquan; Zhu, Zijian; Zhao, Pengfei; Zhao, Xin; Xu, Nan

    2017-06-01

    This paper presents a Support Vector Regression (SVR) method for electric vehicle (EV) charging station load forecast based on genetic algorithm (GA) and particle swarm optimization (PSO). Fuzzy C-Means (FCM) clustering is used to establish similar day samples. GA is used for global parameter searching and PSO is used for a more accurately local searching. Load forecast is then regressed using SVR. The practical load data of an EV charging station were taken to illustrate the proposed method. The result indicates an obvious improvement in the forecasting accuracy compared with SVRs based on PSO and GA exclusively.

  8. Parameterizing sorption isotherms using a hybrid global-local fitting procedure.

    PubMed

    Matott, L Shawn; Singh, Anshuman; Rabideau, Alan J

    2017-05-01

    Predictive modeling of the transport and remediation of groundwater contaminants requires an accurate description of the sorption process, which is usually provided by fitting an isotherm model to site-specific laboratory data. Commonly used calibration procedures, listed in order of increasing sophistication, include: trial-and-error, linearization, non-linear regression, global search, and hybrid global-local search. Given the considerable variability in fitting procedures applied in published isotherm studies, we investigated the importance of algorithm selection through a series of numerical experiments involving 13 previously published sorption datasets. These datasets, considered representative of state-of-the-art for isotherm experiments, had been previously analyzed using trial-and-error, linearization, or non-linear regression methods. The isotherm expressions were re-fit using a 3-stage hybrid global-local search procedure (i.e. global search using particle swarm optimization followed by Powell's derivative free local search method and Gauss-Marquardt-Levenberg non-linear regression). The re-fitted expressions were then compared to previously published fits in terms of the optimized weighted sum of squared residuals (WSSR) fitness function, the final estimated parameters, and the influence on contaminant transport predictions - where easily computed concentration-dependent contaminant retardation factors served as a surrogate measure of likely transport behavior. Results suggest that many of the previously published calibrated isotherm parameter sets were local minima. In some cases, the updated hybrid global-local search yielded order-of-magnitude reductions in the fitness function. In particular, of the candidate isotherms, the Polanyi-type models were most likely to benefit from the use of the hybrid fitting procedure. In some cases, improvements in fitness function were associated with slight (<10%) changes in parameter values, but in other cases significant (>50%) changes in parameter values were noted. Despite these differences, the influence of isotherm misspecification on contaminant transport predictions was quite variable and difficult to predict from inspection of the isotherms. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Global and System-Specific Resting-State fMRI Fluctuations Are Uncorrelated: Principal Component Analysis Reveals Anti-Correlated Networks

    PubMed Central

    Carbonell, Felix; Bellec, Pierre

    2011-01-01

    Abstract The influence of the global average signal (GAS) on functional-magnetic resonance imaging (fMRI)–based resting-state functional connectivity is a matter of ongoing debate. The global average fluctuations increase the correlation between functional systems beyond the correlation that reflects their specific functional connectivity. Hence, removal of the GAS is a common practice for facilitating the observation of network-specific functional connectivity. This strategy relies on the implicit assumption of a linear-additive model according to which global fluctuations, irrespective of their origin, and network-specific fluctuations are super-positioned. However, removal of the GAS introduces spurious negative correlations between functional systems, bringing into question the validity of previous findings of negative correlations between fluctuations in the default-mode and the task-positive networks. Here we present an alternative method for estimating global fluctuations, immune to the complications associated with the GAS. Principal components analysis was applied to resting-state fMRI time-series. A global-signal effect estimator was defined as the principal component (PC) that correlated best with the GAS. The mean correlation coefficient between our proposed PC-based global effect estimator and the GAS was 0.97±0.05, demonstrating that our estimator successfully approximated the GAS. In 66 out of 68 runs, the PC that showed the highest correlation with the GAS was the first PC. Since PCs are orthogonal, our method provides an estimator of the global fluctuations, which is uncorrelated to the remaining, network-specific fluctuations. Moreover, unlike the regression of the GAS, the regression of the PC-based global effect estimator does not introduce spurious anti-correlations beyond the decrease in seed-based correlation values allowed by the assumed additive model. After regressing this PC-based estimator out of the original time-series, we observed robust anti-correlations between resting-state fluctuations in the default-mode and the task-positive networks. We conclude that resting-state global fluctuations and network-specific fluctuations are uncorrelated, supporting a Resting-State Linear-Additive Model. In addition, we conclude that the network-specific resting-state fluctuations of the default-mode and task-positive networks show artifact-free anti-correlations. PMID:22444074

  10. Temperature Increase Reduces Global Yields of Major Crops in Four Independent Estimates

    NASA Technical Reports Server (NTRS)

    Zhao, Chuang; Liu, Bing; Piao, Shilong; Wang, Xuhui; Lobell, David B.; Huang, Yao; Huang, Mengtian; Yao, Yitong; Bassu, Simona; Ciais, Philippe; hide

    2017-01-01

    Wheat, rice, maize, and soybean provide two-thirds of human caloric intake. Assessing the impact of global temperature increase on production of these crops is therefore critical to maintaining global food supply, but different studies have yielded different results. Here, we investigated the impacts of temperature on yields of the four crops by compiling extensive published results from four analytical methods: global grid-based and local point-based models, statistical regressions, and field-warming experiments. Results from the different methods consistently showed negative temperature impacts on crop yield at the global scale, generally underpinned by similar impacts at country and site scales. Without CO2 fertilization, effective adaptation, and genetic improvement, each degree-Celsius increase in global mean temperature would, on average, reduce global yields of wheat by 6.0%, rice by 3.2%, maize by 7.4%, and soybean by 3.1%. Results are highly heterogeneous across crops and geographical areas, with some positive impact estimates. Multi-method analyses improved the confidence in assessments of future climate impacts on global major crops and suggest crop- and region-specific adaptation strategies to ensure food security for an increasing world population.

  11. A frequency domain global parameter estimation method for multiple reference frequency response measurements

    NASA Astrophysics Data System (ADS)

    Shih, C. Y.; Tsuei, Y. G.; Allemang, R. J.; Brown, D. L.

    1988-10-01

    A method of using the matrix Auto-Regressive Moving Average (ARMA) model in the Laplace domain for multiple-reference global parameter identification is presented. This method is particularly applicable to the area of modal analysis where high modal density exists. The method is also applicable when multiple reference frequency response functions are used to characterise linear systems. In order to facilitate the mathematical solution, the Forsythe orthogonal polynomial is used to reduce the ill-conditioning of the formulated equations and to decouple the normal matrix into two reduced matrix blocks. A Complex Mode Indicator Function (CMIF) is introduced, which can be used to determine the proper order of the rational polynomials.

  12. A New Global Regression Analysis Method for the Prediction of Wind Tunnel Model Weight Corrections

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Bridge, Thomas M.; Amaya, Max A.

    2014-01-01

    A new global regression analysis method is discussed that predicts wind tunnel model weight corrections for strain-gage balance loads during a wind tunnel test. The method determines corrections by combining "wind-on" model attitude measurements with least squares estimates of the model weight and center of gravity coordinates that are obtained from "wind-off" data points. The method treats the least squares fit of the model weight separate from the fit of the center of gravity coordinates. Therefore, it performs two fits of "wind- off" data points and uses the least squares estimator of the model weight as an input for the fit of the center of gravity coordinates. Explicit equations for the least squares estimators of the weight and center of gravity coordinates are derived that simplify the implementation of the method in the data system software of a wind tunnel. In addition, recommendations for sets of "wind-off" data points are made that take typical model support system constraints into account. Explicit equations of the confidence intervals on the model weight and center of gravity coordinates and two different error analyses of the model weight prediction are also discussed in the appendices of the paper.

  13. Regional vertical total electron content (VTEC) modeling together with satellite and receiver differential code biases (DCBs) using semi-parametric multivariate adaptive regression B-splines (SP-BMARS)

    NASA Astrophysics Data System (ADS)

    Durmaz, Murat; Karslioglu, Mahmut Onur

    2015-04-01

    There are various global and regional methods that have been proposed for the modeling of ionospheric vertical total electron content (VTEC). Global distribution of VTEC is usually modeled by spherical harmonic expansions, while tensor products of compactly supported univariate B-splines can be used for regional modeling. In these empirical parametric models, the coefficients of the basis functions as well as differential code biases (DCBs) of satellites and receivers can be treated as unknown parameters which can be estimated from geometry-free linear combinations of global positioning system observables. In this work we propose a new semi-parametric multivariate adaptive regression B-splines (SP-BMARS) method for the regional modeling of VTEC together with satellite and receiver DCBs, where the parametric part of the model is related to the DCBs as fixed parameters and the non-parametric part adaptively models the spatio-temporal distribution of VTEC. The latter is based on multivariate adaptive regression B-splines which is a non-parametric modeling technique making use of compactly supported B-spline basis functions that are generated from the observations automatically. This algorithm takes advantage of an adaptive scale-by-scale model building strategy that searches for best-fitting B-splines to the data at each scale. The VTEC maps generated from the proposed method are compared numerically and visually with the global ionosphere maps (GIMs) which are provided by the Center for Orbit Determination in Europe (CODE). The VTEC values from SP-BMARS and CODE GIMs are also compared with VTEC values obtained through calibration using local ionospheric model. The estimated satellite and receiver DCBs from the SP-BMARS model are compared with the CODE distributed DCBs. The results show that the SP-BMARS algorithm can be used to estimate satellite and receiver DCBs while adaptively and flexibly modeling the daily regional VTEC.

  14. A spatially explicit approach to the study of socio-demographic inequality in the spatial distribution of trees across Boston neighborhoods.

    PubMed

    Duncan, Dustin T; Kawachi, Ichiro; Kum, Susan; Aldstadt, Jared; Piras, Gianfranco; Matthews, Stephen A; Arbia, Giuseppe; Castro, Marcia C; White, Kellee; Williams, David R

    2014-04-01

    The racial/ethnic and income composition of neighborhoods often influences local amenities, including the potential spatial distribution of trees, which are important for population health and community wellbeing, particularly in urban areas. This ecological study used spatial analytical methods to assess the relationship between neighborhood socio-demographic characteristics (i.e. minority racial/ethnic composition and poverty) and tree density at the census tact level in Boston, Massachusetts (US). We examined spatial autocorrelation with the Global Moran's I for all study variables and in the ordinary least squares (OLS) regression residuals as well as computed Spearman correlations non-adjusted and adjusted for spatial autocorrelation between socio-demographic characteristics and tree density. Next, we fit traditional regressions (i.e. OLS regression models) and spatial regressions (i.e. spatial simultaneous autoregressive models), as appropriate. We found significant positive spatial autocorrelation for all neighborhood socio-demographic characteristics (Global Moran's I range from 0.24 to 0.86, all P =0.001), for tree density (Global Moran's I =0.452, P =0.001), and in the OLS regression residuals (Global Moran's I range from 0.32 to 0.38, all P <0.001). Therefore, we fit the spatial simultaneous autoregressive models. There was a negative correlation between neighborhood percent non-Hispanic Black and tree density (r S =-0.19; conventional P -value=0.016; spatially adjusted P -value=0.299) as well as a negative correlation between predominantly non-Hispanic Black (over 60% Black) neighborhoods and tree density (r S =-0.18; conventional P -value=0.019; spatially adjusted P -value=0.180). While the conventional OLS regression model found a marginally significant inverse relationship between Black neighborhoods and tree density, we found no statistically significant relationship between neighborhood socio-demographic composition and tree density in the spatial regression models. Methodologically, our study suggests the need to take into account spatial autocorrelation as findings/conclusions can change when the spatial autocorrelation is ignored. Substantively, our findings suggest no need for policy intervention vis-à-vis trees in Boston, though we hasten to add that replication studies, and more nuanced data on tree quality, age and diversity are needed.

  15. Mixed kernel function support vector regression for global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Kai; Lu, Zhenzhou; Wei, Yuhao; Shi, Yan; Zhou, Yicheng

    2017-11-01

    Global sensitivity analysis (GSA) plays an important role in exploring the respective effects of input variables on an assigned output response. Amongst the wide sensitivity analyses in literature, the Sobol indices have attracted much attention since they can provide accurate information for most models. In this paper, a mixed kernel function (MKF) based support vector regression (SVR) model is employed to evaluate the Sobol indices at low computational cost. By the proposed derivation, the estimation of the Sobol indices can be obtained by post-processing the coefficients of the SVR meta-model. The MKF is constituted by the orthogonal polynomials kernel function and Gaussian radial basis kernel function, thus the MKF possesses both the global characteristic advantage of the polynomials kernel function and the local characteristic advantage of the Gaussian radial basis kernel function. The proposed approach is suitable for high-dimensional and non-linear problems. Performance of the proposed approach is validated by various analytical functions and compared with the popular polynomial chaos expansion (PCE). Results demonstrate that the proposed approach is an efficient method for global sensitivity analysis.

  16. Monitoring heavy metal Cr in soil based on hyperspectral data using regression analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Ningyu; Xu, Fuyun; Zhuang, Shidong; He, Changwei

    2016-10-01

    Heavy metal pollution in soils is one of the most critical problems in the global ecology and environment safety nowadays. Hyperspectral remote sensing and its application is capable of high speed, low cost, less risk and less damage, and provides a good method for detecting heavy metals in soil. This paper proposed a new idea of applying regression analysis of stepwise multiple regression between the spectral data and monitoring the amount of heavy metal Cr by sample points in soil for environmental protection. In the measurement, a FieldSpec HandHeld spectroradiometer is used to collect reflectance spectra of sample points over the wavelength range of 325-1075 nm. Then the spectral data measured by the spectroradiometer is preprocessed to reduced the influence of the external factors, and the preprocessed methods include first-order differential equation, second-order differential equation and continuum removal method. The algorithms of stepwise multiple regression are established accordingly, and the accuracy of each equation is tested. The results showed that the accuracy of first-order differential equation works best, which makes it feasible to predict the content of heavy metal Cr by using stepwise multiple regression.

  17. What We Have Learned from the Recent Meta-analyses on Diagnostic Methods for Atherosclerotic Plaque Regression.

    PubMed

    Biondi-Zoccai, Giuseppe; Mastrangeli, Simona; Romagnoli, Enrico; Peruzzi, Mariangela; Frati, Giacomo; Roever, Leonardo; Giordano, Arturo

    2018-01-17

    Atherosclerosis has major morbidity and mortality implications globally. While it has often been considered an irreversible degenerative process, recent evidence provides compelling proof that atherosclerosis can be reversed. Plaque regression is however difficult to appraise and quantify, with competing diagnostic methods available. Given the potential of evidence synthesis to provide clinical guidance, we aimed to review recent meta-analyses on diagnostic methods for atherosclerotic plaque regression. We identified 8 meta-analyses published between 2015 and 2017, including 79 studies and 14,442 patients, followed for a median of 12 months. They reported on atherosclerotic plaque regression appraised with carotid duplex ultrasound, coronary computed tomography, carotid magnetic resonance, coronary intravascular ultrasound, and coronary optical coherence tomography. Overall, all meta-analyses showed significant atherosclerotic plaque regression with lipid-lowering therapy, with the most notable effects on echogenicity, lipid-rich necrotic core volume, wall/plaque volume, dense calcium volume, and fibrous cap thickness. Significant interactions were found with concomitant changes in low density lipoprotein cholesterol, high density lipoprotein cholesterol, and C-reactive protein levels, and with ethnicity. Atherosclerotic plaque regression and conversion to a stable phenotype is possible with intensive medical therapy and can be demonstrated in patients using a variety of non-invasive and invasive imaging modalities.

  18. Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes

    NASA Astrophysics Data System (ADS)

    Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.

    2015-12-01

    Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.

  19. A Modeling Approach to Global Land Surface Monitoring with Low Resolution Satellite Imaging

    NASA Technical Reports Server (NTRS)

    Hlavka, Christine A.; Dungan, Jennifer; Livingston, Gerry P.; Gore, Warren J. (Technical Monitor)

    1998-01-01

    The effects of changing land use/land cover on global climate and ecosystems due to greenhouse gas emissions and changing energy and nutrient exchange rates are being addressed by federal programs such as NASA's Mission to Planet Earth (MTPE) and by international efforts such as the International Geosphere-Biosphere Program (IGBP). The quantification of these effects depends on accurate estimates of the global extent of critical land cover types such as fire scars in tropical savannas and ponds in Arctic tundra. To address the requirement for accurate areal estimates, methods for producing regional to global maps with satellite imagery are being developed. The only practical way to produce maps over large regions of the globe is with data of coarse spatial resolution, such as Advanced Very High Resolution Radiometer (AVHRR) weather satellite imagery at 1.1 km resolution or European Remote-Sensing Satellite (ERS) radar imagery at 100 m resolution. The accuracy of pixel counts as areal estimates is in doubt, especially for highly fragmented cover types such as fire scars and ponds. Efforts to improve areal estimates from coarse resolution maps have involved regression of apparent area from coarse data versus that from fine resolution in sample areas, but it has proven difficult to acquire sufficient fine scale data to develop the regression. A method for computing accurate estimates from coarse resolution maps using little or no fine data is therefore needed.

  20. Can We Use Regression Modeling to Quantify Mean Annual Streamflow at a Global-Scale?

    NASA Astrophysics Data System (ADS)

    Barbarossa, V.; Huijbregts, M. A. J.; Hendriks, J. A.; Beusen, A.; Clavreul, J.; King, H.; Schipper, A.

    2016-12-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF using observations of discharge and catchment characteristics from 1,885 catchments worldwide, ranging from 2 to 106 km2 in size. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB [van Beek et al., 2011] by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area, mean annual precipitation and air temperature, average slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error values were lower (0.29 - 0.38 compared to 0.49 - 0.57) and the modified index of agreement was higher (0.80 - 0.83 compared to 0.72 - 0.75). Our regression model can be applied globally at any point of the river network, provided that the input parameters are within the range of values employed in the calibration of the model. The performance is reduced for water scarce regions and further research should focus on improving such an aspect for regression-based global hydrological models.

  1. Developing and testing a global-scale regression model to quantify mean annual streamflow

    NASA Astrophysics Data System (ADS)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  2. A global goodness-of-fit statistic for Cox regression models.

    PubMed

    Parzen, M; Lipsitz, S R

    1999-06-01

    In this paper, a global goodness-of-fit test statistic for a Cox regression model, which has an approximate chi-squared distribution when the model has been correctly specified, is proposed. Our goodness-of-fit statistic is global and has power to detect if interactions or higher order powers of covariates in the model are needed. The proposed statistic is similar to the Hosmer and Lemeshow (1980, Communications in Statistics A10, 1043-1069) goodness-of-fit statistic for binary data as well as Schoenfeld's (1980, Biometrika 67, 145-153) statistic for the Cox model. The methods are illustrated using data from a Mayo Clinic trial in primary billiary cirrhosis of the liver (Fleming and Harrington, 1991, Counting Processes and Survival Analysis), in which the outcome is the time until liver transplantation or death. The are 17 possible covariates. Two Cox proportional hazards models are fit to the data, and the proposed goodness-of-fit statistic is applied to the fitted models.

  3. A Study of Global Health Elective Outcomes

    PubMed Central

    Russ, Christiana M.; Tran, Tony; Silverman, Melanie; Palfrey, Judith

    2017-01-01

    Background and Objectives: To identify the effects of global health electives over a decade in a pediatric residency program. Methods: This was an anonymous email survey of the Boston Combined Residency alumni funded for global health electives from 2002 to 2011. A test for trend in binomial proportions and logistic regression were used to document associations between elective and participant characteristics and the effects of the electives. Qualitative data were also analyzed. Results: Of the 104 alumni with available email addresses, 69 (66%) responded, describing 94 electives. Elective products included 27 curricula developed, 11 conference presentations, and 7 academic publications. Thirty-two (46%) alumni continued global health work. Previous experience, previous travel to the site, number of global electives, and cumulative global elective time were associated with postresidency work in global health or with the underserved. Conclusions: Resident global electives resulted in significant scholarship and teaching and contributed to long-term career trajectories. PMID:28229096

  4. A variable structure fuzzy neural network model of squamous dysplasia and esophageal squamous cell carcinoma based on a global chaotic optimization algorithm.

    PubMed

    Moghtadaei, Motahareh; Hashemi Golpayegani, Mohammad Reza; Malekzadeh, Reza

    2013-02-07

    Identification of squamous dysplasia and esophageal squamous cell carcinoma (ESCC) is of great importance in prevention of cancer incidence. Computer aided algorithms can be very useful for identification of people with higher risks of squamous dysplasia, and ESCC. Such method can limit the clinical screenings to people with higher risks. Different regression methods have been used to predict ESCC and dysplasia. In this paper, a Fuzzy Neural Network (FNN) model is selected for ESCC and dysplasia prediction. The inputs to the classifier are the risk factors. Since the relation between risk factors in the tumor system has a complex nonlinear behavior, in comparison to most of ordinary data, the cost function of its model can have more local optimums. Thus the need for global optimization methods is more highlighted. The proposed method in this paper is a Chaotic Optimization Algorithm (COA) proceeding by the common Error Back Propagation (EBP) local method. Since the model has many parameters, we use a strategy to reduce the dependency among parameters caused by the chaotic series generator. This dependency was not considered in the previous COA methods. The algorithm is compared with logistic regression model as the latest successful methods of ESCC and dysplasia prediction. The results represent a more precise prediction with less mean and variance of error. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Temperature increase reduces global yields of major crops in four independent estimates

    PubMed Central

    Zhao, Chuang; Piao, Shilong; Wang, Xuhui; Lobell, David B.; Huang, Yao; Huang, Mengtian; Yao, Yitong; Bassu, Simona; Ciais, Philippe; Durand, Jean-Louis; Elliott, Joshua; Ewert, Frank; Janssens, Ivan A.; Li, Tao; Lin, Erda; Liu, Qiang; Martre, Pierre; Peng, Shushi; Wallach, Daniel; Wang, Tao; Wu, Donghai; Liu, Zhuo; Zhu, Yan; Zhu, Zaichun; Asseng, Senthold

    2017-01-01

    Wheat, rice, maize, and soybean provide two-thirds of human caloric intake. Assessing the impact of global temperature increase on production of these crops is therefore critical to maintaining global food supply, but different studies have yielded different results. Here, we investigated the impacts of temperature on yields of the four crops by compiling extensive published results from four analytical methods: global grid-based and local point-based models, statistical regressions, and field-warming experiments. Results from the different methods consistently showed negative temperature impacts on crop yield at the global scale, generally underpinned by similar impacts at country and site scales. Without CO2 fertilization, effective adaptation, and genetic improvement, each degree-Celsius increase in global mean temperature would, on average, reduce global yields of wheat by 6.0%, rice by 3.2%, maize by 7.4%, and soybean by 3.1%. Results are highly heterogeneous across crops and geographical areas, with some positive impact estimates. Multimethod analyses improved the confidence in assessments of future climate impacts on global major crops and suggest crop- and region-specific adaptation strategies to ensure food security for an increasing world population. PMID:28811375

  6. Temperature increase reduces global yields of major crops in four independent estimates.

    PubMed

    Zhao, Chuang; Liu, Bing; Piao, Shilong; Wang, Xuhui; Lobell, David B; Huang, Yao; Huang, Mengtian; Yao, Yitong; Bassu, Simona; Ciais, Philippe; Durand, Jean-Louis; Elliott, Joshua; Ewert, Frank; Janssens, Ivan A; Li, Tao; Lin, Erda; Liu, Qiang; Martre, Pierre; Müller, Christoph; Peng, Shushi; Peñuelas, Josep; Ruane, Alex C; Wallach, Daniel; Wang, Tao; Wu, Donghai; Liu, Zhuo; Zhu, Yan; Zhu, Zaichun; Asseng, Senthold

    2017-08-29

    Wheat, rice, maize, and soybean provide two-thirds of human caloric intake. Assessing the impact of global temperature increase on production of these crops is therefore critical to maintaining global food supply, but different studies have yielded different results. Here, we investigated the impacts of temperature on yields of the four crops by compiling extensive published results from four analytical methods: global grid-based and local point-based models, statistical regressions, and field-warming experiments. Results from the different methods consistently showed negative temperature impacts on crop yield at the global scale, generally underpinned by similar impacts at country and site scales. Without CO 2 fertilization, effective adaptation, and genetic improvement, each degree-Celsius increase in global mean temperature would, on average, reduce global yields of wheat by 6.0%, rice by 3.2%, maize by 7.4%, and soybean by 3.1%. Results are highly heterogeneous across crops and geographical areas, with some positive impact estimates. Multimethod analyses improved the confidence in assessments of future climate impacts on global major crops and suggest crop- and region-specific adaptation strategies to ensure food security for an increasing world population.

  7. Forecast and analysis of the ratio of electric energy to terminal energy consumption for global energy internet

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si

    2018-02-01

    In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.

  8. Single Image Super-Resolution Using Global Regression Based on Multiple Local Linear Mappings.

    PubMed

    Choi, Jae-Seok; Kim, Munchurl

    2017-03-01

    Super-resolution (SR) has become more vital, because of its capability to generate high-quality ultra-high definition (UHD) high-resolution (HR) images from low-resolution (LR) input images. Conventional SR methods entail high computational complexity, which makes them difficult to be implemented for up-scaling of full-high-definition input images into UHD-resolution images. Nevertheless, our previous super-interpolation (SI) method showed a good compromise between Peak-Signal-to-Noise Ratio (PSNR) performances and computational complexity. However, since SI only utilizes simple linear mappings, it may fail to precisely reconstruct HR patches with complex texture. In this paper, we present a novel SR method, which inherits the large-to-small patch conversion scheme from SI but uses global regression based on local linear mappings (GLM). Thus, our new SR method is called GLM-SI. In GLM-SI, each LR input patch is divided into 25 overlapped subpatches. Next, based on the local properties of these subpatches, 25 different local linear mappings are applied to the current LR input patch to generate 25 HR patch candidates, which are then regressed into one final HR patch using a global regressor. The local linear mappings are learned cluster-wise in our off-line training phase. The main contribution of this paper is as follows: Previously, linear-mapping-based conventional SR methods, including SI only used one simple yet coarse linear mapping to each patch to reconstruct its HR version. On the contrary, for each LR input patch, our GLM-SI is the first to apply a combination of multiple local linear mappings, where each local linear mapping is found according to local properties of the current LR patch. Therefore, it can better approximate nonlinear LR-to-HR mappings for HR patches with complex texture. Experiment results show that the proposed GLM-SI method outperforms most of the state-of-the-art methods, and shows comparable PSNR performance with much lower computational complexity when compared with a super-resolution method based on convolutional neural nets (SRCNN15). Compared with the previous SI method that is limited with a scale factor of 2, GLM-SI shows superior performance with average 0.79 dB higher in PSNR, and can be used for scale factors of 3 or higher.

  9. Local Intrinsic Dimension Estimation by Generalized Linear Modeling.

    PubMed

    Hino, Hideitsu; Fujiki, Jun; Akaho, Shotaro; Murata, Noboru

    2017-07-01

    We propose a method for intrinsic dimension estimation. By fitting the power of distance from an inspection point and the number of samples included inside a ball with a radius equal to the distance, to a regression model, we estimate the goodness of fit. Then, by using the maximum likelihood method, we estimate the local intrinsic dimension around the inspection point. The proposed method is shown to be comparable to conventional methods in global intrinsic dimension estimation experiments. Furthermore, we experimentally show that the proposed method outperforms a conventional local dimension estimation method.

  10. Improving Global Models of Remotely Sensed Ocean Chlorophyll Content Using Partial Least Squares and Geographically Weighted Regression

    NASA Astrophysics Data System (ADS)

    Gholizadeh, H.; Robeson, S. M.

    2015-12-01

    Empirical models have been widely used to estimate global chlorophyll content from remotely sensed data. Here, we focus on the standard NASA empirical models that use blue-green band ratios. These band ratio ocean color (OC) algorithms are in the form of fourth-order polynomials and the parameters of these polynomials (i.e. coefficients) are estimated from the NASA bio-Optical Marine Algorithm Data set (NOMAD). Most of the points in this data set have been sampled from tropical and temperate regions. However, polynomial coefficients obtained from this data set are used to estimate chlorophyll content in all ocean regions with different properties such as sea-surface temperature, salinity, and downwelling/upwelling patterns. Further, the polynomial terms in these models are highly correlated. In sum, the limitations of these empirical models are as follows: 1) the independent variables within the empirical models, in their current form, are correlated (multicollinear), and 2) current algorithms are global approaches and are based on the spatial stationarity assumption, so they are independent of location. Multicollinearity problem is resolved by using partial least squares (PLS). PLS, which transforms the data into a set of independent components, can be considered as a combined form of principal component regression (PCR) and multiple regression. Geographically weighted regression (GWR) is also used to investigate the validity of spatial stationarity assumption. GWR solves a regression model over each sample point by using the observations within its neighbourhood. PLS results show that the empirical method underestimates chlorophyll content in high latitudes, including the Southern Ocean region, when compared to PLS (see Figure 1). Cluster analysis of GWR coefficients also shows that the spatial stationarity assumption in empirical models is not likely a valid assumption.

  11. A spatially explicit approach to the study of socio-demographic inequality in the spatial distribution of trees across Boston neighborhoods

    PubMed Central

    Duncan, Dustin T.; Kawachi, Ichiro; Kum, Susan; Aldstadt, Jared; Piras, Gianfranco; Matthews, Stephen A.; Arbia, Giuseppe; Castro, Marcia C.; White, Kellee; Williams, David R.

    2017-01-01

    The racial/ethnic and income composition of neighborhoods often influences local amenities, including the potential spatial distribution of trees, which are important for population health and community wellbeing, particularly in urban areas. This ecological study used spatial analytical methods to assess the relationship between neighborhood socio-demographic characteristics (i.e. minority racial/ethnic composition and poverty) and tree density at the census tact level in Boston, Massachusetts (US). We examined spatial autocorrelation with the Global Moran’s I for all study variables and in the ordinary least squares (OLS) regression residuals as well as computed Spearman correlations non-adjusted and adjusted for spatial autocorrelation between socio-demographic characteristics and tree density. Next, we fit traditional regressions (i.e. OLS regression models) and spatial regressions (i.e. spatial simultaneous autoregressive models), as appropriate. We found significant positive spatial autocorrelation for all neighborhood socio-demographic characteristics (Global Moran’s I range from 0.24 to 0.86, all P=0.001), for tree density (Global Moran’s I=0.452, P=0.001), and in the OLS regression residuals (Global Moran’s I range from 0.32 to 0.38, all P<0.001). Therefore, we fit the spatial simultaneous autoregressive models. There was a negative correlation between neighborhood percent non-Hispanic Black and tree density (rS=−0.19; conventional P-value=0.016; spatially adjusted P-value=0.299) as well as a negative correlation between predominantly non-Hispanic Black (over 60% Black) neighborhoods and tree density (rS=−0.18; conventional P-value=0.019; spatially adjusted P-value=0.180). While the conventional OLS regression model found a marginally significant inverse relationship between Black neighborhoods and tree density, we found no statistically significant relationship between neighborhood socio-demographic composition and tree density in the spatial regression models. Methodologically, our study suggests the need to take into account spatial autocorrelation as findings/conclusions can change when the spatial autocorrelation is ignored. Substantively, our findings suggest no need for policy intervention vis-à-vis trees in Boston, though we hasten to add that replication studies, and more nuanced data on tree quality, age and diversity are needed. PMID:29354668

  12. Locally linear regression for pose-invariant face recognition.

    PubMed

    Chai, Xiujuan; Shan, Shiguang; Chen, Xilin; Gao, Wen

    2007-07-01

    The variation of facial appearance due to the viewpoint (/pose) degrades face recognition systems considerably, which is one of the bottlenecks in face recognition. One of the possible solutions is generating virtual frontal view from any given nonfrontal view to obtain a virtual gallery/probe face. Following this idea, this paper proposes a simple, but efficient, novel locally linear regression (LLR) method, which generates the virtual frontal view from a given nonfrontal face image. We first justify the basic assumption of the paper that there exists an approximate linear mapping between a nonfrontal face image and its frontal counterpart. Then, by formulating the estimation of the linear mapping as a prediction problem, we present the regression-based solution, i.e., globally linear regression. To improve the prediction accuracy in the case of coarse alignment, LLR is further proposed. In LLR, we first perform dense sampling in the nonfrontal face image to obtain many overlapped local patches. Then, the linear regression technique is applied to each small patch for the prediction of its virtual frontal patch. Through the combination of all these patches, the virtual frontal view is generated. The experimental results on the CMU PIE database show distinct advantage of the proposed method over Eigen light-field method.

  13. Optimization of fixture layouts of glass laser optics using multiple kernel regression.

    PubMed

    Su, Jianhua; Cao, Enhua; Qiao, Hong

    2014-05-10

    We aim to build an integrated fixturing model to describe the structural properties and thermal properties of the support frame of glass laser optics. Therefore, (a) a near global optimal set of clamps can be computed to minimize the surface shape error of the glass laser optic based on the proposed model, and (b) a desired surface shape error can be obtained by adjusting the clamping forces under various environmental temperatures based on the model. To construct the model, we develop a new multiple kernel learning method and call it multiple kernel support vector functional regression. The proposed method uses two layer regressions to group and order the data sources by the weights of the kernels and the factors of the layers. Because of that, the influences of the clamps and the temperature can be evaluated by grouping them into different layers.

  14. Estimating the exceedance probability of rain rate by logistic regression

    NASA Technical Reports Server (NTRS)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  15. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    NASA Astrophysics Data System (ADS)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  16. Acoustic-articulatory mapping in vowels by locally weighted regression

    PubMed Central

    McGowan, Richard S.; Berger, Michael A.

    2009-01-01

    A method for mapping between simultaneously measured articulatory and acoustic data is proposed. The method uses principal components analysis on the articulatory and acoustic variables, and mapping between the domains by locally weighted linear regression, or loess [Cleveland, W. S. (1979). J. Am. Stat. Assoc. 74, 829–836]. The latter method permits local variation in the slopes of the linear regression, assuming that the function being approximated is smooth. The methodology is applied to vowels of four speakers in the Wisconsin X-ray Microbeam Speech Production Database, with formant analysis. Results are examined in terms of (1) examples of forward (articulation-to-acoustics) mappings and inverse mappings, (2) distributions of local slopes and constants, (3) examples of correlations among slopes and constants, (4) root-mean-square error, and (5) sensitivity of formant frequencies to articulatory change. It is shown that the results are qualitatively correct and that loess performs better than global regression. The forward mappings show different root-mean-square error properties than the inverse mappings indicating that this method is better suited for the forward mappings than the inverse mappings, at least for the data chosen for the current study. Some preliminary results on sensitivity of the first two formant frequencies to the two most important articulatory principal components are presented. PMID:19813812

  17. Solar radiation over Egypt: Comparison of predicted and measured meteorological data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamel, M.A.; Shalaby, S.A.; Mostafa, S.S.

    1993-06-01

    Measurements of global solar irradiance on a horizontal surface at five meteorological stations in Egypt for three years 1987, 1988, and 1989 are compared with their corresponding values computed by two independent methods. The first method is based on the Angstrom formula, which correlates relative solar irradiance H/H[sub o] to corresponding relative duration of bright sunshine n/N. Regional regression coefficients are obtained and used for prediction of global solar irradiance. Good agreement with measurements is obtained. In the second method an empirical relation, in which sunshine duration and the noon altitude of the sun as inputs together with appropriate choicemore » of zone parameters, is employed. This gives good agreement with the measurements. Comparison shows that the first method gives better fitting with the experimental data.« less

  18. An Analysis of San Diego's Housing Market Using a Geographically Weighted Regression Approach

    NASA Astrophysics Data System (ADS)

    Grant, Christina P.

    San Diego County real estate transaction data was evaluated with a set of linear models calibrated by ordinary least squares and geographically weighted regression (GWR). The goal of the analysis was to determine whether the spatial effects assumed to be in the data are best studied globally with no spatial terms, globally with a fixed effects submarket variable, or locally with GWR. 18,050 single-family residential sales which closed in the six months between April 2014 and September 2014 were used in the analysis. Diagnostic statistics including AICc, R2, Global Moran's I, and visual inspection of diagnostic plots and maps indicate superior model performance by GWR as compared to both global regressions.

  19. An Optimization-Based Method for Feature Ranking in Nonlinear Regression Problems.

    PubMed

    Bravi, Luca; Piccialli, Veronica; Sciandrone, Marco

    2017-04-01

    In this paper, we consider the feature ranking problem, where, given a set of training instances, the task is to associate a score with the features in order to assess their relevance. Feature ranking is a very important tool for decision support systems, and may be used as an auxiliary step of feature selection to reduce the high dimensionality of real-world data. We focus on regression problems by assuming that the process underlying the generated data can be approximated by a continuous function (for instance, a feedforward neural network). We formally state the notion of relevance of a feature by introducing a minimum zero-norm inversion problem of a neural network, which is a nonsmooth, constrained optimization problem. We employ a concave approximation of the zero-norm function, and we define a smooth, global optimization problem to be solved in order to assess the relevance of the features. We present the new feature ranking method based on the solution of instances of the global optimization problem depending on the available training data. Computational experiments on both artificial and real data sets are performed, and point out that the proposed feature ranking method is a valid alternative to existing methods in terms of effectiveness. The obtained results also show that the method is costly in terms of CPU time, and this may be a limitation in the solution of large-dimensional problems.

  20. Hierarchical Bayesian modelling of mobility metrics for hazard model input calibration

    NASA Astrophysics Data System (ADS)

    Calder, Eliza; Ogburn, Sarah; Spiller, Elaine; Rutarindwa, Regis; Berger, Jim

    2015-04-01

    In this work we present a method to constrain flow mobility input parameters for pyroclastic flow models using hierarchical Bayes modeling of standard mobility metrics such as H/L and flow volume etc. The advantage of hierarchical modeling is that it can leverage the information in global dataset for a particular mobility metric in order to reduce the uncertainty in modeling of an individual volcano, especially important where individual volcanoes have only sparse datasets. We use compiled pyroclastic flow runout data from Colima, Merapi, Soufriere Hills, Unzen and Semeru volcanoes, presented in an open-source database FlowDat (https://vhub.org/groups/massflowdatabase). While the exact relationship between flow volume and friction varies somewhat between volcanoes, dome collapse flows originating from the same volcano exhibit similar mobility relationships. Instead of fitting separate regression models for each volcano dataset, we use a variation of the hierarchical linear model (Kass and Steffey, 1989). The model presents a hierarchical structure with two levels; all dome collapse flows and dome collapse flows at specific volcanoes. The hierarchical model allows us to assume that the flows at specific volcanoes share a common distribution of regression slopes, then solves for that distribution. We present comparisons of the 95% confidence intervals on the individual regression lines for the data set from each volcano as well as those obtained from the hierarchical model. The results clearly demonstrate the advantage of considering global datasets using this technique. The technique developed is demonstrated here for mobility metrics, but can be applied to many other global datasets of volcanic parameters. In particular, such methods can provide a means to better contain parameters for volcanoes for which we only have sparse data, a ubiquitous problem in volcanology.

  1. Discrete post-processing of total cloud cover ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Haiden, Thomas; Pappenberger, Florian

    2017-04-01

    This contribution presents an approach to post-process ensemble forecasts for the discrete and bounded weather variable of total cloud cover. Two methods for discrete statistical post-processing of ensemble predictions are tested. The first approach is based on multinomial logistic regression, the second involves a proportional odds logistic regression model. Applying them to total cloud cover raw ensemble forecasts from the European Centre for Medium-Range Weather Forecasts improves forecast skill significantly. Based on station-wise post-processing of raw ensemble total cloud cover forecasts for a global set of 3330 stations over the period from 2007 to early 2014, the more parsimonious proportional odds logistic regression model proved to slightly outperform the multinomial logistic regression model. Reference Hemri, S., Haiden, T., & Pappenberger, F. (2016). Discrete post-processing of total cloud cover ensemble forecasts. Monthly Weather Review 144, 2565-2577.

  2. Multiple imputation for cure rate quantile regression with censored data.

    PubMed

    Wu, Yuanshan; Yin, Guosheng

    2017-03-01

    The main challenge in the context of cure rate analysis is that one never knows whether censored subjects are cured or uncured, or whether they are susceptible or insusceptible to the event of interest. Considering the susceptible indicator as missing data, we propose a multiple imputation approach to cure rate quantile regression for censored data with a survival fraction. We develop an iterative algorithm to estimate the conditionally uncured probability for each subject. By utilizing this estimated probability and Bernoulli sample imputation, we can classify each subject as cured or uncured, and then employ the locally weighted method to estimate the quantile regression coefficients with only the uncured subjects. Repeating the imputation procedure multiple times and taking an average over the resultant estimators, we obtain consistent estimators for the quantile regression coefficients. Our approach relaxes the usual global linearity assumption, so that we can apply quantile regression to any particular quantile of interest. We establish asymptotic properties for the proposed estimators, including both consistency and asymptotic normality. We conduct simulation studies to assess the finite-sample performance of the proposed multiple imputation method and apply it to a lung cancer study as an illustration. © 2016, The International Biometric Society.

  3. Confidence limits for data mining models of options prices

    NASA Astrophysics Data System (ADS)

    Healy, J. V.; Dixon, M.; Read, B. J.; Cai, F. F.

    2004-12-01

    Non-parametric methods such as artificial neural nets can successfully model prices of financial options, out-performing the Black-Scholes analytic model (Eur. Phys. J. B 27 (2002) 219). However, the accuracy of such approaches is usually expressed only by a global fitting/error measure. This paper describes a robust method for determining prediction intervals for models derived by non-linear regression. We have demonstrated it by application to a standard synthetic example (29th Annual Conference of the IEEE Industrial Electronics Society, Special Session on Intelligent Systems, pp. 1926-1931). The method is used here to obtain prediction intervals for option prices using market data for LIFFE “ESX” FTSE 100 index options ( http://www.liffe.com/liffedata/contracts/month_onmonth.xls). We avoid special neural net architectures and use standard regression procedures to determine local error bars. The method is appropriate for target data with non constant variance (or volatility).

  4. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  5. Economic and Health Predictors of National Postpartum Depression Prevalence: A Systematic Review, Meta-analysis, and Meta-Regression of 291 Studies from 56 Countries.

    PubMed

    Hahn-Holbrook, Jennifer; Cornwell-Hinrichs, Taylor; Anaya, Itzel

    2017-01-01

    Postpartum depression (PPD) poses a major global public health challenge. PPD is the most common complication associated with childbirth and exerts harmful effects on children. Although hundreds of PPD studies have been published, we lack accurate global or national PPD prevalence estimates and have no clear account of why PPD appears to vary so dramatically between nations. Accordingly, we conducted a meta-analysis to estimate the global and national prevalence of PPD and a meta-regression to identify economic, health, social, or policy factors associated with national PPD prevalence. We conducted a systematic review of all papers reporting PPD prevalence using the Edinburgh Postnatal Depression Scale. PPD prevalence and methods were extracted from each study. Random effects meta-analysis was used to estimate global and national PPD prevalence. To test for country level predictors, we drew on data from UNICEF, WHO, and the World Bank. Random effects meta-regression was used to test national predictors of PPD prevalence. 291 studies of 296284 women from 56 countries were identified. The global pooled prevalence of PPD was 17.7% (95% confidence interval: 16.6-18.8%), with significant heterogeneity across nations ( Q  = 16,823, p  = 0.000, I 2  = 98%), ranging from 3% (2-5%) in Singapore to 38% (35-41%) in Chile. Nations with significantly higher rates of income inequality ( R 2  = 41%), maternal mortality ( R 2  = 19%), infant mortality ( R 2  = 16%), or women of childbearing age working ≥40 h a week ( R 2  = 31%) have higher rates of PPD. Together, these factors explain 73% of the national variation in PPD prevalence. The global prevalence of PPD is greater than previously thought and varies dramatically by nation. Disparities in wealth inequality and maternal-child-health factors explain much of the national variation in PPD prevalence.

  6. The eMLR(C*) Method to Determine Decadal Changes in the Global Ocean Storage of Anthropogenic CO2

    NASA Astrophysics Data System (ADS)

    Clement, Dominic; Gruber, Nicolas

    2018-04-01

    The determination of the decadal change in anthropogenic CO2 in the global ocean from repeat hydrographic surveys represents a formidable challenge, which we address here by introducing a seamless new method. This method builds on the extended multiple linear regression (eMLR) approach to identify the anthropogenic CO2 signal, but in order to improve the robustness of this method, we fit C∗ rather than dissolved inorganic carbon and use a probabilistic method for the selection of the predictors. In order to account for the multiyear nature of the surveys, we adjust all C∗ observations of a particular observing period to a common reference year by assuming a transient steady state. We finally use the eMLR models together with global gridded climatological distributions of the predictors to map the estimated change in anthropogenic CO2 to the global ocean. Testing this method with synthetic data generated from a hindcast simulation with an ocean model reveals that the method is able to reconstruct the change in anthropogenic CO2 with only a small global bias (<5%). Within ocean basins, the errors can be larger, mostly driven by changes in ocean circulation. Overall, we conclude from the model that the method has an accuracy of retrieving the column integrated change in anthropogenic CO2 of about ±10% at the scale of whole ocean basins. We expect that this uncertainty needs to be doubled to about ±20% when the change in anthropogenic CO2 is reconstructed from observations.

  7. Applications of Monte Carlo method to nonlinear regression of rheological data

    NASA Astrophysics Data System (ADS)

    Kim, Sangmo; Lee, Junghaeng; Kim, Sihyun; Cho, Kwang Soo

    2018-02-01

    In rheological study, it is often to determine the parameters of rheological models from experimental data. Since both rheological data and values of the parameters vary in logarithmic scale and the number of the parameters is quite large, conventional method of nonlinear regression such as Levenberg-Marquardt (LM) method is usually ineffective. The gradient-based method such as LM is apt to be caught in local minima which give unphysical values of the parameters whenever the initial guess of the parameters is far from the global optimum. Although this problem could be solved by simulated annealing (SA), the Monte Carlo (MC) method needs adjustable parameter which could be determined in ad hoc manner. We suggest a simplified version of SA, a kind of MC methods which results in effective values of the parameters of most complicated rheological models such as the Carreau-Yasuda model of steady shear viscosity, discrete relaxation spectrum and zero-shear viscosity as a function of concentration and molecular weight.

  8. Global Monitoring of Water Supply and Sanitation: History, Methods and Future Challenges

    PubMed Central

    Bartram, Jamie; Brocklehurst, Clarissa; Fisher, Michael B.; Luyendijk, Rolf; Hossain, Rifat; Wardlaw, Tessa; Gordon, Bruce

    2014-01-01

    International monitoring of drinking water and sanitation shapes awareness of countries’ needs and informs policy, implementation and research efforts to extend and improve services. The Millennium Development Goals established global targets for drinking water and sanitation access; progress towards these targets, facilitated by international monitoring, has contributed to reducing the global disease burden and increasing quality of life. The experiences of the MDG period generated important lessons about the strengths and limitations of current approaches to defining and monitoring access to drinking water and sanitation. The methods by which the Joint Monitoring Programme (JMP) of WHO and UNICEF tracks access and progress are based on analysis of data from household surveys and linear regression modelling of these results over time. These methods provide nationally-representative and internationally-comparable insights into the drinking water and sanitation facilities used by populations worldwide, but also have substantial limitations: current methods do not address water quality, equity of access, or extra-household services. Improved statistical methods are needed to better model temporal trends. This article describes and critically reviews JMP methods in detail for the first time. It also explores the impact of, and future directions for, international monitoring of drinking water and sanitation. PMID:25116635

  9. Vegetation Continuous Fields--Transitioning from MODIS to VIIRS

    NASA Astrophysics Data System (ADS)

    DiMiceli, C.; Townshend, J. R.; Sohlberg, R. A.; Kim, D. H.; Kelly, M.

    2015-12-01

    Measurements of fractional vegetation cover are critical for accurate and consistent monitoring of global deforestation rates. They also provide important parameters for land surface, climate and carbon models and vital background data for research into fire, hydrological and ecosystem processes. MODIS Vegetation Continuous Fields (VCF) products provide four complementary layers of fractional cover: tree cover, non-tree vegetation, bare ground, and surface water. MODIS VCF products are currently produced globally and annually at 250m resolution for 2000 to the present. Additionally, annual VCF products at 1/20° resolution derived from AVHRR and MODIS Long-Term Data Records are in development to provide Earth System Data Records of fractional vegetation cover for 1982 to the present. In order to provide continuity of these valuable products, we are extending the VCF algorithms to create Suomi NPP/VIIRS VCF products. This presentation will highlight the first VIIRS fractional cover product: global percent tree cover at 1 km resolution. To create this product, phenological and physiological metrics were derived from each complete year of VIIRS 8-day surface reflectance products. A supervised regression tree method was applied to the metrics, using training derived from Landsat data supplemented by high-resolution data from Ikonos, RapidEye and QuickBird. The regression tree model was then applied globally to produce fractional tree cover. In our presentation we will detail our methods for creating the VIIRS VCF product. We will compare the new VIIRS VCF product to our current MODIS VCF products and demonstrate continuity between instruments. Finally, we will outline future VIIRS VCF development plans.

  10. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  11. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE PAGES

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...

    2017-05-15

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  12. A Statistical Multimodel Ensemble Approach to Improving Long-Range Forecasting in Pakistan

    DTIC Science & Technology

    2012-03-01

    Impact of global warming on monsoon variability in Pakistan. J. Anim. Pl. Sci., 21, no. 1, 107–110. Gillies, S., T. Murphree, and D. Meyer, 2012...are generated by multiple regression models that relate globally distributed oceanic and atmospheric predictors to local predictands. The...generated by multiple regression models that relate globally distributed oceanic and atmospheric predictors to local predictands. The predictands are

  13. A suite of global reconstructed precipitation products and their error estimate by multivariate regression using empirical orthogonal functions: 1850-present

    NASA Astrophysics Data System (ADS)

    Shen, S. S.

    2014-12-01

    This presentation describes a suite of global precipitation products reconstructed by a multivariate regression method using an empirical orthogonal function (EOF) expansion. The sampling errors of the reconstruction are estimated for each product datum entry. The maximum temporal coverage is 1850-present and the spatial coverage is quasi-global (75S, 75N). The temporal resolution ranges from 5-day, monthly, to seasonal and annual. The Global Precipitation Climatology Project (GPCP) precipitation data from 1979-2008 are used to calculate the EOFs. The Global Historical Climatology Network (GHCN) gridded data are used to calculate the regression coefficients for reconstructions. The sampling errors of the reconstruction are analyzed in detail for different EOF modes. Our reconstructed 1900-2011 time series of the global average annual precipitation shows a 0.024 (mm/day)/100a trend, which is very close to the trend derived from the mean of 25 models of the CMIP5 (Coupled Model Intercomparison Project Phase 5). Our reconstruction examples of 1983 El Niño precipitation and 1917 La Niña precipitation (Figure 1) demonstrate that the El Niño and La Niña precipitation patterns are well reflected in the first two EOFs. The validation of our reconstruction results with GPCP makes it possible to use the reconstruction as the benchmark data for climate models. This will help the climate modeling community to improve model precipitation mechanisms and reduce the systematic difference between observed global precipitation, which hovers at around 2.7 mm/day for reconstructions and GPCP, and model precipitations, which have a range of 2.6-3.3 mm/day for CMIP5. Our precipitation products are publically available online, including digital data, precipitation animations, computer codes, readme files, and the user manual. This work is a joint effort between San Diego State University (Sam Shen, Nancy Tafolla, Barbara Sperberg, and Melanie Thorn) and University of Maryland (Phil Arkin, Tom Smith, Li Ren, and Li Dai) and supported in part by the U.S. National Science Foundation (Awards No. AGS-1015926 and AGS-1015957).

  14. A Global Study of GPP focusing on Light Use Efficiency in a Random Forest Regression Model

    NASA Astrophysics Data System (ADS)

    Fang, W.; Wei, S.; Yi, C.; Hendrey, G. R.

    2016-12-01

    Light use efficiency (LUE) is at the core of mechanistic modeling of global gross primary production (GPP). However, most LUE estimates in global models are satellite-based and coarsely measured with emphasis on environmental variables. Others are from eddy covariance towers with much greater spatial and temporal data quality and emphasis on mechanistic processes, but in a limited number of sites. In this paper, we conducted a comprehensive global study of tower-based LUE from 237 FLUXNET towers, and scaled up LUEs from in-situ tower level to global biome level. We integrated key environmental and biological variables into the tower-based LUE estimates, at 0.5o x 0.5o grid-cell resolution, using a random forest regression (RFR) approach. We then developed an RFR-LUE-GPP model using the grid-cell LUE data, and compared it to a tower-LUE-GPP model by the conventional way of treating LUE as a series of biome-specific constants. In order to calibrate the LUE models, we developed a data-driven RFR-GPP model using a random forest regression method. Our results showed that LUE varies largely with latitude. We estimated a global area-weighted average of LUE at 1.21 gC m-2 MJ-1 APAR, which led to an estimated global GPP of 102.9 Gt C /year from 2000 to 2005. The tower-LUE-GPP model tended to overestimate forest GPP in tropical and boreal regions. Large uncertainties exist in GPP estimates over sparsely vegetated areas covered by savannas and woody savannas around the middle to low latitudes (i.g. 20oS to 40oS and 5oN to 15oN) due to lack of available data. Model results were improved by incorporating Köppen climate types to represent climate /meteorological information in machine learning modeling. This shed new light on the recognized issues of climate dependence of spring onset of photosynthesis and the challenges in modeling the biome GPP of evergreen broad leaf forests (EBF) accurately. The divergent responses of GPP to temperature and precipitation at mid-high latitudes and at mid-low latitudes echoed the necessity of modeling GPP separately by latitudes. This work provided a global distribution of LUE estimate, and developed a comprehensive algorithm modeling global terrestrial carbon with high spatial and temporal resolutions.

  15. Gradient descent for robust kernel-based regression

    NASA Astrophysics Data System (ADS)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  16. Quantification of the impact of a confounding variable on functional connectivity confirms anti-correlated networks in the resting-state.

    PubMed

    Carbonell, F; Bellec, P; Shmuel, A

    2014-02-01

    The effect of regressing out the global average signal (GAS) in resting state fMRI data has become a concern for interpreting functional connectivity analyses. It is not clear whether the reported anti-correlations between the Default Mode and the Dorsal Attention Networks are intrinsic to the brain, or are artificially created by regressing out the GAS. Here we introduce a concept, Impact of the Global Average on Functional Connectivity (IGAFC), for quantifying the sensitivity of seed-based correlation analyses to the regression of the GAS. This voxel-wise IGAFC index is defined as the product of two correlation coefficients: the correlation between the GAS and the fMRI time course of a voxel, times the correlation between the GAS and the seed time course. This definition enables the calculation of a threshold at which the impact of regressing-out the GAS would be large enough to introduce spurious negative correlations. It also yields a post-hoc impact correction procedure via thresholding, which eliminates spurious correlations introduced by regressing out the GAS. In addition, we introduce an Artificial Negative Correlation Index (ANCI), defined as the absolute difference between the IGAFC index and the impact threshold. The ANCI allows a graded confidence scale for ranking voxels according to their likelihood of showing artificial correlations. By applying this method, we observed regions in the Default Mode and Dorsal Attention Networks that were anti-correlated. These findings confirm that the previously reported negative correlations between the Dorsal Attention and Default Mode Networks are intrinsic to the brain and not the result of statistical manipulations. Our proposed quantification of the impact that a confound may have on functional connectivity can be generalized to global effect estimators other than the GAS. It can be readily applied to other confounds, such as systemic physiological or head movement interferences, in order to quantify their impact on functional connectivity in the resting state. © 2013.

  17. Improved model of the retardance in citric acid coated ferrofluids using stepwise regression

    NASA Astrophysics Data System (ADS)

    Lin, J. F.; Qiu, X. R.

    2017-06-01

    Citric acid (CA) coated Fe3O4 ferrofluids (FFs) have been conducted for biomedical application. The magneto-optical retardance of CA coated FFs was measured by a Stokes polarimeter. Optimization and multiple regression of retardance in FFs were executed by Taguchi method and Microsoft Excel previously, and the F value of regression model was large enough. However, the model executed by Excel was not systematic. Instead we adopted the stepwise regression to model the retardance of CA coated FFs. From the results of stepwise regression by MATLAB, the developed model had highly predictable ability owing to F of 2.55897e+7 and correlation coefficient of one. The average absolute error of predicted retardances to measured retardances was just 0.0044%. Using the genetic algorithm (GA) in MATLAB, the optimized parametric combination was determined as [4.709 0.12 39.998 70.006] corresponding to the pH of suspension, molar ratio of CA to Fe3O4, CA volume, and coating temperature. The maximum retardance was found as 31.712°, close to that obtained by evolutionary solver in Excel and a relative error of -0.013%. Above all, the stepwise regression method was successfully used to model the retardance of CA coated FFs, and the maximum global retardance was determined by the use of GA.

  18. The Colorectal Cancer Mortality-to-Incidence Ratio as an Indicator of Global Cancer Screening and Care

    PubMed Central

    Sunkara, Vasu; Hébert, James R.

    2015-01-01

    BACKGROUND Disparities in cancer screening, incidence, treatment, and survival are worsening globally. The mortality-to-incidence ratio (MIR) has been used previously to evaluate such disparities. METHODS The MIR for colorectal cancer is calculated for all Organisation for Economic Cooperation and Development (OECD) countries using the 2012 GLOBOCAN incidence and mortality statistics. Health system rankings were obtained from the World Health Organization. Two linear regression models were fit with the MIR as the dependent variable and health system ranking as the independent variable; one included all countries and one model had the “divergents” removed. RESULTS The regression model for all countries explained 24% of the total variance in the MIR. Nine countries were found to have regression-calculated MIRs that differed from the actual MIR by >20%. Countries with lower-than-expected MIRs were found to have strong national health systems characterized by formal colorectal cancer screening programs. Conversely, countries with higher-than-expected MIRs lack screening programs. When these divergent points were removed from the data set, the recalculated regression model explained 60% of the total variance in the MIR. CONCLUSIONS The MIR proved useful for identifying disparities in cancer screening and treatment internationally. It has potential as an indicator of the long-term success of cancer surveillance programs and may be extended to other cancer types for these purposes. PMID:25572676

  19. A reconnaissance method for delineation of tracts for regional-scale mineral-resource assessment based on geologic-map data

    USGS Publications Warehouse

    Raines, G.L.; Mihalasky, M.J.

    2002-01-01

    The U.S. Geological Survey (USGS) is proposing to conduct a global mineral-resource assessment using geologic maps, significant deposits, and exploration history as minimal data requirements. Using a geologic map and locations of significant pluton-related deposits, the pluton-related-deposit tract maps from the USGS national mineral-resource assessment have been reproduced with GIS-based analysis and modeling techniques. Agreement, kappa, and Jaccard's C correlation statistics between the expert USGS and calculated tract maps of 87%, 40%, and 28%, respectively, have been achieved using a combination of weights-of-evidence and weighted logistic regression methods. Between the experts' and calculated maps, the ranking of states measured by total permissive area correlates at 84%. The disagreement between the experts and calculated results can be explained primarily by tracts defined by geophysical evidence not considered in the calculations, generalization of tracts by the experts, differences in map scales, and the experts' inclusion of large tracts that are arguably not permissive. This analysis shows that tracts for regional mineral-resource assessment approximating those delineated by USGS experts can be calculated using weights of evidence and weighted logistic regression, a geologic map, and the location of significant deposits. Weights of evidence and weighted logistic regression applied to a global geologic map could provide quickly a useful reconnaissance definition of tracts for mineral assessment that is tied to the data and is reproducible. ?? 2002 International Association for Mathematical Geology.

  20. Spatial Autocorrelation of Cancer Incidence in Saudi Arabia

    PubMed Central

    Al-Ahmadi, Khalid; Al-Zahrani, Ali

    2013-01-01

    Little is known about the geographic distribution of common cancers in Saudi Arabia. We explored the spatial incidence patterns of common cancers in Saudi Arabia using spatial autocorrelation analyses, employing the global Moran’s I and Anselin’s local Moran’s I statistics to detect nonrandom incidence patterns. Global ordinary least squares (OLS) regression and local geographically-weighted regression (GWR) were applied to examine the spatial correlation of cancer incidences at the city level. Population-based records of cancers diagnosed between 1998 and 2004 were used. Male lung cancer and female breast cancer exhibited positive statistically significant global Moran’s I index values, indicating a tendency toward clustering. The Anselin’s local Moran’s I analyses revealed small significant clusters of lung cancer, prostate cancer and Hodgkin’s disease among males in the Eastern region and significant clusters of thyroid cancers in females in the Eastern and Riyadh regions. Additionally, both regression methods found significant associations among various cancers. For example, OLS and GWR revealed significant spatial associations among NHL, leukemia and Hodgkin’s disease (r² = 0.49–0.67 using OLS and r² = 0.52–0.68 using GWR) and between breast and prostate cancer (r² = 0.53 OLS and 0.57 GWR) in Saudi Arabian cities. These findings may help to generate etiologic hypotheses of cancer causation and identify spatial anomalies in cancer incidence in Saudi Arabia. Our findings should stimulate further research on the possible causes underlying these clusters and associations. PMID:24351742

  1. Globalization and eating disorder risk: Peer influence, perceived social norms, and adolescent disordered eating in Fiji

    PubMed Central

    Gerbasi, Margaret E.; Richards, Lauren K.; Thomas, Jennifer J.; Agnew-Blais, Jessica C.; Thompson-Brenner, Heather; Gilman, Stephen E.; Becker, Anne E.

    2014-01-01

    Objective The increasing global health burden imposed by eating disorders warrants close examination of social exposures associated with globalization that potentially elevate risk during the critical developmental period of adolescence in low- and middle-income countries (LMICs). The study aim was to investigate the association of peer influence and perceived social norms with adolescent eating pathology in Fiji, a LMIC undergoing rapid social change. Method We measured peer influence on eating concerns (with the Inventory of Peer Influence on Eating Concerns; IPIEC), perceived peer norms associated with disordered eating and body concerns, perceived community cultural norms, and individual cultural orientations in a representative sample of school-going ethnic Fijian adolescent girls (n=523). We then developed a multivariable linear regression model to examine their relation to eating pathology (measured by the Eating Disorder Examination-Questionnaire; EDE-Q). Results We found independent and statistically significant associations between both IPIEC scores and our proxy for perceived social norms specific to disordered eating (both p <.001) and EDE-Q global scores in a fully adjusted linear regression model. Discussion Study findings support the possibility that peer influence as well as perceived social norms relevant to disordered eating may elevate risk for disordered eating in Fiji, during the critical developmental period of adolescence. Replication and extension of these research findings in other populations undergoing rapid social transition—and where globalization is also influencing local social norms—may enrich etiologic models and inform strategies to mitigate risk. PMID:25139374

  2. Predicting Global Fund grant disbursements for procurement of artemisinin-based combination therapies

    PubMed Central

    Cohen, Justin M; Singh, Inder; O'Brien, Megan E

    2008-01-01

    Background An accurate forecast of global demand is essential to stabilize the market for artemisinin-based combination therapy (ACT) and to ensure access to high-quality, life-saving medications at the lowest sustainable prices by avoiding underproduction and excessive overproduction, each of which can have negative consequences for the availability of affordable drugs. A robust forecast requires an understanding of the resources available to support procurement of these relatively expensive antimalarials, in particular from the Global Fund, at present the single largest source of ACT funding. Methods Predictive regression models estimating the timing and rate of disbursements from the Global Fund to recipient countries for each malaria grant were derived using a repeated split-sample procedure intended to avoid over-fitting. Predictions were compared against actual disbursements in a group of validation grants, and forecasts of ACT procurement extrapolated from disbursement predictions were evaluated against actual procurement in two sub-Saharan countries. Results Quarterly forecasts were correlated highly with actual smoothed disbursement rates (r = 0.987, p < 0.0001). Additionally, predicted ACT procurement, extrapolated from forecasted disbursements, was correlated strongly with actual ACT procurement supported by two grants from the Global Fund's first (r = 0.945, p < 0.0001) and fourth (r = 0.938, p < 0.0001) funding rounds. Conclusion This analysis derived predictive regression models that successfully forecasted disbursement patterning for individual Global Fund malaria grants. These results indicate the utility of this approach for demand forecasting of ACT and, potentially, for other commodities procured using funding from the Global Fund. Further validation using data from other countries in different regions and environments will be necessary to confirm its generalizability. PMID:18831742

  3. A new method for the production of social fragility functions and the result of its use in worldwide fatality loss estimation for earthquakes

    NASA Astrophysics Data System (ADS)

    Daniell, James; Wenzel, Friedemann

    2014-05-01

    A review of over 200 fatality models over the past 50 years for earthquake loss estimation from various authors has identified key parameters that influence fatality estimation in each of these models. These are often very specific and cannot be readily adapted globally. In the doctoral dissertation of the author, a new method is used for regression of fatalities to intensity using loss functions based not only on fatalities, but also using population models and other socioeconomic parameters created through time for every country worldwide for the period 1900-2013. A calibration of functions was undertaken from 1900-2008, and each individual quake analysed from 2009-2013 in real-time, in conjunction with www.earthquake-report.com. Using the CATDAT Damaging Earthquakes Database containing socioeconomic loss information for 7208 damaging earthquake events from 1900-2013 including disaggregation of secondary effects, fatality estimates for over 2035 events have been re-examined from 1900-2013. In addition, 99 of these events have detailed data for the individual cities and towns or have been reconstructed to create a death rate as a percentage of population. Many historical isoseismal maps and macroseismic intensity datapoint surveys collected globally, have been digitised and modelled covering around 1353 of these 2035 fatal events, to include an estimate of population, occupancy and socioeconomic climate at the time of the event at each intensity bracket. In addition, 1651 events without fatalities but causing damage have also been examined in this way. The production of socioeconomic and engineering indices such as HDI and building vulnerability has been undertaken on a country-level and state/province-level leading to a dataset allowing regressions not only using a static view of risk, but also allowing for the change in the socioeconomic climate between the earthquake events to be undertaken. This means that a year 1920 event in a country, will not simply be regressed against a year 2000 event, but normalised. A global human development index (HDI) (life expectancy, education and income) was developed and collected for the first time from 1900-2013 globally on a country and province level allowing for a very useful parameter in the regression. In addition, the occupancy rate from the time of day that the event occurred, as well as population density and individual earthquake attributes like the existence of a foreshock were also examined for the 3004 events in the regression analysis. Where an event has not occurred in a country previously, a regionalisation strategy based on building typologies, seismic code index, building practice, climate, earthquake history and socioeconomic climate is proposed. The result is a set of "social fragility functions" calculating fatalities for use in any country worldwide using the parameters of macroseismic intensity, population, HDI, time of day and occupancy, that provide a robust accurate method, which has not only been calibrated to country level data but to town and city data through time. The estimates will continue to be used in conjunction with Earthquake Report, a non-profit worldwide earthquake reporting website and has shown very promising results from 2010-2013 for rapid estimates of fatalities globally.

  4. Statistical downscaling of precipitation using long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra

    2017-11-01

    Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.

  5. An open-access CMIP5 pattern library for temperature and precipitation: description and methodology

    NASA Astrophysics Data System (ADS)

    Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben

    2017-05-01

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.

  6. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.

  7. Anti-correlated Networks, Global Signal Regression, and the Effects of Caffeine in Resting-State Functional MRI

    PubMed Central

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T.

    2012-01-01

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. PMID:22743194

  8. Evaluation of fuzzy inference systems using fuzzy least squares

    NASA Technical Reports Server (NTRS)

    Barone, Joseph M.

    1992-01-01

    Efforts to develop evaluation methods for fuzzy inference systems which are not based on crisp, quantitative data or processes (i.e., where the phenomenon the system is built to describe or control is inherently fuzzy) are just beginning. This paper suggests that the method of fuzzy least squares can be used to perform such evaluations. Regressing the desired outputs onto the inferred outputs can provide both global and local measures of success. The global measures have some value in an absolute sense, but they are particularly useful when competing solutions (e.g., different numbers of rules, different fuzzy input partitions) are being compared. The local measure described here can be used to identify specific areas of poor fit where special measures (e.g., the use of emphatic or suppressive rules) can be applied. Several examples are discussed which illustrate the applicability of the method as an evaluation tool.

  9. Outlier identification in urban soils and its implications for identification of potential contaminated land

    NASA Astrophysics Data System (ADS)

    Zhang, Chaosheng

    2010-05-01

    Outliers in urban soil geochemical databases may imply potential contaminated land. Different methodologies which can be easily implemented for the identification of global and spatial outliers were applied for Pb concentrations in urban soils of Galway City in Ireland. Due to its strongly skewed probability feature, a Box-Cox transformation was performed prior to further analyses. The graphic methods of histogram and box-and-whisker plot were effective in identification of global outliers at the original scale of the dataset. Spatial outliers could be identified by a local indicator of spatial association of local Moran's I, cross-validation of kriging, and a geographically weighted regression. The spatial locations of outliers were visualised using a geographical information system. Different methods showed generally consistent results, but differences existed. It is suggested that outliers identified by statistical methods should be confirmed and justified using scientific knowledge before they are properly dealt with.

  10. Application of two regression-based methods to estimate the effects harvest on forest structure using Landsat data

    Treesearch

    Sean P. Healey; Zhiqiang Yang; Warren B. Cohen; D. John Pierce

    2006-01-01

    Although partial harvests are common in many forest types globally, there has been little assessment of the potential to map the intensity of these harvests using Landsat data. We modeled basal area removal and percent cover change in a study area in central Washington (northwestern USA) using biennial Landsat imagery and reference data from historical aerial photos...

  11. [Statistical prediction methods in violence risk assessment and its application].

    PubMed

    Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song

    2013-06-01

    It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.

  12. The Global Signal in fMRI: Nuisance or Information?

    PubMed Central

    Nalci, Alican; Falahpour, Maryam

    2017-01-01

    The global signal is widely used as a regressor or normalization factor for removing the effects of global variations in the analysis of functional magnetic resonance imaging (fMRI) studies. However, there is considerable controversy over its use because of the potential bias that can be introduced when it is applied to the analysis of both task-related and resting-state fMRI studies. In this paper we take a closer look at the global signal, examining in detail the various sources that can contribute to the signal. For the most part, the global signal has been treated as a nuisance term, but there is growing evidence that it may also contain valuable information. We also examine the various ways that the global signal has been used in the analysis of fMRI data, including global signal regression, global signal subtraction, and global signal normalization. Furthermore, we describe new ways for understanding the effects of global signal regression and its relation to the other approaches. PMID:28213118

  13. Hierarchical Matching and Regression with Application to Photometric Redshift Estimation

    NASA Astrophysics Data System (ADS)

    Murtagh, Fionn

    2017-06-01

    This work emphasizes that heterogeneity, diversity, discontinuity, and discreteness in data is to be exploited in classification and regression problems. A global a priori model may not be desirable. For data analytics in cosmology, this is motivated by the variety of cosmological objects such as elliptical, spiral, active, and merging galaxies at a wide range of redshifts. Our aim is matching and similarity-based analytics that takes account of discrete relationships in the data. The information structure of the data is represented by a hierarchy or tree where the branch structure, rather than just the proximity, is important. The representation is related to p-adic number theory. The clustering or binning of the data values, related to the precision of the measurements, has a central role in this methodology. If used for regression, our approach is a method of cluster-wise regression, generalizing nearest neighbour regression. Both to exemplify this analytics approach, and to demonstrate computational benefits, we address the well-known photometric redshift or `photo-z' problem, seeking to match Sloan Digital Sky Survey (SDSS) spectroscopic and photometric redshifts.

  14. Anti-correlated networks, global signal regression, and the effects of caffeine in resting-state functional MRI.

    PubMed

    Wong, Chi Wah; Olafsson, Valur; Tal, Omer; Liu, Thomas T

    2012-10-15

    Resting-state functional connectivity magnetic resonance imaging is proving to be an essential tool for the characterization of functional networks in the brain. Two of the major networks that have been identified are the default mode network (DMN) and the task positive network (TPN). Although prior work indicates that these two networks are anti-correlated, the findings are controversial because the anti-correlations are often found only after the application of a pre-processing step, known as global signal regression, that can produce artifactual anti-correlations. In this paper, we show that, for subjects studied in an eyes-closed rest state, caffeine can significantly enhance the detection of anti-correlations between the DMN and TPN without the need for global signal regression. In line with these findings, we find that caffeine also leads to widespread decreases in connectivity and global signal amplitude. Using a recently introduced geometric model of global signal effects, we demonstrate that these decreases are consistent with the removal of an additive global signal confound. In contrast to the effects observed in the eyes-closed rest state, caffeine did not lead to significant changes in global functional connectivity in the eyes-open rest state. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Squeezeposenet: Image Based Pose Regression with Small Convolutional Neural Networks for Real Time Uas Navigation

    NASA Astrophysics Data System (ADS)

    Müller, M. S.; Urban, S.; Jutzi, B.

    2017-08-01

    The number of unmanned aerial vehicles (UAVs) is increasing since low-cost airborne systems are available for a wide range of users. The outdoor navigation of such vehicles is mostly based on global navigation satellite system (GNSS) methods to gain the vehicles trajectory. The drawback of satellite-based navigation are failures caused by occlusions and multi-path interferences. Beside this, local image-based solutions like Simultaneous Localization and Mapping (SLAM) and Visual Odometry (VO) can e.g. be used to support the GNSS solution by closing trajectory gaps but are computationally expensive. However, if the trajectory estimation is interrupted or not available a re-localization is mandatory. In this paper we will provide a novel method for a GNSS-free and fast image-based pose regression in a known area by utilizing a small convolutional neural network (CNN). With on-board processing in mind, we employ a lightweight CNN called SqueezeNet and use transfer learning to adapt the network to pose regression. Our experiments show promising results for GNSS-free and fast localization.

  16. GLOBALLY ADAPTIVE QUANTILE REGRESSION WITH ULTRA-HIGH DIMENSIONAL DATA

    PubMed Central

    Zheng, Qi; Peng, Limin; He, Xuming

    2015-01-01

    Quantile regression has become a valuable tool to analyze heterogeneous covaraite-response associations that are often encountered in practice. The development of quantile regression methodology for high dimensional covariates primarily focuses on examination of model sparsity at a single or multiple quantile levels, which are typically prespecified ad hoc by the users. The resulting models may be sensitive to the specific choices of the quantile levels, leading to difficulties in interpretation and erosion of confidence in the results. In this article, we propose a new penalization framework for quantile regression in the high dimensional setting. We employ adaptive L1 penalties, and more importantly, propose a uniform selector of the tuning parameter for a set of quantile levels to avoid some of the potential problems with model selection at individual quantile levels. Our proposed approach achieves consistent shrinkage of regression quantile estimates across a continuous range of quantiles levels, enhancing the flexibility and robustness of the existing penalized quantile regression methods. Our theoretical results include the oracle rate of uniform convergence and weak convergence of the parameter estimators. We also use numerical studies to confirm our theoretical findings and illustrate the practical utility of our proposal. PMID:26604424

  17. A modified temporal criterion to meta-optimize the extended Kalman filter for land cover classification of remotely sensed time series

    NASA Astrophysics Data System (ADS)

    Salmon, B. P.; Kleynhans, W.; Olivier, J. C.; van den Bergh, F.; Wessels, K. J.

    2018-05-01

    Humans are transforming land cover at an ever-increasing rate. Accurate geographical maps on land cover, especially rural and urban settlements are essential to planning sustainable development. Time series extracted from MODerate resolution Imaging Spectroradiometer (MODIS) land surface reflectance products have been used to differentiate land cover classes by analyzing the seasonal patterns in reflectance values. The proper fitting of a parametric model to these time series usually requires several adjustments to the regression method. To reduce the workload, a global setting of parameters is done to the regression method for a geographical area. In this work we have modified a meta-optimization approach to setting a regression method to extract the parameters on a per time series basis. The standard deviation of the model parameters and magnitude of residuals are used as scoring function. We successfully fitted a triply modulated model to the seasonal patterns of our study area using a non-linear extended Kalman filter (EKF). The approach uses temporal information which significantly reduces the processing time and storage requirements to process each time series. It also derives reliability metrics for each time series individually. The features extracted using the proposed method are classified with a support vector machine and the performance of the method is compared to the original approach on our ground truth data.

  18. Global-scale high-resolution ( 1 km) modelling of mean, maximum and minimum annual streamflow

    NASA Astrophysics Data System (ADS)

    Barbarossa, Valerio; Huijbregts, Mark; Hendriks, Jan; Beusen, Arthur; Clavreul, Julie; King, Henry; Schipper, Aafke

    2017-04-01

    Quantifying mean, maximum and minimum annual flow (AF) of rivers at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. AF metrics can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict AF metrics based on climate and catchment characteristics. Yet, so far, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. We developed global-scale regression models that quantify mean, maximum and minimum AF as function of catchment area and catchment-averaged slope, elevation, and mean, maximum and minimum annual precipitation and air temperature. We then used these models to obtain global 30 arc-seconds (˜ 1 km) maps of mean, maximum and minimum AF for each year from 1960 through 2015, based on a newly developed hydrologically conditioned digital elevation model. We calibrated our regression models based on observations of discharge and catchment characteristics from about 4,000 catchments worldwide, ranging from 100 to 106 km2 in size, and validated them against independent measurements as well as the output of a number of process-based global hydrological models (GHMs). The variance explained by our regression models ranged up to 90% and the performance of the models compared well with the performance of existing GHMs. Yet, our AF maps provide a level of spatial detail that cannot yet be achieved by current GHMs.

  19. Random Forests for Global and Regional Crop Yield Predictions.

    PubMed

    Jeong, Jig Han; Resop, Jonathan P; Mueller, Nathaniel D; Fleisher, David H; Yun, Kyungdahm; Butler, Ethan E; Timlin, Dennis J; Shim, Kyo-Moon; Gerber, James S; Reddy, Vangimalla R; Kim, Soo-Hyung

    2016-01-01

    Accurate predictions of crop yield are critical for developing effective agricultural and food policies at the regional and global scales. We evaluated a machine-learning method, Random Forests (RF), for its ability to predict crop yield responses to climate and biophysical variables at global and regional scales in wheat, maize, and potato in comparison with multiple linear regressions (MLR) serving as a benchmark. We used crop yield data from various sources and regions for model training and testing: 1) gridded global wheat grain yield, 2) maize grain yield from US counties over thirty years, and 3) potato tuber and maize silage yield from the northeastern seaboard region. RF was found highly capable of predicting crop yields and outperformed MLR benchmarks in all performance statistics that were compared. For example, the root mean square errors (RMSE) ranged between 6 and 14% of the average observed yield with RF models in all test cases whereas these values ranged from 14% to 49% for MLR models. Our results show that RF is an effective and versatile machine-learning method for crop yield predictions at regional and global scales for its high accuracy and precision, ease of use, and utility in data analysis. RF may result in a loss of accuracy when predicting the extreme ends or responses beyond the boundaries of the training data.

  20. Application of two regression-based methods to estimate the effects of partial harvest on forest structure using Landsat data.

    Treesearch

    S.P. Healey; Z. Yang; W.B. Cohen; D.J. Pierce

    2006-01-01

    Although partial harvests are common in many forest types globally, there has been little assessment of the potential to map the intensity of these harvests using Landsat data. We modeled basal area removal and percentage cover change in a study area in central Washington (northwestern USA) using biennial Landsat imagery and reference data from historical aerial photos...

  1. Global estimation of long-term persistence in annual river runoff

    NASA Astrophysics Data System (ADS)

    Markonis, Y.; Moustakis, Y.; Nasika, C.; Sychova, P.; Dimitriadis, P.; Hanel, M.; Máca, P.; Papalexiou, S. M.

    2018-03-01

    Long-term persistence (LTP) of annual river runoff is a topic of ongoing hydrological research, due to its implications to water resources management. Here, we estimate its strength, measured by the Hurst coefficient H, in 696 annual, globally distributed, streamflow records with at least 80 years of data. We use three estimation methods (maximum likelihood estimator, Whittle estimator and least squares variance) resulting in similar mean values of H close to 0.65. Subsequently, we explore potential factors influencing H by two linear (Spearman's rank correlation, multiple linear regression) and two non-linear (self-organizing maps, random forests) techniques. Catchment area is found to be crucial for medium to larger watersheds, while climatic controls, such as aridity index, have higher impact to smaller ones. Our findings indicate that long-term persistence is weaker than found in other studies, suggesting that enhanced LTP is encountered in large-catchment rivers, were the effect of spatial aggregation is more intense. However, we also show that the estimated values of H can be reproduced by a short-term persistence stochastic model such as an auto-regressive AR(1) process. A direct consequence is that some of the most common methods for the estimation of H coefficient, might not be suitable for discriminating short- and long-term persistence even in long observational records.

  2. Solar energy distribution over Egypt using cloudiness from Meteosat photos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosalam Shaltout, M.A.; Hassen, A.H.

    1990-01-01

    In Egypt, there are 10 ground stations for measuring the global solar radiation, and five stations for measuring the diffuse solar radiation. Every day at noon, the Meteorological Authority in Cairo receives three photographs of cloudiness over Egypt from the Meteosat satellite, one in the visible, and two in the infra-red bands (10.5-12.5 {mu}m) and (5.7-7.1 {mu}m). The monthly average cloudiness for 24 sites over Egypt are measured and calculated from Meteosat observations during the period 1985-1986. Correlation analysis between the cloudiness observed by Meteosat and global solar radiation measured from the ground stations is carried out. It is foundmore » that, the correlation coefficients are about 0.90 for the simple linear regression, and increase for the second and third degree regressions. Also, the correlation coefficients for the cloudiness with the diffuse solar radiation are about 0.80 for the simple linear regression, and increase for the second and third degree regression. Models and empirical relations for estimating the global and diffuse solar radiation from Meteosat cloudiness data over Egypt are deduced and tested. Seasonal maps for the global and diffuse radiation over Egypt are carried out.« less

  3. Quality optimization of H.264/AVC video transmission over noisy environments using a sparse regression framework

    NASA Astrophysics Data System (ADS)

    Pandremmenou, K.; Tziortziotis, N.; Paluri, S.; Zhang, W.; Blekas, K.; Kondi, L. P.; Kumar, S.

    2015-03-01

    We propose the use of the Least Absolute Shrinkage and Selection Operator (LASSO) regression method in order to predict the Cumulative Mean Squared Error (CMSE), incurred by the loss of individual slices in video transmission. We extract a number of quality-relevant features from the H.264/AVC video sequences, which are given as input to the LASSO. This method has the benefit of not only keeping a subset of the features that have the strongest effects towards video quality, but also produces accurate CMSE predictions. Particularly, we study the LASSO regression through two different architectures; the Global LASSO (G.LASSO) and Local LASSO (L.LASSO). In G.LASSO, a single regression model is trained for all slice types together, while in L.LASSO, motivated by the fact that the values for some features are closely dependent on the considered slice type, each slice type has its own regression model, in an e ort to improve LASSO's prediction capability. Based on the predicted CMSE values, we group the video slices into four priority classes. Additionally, we consider a video transmission scenario over a noisy channel, where Unequal Error Protection (UEP) is applied to all prioritized slices. The provided results demonstrate the efficiency of LASSO in estimating CMSE with high accuracy, using only a few features. les that typically contain high-entropy data, producing a footprint that is far less conspicuous than existing methods. The system uses a local web server to provide a le system, user interface and applications through an web architecture.

  4. Data Mining for Efficient and Accurate Large Scale Retrieval of Geophysical Parameters

    NASA Astrophysics Data System (ADS)

    Obradovic, Z.; Vucetic, S.; Peng, K.; Han, B.

    2004-12-01

    Our effort is devoted to developing data mining technology for improving efficiency and accuracy of the geophysical parameter retrievals by learning a mapping from observation attributes to the corresponding parameters within the framework of classification and regression. We will describe a method for efficient learning of neural network-based classification and regression models from high-volume data streams. The proposed procedure automatically learns a series of neural networks of different complexities on smaller data stream chunks and then properly combines them into an ensemble predictor through averaging. Based on the idea of progressive sampling the proposed approach starts with a very simple network trained on a very small chunk and then gradually increases the model complexity and the chunk size until the learning performance no longer improves. Our empirical study on aerosol retrievals from data obtained with the MISR instrument mounted at Terra satellite suggests that the proposed method is successful in learning complex concepts from large data streams with near-optimal computational effort. We will also report on a method that complements deterministic retrievals by constructing accurate predictive algorithms and applying them on appropriately selected subsets of observed data. The method is based on developing more accurate predictors aimed to catch global and local properties synthesized in a region. The procedure starts by learning the global properties of data sampled over the entire space, and continues by constructing specialized models on selected localized regions. The global and local models are integrated through an automated procedure that determines the optimal trade-off between the two components with the objective of minimizing the overall mean square errors over a specific region. Our experimental results on MISR data showed that the combined model can increase the retrieval accuracy significantly. The preliminary results on various large heterogeneous spatial-temporal datasets provide evidence that the benefits of the proposed methodology for efficient and accurate learning exist beyond the area of retrieval of geophysical parameters.

  5. Measuring carbon in forests: current status and future challenges.

    PubMed

    Brown, Sandra

    2002-01-01

    To accurately and precisely measure the carbon in forests is gaining global attention as countries seek to comply with agreements under the UN Framework Convention on Climate Change. Established methods for measuring carbon in forests exist, and are best based on permanent sample plots laid out in a statistically sound design. Measurements on trees in these plots can be readily converted to aboveground biomass using either biomass expansion factors or allometric regression equations. A compilation of existing root biomass data for upland forests of the world generated a significant regression equation that can be used to predict root biomass based on aboveground biomass only. Methods for measuring coarse dead wood have been tested in many forest types, but the methods could be improved if a non-destructive tool for measuring the density of dead wood was developed. Future measurements of carbon storage in forests may rely more on remote sensing data, and new remote data collection technologies are in development.

  6. Estimating Pneumonia Deaths of Post-Neonatal Children in Countries of Low or No Death Certification in 2008

    PubMed Central

    Theodoratou, Evropi; Zhang, Jian Shayne F.; Kolcic, Ivana; Davis, Andrew M.; Bhopal, Sunil; Nair, Harish; Chan, Kit Yee; Liu, Li; Johnson, Hope; Rudan, Igor; Campbell, Harry

    2011-01-01

    Background Pneumonia is the leading cause of child deaths globally. The aims of this study were to: a) estimate the number and global distribution of pneumonia deaths for children 1–59 months for 2008 for countries with low (<85%) or no coverage of death certification using single-cause regression models and b) compare these country estimates with recently published ones based on multi-cause regression models. Methods and Findings For 35 low child-mortality countries with <85% coverage of death certification, a regression model based on vital registration data of low child-mortality and >85% coverage of death certification countries was used. For 87 high child-mortality countries pneumonia death estimates were obtained by applying a regression model developed from published and unpublished verbal autopsy data from high child-mortality settings. The total number of 1–59 months pneumonia deaths for the year 2008 for these 122 countries was estimated to be 1.18 M (95% CI 0.77 M–1.80 M), which represented 23.27% (95% CI 17.15%–32.75%) of all 1–59 month child deaths. The country level estimation correlation coefficient between these two methods was 0.40. Interpretation Although the overall number of post-neonatal pneumonia deaths was similar irrespective to the method of estimation used, the country estimate correlation coefficient was low, and therefore country-specific estimates should be interpreted with caution. Pneumonia remains the leading cause of child deaths and is greatest in regions of poverty and high child-mortality. Despite the concerns about gender inequity linked with childhood mortality we could not estimate sex-specific pneumonia mortality rates due to the inadequate data. Life-saving interventions effective in preventing and treating pneumonia mortality exist but few children in high pneumonia disease burden regions are able to access them. To achieve the United Nations Millennium Development Goal 4 target to reduce child deaths by two-thirds in year 2015 will require the scale-up of access to these effective pneumonia interventions. PMID:21966425

  7. Emotional textile image classification based on cross-domain convolutional sparse autoencoders with feature selection

    NASA Astrophysics Data System (ADS)

    Li, Zuhe; Fan, Yangyu; Liu, Weihua; Yu, Zeqi; Wang, Fengqin

    2017-01-01

    We aim to apply sparse autoencoder-based unsupervised feature learning to emotional semantic analysis for textile images. To tackle the problem of limited training data, we present a cross-domain feature learning scheme for emotional textile image classification using convolutional autoencoders. We further propose a correlation-analysis-based feature selection method for the weights learned by sparse autoencoders to reduce the number of features extracted from large size images. First, we randomly collect image patches on an unlabeled image dataset in the source domain and learn local features with a sparse autoencoder. We then conduct feature selection according to the correlation between different weight vectors corresponding to the autoencoder's hidden units. We finally adopt a convolutional neural network including a pooling layer to obtain global feature activations of textile images in the target domain and send these global feature vectors into logistic regression models for emotional image classification. The cross-domain unsupervised feature learning method achieves 65% to 78% average accuracy in the cross-validation experiments corresponding to eight emotional categories and performs better than conventional methods. Feature selection can reduce the computational cost of global feature extraction by about 50% while improving classification performance.

  8. Phobic Anxiety and Plasma Levels of Global Oxidative Stress in Women

    PubMed Central

    Hagan, Kaitlin A.; Wu, Tianying; Rimm, Eric B.; Eliassen, A. Heather; Okereke, Olivia I.

    2015-01-01

    Background and Objectives Psychological distress has been hypothesized to be associated with adverse biologic states such as higher oxidative stress and inflammation. Yet, little is known about associations between a common form of distress – phobic anxiety – and global oxidative stress. Thus, we related phobic anxiety to plasma fluorescent oxidation products (FlOPs), a global oxidative stress marker. Methods We conducted a cross-sectional analysis among 1,325 women (aged 43-70 years) from the Nurses’ Health Study. Phobic anxiety was measured using the Crown-Crisp Index (CCI). Adjusted least-squares mean log-transformed FlOPs were calculated across phobic categories. Logistic regression models were used to calculate odds ratios (OR) comparing the highest CCI category (≥6 points) vs. lower scores, across FlOPs quartiles. Results No association was found between phobic anxiety categories and mean FlOP levels in multivariable adjusted linear models. Similarly, in multivariable logistic regression models there were no associations between FlOPs quartiles and likelihood of being in the highest phobic category. Comparing women in the highest vs. lowest FlOPs quartiles: FlOP_360: OR=0.68 (95% CI: 0.40-1.15); FlOP_320: OR=0.99 (95% CI: 0.61-1.61); FlOP_400: OR=0.92 (95% CI: 0.52, 1.63). Conclusions No cross-sectional association was found between phobic anxiety and a plasma measure of global oxidative stress in this sample of middle-aged and older women. PMID:26635425

  9. Sources and implications of whole-brain fMRI signals in humans

    PubMed Central

    Power, Jonathan D; Plitt, Mark; Laumann, Timothy O; Martin, Alex

    2016-01-01

    Whole-brain fMRI signals are a subject of intense interest: variance in the global fMRI signal (the spatial mean of all signals in the brain) indexes subject arousal, and psychiatric conditions such as schizophrenia and autism have been characterized by differences in the global fMRI signal. Further, vigorous debates exist on whether global signals ought to be removed from fMRI data. However, surprisingly little research has focused on the empirical properties of whole-brain fMRI signals. Here we map the spatial and temporal properties of the global signal, individually, in 1000+ fMRI scans. Variance in the global fMRI signal is strongly linked to head motion, to hardware artifacts, and to respiratory patterns and their attendant physiologic changes. Many techniques used to prepare fMRI data for analysis fail to remove these uninteresting kinds of global signal fluctuations. Thus, many studies include, at the time of analysis, prominent global effects of yawns, breathing changes, and head motion, among other signals. Such artifacts will mimic dynamic neural activity and will spuriously alter signal covariance throughout the brain. Methods capable of isolating and removing global artifactual variance while preserving putative “neural” variance are needed; this paper adopts no position on the topic of global signal regression. PMID:27751941

  10. A new statistical approach to climate change detection and attribution

    NASA Astrophysics Data System (ADS)

    Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe

    2017-01-01

    We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).

  11. Global Efficiency of Structural Networks Mediates Cognitive Control in Mild Cognitive Impairment

    PubMed Central

    Berlot, Rok; Metzler-Baddeley, Claudia; Ikram, M. Arfan; Jones, Derek K.; O’Sullivan, Michael J.

    2016-01-01

    Background: Cognitive control has been linked to both the microstructure of individual tracts and the structure of whole-brain networks, but their relative contributions in health and disease remain unclear. Objective: To determine the contribution of both localized white matter tract damage and disruption of global network architecture to cognitive control, in older age and Mild Cognitive Impairment (MCI). Materials and Methods: Twenty-five patients with MCI and 20 age, sex, and intelligence-matched healthy volunteers were investigated with 3 Tesla structural magnetic resonance imaging (MRI). Cognitive control and episodic memory were evaluated with established tests. Structural network graphs were constructed from diffusion MRI-based whole-brain tractography. Their global measures were calculated using graph theory. Regression models utilized both global network metrics and microstructure of specific connections, known to be critical for each domain, to predict cognitive scores. Results: Global efficiency and the mean clustering coefficient of networks were reduced in MCI. Cognitive control was associated with global network topology. Episodic memory, in contrast, correlated with individual temporal tracts only. Relationships between cognitive control and network topology were attenuated by addition of single tract measures to regression models, consistent with a partial mediation effect. The mediation effect was stronger in MCI than healthy volunteers, explaining 23-36% of the effect of cingulum microstructure on cognitive control performance. Network clustering was a significant mediator in the relationship between tract microstructure and cognitive control in both groups. Conclusion: The status of critical connections and large-scale network topology are both important for maintenance of cognitive control in MCI. Mediation via large-scale networks is more important in patients with MCI than healthy volunteers. This effect is domain-specific, and true for cognitive control but not for episodic memory. Interventions to improve cognitive control will need to address both dysfunction of local circuitry and global network architecture to be maximally effective. PMID:28018208

  12. DQM: Decentralized Quadratically Approximated Alternating Direction Method of Multipliers

    NASA Astrophysics Data System (ADS)

    Mokhtari, Aryan; Shi, Wei; Ling, Qing; Ribeiro, Alejandro

    2016-10-01

    This paper considers decentralized consensus optimization problems where nodes of a network have access to different summands of a global objective function. Nodes cooperate to minimize the global objective by exchanging information with neighbors only. A decentralized version of the alternating directions method of multipliers (DADMM) is a common method for solving this category of problems. DADMM exhibits linear convergence rate to the optimal objective but its implementation requires solving a convex optimization problem at each iteration. This can be computationally costly and may result in large overall convergence times. The decentralized quadratically approximated ADMM algorithm (DQM), which minimizes a quadratic approximation of the objective function that DADMM minimizes at each iteration, is proposed here. The consequent reduction in computational time is shown to have minimal effect on convergence properties. Convergence still proceeds at a linear rate with a guaranteed constant that is asymptotically equivalent to the DADMM linear convergence rate constant. Numerical results demonstrate advantages of DQM relative to DADMM and other alternatives in a logistic regression problem.

  13. Outlier identification and visualization for Pb concentrations in urban soils and its implications for identification of potential contaminated land.

    PubMed

    Zhang, Chaosheng; Tang, Ya; Luo, Lin; Xu, Weilin

    2009-11-01

    Outliers in urban soil geochemical databases may imply potential contaminated land. Different methodologies which can be easily implemented for the identification of global and spatial outliers were applied for Pb concentrations in urban soils of Galway City in Ireland. Due to its strongly skewed probability feature, a Box-Cox transformation was performed prior to further analyses. The graphic methods of histogram and box-and-whisker plot were effective in identification of global outliers at the original scale of the dataset. Spatial outliers could be identified by a local indicator of spatial association of local Moran's I, cross-validation of kriging, and a geographically weighted regression. The spatial locations of outliers were visualised using a geographical information system. Different methods showed generally consistent results, but differences existed. It is suggested that outliers identified by statistical methods should be confirmed and justified using scientific knowledge before they are properly dealt with.

  14. Predicting carbon dioxide and energy fluxes across global FLUXNET sites with regression algorithms

    DOE PAGES

    Tramontana, Gianluca; Jung, Martin; Schwalm, Christopher R.; ...

    2016-07-29

    Spatio-temporal fields of land–atmosphere fluxes derived from data-driven models can complement simulations by process-based land surface models. While a number of strategies for empirical models with eddy-covariance flux data have been applied, a systematic intercomparison of these methods has been missing so far. In this study, we performed a cross-validation experiment for predicting carbon dioxide, latent heat, sensible heat and net radiation fluxes across different ecosystem types with 11 machine learning (ML) methods from four different classes (kernel methods, neural networks, tree methods, and regression splines). We applied two complementary setups: (1) 8-day average fluxes based on remotely sensed data andmore » (2) daily mean fluxes based on meteorological data and a mean seasonal cycle of remotely sensed variables. The patterns of predictions from different ML and experimental setups were highly consistent. There were systematic differences in performance among the fluxes, with the following ascending order: net ecosystem exchange ( R 2 < 0.5), ecosystem respiration ( R 2 > 0.6), gross primary production ( R 2> 0.7), latent heat ( R 2 > 0.7), sensible heat ( R 2 > 0.7), and net radiation ( R 2 > 0.8). The ML methods predicted the across-site variability and the mean seasonal cycle of the observed fluxes very well ( R 2 > 0.7), while the 8-day deviations from the mean seasonal cycle were not well predicted ( R 2 < 0.5). Fluxes were better predicted at forested and temperate climate sites than at sites in extreme climates or less represented by training data (e.g., the tropics). Finally, the evaluated large ensemble of ML-based models will be the basis of new global flux products.« less

  15. Predicting carbon dioxide and energy fluxes across global FLUXNET sites with regression algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tramontana, Gianluca; Jung, Martin; Schwalm, Christopher R.

    Spatio-temporal fields of land–atmosphere fluxes derived from data-driven models can complement simulations by process-based land surface models. While a number of strategies for empirical models with eddy-covariance flux data have been applied, a systematic intercomparison of these methods has been missing so far. In this study, we performed a cross-validation experiment for predicting carbon dioxide, latent heat, sensible heat and net radiation fluxes across different ecosystem types with 11 machine learning (ML) methods from four different classes (kernel methods, neural networks, tree methods, and regression splines). We applied two complementary setups: (1) 8-day average fluxes based on remotely sensed data andmore » (2) daily mean fluxes based on meteorological data and a mean seasonal cycle of remotely sensed variables. The patterns of predictions from different ML and experimental setups were highly consistent. There were systematic differences in performance among the fluxes, with the following ascending order: net ecosystem exchange ( R 2 < 0.5), ecosystem respiration ( R 2 > 0.6), gross primary production ( R 2> 0.7), latent heat ( R 2 > 0.7), sensible heat ( R 2 > 0.7), and net radiation ( R 2 > 0.8). The ML methods predicted the across-site variability and the mean seasonal cycle of the observed fluxes very well ( R 2 > 0.7), while the 8-day deviations from the mean seasonal cycle were not well predicted ( R 2 < 0.5). Fluxes were better predicted at forested and temperate climate sites than at sites in extreme climates or less represented by training data (e.g., the tropics). Finally, the evaluated large ensemble of ML-based models will be the basis of new global flux products.« less

  16. Spatial Estimation of Sub-Hour Global Horizontal Irradiance Based on Official Observations and Remote Sensors

    PubMed Central

    Gutierrez-Corea, Federico-Vladimir; Manso-Callejo, Miguel-Angel; Moreno-Regidor, María-Pilar; Velasco-Gómez, Jesús

    2014-01-01

    This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of ±2 standard deviations). PMID:24732102

  17. Spatial estimation of sub-hour Global Horizontal Irradiance based on official observations and remote sensors.

    PubMed

    Gutierrez-Corea, Federico-Vladimir; Manso-Callejo, Miguel-Angel; Moreno-Regidor, María-Pilar; Velasco-Gómez, Jesús

    2014-04-11

    This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of ±2 standard deviations).

  18. Data-based estimates of the ocean carbon sink variability - results of the Surface Ocean pCO2 Mapping intercomparison (SOCOM)

    NASA Astrophysics Data System (ADS)

    Rödenbeck, Christian; Bakker, Dorothee; Gruber, Nicolas; Iida, Yosuke; Jacobson, Andy; Jones, Steve; Landschützer, Peter; Metzl, Nicolas; Nakaoka, Shin-ichiro; Olsen, Are; Park, Geun-Ha; Peylin, Philippe; Rodgers, Keith; Sasse, Tristan; Schuster, Ute; Shutler, James; Valsala, Vinu; Wanninkhof, Rik; Zeng, Jiye

    2016-04-01

    Using measurements of the surface-ocean COtwo partial pressure (pCOtwo) from the SOCAT and LDEO data bases and 14 different pCOtwo mapping methods recently collated by the Surface Ocean pCOtwo Mapping intercomparison (SOCOM) initiative, variations in regional and global sea-air COtwo fluxes are investigated. Though the available mapping methods use widely different approaches, we find relatively consistent estimates of regional pCOtwo seasonality, in line with previous estimates. In terms of interannual variability (IAV), all mapping methods estimate the largest variations to occur in the Eastern equatorial Pacific. Despite considerable spread in the detailed variations, mapping methods that fit the data more closely also tend to agree more closely with each other in regional averages. Encouragingly, this includes mapping methods belonging to complementary types - taking variability either directly from the pCOtwo data or indirectly from driver data via regression. From a weighted ensemble average, we find an IAV amplitude of the global sea-air COtwo flux of IAVampl (standard deviation over AnalysisPeriod), which is larger than simulated by biogeochemical process models. On a decadal perspective, the global ocean COtwo uptake is estimated to have gradually increased since about 2000, with little decadal change prior to that. The weighted mean net global ocean COtwo sink estimated by the SOCOM ensemble is -1.75 UPgCyr (AnalysisPeriod), consistent within uncertainties with estimates from ocean-interior carbon data or atmospheric oxygen trends. Using data-based sea-air COtwo fluxes in atmospheric COtwo inversions also helps to better constrain land-atmosphere COtwo fluxes.

  19. Data-based estimates of the ocean carbon sink variability - first results of the Surface Ocean pCO2 Mapping intercomparison (SOCOM)

    NASA Astrophysics Data System (ADS)

    Rödenbeck, C.; Bakker, D. C. E.; Gruber, N.; Iida, Y.; Jacobson, A. R.; Jones, S.; Landschützer, P.; Metzl, N.; Nakaoka, S.; Olsen, A.; Park, G.-H.; Peylin, P.; Rodgers, K. B.; Sasse, T. P.; Schuster, U.; Shutler, J. D.; Valsala, V.; Wanninkhof, R.; Zeng, J.

    2015-12-01

    Using measurements of the surface-ocean CO2 partial pressure (pCO2) and 14 different pCO2 mapping methods recently collated by the Surface Ocean pCO2 Mapping intercomparison (SOCOM) initiative, variations in regional and global sea-air CO2 fluxes are investigated. Though the available mapping methods use widely different approaches, we find relatively consistent estimates of regional pCO2 seasonality, in line with previous estimates. In terms of interannual variability (IAV), all mapping methods estimate the largest variations to occur in the eastern equatorial Pacific. Despite considerable spread in the detailed variations, mapping methods that fit the data more closely also tend to agree more closely with each other in regional averages. Encouragingly, this includes mapping methods belonging to complementary types - taking variability either directly from the pCO2 data or indirectly from driver data via regression. From a weighted ensemble average, we find an IAV amplitude of the global sea-air CO2 flux of 0.31 PgC yr-1 (standard deviation over 1992-2009), which is larger than simulated by biogeochemical process models. From a decadal perspective, the global ocean CO2 uptake is estimated to have gradually increased since about 2000, with little decadal change prior to that. The weighted mean net global ocean CO2 sink estimated by the SOCOM ensemble is -1.75 PgC yr-1 (1992-2009), consistent within uncertainties with estimates from ocean-interior carbon data or atmospheric oxygen trends.

  20. SoilGrids250m: Global gridded soil information based on machine learning.

    PubMed

    Hengl, Tomislav; Mendes de Jesus, Jorge; Heuvelink, Gerard B M; Ruiperez Gonzalez, Maria; Kilibarda, Milan; Blagotić, Aleksandar; Shangguan, Wei; Wright, Marvin N; Geng, Xiaoyuan; Bauer-Marschallinger, Bernhard; Guevara, Mario Antonio; Vargas, Rodrigo; MacMillan, Robert A; Batjes, Niels H; Leenaars, Johan G B; Ribeiro, Eloi; Wheeler, Ichsani; Mantel, Stephan; Kempen, Bas

    2017-01-01

    This paper describes the technical development and accuracy assessment of the most recent and improved version of the SoilGrids system at 250m resolution (June 2016 update). SoilGrids provides global predictions for standard numeric soil properties (organic carbon, bulk density, Cation Exchange Capacity (CEC), pH, soil texture fractions and coarse fragments) at seven standard depths (0, 5, 15, 30, 60, 100 and 200 cm), in addition to predictions of depth to bedrock and distribution of soil classes based on the World Reference Base (WRB) and USDA classification systems (ca. 280 raster layers in total). Predictions were based on ca. 150,000 soil profiles used for training and a stack of 158 remote sensing-based soil covariates (primarily derived from MODIS land products, SRTM DEM derivatives, climatic images and global landform and lithology maps), which were used to fit an ensemble of machine learning methods-random forest and gradient boosting and/or multinomial logistic regression-as implemented in the R packages ranger, xgboost, nnet and caret. The results of 10-fold cross-validation show that the ensemble models explain between 56% (coarse fragments) and 83% (pH) of variation with an overall average of 61%. Improvements in the relative accuracy considering the amount of variation explained, in comparison to the previous version of SoilGrids at 1 km spatial resolution, range from 60 to 230%. Improvements can be attributed to: (1) the use of machine learning instead of linear regression, (2) to considerable investments in preparing finer resolution covariate layers and (3) to insertion of additional soil profiles. Further development of SoilGrids could include refinement of methods to incorporate input uncertainties and derivation of posterior probability distributions (per pixel), and further automation of spatial modeling so that soil maps can be generated for potentially hundreds of soil variables. Another area of future research is the development of methods for multiscale merging of SoilGrids predictions with local and/or national gridded soil products (e.g. up to 50 m spatial resolution) so that increasingly more accurate, complete and consistent global soil information can be produced. SoilGrids are available under the Open Data Base License.

  1. Developing global regression models for metabolite concentration prediction regardless of cell line.

    PubMed

    André, Silvère; Lagresle, Sylvain; Da Sliva, Anthony; Heimendinger, Pierre; Hannas, Zahia; Calvosa, Éric; Duponchel, Ludovic

    2017-11-01

    Following the Process Analytical Technology (PAT) of the Food and Drug Administration (FDA), drug manufacturers are encouraged to develop innovative techniques in order to monitor and understand their processes in a better way. Within this framework, it has been demonstrated that Raman spectroscopy coupled with chemometric tools allow to predict critical parameters of mammalian cell cultures in-line and in real time. However, the development of robust and predictive regression models clearly requires many batches in order to take into account inter-batch variability and enhance models accuracy. Nevertheless, this heavy procedure has to be repeated for every new line of cell culture involving many resources. This is why we propose in this paper to develop global regression models taking into account different cell lines. Such models are finally transferred to any culture of the cells involved. This article first demonstrates the feasibility of developing regression models, not only for mammalian cell lines (CHO and HeLa cell cultures), but also for insect cell lines (Sf9 cell cultures). Then global regression models are generated, based on CHO cells, HeLa cells, and Sf9 cells. Finally, these models are evaluated considering a fourth cell line(HEK cells). In addition to suitable predictions of glucose and lactate concentration of HEK cell cultures, we expose that by adding a single HEK-cell culture to the calibration set, the predictive ability of the regression models are substantially increased. In this way, we demonstrate that using global models, it is not necessary to consider many cultures of a new cell line in order to obtain accurate models. Biotechnol. Bioeng. 2017;114: 2550-2559. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  2. Newer classification and regression tree techniques: Bagging and Random Forests for ecological prediction

    Treesearch

    Anantha M. Prasad; Louis R. Iverson; Andy Liaw; Andy Liaw

    2006-01-01

    We evaluated four statistical models - Regression Tree Analysis (RTA), Bagging Trees (BT), Random Forests (RF), and Multivariate Adaptive Regression Splines (MARS) - for predictive vegetation mapping under current and future climate scenarios according to the Canadian Climate Centre global circulation model.

  3. Time series modeling by a regression approach based on a latent process.

    PubMed

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  4. Factors influencing global antiretroviral procurement prices

    PubMed Central

    2009-01-01

    Background Antiretroviral medicines (ARVs) are one of the most costly parts of HIV/AIDS treatment. Many countries are struggling to provide universal access to ARVs for all people living with HIV and AIDS. Although substantial price reductions of ARVs have occurred, especially between 2002 and 2008, achieving sustainable access for the next several decades remains a major challenge for most low- and middle-income countries. The objectives of the present study were twofold: first, to analyze global ARV prices between 2005 and 2008 and associated factors, particularly procurement methods and key donor policies on ARV procurement efficiency; second, to discuss the options of procurement processes and policies that should be considered when implementing or reforming access to ARV programs. Methods An ARV-medicines price-analysis was carried out using the Global Price Reporting Mechanism from the World Health Organization. For a selection of 12 ARVs, global median prices and price variation were calculated. Linear regression models for each ARV were used to identify factors that were associated with lower procurement prices. Logistic regression models were used to identify the characteristics of those countries which procure below the highest and lowest direct manufactured costs. Results Three key factors appear to have an influence on a country's ARV prices: (a) whether the product is generic or not; (b) the socioeconomic status of the country; (c) whether the country is a member of the Clinton HIV/AIDS Initiative. Factors which did not influence procurement below the highest direct manufactured costs were HIV prevalence, procurement volume, whether the country belongs to the least developed countries or a focus country of the United States President's Emergency Plan For AIDS Relief. Conclusion One of the principal mechanisms that can help to lower prices for ARV over the next several decades is increasing procurement efficiency. Benchmarking prices could be one useful tool to achieve this. PMID:19922690

  5. Role of Aedes aegypti (Linnaeus) and Aedes albopictus (Skuse) in local dengue epidemics in Taiwan.

    PubMed

    Tsai, Pui-Jen; Teng, Hwa-Jen

    2016-11-09

    Aedes mosquitoes in Taiwan mainly comprise Aedes albopictus and Ae. aegypti. However, the species contributing to autochthonous dengue spread and the extent at which it occurs remain unclear. Thus, in this study, we spatially analyzed real data to determine spatial features related to local dengue incidence and mosquito density, particularly that of Ae. albopictus and Ae. aegypti. We used bivariate Moran's I statistic and geographically weighted regression (GWR) spatial methods to analyze the globally spatial dependence and locally regressed relationship between (1) imported dengue incidences and Breteau indices (BIs) of Ae. albopictus, (2) imported dengue incidences and BI of Ae. aegypti, (3) autochthonous dengue incidences and BI of Ae. albopictus, (4) autochthonous dengue incidences and BI of Ae. aegypti, (5) all dengue incidences and BI of Ae. albopictus, (6) all dengue incidences and BI of Ae. aegypti, (7) BI of Ae. albopictus and human population density, and (8) BI of Ae. aegypti and human population density in 348 townships in Taiwan. In the GWR models, regression coefficients of spatially regressed relationships between the incidence of autochthonous dengue and vector density of Ae. aegypti were significant and positive in most townships in Taiwan. However, Ae. albopictus had significant but negative regression coefficients in clusters of dengue epidemics. In the global bivariate Moran's index, spatial dependence between the incidence of autochthonous dengue and vector density of Ae. aegypti was significant and exhibited positive correlation in Taiwan (bivariate Moran's index = 0.51). However, Ae. albopictus exhibited positively significant but low correlation (bivariate Moran's index = 0.06). Similar results were observed in the two spatial methods between all dengue incidences and Aedes mosquitoes (Ae. aegypti and Ae. albopictus). The regression coefficients of spatially regressed relationships between imported dengue cases and Aedes mosquitoes (Ae. aegypti and Ae. albopictus) were significant in 348 townships in Taiwan. The results indicated that local Aedes mosquitoes do not contribute to the dengue incidence of imported cases. The density of Ae. aegypti positively correlated with the density of human population. By contrast, the density of Ae. albopictus negatively correlated with the density of human population in the areas of southern Taiwan. The results indicated that Ae. aegypti has more opportunities for human-mosquito contact in dengue endemic areas in southern Taiwan. Ae. aegypti, but not Ae. albopictus, and human population density in southern Taiwan are closely associated with an increased risk of autochthonous dengue incidence.

  6. 99mTc-d,l-HMPAO and SPECT of the brain in normal aging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waldemar, G.; Hasselbalch, S.G.; Andersen, A.R.

    1991-05-01

    Single photon emission computed tomography (SPECT) with 99mTc-d,l-hexamethylpropyleneamine oxime (99mTc-d,l-HMPAO) was used to determine global and regional CBF in 53 healthy subjects aged 21-83 years. For the whole group, global CBF normalized to the cerebellum was 86.4% +/- 8.4 (SD). The contribution of age, sex, and atrophy to variations in global CBF was studied using stepwise multiple regression analysis. There was a significant negative correlation of global CBF with subjective ratings of cortical atrophy, but not with ratings of ventricular size, Evans ratio, sex, or age. In a subgroup of 33 subjects, in whom volumetric measurements of atrophy were performed,more » cortical atrophy was the only significant determinant for global CBF, accounting for 27% of its variance. Mean global CBF as measured with the 133Xe inhalation technique and SPECT was 54 +/- 9 ml/100 g/min and did not correlate significantly with age. There was a preferential decline of CBF in the frontal cortex with advancing age. The side-to-side asymmetry of several regions of interest increased with age. A method was described for estimation of subcortical CBF, which decreased with advancing cortical atrophy. The relative area of the subcortical low-flow region increased with age. These results are useful in distinguishing the effects of age and simple atrophy from disease effects, when the 99mTc-d,l-HMPAO method is used.« less

  7. Assessment of the relative merits of a few methods to detect evolutionary trends.

    PubMed

    Laurin, Michel

    2010-12-01

    Some of the most basic questions about the history of life concern evolutionary trends. These include determining whether or not metazoans have become more complex over time, whether or not body size tends to increase over time (the Cope-Depéret rule), or whether or not brain size has increased over time in various taxa, such as mammals and birds. Despite the proliferation of studies on such topics, assessment of the reliability of results in this field is hampered by the variability of techniques used and the lack of statistical validation of these methods. To solve this problem, simulations are performed using a variety of evolutionary models (gradual Brownian motion, speciational Brownian motion, and Ornstein-Uhlenbeck), with or without a drift of variable amplitude, with variable variance of tips, and with bounds placed close or far from the starting values and final means of simulated characters. These are used to assess the relative merits (power, Type I error rate, bias, and mean absolute value of error on slope estimate) of several statistical methods that have recently been used to assess the presence of evolutionary trends in comparative data. Results show widely divergent performance of the methods. The simple, nonphylogenetic regression (SR) and variance partitioning using phylogenetic eigenvector regression (PVR) with a broken stick selection procedure have greatly inflated Type I error rate (0.123-0.180 at a 0.05 threshold), which invalidates their use in this context. However, they have the greatest power. Most variants of Felsenstein's independent contrasts (FIC; five of which are presented) have adequate Type I error rate, although two have a slightly inflated Type I error rate with at least one of the two reference trees (0.064-0.090 error rate at a 0.05 threshold). The power of all contrast-based methods is always much lower than that of SR and PVR, except under Brownian motion with a strong trend and distant bounds. Mean absolute value of error on slope of all FIC methods is slightly higher than that of phylogenetic generalized least squares (PGLS), SR, and PVR. PGLS performs well, with low Type I error rate, low error on regression coefficient, and power comparable with some FIC methods. Four variants of skewness analysis are examined, and a new method to assess significance of results is presented. However, all have consistently low power, except in rare combinations of trees, trend strength, and distance between final means and bounds. Globally, the results clearly show that FIC-based methods and PGLS are globally better than nonphylogenetic methods and variance partitioning with PVR. FIC methods and PGLS are sensitive to the model of evolution (and, hence, to branch length errors). Our results suggest that regressing raw character contrasts against raw geological age contrasts yields a good combination of power and Type I error rate. New software to facilitate batch analysis is presented.

  8. Hospital response to a global budget program under universal health insurance in Taiwan.

    PubMed

    Cheng, Shou-Hsia; Chen, Chi-Chen; Chang, Wei-Ling

    2009-10-01

    Global budget programs are utilized in many countries to control soaring healthcare expenditures. The present study was designed to evaluate the responses of Taiwanese hospitals to a new global budget program implemented in 2002. Using data obtained from the Bureau of National Health Insurance (NHI) and two nationwide surveys conducted before and after the global budget program, changes in the length of stay, treatment intensity, insurance claims, and out-of-pocket fees were compared in 2002 and 2004. The analysis was conducted using the Generalized Estimating Equations (GEEs) method. Regression models revealed that implementation of the global budget was followed by a 7% increase in length of stay and a 15% increase in the number of prescribed procedures and medications per admission. The claim expenses increased by 14%, and out-of-pocket fees per admission increased by 6%. Among the hospitals, no coalition action was found during the study period. In the present study, it appears that hospitals attempted to increase per-case expense claims to protect their reimbursement from possible discounts under a global budget cap. How Taiwanese hospitals respond to this challenge in the future deserves continued, long-term observation.

  9. Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature

    PubMed Central

    2011-01-01

    Background Meta-analysis is a popular methodology in several fields of medical research, including genetic association studies. However, the methods used for meta-analysis of association studies that report haplotypes have not been studied in detail. In this work, methods for performing meta-analysis of haplotype association studies are summarized, compared and presented in a unified framework along with an empirical evaluation of the literature. Results We present multivariate methods that use summary-based data as well as methods that use binary and count data in a generalized linear mixed model framework (logistic regression, multinomial regression and Poisson regression). The methods presented here avoid the inflation of the type I error rate that could be the result of the traditional approach of comparing a haplotype against the remaining ones, whereas, they can be fitted using standard software. Moreover, formal global tests are presented for assessing the statistical significance of the overall association. Although the methods presented here assume that the haplotypes are directly observed, they can be easily extended to allow for such an uncertainty by weighting the haplotypes by their probability. Conclusions An empirical evaluation of the published literature and a comparison against the meta-analyses that use single nucleotide polymorphisms, suggests that the studies reporting meta-analysis of haplotypes contain approximately half of the included studies and produce significant results twice more often. We show that this excess of statistically significant results, stems from the sub-optimal method of analysis used and, in approximately half of the cases, the statistical significance is refuted if the data are properly re-analyzed. Illustrative examples of code are given in Stata and it is anticipated that the methods developed in this work will be widely applied in the meta-analysis of haplotype association studies. PMID:21247440

  10. The global prevalence and correlates of skin bleaching: a meta-analysis and meta-regression analysis.

    PubMed

    Sagoe, Dominic; Pallesen, Ståle; Dlova, Ncoza C; Lartey, Margaret; Ezzedine, Khaled; Dadzie, Ophelia

    2018-06-11

    To estimate and investigate the global lifetime prevalence and correlates of skin bleaching. A meta-analysis and meta-regression analysis was performed based on a systematic and comprehensive literature search conducted in Google Scholar, ISI Web of Science, ProQuest, PsycNET, PubMed, and other relevant websites and reference lists. A total of 68 studies (67,665 participants) providing original data on the lifetime prevalence of skin bleaching were included. Publication bias was corrected using the trim and fill procedure. The pooled (imputed) lifetime prevalence of skin bleaching was 27.7% (95% CI: 19.6-37.5, I 2  = 99.6, P < 0.01). The highest significant prevalences were associated with: males (28.0%), topical corticosteroid use (51.8%), Africa (27.1%), persons aged ≤30 years (55.9%), individuals with only primary school education (31.6%), urban or semiurban residents (74.9%), patients (21.3%), data from 2010-2017 (26.8%), dermatological evaluation and testing-based assessment (24.9%), random sampling methods (29.2%), and moderate quality studies (32.3%). The proportion of females in study samples was significantly related to skin bleaching prevalence. Despite some limitations, our results indicate that the practice of skin bleaching is a serious global public health issue that should be addressed through appropriate public health interventions. © 2018 The International Society of Dermatology.

  11. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    PubMed

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF clusters. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Robust, Adaptive Functional Regression in Functional Mixed Model Framework.

    PubMed

    Zhu, Hongxiao; Brown, Philip J; Morris, Jeffrey S

    2011-09-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets.

  13. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient enough to handle large data sets, and yields posterior samples of all model parameters that can be used to perform desired Bayesian estimation and inference. Although we present details for a specific implementation of the R-FMM using specific distributional choices in the hierarchical model, 1D functions, and wavelet transforms, the method can be applied more generally using other heavy-tailed distributions, higher dimensional functions (e.g. images), and using other invertible transformations as alternatives to wavelets. PMID:22308015

  14. Retrieving Temperature Anomaly in the Global Subsurface and Deeper Ocean From Satellite Observations

    NASA Astrophysics Data System (ADS)

    Su, Hua; Li, Wene; Yan, Xiao-Hai

    2018-01-01

    Retrieving the subsurface and deeper ocean (SDO) dynamic parameters from satellite observations is crucial for effectively understanding ocean interior anomalies and dynamic processes, but it is challenging to accurately estimate the subsurface thermal structure over the global scale from sea surface parameters. This study proposes a new approach based on Random Forest (RF) machine learning to retrieve subsurface temperature anomaly (STA) in the global ocean from multisource satellite observations including sea surface height anomaly (SSHA), sea surface temperature anomaly (SSTA), sea surface salinity anomaly (SSSA), and sea surface wind anomaly (SSWA) via in situ Argo data for RF training and testing. RF machine-learning approach can accurately retrieve the STA in the global ocean from satellite observations of sea surface parameters (SSHA, SSTA, SSSA, SSWA). The Argo STA data were used to validate the accuracy and reliability of the results from the RF model. The results indicated that SSHA, SSTA, SSSA, and SSWA together are useful parameters for detecting SDO thermal information and obtaining accurate STA estimations. The proposed method also outperformed support vector regression (SVR) in global STA estimation. It will be a useful technique for studying SDO thermal variability and its role in global climate system from global-scale satellite observations.

  15. An Innovative Technique to Assess Spontaneous Baroreflex Sensitivity with Short Data Segments: Multiple Trigonometric Regressive Spectral Analysis.

    PubMed

    Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf

    2018-01-01

    Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.

  16. Soil Moisture Retrieval Using Convolutional Neural Networks: Application to Passive Microwave Remote Sensing

    NASA Astrophysics Data System (ADS)

    Hu, Z.; Xu, L.; Yu, B.

    2018-04-01

    A empirical model is established to analyse the daily retrieval of soil moisture from passive microwave remote sensing using convolutional neural networks (CNN). Soil moisture plays an important role in the water cycle. However, with the rapidly increasing of the acquiring technology for remotely sensed data, it's a hard task for remote sensing practitioners to find a fast and convenient model to deal with the massive data. In this paper, the AMSR-E brightness temperatures are used to train CNN for the prediction of the European centre for medium-range weather forecasts (ECMWF) model. Compared with the classical inversion methods, the deep learning-based method is more suitable for global soil moisture retrieval. It is very well supported by graphics processing unit (GPU) acceleration, which can meet the demand of massive data inversion. Once the model trained, a global soil moisture map can be predicted in less than 10 seconds. What's more, the method of soil moisture retrieval based on deep learning can learn the complex texture features from the big remote sensing data. In this experiment, the results demonstrates that the CNN deployed to retrieve global soil moisture can achieve a better performance than the support vector regression (SVR) for soil moisture retrieval.

  17. “Polio Eradication” Game May Increase Public Interest in Global Health

    PubMed Central

    Barnabas, Ruanne V.; Rue, Tessa; Weisman, Jordan; Harris, Nathan A.; Orenstein, Walter A.

    2015-01-01

    Abstract Background: Interactive games that highlight global health challenges and solutions are a potential tool for increasing interest in global health. To test this hypothesis, we developed an interactive “Polio Eradication” (PE) game and evaluated whether playing or watching was associated with increased public interest in global health. Materials and Methods: The PE game is a life-size, human board game that simulates PE efforts. Four players—a researcher, a transportation expert, a local community coordinator, and a healthcare worker—collaborate as an interdisciplinary team to help limit ongoing and future polio outbreaks in Pakistan, represented on the game board. Participants who played or observed the game and those who did not participate in the game, but visited noninteractive global health exhibits, completed a survey on participation outcomes. We used relative risk regression to examine associations between cofactors and change in global health interest. Results: Three variables predicted increased global health interest among the game participants: Having little or no previous global health knowledge prior to playing the game (risk ratio [RR]=1.28; 95 percent confidence interval [CI], 1.13–1.45), not currently being involved in global health (RR=1.41; 95 percent CI, 1.07–1.85), and visiting Seattle (RR=1.25; 95 percent CI, 1.04–1.51). Conclusions: Our results suggest that a hands-on, interactive game may increase the public's interest in global health, particularly among those with minimal previous knowledge of or involvement in global health activities. PMID:26182064

  18. Fast and robust segmentation of the striatum using deep convolutional neural networks.

    PubMed

    Choi, Hongyoon; Jin, Kyong Hwan

    2016-12-01

    Automated segmentation of brain structures is an important task in structural and functional image analysis. We developed a fast and accurate method for the striatum segmentation using deep convolutional neural networks (CNN). T1 magnetic resonance (MR) images were used for our CNN-based segmentation, which require neither image feature extraction nor nonlinear transformation. We employed two serial CNN, Global and Local CNN: The Global CNN determined approximate locations of the striatum. It performed a regression of input MR images fitted to smoothed segmentation maps of the striatum. From the output volume of Global CNN, cropped MR volumes which included the striatum were extracted. The cropped MR volumes and the output volumes of Global CNN were used for inputs of Local CNN. Local CNN predicted the accurate label of all voxels. Segmentation results were compared with a widely used segmentation method, FreeSurfer. Our method showed higher Dice Similarity Coefficient (DSC) (0.893±0.017 vs. 0.786±0.015) and precision score (0.905±0.018 vs. 0.690±0.022) than FreeSurfer-based striatum segmentation (p=0.06). Our approach was also tested using another independent dataset, which showed high DSC (0.826±0.038) comparable with that of FreeSurfer. Comparison with existing method Segmentation performance of our proposed method was comparable with that of FreeSurfer. The running time of our approach was approximately three seconds. We suggested a fast and accurate deep CNN-based segmentation for small brain structures which can be widely applied to brain image analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Adaptive local linear regression with application to printer color management.

    PubMed

    Gupta, Maya R; Garcia, Eric K; Chin, Erika

    2008-06-01

    Local learning methods, such as local linear regression and nearest neighbor classifiers, base estimates on nearby training samples, neighbors. Usually, the number of neighbors used in estimation is fixed to be a global "optimal" value, chosen by cross validation. This paper proposes adapting the number of neighbors used for estimation to the local geometry of the data, without need for cross validation. The term enclosing neighborhood is introduced to describe a set of neighbors whose convex hull contains the test point when possible. It is proven that enclosing neighborhoods yield bounded estimation variance under some assumptions. Three such enclosing neighborhood definitions are presented: natural neighbors, natural neighbors inclusive, and enclosing k-NN. The effectiveness of these neighborhood definitions with local linear regression is tested for estimating lookup tables for color management. Significant improvements in error metrics are shown, indicating that enclosing neighborhoods may be a promising adaptive neighborhood definition for other local learning tasks as well, depending on the density of training samples.

  20. Efficient Determination of Free Energy Landscapes in Multiple Dimensions from Biased Umbrella Sampling Simulations Using Linear Regression.

    PubMed

    Meng, Yilin; Roux, Benoît

    2015-08-11

    The weighted histogram analysis method (WHAM) is a standard protocol for postprocessing the information from biased umbrella sampling simulations to construct the potential of mean force with respect to a set of order parameters. By virtue of the WHAM equations, the unbiased density of state is determined by satisfying a self-consistent condition through an iterative procedure. While the method works very effectively when the number of order parameters is small, its computational cost grows rapidly in higher dimension. Here, we present a simple and efficient alternative strategy, which avoids solving the self-consistent WHAM equations iteratively. An efficient multivariate linear regression framework is utilized to link the biased probability densities of individual umbrella windows and yield an unbiased global free energy landscape in the space of order parameters. It is demonstrated with practical examples that free energy landscapes that are comparable in accuracy to WHAM can be generated at a small fraction of the cost.

  1. Efficient Determination of Free Energy Landscapes in Multiple Dimensions from Biased Umbrella Sampling Simulations Using Linear Regression

    PubMed Central

    2015-01-01

    The weighted histogram analysis method (WHAM) is a standard protocol for postprocessing the information from biased umbrella sampling simulations to construct the potential of mean force with respect to a set of order parameters. By virtue of the WHAM equations, the unbiased density of state is determined by satisfying a self-consistent condition through an iterative procedure. While the method works very effectively when the number of order parameters is small, its computational cost grows rapidly in higher dimension. Here, we present a simple and efficient alternative strategy, which avoids solving the self-consistent WHAM equations iteratively. An efficient multivariate linear regression framework is utilized to link the biased probability densities of individual umbrella windows and yield an unbiased global free energy landscape in the space of order parameters. It is demonstrated with practical examples that free energy landscapes that are comparable in accuracy to WHAM can be generated at a small fraction of the cost. PMID:26574437

  2. Data-based estimates of the ocean carbon sink variability - first results of the Surface Ocean pCO2 Mapping intercomparison (SOCOM)

    NASA Astrophysics Data System (ADS)

    Rödenbeck, C.; Bakker, D. C. E.; Gruber, N.; Iida, Y.; Jacobson, A. R.; Jones, S.; Landschützer, P.; Metzl, N.; Nakaoka, S.; Olsen, A.; Park, G.-H.; Peylin, P.; Rodgers, K. B.; Sasse, T. P.; Schuster, U.; Shutler, J. D.; Valsala, V.; Wanninkhof, R.; Zeng, J.

    2015-08-01

    Using measurements of the surface-ocean CO2 partial pressure (pCO2) and 14 different pCO2 mapping methods recently collated by the Surface Ocean pCO2 Mapping intercomparison (SOCOM) initiative, variations in regional and global sea-air CO2 fluxes have been investigated. Though the available mapping methods use widely different approaches, we find relatively consistent estimates of regional pCO2 seasonality, in line with previous estimates. In terms of interannual variability (IAV), all mapping methods estimate the largest variations to occur in the Eastern equatorial Pacific. Despite considerable spead in the detailed variations, mapping methods with closer match to the data also tend to be more consistent with each other. Encouragingly, this includes mapping methods belonging to complementary types - taking variability either directly from the pCO2 data or indirectly from driver data via regression. From a weighted ensemble average, we find an IAV amplitude of the global sea-air CO2 flux of 0.31 PgC yr-1 (standard deviation over 1992-2009), which is larger than simulated by biogeochemical process models. On a decadal perspective, the global CO2 uptake is estimated to have gradually increased since about 2000, with little decadal change prior to 2000. The weighted mean total ocean CO2 sink estimated by the SOCOM ensemble is consistent within uncertainties with estimates from ocean-interior carbon data or atmospheric oxygen trends.

  3. Establishment of reference intervals of clinical chemistry analytes for the adult population in Saudi Arabia: a study conducted as a part of the IFCC global study on reference values.

    PubMed

    Borai, Anwar; Ichihara, Kiyoshi; Al Masaud, Abdulaziz; Tamimi, Waleed; Bahijri, Suhad; Armbuster, David; Bawazeer, Ali; Nawajha, Mustafa; Otaibi, Nawaf; Khalil, Haitham; Kawano, Reo; Kaddam, Ibrahim; Abdelaal, Mohamed

    2016-05-01

    This study is a part of the IFCC-global study to derive reference intervals (RIs) for 28 chemistry analytes in Saudis. Healthy individuals (n=826) aged ≥18 years were recruited using the global study protocol. All specimens were measured using an Architect analyzer. RIs were derived by both parametric and non-parametric methods for comparative purpose. The need for secondary exclusion of reference values based on latent abnormal values exclusion (LAVE) method was examined. The magnitude of variation attributable to gender, ages and regions was calculated by the standard deviation ratio (SDR). Sources of variations: age, BMI, physical exercise and smoking levels were investigated by using the multiple regression analysis. SDRs for gender, age and regional differences were significant for 14, 8 and 2 analytes, respectively. BMI-related changes in test results were noted conspicuously for CRP. For some metabolic related parameters the ranges of RIs by non-parametric method were wider than by the parametric method and RIs derived using the LAVE method were significantly different than those without it. RIs were derived with and without gender partition (BMI, drugs and supplements were considered). RIs applicable to Saudis were established for the majority of chemistry analytes, whereas gender, regional and age RI partitioning was required for some analytes. The elevated upper limits of metabolic analytes reflects the existence of high prevalence of metabolic syndrome in Saudi population.

  4. Adaptive surrogate modeling by ANOVA and sparse polynomial dimensional decomposition for global sensitivity analysis in fluid simulation

    NASA Astrophysics Data System (ADS)

    Tang, Kunkun; Congedo, Pietro M.; Abgrall, Rémi

    2016-06-01

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable for real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.

  5. Spatiotemporal variability of urban growth factors: A global and local perspective on the megacity of Mumbai

    NASA Astrophysics Data System (ADS)

    Shafizadeh-Moghadam, Hossein; Helbich, Marco

    2015-03-01

    The rapid growth of megacities requires special attention among urban planners worldwide, and particularly in Mumbai, India, where growth is very pronounced. To cope with the planning challenges this will bring, developing a retrospective understanding of urban land-use dynamics and the underlying driving-forces behind urban growth is a key prerequisite. This research uses regression-based land-use change models - and in particular non-spatial logistic regression models (LR) and auto-logistic regression models (ALR) - for the Mumbai region over the period 1973-2010, in order to determine the drivers behind spatiotemporal urban expansion. Both global models are complemented by a local, spatial model, the so-called geographically weighted logistic regression (GWLR) model, one that explicitly permits variations in driving-forces across space. The study comes to two main conclusions. First, both global models suggest similar driving-forces behind urban growth over time, revealing that LRs and ALRs result in estimated coefficients with comparable magnitudes. Second, all the local coefficients show distinctive temporal and spatial variations. It is therefore concluded that GWLR aids our understanding of urban growth processes, and so can assist context-related planning and policymaking activities when seeking to secure a sustainable urban future.

  6. Global Quality of Life Among WHI Women Aged 80 Years and Older

    PubMed Central

    Brunner, Robert L.; Hogan, Patricia E.; Danhauer, Suzanne C.; Brenes, Gretchen A.; Bowen, Deborah J.; Snively, Beverly M.; Goveas, Joseph S.; Saquib, Nazmus; Zaslavsky, Oleg; Shumaker, Sally A.

    2016-01-01

    Abstract Background. The number of older adults living to age 80 and older is increasing rapidly, particularly among women. Correlates of quality of life (QOL) in very advanced ages are not known. We examined the association of demographic, social-psychological, lifestyle, and physical health variables with global QOL in a Women’s Health Initiative (WHI) cohort of women aged 80 and older. Methods. 26,299 WHI participants, who had completed a recent psychosocial and medical update, were included in these analyses. Global QOL was assessed by a single item, asking the women to rate their overall QOL on a scale from 0 to 10. Characteristics of the women were examined by the level of their transformed global QOL scores (≤50, 50–70, ≥70), and multiple regression was used to examine which demographic, social-psychological, lifestyle and health variables were independently associated with higher global QOL. Results. Social-psychological and current health variables were more strongly associated with global QOL than a history of selected comorbid conditions. In particular, higher self-rated health and fewer depressive symptoms were the most strongly associated with better global QOL in WHI women ≥80 years. Conclusions. Interventions to reduce depressive symptoms and improve health may lead to better self-reported health and global QOL among older women. Physical and mental health screenings followed by evidence-based interventions are imperative in geriatric care. PMID:26858327

  7. Estimating current and future global urban domestic material consumption

    NASA Astrophysics Data System (ADS)

    Baynes, Timothy Malcolm; Kaviti Musango, Josephine

    2018-06-01

    Urban material resource requirements are significant at the global level and these are expected to expand with future urban population growth. However, there are no global scale studies on the future material consumption of urban areas. This paper provides estimates of global urban domestic material consumption (DMC) in 2050 using three approaches based on: current gross statistics; a regression model; and a transition theoretic logistic model. All methods use UN urban population projections and assume a simple ‘business-as-usual’ scenario wherein historical aggregate trends in income and material flow continue into the future. A collation of data for 152 cities provided a year 2000 world average DMC/capita estimate, 12 tons/person/year (±22%), which we combined with UN population projections to produce a first-order estimation of urban DMC at 2050 of ~73 billion tons/year (±22%). Urban DMC/capita was found to be significantly correlated (R 2 > 0.9) to urban GDP/capita and area per person through a power law relation used to obtain a second estimate of 106 billion tons (±33%) in 2050. The inelastic exponent of the power law indicates a global tendency for relative decoupling of direct urban material consumption with increasing income. These estimates are global and influenced by the current proportion of developed-world cities in the global population of cities (and in our sample data). A third method employed a logistic model of transitions in urban DMC/capita with regional resolution. This method estimated global urban DMC to rise from approximately 40 billion tons/year in 2010 to ~90 billion tons/year in 2050 (modelled range: 66–111 billion tons/year). DMC/capita across different regions was estimated to converge from a range of 5–27 tons/person/year in the year 2000 to around 8–17 tons/person/year in 2050. The urban population does not increase proportionally during this period and thus the global average DMC/capita increases from ~12 to ~14 tons/person/year, challenging resource decoupling targets.

  8. Estimating the global incidence of traumatic spinal cord injury.

    PubMed

    Fitzharris, M; Cripps, R A; Lee, B B

    2014-02-01

    Population modelling--forecasting. To estimate the global incidence of traumatic spinal cord injury (TSCI). An initiative of the International Spinal Cord Society (ISCoS) Prevention Committee. Regression techniques were used to derive regional and global estimates of TSCI incidence. Using the findings of 31 published studies, a regression model was fitted using a known number of TSCI cases as the dependent variable and the population at risk as the single independent variable. In the process of deriving TSCI incidence, an alternative TSCI model was specified in an attempt to arrive at an optimal way of estimating the global incidence of TSCI. The global incidence of TSCI was estimated to be 23 cases per 1,000,000 persons in 2007 (179,312 cases per annum). World Health Organization's regional results are provided. Understanding the incidence of TSCI is important for health service planning and for the determination of injury prevention priorities. In the absence of high-quality epidemiological studies of TSCI in each country, the estimation of TSCI obtained through population modelling can be used to overcome known deficits in global spinal cord injury (SCI) data. The incidence of TSCI is context specific, and an alternative regression model demonstrated how TSCI incidence estimates could be improved with additional data. The results highlight the need for data standardisation and comprehensive reporting of national level TSCI data. A step-wise approach from the collation of conventional epidemiological data through to population modelling is suggested.

  9. Resilient Brain Aging: Characterization of Discordance between Alzheimer’s Disease Pathology and Cognition

    PubMed Central

    Negash, Selam; Wilson, Robert S.; Leurgans, Sue E.; Wolk, David A.; Schneider, Julie A.; Buchman, Aron S.; Bennett, David A.; Arnold, Steven. E.

    2014-01-01

    Background Although it is now evident that normal cognition can occur despite significant AD pathology, few studies have attempted to characterize this discordance, or examine factors that may contribute to resilient brain aging in the setting of AD pathology. Methods More than 2,000 older persons underwent annual evaluation as part of participation in the Religious Orders Study or Rush Memory Aging Project. A total of 966 subjects who had brain autopsy and comprehensive cognitive testing proximate to death were analyzed. Resilience was quantified as a continuous measure using linear regression modeling, where global cognition was entered as a dependent variable and global pathology was an independent variable. Studentized residuals generated from the model represented the discordance between cognition and pathology, and served as measure of resilience. The relation of resilience index to known risk factors for AD and related variables was examined. Results Multivariate regression models that adjusted for demographic variables revealed significant associations for early life socioeconomic status, reading ability, APOE-ε4 status, and past cognitive activity. A stepwise regression model retained reading level (estimate = 0.10, SE = 0.02; p < 0.0001) and past cognitive activity (estimate = 0.27, SE = 0.09; p = 0.002), suggesting the potential mediating role of these variables for resilience. Conclusions The construct of resilient brain aging can provide a framework for quantifying the discordance between cognition and pathology, and help identify factors that may mediate this relationship. PMID:23919768

  10. Parametric Methods for Dynamic 11C-Phenytoin PET Studies.

    PubMed

    Mansor, Syahir; Yaqub, Maqsood; Boellaard, Ronald; Froklage, Femke E; de Vries, Anke; Bakker, Esther D M; Voskuyl, Rob A; Eriksson, Jonas; Schwarte, Lothar A; Verbeek, Joost; Windhorst, Albert D; Lammertsma, Adriaan A

    2017-03-01

    In this study, the performance of various methods for generating quantitative parametric images of dynamic 11 C-phenytoin PET studies was evaluated. Methods: Double-baseline 60-min dynamic 11 C-phenytoin PET studies, including online arterial sampling, were acquired for 6 healthy subjects. Parametric images were generated using Logan plot analysis, a basis function method, and spectral analysis. Parametric distribution volume (V T ) and influx rate ( K 1 ) were compared with those obtained from nonlinear regression analysis of time-activity curves. In addition, global and regional test-retest (TRT) variability was determined for parametric K 1 and V T values. Results: Biases in V T observed with all parametric methods were less than 5%. For K 1 , spectral analysis showed a negative bias of 16%. The mean TRT variabilities of V T and K 1 were less than 10% for all methods. Shortening the scan duration to 45 min provided similar V T and K 1 with comparable TRT performance compared with 60-min data. Conclusion: Among the various parametric methods tested, the basis function method provided parametric V T and K 1 values with the least bias compared with nonlinear regression data and showed TRT variabilities lower than 5%, also for smaller volume-of-interest sizes (i.e., higher noise levels) and shorter scan duration. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  11. The Role of Interpersonal Personality Traits and Reassurance Seeking in Eating Disorder Symptoms and Depressive Symptoms among Women with Bulimia Nervosa

    PubMed Central

    Mason, Tyler B.; Lavender, Jason M.; Wonderlich, Stephen A.; Crosby, Ross D.; Joiner, Thomas E.; Mitchell, James E.; Crow, Scott J.; Klein, Marjorie H.; Le Grange, Daniel; Bardone-Cone, Anna M.; Peterson, Carol B.

    2017-01-01

    Introduction The role of interpersonal factors has been proposed in various models of eating disorder (ED) psychopathology and treatment. We examined the independent and interactive contributions of two interpersonal-focused personality traits (i.e., social avoidance and insecure attachment) and reassurance seeking in relation to global ED psychopathology and depressive symptoms among women with bulimia nervosa (BN). Method Participants were 204 adult women with full or subclinical BN who completed a battery of self-report questionnaires. Hierarchical multiple OLS regressions including main effects and interaction terms were used to analyze the data. Results Main effects were found for social avoidance and insecure attachment in association with global ED psychopathology and depressive symptoms. In addition, two-way interactions between social avoidance and reassurance seeking were observed for both global ED psychopathology and depressive symptoms. In general, reassurance seeking strengthened the association between social avoidance and global ED psychopathology and depressive symptoms. Conclusion These results demonstrate the importance of reassurance seeking in psychopathology among women with BN who display personality features characterized by social avoidance. PMID:27234198

  12. Building global models for fat and total protein content in raw milk based on historical spectroscopic data in the visible and short-wave near infrared range.

    PubMed

    Melenteva, Anastasiia; Galyanin, Vladislav; Savenkova, Elena; Bogomolov, Andrey

    2016-07-15

    A large set of fresh cow milk samples collected from many suppliers over a large geographical area in Russia during a year has been analyzed by optical spectroscopy in the range 400-1100 nm in accordance with previously developed scatter-based technique. The global (i.e. resistant to seasonal, genetic, regional and other variations of the milk composition) models for fat and total protein content, which were built using partial least-squares (PLS) regression, exhibit satisfactory prediction performances enabling their practical application in the dairy. The root mean-square errors of prediction (RMSEP) were 0.09 and 0.10 for fat and total protein content, respectively. The issues of raw milk analysis and multivariate modelling based on the historical spectroscopic data have been considered and approaches to the creation of global models and their transfer between the instruments have been proposed. Availability of global models should significantly facilitate the dissemination of optical spectroscopic methods for the laboratory and in-line quantitative milk analysis. Copyright © 2016. Published by Elsevier Ltd.

  13. Detect signals of interdecadal climate variations from an enhanced suite of reconstructed precipitation products since 1850 using the historical station data from Global Historical Climatology Network and the dynamical patterns derived from Global Precipitation Climatology Project

    NASA Astrophysics Data System (ADS)

    Shen, S. S.

    2015-12-01

    This presentation describes the detection of interdecadal climate signals in a newly reconstructed precipitation data from 1850-present. Examples are on precipitation signatures of East Asian Monsoon (EAM), Pacific Decadal Oscillation (PDO) and Atlantic Multidecadal Oscillations (AMO). The new reconstruction dataset is an enhanced edition of a suite of global precipitation products reconstructed by Spectral Optimal Gridding of Precipitation Version 1.0 (SOGP 1.0). The maximum temporal coverage is 1850-present and the spatial coverage is quasi-global (75S, 75N). This enhanced version has three different temporal resolutions (5-day, monthly, and annual) and two different spatial resolutions (2.5 deg and 5.0 deg). It also has a friendly Graphical User Interface (GUI). SOGP uses a multivariate regression method using an empirical orthogonal function (EOF) expansion. The Global Precipitation Climatology Project (GPCP) precipitation data from 1981-20010 are used to calculate the EOFs. The Global Historical Climatology Network (GHCN) gridded data are used to calculate the regression coefficients for reconstructions. The sampling errors of the reconstruction are analyzed according to the number of EOF modes used in the reconstruction. Our reconstructed 1900-2011 time series of the global average annual precipitation shows a 0.024 (mm/day)/100a trend, which is very close to the trend derived from the mean of 25 models of the CMIP5 (Coupled Model Intercomparison Project Phase 5). Our reconstruction has been validated by GPCP data after 1979. Our reconstruction successfully displays the 1877 El Nino (see the attached figure), which is considered a validation before 1900. Our precipitation products are publically available online, including digital data, precipitation animations, computer codes, readme files, and the user manual. This work is a joint effort of San Diego State University (Sam Shen, Gregori Clarke, Christian Junjinger, Nancy Tafolla, Barbara Sperberg, and Melanie Thorn), UCLA (Yongkang Xue), and University of Maryland (Tom Smith and Phil Arkin) and supported in part by the U.S. National Science Foundation (Awards No. AGS-1419256 and AGS-1015957).

  14. Novel applications of the temporal kernel method: Historical and future radiative forcing

    NASA Astrophysics Data System (ADS)

    Portmann, R. W.; Larson, E.; Solomon, S.; Murphy, D. M.

    2017-12-01

    We present a new estimate of the historical radiative forcing derived from the observed global mean surface temperature and a model derived kernel function. Current estimates of historical radiative forcing are usually derived from climate models. Despite large variability in these models, the multi-model mean tends to do a reasonable job of representing the Earth system and climate. One method of diagnosing the transient radiative forcing in these models requires model output of top of the atmosphere radiative imbalance and global mean temperature anomaly. It is difficult to apply this method to historical observations due to the lack of TOA radiative measurements before CERES. We apply the temporal kernel method (TKM) of calculating radiative forcing to the historical global mean temperature anomaly. This novel approach is compared against the current regression based methods using model outputs and shown to produce consistent forcing estimates giving confidence in the forcing derived from the historical temperature record. The derived TKM radiative forcing provides an estimate of the forcing time series that the average climate model needs to produce the observed temperature record. This forcing time series is found to be in good overall agreement with previous estimates but includes significant differences that will be discussed. The historical anthropogenic aerosol forcing is estimated as a residual from the TKM and found to be consistent with earlier moderate forcing estimates. In addition, this method is applied to future temperature projections to estimate the radiative forcing required to achieve those temperature goals, such as those set in the Paris agreement.

  15. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  16. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE PAGES

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne; ...

    2017-05-12

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  17. Global image registration using a symmetric block-matching approach

    PubMed Central

    Modat, Marc; Cash, David M.; Daga, Pankaj; Winston, Gavin P.; Duncan, John S.; Ourselin, Sébastien

    2014-01-01

    Abstract. Most medical image registration algorithms suffer from a directionality bias that has been shown to largely impact subsequent analyses. Several approaches have been proposed in the literature to address this bias in the context of nonlinear registration, but little work has been done for global registration. We propose a symmetric approach based on a block-matching technique and least-trimmed square regression. The proposed method is suitable for multimodal registration and is robust to outliers in the input images. The symmetric framework is compared with the original asymmetric block-matching technique and is shown to outperform it in terms of accuracy and robustness. The methodology presented in this article has been made available to the community as part of the NiftyReg open-source package. PMID:26158035

  18. Automated robot-assisted surgical skill evaluation: Predictive analytics approach.

    PubMed

    Fard, Mahtab J; Ameri, Sattar; Darin Ellis, R; Chinnam, Ratna B; Pandya, Abhilash K; Klein, Michael D

    2018-02-01

    Surgical skill assessment has predominantly been a subjective task. Recently, technological advances such as robot-assisted surgery have created great opportunities for objective surgical evaluation. In this paper, we introduce a predictive framework for objective skill assessment based on movement trajectory data. Our aim is to build a classification framework to automatically evaluate the performance of surgeons with different levels of expertise. Eight global movement features are extracted from movement trajectory data captured by a da Vinci robot for surgeons with two levels of expertise - novice and expert. Three classification methods - k-nearest neighbours, logistic regression and support vector machines - are applied. The result shows that the proposed framework can classify surgeons' expertise as novice or expert with an accuracy of 82.3% for knot tying and 89.9% for a suturing task. This study demonstrates and evaluates the ability of machine learning methods to automatically classify expert and novice surgeons using global movement features. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Use of selected ambulatory dental services in Taiwan before and after global budgeting: a longitudinal study to identify trends in hospital and clinic-based services

    PubMed Central

    2012-01-01

    Background The Taiwan government adopted National Health Insurance (NHI) in 1995, providing universal health care to all citizens. It was financed by mandatory premium contributions made by employers, employees, and the government. Since then, the government has faced increasing challenges to control NHI expenditures. The aim of this study was to determine trends in the provision of dental services in Taiwan after the implementation of global budgeting in 1998 and to identify areas of possible concern. Methods This longitudinal before/after study was based on data from the National Health Insurance Research Database from 1996 to 2001. These data were subjected to logistic regression analysis. Linear regression analysis was used to examine changes in delivery of specific services after global budgeting implementation. Utilization of hospital and clinic services was compared. Results Reimbursement for dental services increased significantly while the number of visits per patient remained steady in both hospitals and clinics. In hospitals, visits for root canal procedures, ionomer restoration, tooth extraction and tooth scaling increased significantly. In dental clinics, visits for amalgam restoration decreased significantly while those for ionomer restoration, tooth extraction, and tooth scaling increased significantly. After the adoption of global budgeting, expenditures for dental services increased dramatically while the number of visits per patient did not, indicating a possible shift in patients to hospital facilities that received additional National Health Insurance funding. Conclusions The identified trends indicate increased utilization of dental services and uneven distribution of care and dentists. These trends may be compromising the quality of dental care delivered in Taiwan. PMID:23009095

  20. Factors influencing global antiretroviral procurement prices.

    PubMed

    Wirtz, Veronika J; Forsythe, Steven; Valencia-Mendoza, Atanacio; Bautista-Arredondo, Sergio

    2009-11-18

    Antiretroviral medicines (ARVs) are one of the most costly parts of HIV/AIDS treatment. Many countries are struggling to provide universal access to ARVs for all people living with HIV and AIDS. Although substantial price reductions of ARVs have occurred, especially between 2002 and 2008, achieving sustainable access for the next several decades remains a major challenge for most low- and middle-income countries. The objectives of the present study were twofold: first, to analyze global ARV prices between 2005 and 2008 and associated factors, particularly procurement methods and key donor policies on ARV procurement efficiency; second, to discuss the options of procurement processes and policies that should be considered when implementing or reforming access to ARV programs. An ARV-medicines price-analysis was carried out using the Global Price Reporting Mechanism from the World Health Organization. For a selection of 12 ARVs, global median prices and price variation were calculated. Linear regression models for each ARV were used to identify factors that were associated with lower procurement prices. Logistic regression models were used to identify the characteristics of those countries which procure below the highest and lowest direct manufactured costs. Three key factors appear to have an influence on a country's ARV prices: (a) whether the product is generic or not; (b) the socioeconomic status of the country; (c) whether the country is a member of the Clinton HIV/AIDS Initiative. Factors which did not influence procurement below the highest direct manufactured costs were HIV prevalence, procurement volume, whether the country belongs to the least developed countries or a focus country of the United States President's Emergency Plan For AIDS Relief. One of the principal mechanisms that can help to lower prices for ARV over the next several decades is increasing procurement efficiency. Benchmarking prices could be one useful tool to achieve this.

  1. Effects of land cover, topography, and built structure on seasonal water quality at multiple spatial scales.

    PubMed

    Pratt, Bethany; Chang, Heejun

    2012-03-30

    The relationship among land cover, topography, built structure and stream water quality in the Portland Metro region of Oregon and Clark County, Washington areas, USA, is analyzed using ordinary least squares (OLS) and geographically weighted (GWR) multiple regression models. Two scales of analysis, a sectional watershed and a buffer, offered a local and a global investigation of the sources of stream pollutants. Model accuracy, measured by R(2) values, fluctuated according to the scale, season, and regression method used. While most wet season water quality parameters are associated with urban land covers, most dry season water quality parameters are related topographic features such as elevation and slope. GWR models, which take into consideration local relations of spatial autocorrelation, had stronger results than OLS regression models. In the multiple regression models, sectioned watershed results were consistently better than the sectioned buffer results, except for dry season pH and stream temperature parameters. This suggests that while riparian land cover does have an effect on water quality, a wider contributing area needs to be included in order to account for distant sources of pollutants. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Who Adopts Improved Fuels and Cookstoves? A Systematic Review

    PubMed Central

    Lewis, Jessica J.

    2012-01-01

    Background: The global focus on improved cookstoves (ICSs) and clean fuels has increased because of their potential for delivering triple dividends: household health, local environmental quality, and regional climate benefits. However, ICS and clean fuel dissemination programs have met with low rates of adoption. Objectives: We reviewed empirical studies on ICSs and fuel choice to describe the literature, examine determinants of fuel and stove choice, and identify knowledge gaps. Methods: We conducted a systematic review of the literature on the adoption of ICSs or cleaner fuels by households in developing countries. Results are synthesized through a simple vote-counting meta-analysis. Results: We identified 32 research studies that reported 146 separate regression analyses of ICS adoption (11 analyses) or fuel choice (135 analyses) from Asia (60%), Africa (27%), and Latin America (19%). Most studies apply multivariate regression methods to consider 7–13 determinants of choice. Income, education, and urban location were positively associated with adoption in most but not all studies. However, the influence of fuel availability and prices, household size and composition, and sex is unclear. Potentially important drivers such as credit, supply-chain strengthening, and social marketing have been ignored. Conclusions: Adoption studies of ICSs or clean energy are scarce, scattered, and of differential quality, even though global distribution programs are quickly expanding. Future research should examine an expanded set of contextual variables to improve implementation of stove programs that can realize the “win-win-win” of health, local environmental quality, and climate associated with these technologies. PMID:22296719

  3. Half a billion surgical cases: Aligning surgical delivery with best-performing health systems

    PubMed Central

    Shrime, Mark G.; Daniels, Kimberly M.; Meara, John G.

    2015-01-01

    Background Surgical delivery varies 200-fold across countries. No direct correlation exists, however, between surgical delivery and health outcomes, making it difficult to pinpoint a goal for surgical scale-up. This report determines the amount of surgery that would be delivered worldwide if the world aligned itself with countries providing the best health outcomes. Methods Annual rates of surgical delivery have been published previously for 129 countries. Five health outcomes were plotted against reported surgical delivery. Univariate and multivariate polynomial regression curves were fit, and the optimal point on each regression curve was determined by solving for first-order conditions. The country closest to the optimum for each health outcome was taken as representative of the best-performing health system. Monetary inputs to and surgical procedures provided by these systems were scaled to the global population. Results For 3 of the 5 health outcomes, optima could be found. Globally, 315 million procedures currently are provided annually. If global delivery mirrored the 3 best-performing countries, between 360 million and 460 million cases would be provided annually. With population growth, this will increase to approximately half a billion cases by 2030. Health systems delivering these outcomes spend approximately 10% of their GDP on health. Conclusion This is the first study to provide empirical evidence for the surgical output that an ideal health system would provide. Our results project ideal delivery worldwide of approximately 550 million annual surgical cases by 2030. PMID:25934078

  4. Improved Regression Analysis of Temperature-Dependent Strain-Gage Balance Calibration Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.

    2015-01-01

    An improved approach is discussed that may be used to directly include first and second order temperature effects in the load prediction algorithm of a wind tunnel strain-gage balance. The improved approach was designed for the Iterative Method that fits strain-gage outputs as a function of calibration loads and uses a load iteration scheme during the wind tunnel test to predict loads from measured gage outputs. The improved approach assumes that the strain-gage balance is at a constant uniform temperature when it is calibrated and used. First, the method introduces a new independent variable for the regression analysis of the balance calibration data. The new variable is designed as the difference between the uniform temperature of the balance and a global reference temperature. This reference temperature should be the primary calibration temperature of the balance so that, if needed, a tare load iteration can be performed. Then, two temperature{dependent terms are included in the regression models of the gage outputs. They are the temperature difference itself and the square of the temperature difference. Simulated temperature{dependent data obtained from Triumph Aerospace's 2013 calibration of NASA's ARC-30K five component semi{span balance is used to illustrate the application of the improved approach.

  5. Source apportionment of soil heavy metals using robust absolute principal component scores-robust geographically weighted regression (RAPCS-RGWR) receptor model.

    PubMed

    Qu, Mingkai; Wang, Yan; Huang, Biao; Zhao, Yongcun

    2018-06-01

    The traditional source apportionment models, such as absolute principal component scores-multiple linear regression (APCS-MLR), are usually susceptible to outliers, which may be widely present in the regional geochemical dataset. Furthermore, the models are merely built on variable space instead of geographical space and thus cannot effectively capture the local spatial characteristics of each source contributions. To overcome the limitations, a new receptor model, robust absolute principal component scores-robust geographically weighted regression (RAPCS-RGWR), was proposed based on the traditional APCS-MLR model. Then, the new method was applied to the source apportionment of soil metal elements in a region of Wuhan City, China as a case study. Evaluations revealed that: (i) RAPCS-RGWR model had better performance than APCS-MLR model in the identification of the major sources of soil metal elements, and (ii) source contributions estimated by RAPCS-RGWR model were more close to the true soil metal concentrations than that estimated by APCS-MLR model. It is shown that the proposed RAPCS-RGWR model is a more effective source apportionment method than APCS-MLR (i.e., non-robust and global model) in dealing with the regional geochemical dataset. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Heidelberg Retina Tomograph 3 machine learning classifiers for glaucoma detection

    PubMed Central

    Townsend, K A; Wollstein, G; Danks, D; Sung, K R; Ishikawa, H; Kagemann, L; Gabriele, M L; Schuman, J S

    2010-01-01

    Aims To assess performance of classifiers trained on Heidelberg Retina Tomograph 3 (HRT3) parameters for discriminating between healthy and glaucomatous eyes. Methods Classifiers were trained using HRT3 parameters from 60 healthy subjects and 140 glaucomatous subjects. The classifiers were trained on all 95 variables and smaller sets created with backward elimination. Seven types of classifiers, including Support Vector Machines with radial basis (SVM-radial), and Recursive Partitioning and Regression Trees (RPART), were trained on the parameters. The area under the ROC curve (AUC) was calculated for classifiers, individual parameters and HRT3 glaucoma probability scores (GPS). Classifier AUCs and leave-one-out accuracy were compared with the highest individual parameter and GPS AUCs and accuracies. Results The highest AUC and accuracy for an individual parameter were 0.848 and 0.79, for vertical cup/disc ratio (vC/D). For GPS, global GPS performed best with AUC 0.829 and accuracy 0.78. SVM-radial with all parameters showed significant improvement over global GPS and vC/ D with AUC 0.916 and accuracy 0.85. RPART with all parameters provided significant improvement over global GPS with AUC 0.899 and significant improvement over global GPS and vC/D with accuracy 0.875. Conclusions Machine learning classifiers of HRT3 data provide significant enhancement over current methods for detection of glaucoma. PMID:18523087

  7. Optimizing human activity patterns using global sensitivity analysis.

    PubMed

    Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M

    2014-12-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.

  8. Optimizing human activity patterns using global sensitivity analysis

    PubMed Central

    Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.

    2014-01-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080

  9. Optimizing human activity patterns using global sensitivity analysis

    DOE PAGES

    Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...

    2013-12-10

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less

  10. Mapping 2000 2010 Impervious Surface Change in India Using Global Land Survey Landsat Data

    NASA Technical Reports Server (NTRS)

    Wang, Panshi; Huang, Chengquan; Brown De Colstoun, Eric C.

    2017-01-01

    Understanding and monitoring the environmental impacts of global urbanization requires better urban datasets. Continuous field impervious surface change (ISC) mapping using Landsat data is an effective way to quantify spatiotemporal dynamics of urbanization. It is well acknowledged that Landsat-based estimation of impervious surface is subject to seasonal and phenological variations. The overall goal of this paper is to map 200-02010 ISC for India using Global Land Survey datasets and training data only available for 2010. To this end, a method was developed that could transfer the regression tree model developed for mapping 2010 impervious surface to 2000 using an iterative training and prediction (ITP) approach An independent validation dataset was also developed using Google Earth imagery. Based on the reference ISC from the validation dataset, the RMSE of predicted ISC was estimated to be 18.4%. At 95% confidence, the total estimated ISC for India between 2000 and 2010 is 2274.62 +/- 7.84 sq km.

  11. The Highly Adaptive Lasso Estimator

    PubMed Central

    Benkeser, David; van der Laan, Mark

    2017-01-01

    Estimation of a regression functions is a common goal of statistical learning. We propose a novel nonparametric regression estimator that, in contrast to many existing methods, does not rely on local smoothness assumptions nor is it constructed using local smoothing techniques. Instead, our estimator respects global smoothness constraints by virtue of falling in a class of right-hand continuous functions with left-hand limits that have variation norm bounded by a constant. Using empirical process theory, we establish a fast minimal rate of convergence of our proposed estimator and illustrate how such an estimator can be constructed using standard software. In simulations, we show that the finite-sample performance of our estimator is competitive with other popular machine learning techniques across a variety of data generating mechanisms. We also illustrate competitive performance in real data examples using several publicly available data sets. PMID:29094111

  12. Optimal consistency in microRNA expression analysis using reference-gene-based normalization.

    PubMed

    Wang, Xi; Gardiner, Erin J; Cairns, Murray J

    2015-05-01

    Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.

  13. A bioavailable strontium isoscape for Western Europe: A machine learning approach

    PubMed Central

    von Holstein, Isabella C. C.; Laffoon, Jason E.; Willmes, Malte; Liu, Xiao-Ming; Davies, Gareth R.

    2018-01-01

    Strontium isotope ratios (87Sr/86Sr) are gaining considerable interest as a geolocation tool and are now widely applied in archaeology, ecology, and forensic research. However, their application for provenance requires the development of baseline models predicting surficial 87Sr/86Sr variations (“isoscapes”). A variety of empirically-based and process-based models have been proposed to build terrestrial 87Sr/86Sr isoscapes but, in their current forms, those models are not mature enough to be integrated with continuous-probability surface models used in geographic assignment. In this study, we aim to overcome those limitations and to predict 87Sr/86Sr variations across Western Europe by combining process-based models and a series of remote-sensing geospatial products into a regression framework. We find that random forest regression significantly outperforms other commonly used regression and interpolation methods, and efficiently predicts the multi-scale patterning of 87Sr/86Sr variations by accounting for geological, geomorphological and atmospheric controls. Random forest regression also provides an easily interpretable and flexible framework to integrate different types of environmental auxiliary variables required to model the multi-scale patterning of 87Sr/86Sr variability. The method is transferable to different scales and resolutions and can be applied to the large collection of geospatial data available at local and global levels. The isoscape generated in this study provides the most accurate 87Sr/86Sr predictions in bioavailable strontium for Western Europe (R2 = 0.58 and RMSE = 0.0023) to date, as well as a conservative estimate of spatial uncertainty by applying quantile regression forest. We anticipate that the method presented in this study combined with the growing numbers of bioavailable 87Sr/86Sr data and satellite geospatial products will extend the applicability of the 87Sr/86Sr geo-profiling tool in provenance applications. PMID:29847595

  14. Development and Testing of Building Energy Model Using Non-Linear Auto Regression Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Arida, Maya Ahmad

    In 1972 sustainable development concept existed and during The years it became one of the most important solution to save natural resources and energy, but now with rising energy costs and increasing awareness of the effect of global warming, the development of building energy saving methods and models become apparently more necessary for sustainable future. According to U.S. Energy Information Administration EIA (EIA), today buildings in the U.S. consume 72 percent of electricity produced, and use 55 percent of U.S. natural gas. Buildings account for about 40 percent of the energy consumed in the United States, more than industry and transportation. Of this energy, heating and cooling systems use about 55 percent. If energy-use trends continue, buildings will become the largest consumer of global energy by 2025. This thesis proposes procedures and analysis techniques for building energy system and optimization methods using time series auto regression artificial neural networks. The model predicts whole building energy consumptions as a function of four input variables, dry bulb and wet bulb outdoor air temperatures, hour of day and type of day. The proposed model and the optimization process are tested using data collected from an existing building located in Greensboro, NC. The testing results show that the model can capture very well the system performance, and The optimization method was also developed to automate the process of finding the best model structure that can produce the best accurate prediction against the actual data. The results show that the developed model can provide results sufficiently accurate for its use in various energy efficiency and saving estimation applications.

  15. Chronic widespread pain prevalence in the general population: A systematic review.

    PubMed

    Andrews, P; Steultjens, M; Riskowski, J

    2018-01-01

    Chronic widespread pain (CWP) is a significant burden in communities. Understanding the impact of population-dependent (e.g., age, gender) and contextual-dependent (e.g. survey method, region, inequality level) factors have on CWP prevalence may provide a foundation for population-based strategies to address CWP. Therefore, the purpose of this study was to estimate the global prevalence of CWP and evaluate the population and contextual factors associated with CWP. A systematic review of CWP prevalence studies (1990-2017) in the general population was undertaken. Meta-analyses were conducted to determine CWP prevalence, and study population data and contextual factors were evaluated using a meta-regression. Thirty-nine manuscripts met the inclusion criteria. Study CWP prevalence ranged from 1.4% to 24.0%, with CWP prevalence in men ranging from 0.8% to 15.3% and 1.7% to 22.1% in women. Estimated overall CWP prevalence was 9.6% (8.0-11.2%). Meta-regression analyses showed gender, United Nations country development status, and human development index (HDI) influenced CWP prevalence, while survey method, region, methodological and reporting quality, and inequality showed no significant effect on the CWP estimate. Globally CWP affects one in ten individuals within the general population, with women more likely to experience CWP than men. HDI was noted to be the socioeconomic factor related to CWP prevalence, with those in more developed countries having a lower CWP prevalence than those in less developed countries. Most CWP estimates were from developed countries, and CWP estimates from countries with a lower socioeconomic position is needed to further refine the global estimate of CWP. This systematic review and meta-analysis updates the current global CWP prevalence by examining the population-level (e.g. age, gender) and contextual (e.g. country development status; survey style; reporting and methodologic quality) factors associated with CWP prevalence. This analyses provides evidence to support higher levels of CWP in countries with a lower socioeconomic position relative to countries with a higher socioeconomic position. © 2017 European Pain Federation - EFIC®.

  16. Globally efficient non-parametric inference of average treatment effects by empirical balancing calibration weighting

    PubMed Central

    Chan, Kwun Chuen Gary; Yam, Sheung Chi Phillip; Zhang, Zheng

    2015-01-01

    Summary The estimation of average treatment effects based on observational data is extremely important in practice and has been studied by generations of statisticians under different frameworks. Existing globally efficient estimators require non-parametric estimation of a propensity score function, an outcome regression function or both, but their performance can be poor in practical sample sizes. Without explicitly estimating either functions, we consider a wide class calibration weights constructed to attain an exact three-way balance of the moments of observed covariates among the treated, the control, and the combined group. The wide class includes exponential tilting, empirical likelihood and generalized regression as important special cases, and extends survey calibration estimators to different statistical problems and with important distinctions. Global semiparametric efficiency for the estimation of average treatment effects is established for this general class of calibration estimators. The results show that efficiency can be achieved by solely balancing the covariate distributions without resorting to direct estimation of propensity score or outcome regression function. We also propose a consistent estimator for the efficient asymptotic variance, which does not involve additional functional estimation of either the propensity score or the outcome regression functions. The proposed variance estimator outperforms existing estimators that require a direct approximation of the efficient influence function. PMID:27346982

  17. Globally efficient non-parametric inference of average treatment effects by empirical balancing calibration weighting.

    PubMed

    Chan, Kwun Chuen Gary; Yam, Sheung Chi Phillip; Zhang, Zheng

    2016-06-01

    The estimation of average treatment effects based on observational data is extremely important in practice and has been studied by generations of statisticians under different frameworks. Existing globally efficient estimators require non-parametric estimation of a propensity score function, an outcome regression function or both, but their performance can be poor in practical sample sizes. Without explicitly estimating either functions, we consider a wide class calibration weights constructed to attain an exact three-way balance of the moments of observed covariates among the treated, the control, and the combined group. The wide class includes exponential tilting, empirical likelihood and generalized regression as important special cases, and extends survey calibration estimators to different statistical problems and with important distinctions. Global semiparametric efficiency for the estimation of average treatment effects is established for this general class of calibration estimators. The results show that efficiency can be achieved by solely balancing the covariate distributions without resorting to direct estimation of propensity score or outcome regression function. We also propose a consistent estimator for the efficient asymptotic variance, which does not involve additional functional estimation of either the propensity score or the outcome regression functions. The proposed variance estimator outperforms existing estimators that require a direct approximation of the efficient influence function.

  18. Age estimation standards for a Western Australian population using the coronal pulp cavity index.

    PubMed

    Karkhanis, Shalmira; Mack, Peter; Franklin, Daniel

    2013-09-10

    Age estimation is a vital aspect in creating a biological profile and aids investigators by narrowing down potentially matching identities from the available pool. In addition to routine casework, in the present global political scenario, age estimation in living individuals is required in cases of refugees, asylum seekers, human trafficking and to ascertain age of criminal responsibility. Thus robust methods that are simple, non-invasive and ethically viable are required. The aim of the present study is, therefore, to test the reliability and applicability of the coronal pulp cavity index method, for the purpose of developing age estimation standards for an adult Western Australian population. A total of 450 orthopantomograms (220 females and 230 males) of Australian individuals were analyzed. Crown and coronal pulp chamber heights were measured in the mandibular left and right premolars, and the first and second molars. These measurements were then used to calculate the tooth coronal index. Data was analyzed using paired sample t-tests to assess bilateral asymmetry followed by simple linear and multiple regressions to develop age estimation models. The most accurate age estimation based on simple linear regression model was with mandibular right first molar (SEE ±8.271 years). Multiple regression models improved age prediction accuracy considerably and the most accurate model was with bilateral first and second molars (SEE ±6.692 years). This study represents the first investigation of this method in a Western Australian population and our results indicate that the method is suitable for forensic application. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. TLE uncertainty estimation using robust weighted differencing

    NASA Astrophysics Data System (ADS)

    Geul, Jacco; Mooij, Erwin; Noomen, Ron

    2017-05-01

    Accurate knowledge of satellite orbit errors is essential for many types of analyses. Unfortunately, for two-line elements (TLEs) this is not available. This paper presents a weighted differencing method using robust least-squares regression for estimating many important error characteristics. The method is applied to both classic and enhanced TLEs, compared to previous implementations, and validated using Global Positioning System (GPS) solutions for the GOCE satellite in Low-Earth Orbit (LEO), prior to its re-entry. The method is found to be more accurate than previous TLE differencing efforts in estimating initial uncertainty, as well as error growth. The method also proves more reliable and requires no data filtering (such as outlier removal). Sensitivity analysis shows a strong relationship between argument of latitude and covariance (standard deviations and correlations), which the method is able to approximate. Overall, the method proves accurate, computationally fast, and robust, and is applicable to any object in the satellite catalogue (SATCAT).

  20. A systematic evaluation of normalization methods in quantitative label-free proteomics.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2018-01-01

    To date, mass spectrometry (MS) data remain inherently biased as a result of reasons ranging from sample handling to differences caused by the instrumentation. Normalization is the process that aims to account for the bias and make samples more comparable. The selection of a proper normalization method is a pivotal task for the reliability of the downstream analysis and results. Many normalization methods commonly used in proteomics have been adapted from the DNA microarray techniques. Previous studies comparing normalization methods in proteomics have focused mainly on intragroup variation. In this study, several popular and widely used normalization methods representing different strategies in normalization are evaluated using three spike-in and one experimental mouse label-free proteomic data sets. The normalization methods are evaluated in terms of their ability to reduce variation between technical replicates, their effect on differential expression analysis and their effect on the estimation of logarithmic fold changes. Additionally, we examined whether normalizing the whole data globally or in segments for the differential expression analysis has an effect on the performance of the normalization methods. We found that variance stabilization normalization (Vsn) reduced variation the most between technical replicates in all examined data sets. Vsn also performed consistently well in the differential expression analysis. Linear regression normalization and local regression normalization performed also systematically well. Finally, we discuss the choice of a normalization method and some qualities of a suitable normalization method in the light of the results of our evaluation. © The Author 2016. Published by Oxford University Press.

  1. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.

  2. Melamine detection by mid- and near-infrared (MIR/NIR) spectroscopy: a quick and sensitive method for dairy products analysis including liquid milk, infant formula, and milk powder.

    PubMed

    Balabin, Roman M; Smirnov, Sergey V

    2011-07-15

    Melamine (2,4,6-triamino-1,3,5-triazine) is a nitrogen-rich chemical implicated in the pet and human food recalls and in the global food safety scares involving milk products. Due to the serious health concerns associated with melamine consumption and the extensive scope of affected products, rapid and sensitive methods to detect melamine's presence are essential. We propose the use of spectroscopy data-produced by near-infrared (near-IR/NIR) and mid-infrared (mid-IR/MIR) spectroscopies, in particular-for melamine detection in complex dairy matrixes. None of the up-to-date reported IR-based methods for melamine detection has unambiguously shown its wide applicability to different dairy products as well as limit of detection (LOD) below 1 ppm on independent sample set. It was found that infrared spectroscopy is an effective tool to detect melamine in dairy products, such as infant formula, milk powder, or liquid milk. ALOD below 1 ppm (0.76±0.11 ppm) can be reached if a correct spectrum preprocessing (pretreatment) technique and a correct multivariate (MDA) algorithm-partial least squares regression (PLS), polynomial PLS (Poly-PLS), artificial neural network (ANN), support vector regression (SVR), or least squares support vector machine (LS-SVM)-are used for spectrum analysis. The relationship between MIR/NIR spectrum of milk products and melamine content is nonlinear. Thus, nonlinear regression methods are needed to correctly predict the triazine-derivative content of milk products. It can be concluded that mid- and near-infrared spectroscopy can be regarded as a quick, sensitive, robust, and low-cost method for liquid milk, infant formula, and milk powder analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Management of health care expenditure by soft computing methodology

    NASA Astrophysics Data System (ADS)

    Maksimović, Goran; Jović, Srđan; Jovanović, Radomir; Aničić, Obrad

    2017-01-01

    In this study was managed the health care expenditure by soft computing methodology. The main goal was to predict the gross domestic product (GDP) according to several factors of health care expenditure. Soft computing methodologies were applied since GDP prediction is very complex task. The performances of the proposed predictors were confirmed with the simulation results. According to the results, support vector regression (SVR) has better prediction accuracy compared to other soft computing methodologies. The soft computing methods benefit from the soft computing capabilities of global optimization in order to avoid local minimum issues.

  4. Have the temperature time series a structural change after 1998?

    NASA Astrophysics Data System (ADS)

    Werner, Rolf; Valev, Dimitare; Danov, Dimitar

    2012-07-01

    The global and hemisphere temperature GISS and Hadcrut3 time series were analysed for structural changes. We postulate the continuity of the preceding temperature function depending from the time. The slopes are calculated for a sequence of segments limited by time thresholds. We used a standard method, the restricted linear regression with dummy variables. We performed the calculations and tests for different number of thresholds. The thresholds are searched continuously in determined time intervals. The F-statistic is used to obtain the time points of the structural changes.

  5. Forecasting Electric Power Generation of Photovoltaic Power System for Energy Network

    NASA Astrophysics Data System (ADS)

    Kudo, Mitsuru; Takeuchi, Akira; Nozaki, Yousuke; Endo, Hisahito; Sumita, Jiro

    Recently, there has been an increase in concern about the global environment. Interest is growing in developing an energy network by which new energy systems such as photovoltaic and fuel cells generate power locally and electric power and heat are controlled with a communications network. We developed the power generation forecast method for photovoltaic power systems in an energy network. The method makes use of weather information and regression analysis. We carried out forecasting power output of the photovoltaic power system installed in Expo 2005, Aichi Japan. As a result of comparing measurements with a prediction values, the average prediction error per day was about 26% of the measured power.

  6. [Prediction and spatial distribution of recruitment trees of natural secondary forest based on geographically weighted Poisson model].

    PubMed

    Zhang, Ling Yu; Liu, Zhao Gang

    2017-12-01

    Based on the data collected from 108 permanent plots of the forest resources survey in Maoershan Experimental Forest Farm during 2004-2016, this study investigated the spatial distribution of recruitment trees in natural secondary forest by global Poisson regression and geographically weighted Poisson regression (GWPR) with four bandwidths of 2.5, 5, 10 and 15 km. The simulation effects of the 5 regressions and the factors influencing the recruitment trees in stands were analyzed, a description was given to the spatial autocorrelation of the regression residuals on global and local levels using Moran's I. The results showed that the spatial distribution of the number of natural secondary forest recruitment was significantly influenced by stands and topographic factors, especially average DBH. The GWPR model with small scale (2.5 km) had high accuracy of model fitting, a large range of model parameter estimates was generated, and the localized spatial distribution effect of the model parameters was obtained. The GWPR model at small scale (2.5 and 5 km) had produced a small range of model residuals, and the stability of the model was improved. The global spatial auto-correlation of the GWPR model residual at the small scale (2.5 km) was the lowe-st, and the local spatial auto-correlation was significantly reduced, in which an ideal spatial distribution pattern of small clusters with different observations was formed. The local model at small scale (2.5 km) was much better than the global model in the simulation effect on the spatial distribution of recruitment tree number.

  7. Women, Physical Activity, and Quality of Life: Self-concept as a Mediator.

    PubMed

    Gonzalo Silvestre, Tamara; Ubillos Landa, Silvia

    2016-02-22

    The objectives of this research are: (a) analyze the incremental validity of physical activity's (PA) influence on perceived quality of life (PQL); (b) determine if PA's predictive power is mediated by self-concept; and (c) study if results vary according to a unidimensional or multidimensional approach to self-concept measurement. The sample comprised 160 women from Burgos, Spain aged 18 to 45 years old. Non-probability sampling was used. Two three-step hierarchical regression analyses were applied to forecast PQL. The hedonic quality-of-life indicators, self-concept, self-esteem, and PA were included as independent variables. The first regression analysis included global self-concept as predictor variable, while the second included its five dimensions. Two mediation analyses were conducted to see if PA's ability to predict PQL was mediated by global and physical self-concept. Results from the first regression shows that self-concept, satisfaction with life, and PA were significant predictors. PA slightly but significantly increased explained variance in PQL (2.1%). In the second regression, substituting global self-concept with its five constituent factors, only the physical dimension and satisfaction with life predicted PQL, while PA ceased to be a significant predictor. Mediation analysis revealed that only physical self-concept mediates the relationship between PA and PQL (z = 1.97, p < .050), and not global self-concept. Physical self-concept was the strongest predictor and approximately 32.45 % of PA's effect on PQL was mediated by it. This study's findings support a multidimensional view of self-concept, and represent a more accurate image of the relationship between PQL, PA, and self-concept.

  8. Geographically weighted poisson regression semiparametric on modeling of the number of tuberculosis cases (Case study: Bandung city)

    NASA Astrophysics Data System (ADS)

    Octavianty, Toharudin, Toni; Jaya, I. G. N. Mindra

    2017-03-01

    Tuberculosis (TB) is a disease caused by a bacterium, called Mycobacterium tuberculosis, which typically attacks the lungs but can also affect the kidney, spine, and brain (Centers for Disease Control and Prevention). Indonesia had the largest number of TB cases after India (Global Tuberculosis Report 2015 by WHO). The distribution of Mycobacterium tuberculosis genotypes in Indonesia showed the high genetic diversity and tended to vary by geographic regions. For instance, in Bandung city, the prevalence rate of TB morbidity is quite high. A number of TB patients belong to the counted data. To determine the factors that significantly influence the number of tuberculosis patients in each location of the observations can be used statistical analysis tool that is Geographically Weighted Poisson Regression Semiparametric (GWPRS). GWPRS is an extension of the Poisson regression and GWPR that is influenced by geographical factors, and there is also variables that influence globally and locally. Using the TB Data in Bandung city (in 2015), the results show that the global and local variables that influence the number of tuberculosis patients in every sub-district.

  9. Comment on "Cosmic-ray-driven reaction and greenhouse effect of halogenated molecules: Culprits for atmospheric ozone depletion and global climate change"

    NASA Astrophysics Data System (ADS)

    Nuccitelli, Dana; Cowtan, Kevin; Jacobs, Peter; Richardson, Mark; Way, Robert G.; Blackburn, Anne-Marie; Stolpe, Martin B.; Cook, John

    2014-04-01

    Lu (2013) (L13) argued that solar effects and anthropogenic halogenated gases can explain most of the observed warming of global mean surface air temperatures since 1850, with virtually no contribution from atmospheric carbon dioxide (CO2) concentrations. Here we show that this conclusion is based on assumptions about the saturation of the CO2-induced greenhouse effect that have been experimentally falsified. L13 also confuses equilibrium and transient response, and relies on data sources that have been superseeded due to known inaccuracies. Furthermore, the statistical approach of sequential linear regression artificially shifts variance onto the first predictor. L13's artificial choice of regression order and neglect of other relevant data is the fundamental cause of the incorrect main conclusion. Consideration of more modern data and a more parsimonious multiple regression model leads to contradiction with L13's statistical results. Finally, the correlation arguments in L13 are falsified by considering either the more appropriate metric of global heat accumulation, or data on longer timescales.

  10. A Simple Introduction to Moving Least Squares and Local Regression Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garimella, Rao Veerabhadra

    In this brief note, a highly simpli ed introduction to esimating functions over a set of particles is presented. The note starts from Global Least Squares tting, going on to Moving Least Squares estimation (MLS) and nally, Local Regression Estimation (LRE).

  11. Determination of total iron-reactive phenolics, anthocyanins and tannins in wine grapes of skins and seeds based on near-infrared hyperspectral imaging.

    PubMed

    Zhang, Ni; Liu, Xu; Jin, Xiaoduo; Li, Chen; Wu, Xuan; Yang, Shuqin; Ning, Jifeng; Yanne, Paul

    2017-12-15

    Phenolics contents in wine grapes are key indicators for assessing ripeness. Near-infrared hyperspectral images during ripening have been explored to achieve an effective method for predicting phenolics contents. Principal component regression (PCR), partial least squares regression (PLSR) and support vector regression (SVR) models were built, respectively. The results show that SVR behaves globally better than PLSR and PCR, except in predicting tannins content of seeds. For the best prediction results, the squared correlation coefficient and root mean square error reached 0.8960 and 0.1069g/L (+)-catechin equivalents (CE), respectively, for tannins in skins, 0.9065 and 0.1776 (g/L CE) for total iron-reactive phenolics (TIRP) in skins, 0.8789 and 0.1442 (g/L M3G) for anthocyanins in skins, 0.9243 and 0.2401 (g/L CE) for tannins in seeds, and 0.8790 and 0.5190 (g/L CE) for TIRP in seeds. Our results indicated that NIR hyperspectral imaging has good prospects for evaluation of phenolics in wine grapes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Lunar Silicon Abundance determined by Kaguya Gamma-ray Spectrometer and Chandrayaan-1 Moon Mineralogy Mapper

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong; Berezhnoy, Alexey; Wöhler, Christian; Grumpe, Arne; Rodriguez, Alexis; Hasebe, Nobuyuki; Van Gasselt, Stephan

    2016-07-01

    Using Kaguya GRS data, we investigated Si distribution on the Moon, based on study of the 4934 keV Si gamma ray peak caused by interaction between thermal neutrons and lunar Si-28 atoms. A Si peak analysis for a grid of 10 degrees in longitude and latitude was accomplished by the IRAP Aquarius program followed by a correction for altitude and thermal neutron density. A spectral parameter based regression model of the Si distribution was built for latitudes between 60°S and 60°N based on the continuum slopes, band depths, widths and minimum wavelengths of the absorption bands near 1 μμm and 2 μμm. Based on these regression models a nearly global cpm (counts per minute) map of Si with a resolution of 20 pixels per degree was constructed. The construction of a nearly global map of lunar Si abundances has been achieved by a combination of regression-based analysis of KGRS cpm data and M ^{3} spectral reflectance data, it has been calibrated with respect to returned sample-based wt% values. The Si abundances estimated with our method systematically exceed those of the LP GRS Si data set but are consistent with typical Si abundances of lunar basalt samples (in the maria) and feldspathic mineral samples (in the highlands). Our Si map shows that the Si abundance values on the Moon are typically between 17 and 28 wt%. The obtained Si map will provide an important aspect in both understanding the distribution of minerals and the evolution of the lunar surface since its formation.

  13. Comparison of anchor-based and distributional approaches in estimating important difference in common cold.

    PubMed

    Barrett, Bruce; Brown, Roger; Mundt, Marlon

    2008-02-01

    Evaluative health-related quality-of-life instruments used in clinical trials should be able to detect small but important changes in health status. Several approaches to minimal important difference (MID) and responsiveness have been developed. To compare anchor-based and distributional approaches to important difference and responsiveness for the Wisconsin Upper Respiratory Symptom Survey (WURSS), an illness-specific quality of life outcomes instrument. Participants with community-acquired colds self-reported daily using the WURSS-44. Distribution-based methods calculated standardized effect size (ES) and standard error of measurement (SEM). Anchor-based methods compared daily interval changes to global ratings of change, using: (1) standard MID methods based on correspondence to ratings of "a little better" or "somewhat better," and (2) two-level multivariate regression models. About 150 adults were monitored throughout their colds (1,681 sick days.): 88% were white, 69% were women, and 50% had completed college. The mean age was 35.5 years (SD = 14.7). WURSS scores increased 2.2 points from the first to second day, and then dropped by an average of 8.2 points per day from days 2 to 7. The SEM averaged 9.1 during these 7 days. Standard methods yielded a between day MID of 22 points. Regression models of MID projected 11.3-point daily changes. Dividing these estimates of small-but-important-difference by pooled SDs yielded coefficients of .425 for standard MID, .218 for regression model, .177 for SEM, and .157 for ES. These imply per-group sample sizes of 870 using ES, 616 for SEM, 302 for regression model, and 89 for standard MID, assuming alpha = .05, beta = .20 (80% power), and two-tailed testing. Distribution and anchor-based approaches provide somewhat different estimates of small but important difference, which in turn can have substantial impact on trial design.

  14. Research, policy, and programmatic considerations from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project

    PubMed Central

    Klemm, Rolf

    2017-01-01

    The Biomarkers Reflecting Inflammation and Nutritional Determinants of Anemia (BRINDA) project sought to inform the interpretation of iron and vitamin A biomarkers (ferritin, serum transferrin receptor, and retinol binding protein) in settings of prevalent inflammation as well as the prevention of and control strategies to address anemia. Our purpose is to comment on the contributions of the BRINDA to advance global knowledge with regard to iron and vitamin A status assessment in women and preschool children and to analyze the findings in terms of their rigor and usefulness for global nutrition research and programs. BRINDA investigators found that the acute-phase response is so prevalent that it must be assessed in surveys of iron and vitamin A status for valid interpretation of micronutrient biomarkers. Furthermore, they found that C-reactive protein and α-1-acid glycoprotein provide important and different information about these responses and that common survey variables cannot replace the information they provide. Developing a method for adjusting micronutrient biomarkers for the independent influence of inflammation is challenging and complex, and BRINDA has brought greater clarity to this challenge through the use of large and diverse data sets. When comparing approaches, the regression methods appear to perform best when sample sizes are sufficient and adequate statistical capacity is available. Further correction for malaria does not appear to materially alter regression-adjusted prevalence estimates. We suggest that researchers present both adjusted and unadjusted values for the micronutrient biomarkers. BRINDA findings confirm that iron deficiency is a common and consistent risk factor for anemia globally and that anemia control must combine iron interventions with control of infection and inflammation. Anemia control strategies must be informed by local data. By applying the knowledge in these studies, researchers, program planners, and evaluators working in populations with prevalent inflammation can use and interpret biomarkers with more confidence, tempered with necessary caution. PMID:28615252

  15. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  16. Hospital ownership and drug utilization under a global budget: a quantile regression analysis.

    PubMed

    Zhang, Jing Hua; Chou, Shin-Yi; Deily, Mary E; Lien, Hsien-Ming

    2014-03-01

    A global budgeting system helps control the growth of healthcare spending by setting expenditure ceilings. However, the hospital global budget implemented in Taiwan in 2002 included a special provision: drug expenditures are reimbursed at face value, while other expenditures are subject to discounting. That gives hospitals, particularly those that are for-profit, an incentive to increase drug expenditures in treating patients. We calculated monthly drug expenditures by hospital departments from January 1997 to June 2006, using a sample of 348 193 patient claims to Taiwan National Health Insurance. To allow for variation among responses by departments with differing reliance on drugs and among hospitals of different ownerships, we used quantile regression to identify the effect of the hospital global budget on drug expenditures. Although drug expenditure increased in all hospital departments after the enactment of the hospital global budget, departments in for-profit hospitals that rely more heavily on drug treatments increased drug spending more, relative to public hospitals. Our findings suggest that a global budgeting system with special reimbursement provisions for certain treatment categories may alter treatment decisions and may undermine cost-containment goals, particularly among for-profit hospitals.

  17. Combinations of Stressors in Midlife: Examining Role and Domain Stressors Using Regression Trees and Random Forests

    PubMed Central

    2013-01-01

    Objectives. Global perceptions of stress (GPS) have major implications for mental and physical health, and stress in midlife may influence adaptation in later life. Thus, it is important to determine the unique and interactive effects of diverse influences of role stress (at work or in personal relationships), loneliness, life events, time pressure, caregiving, finances, discrimination, and neighborhood circumstances on these GPS. Method. Exploratory regression trees and random forests were used to examine complex interactions among myriad events and chronic stressors in middle-aged participants’ (N = 410; mean age = 52.12) GPS. Results. Different role and domain stressors were influential at high and low levels of loneliness. Varied combinations of these stressors resulting in similar levels of perceived stress are also outlined as examples of equifinality. Loneliness emerged as an important predictor across trees. Discussion. Exploring multiple stressors simultaneously provides insights into the diversity of stressor combinations across individuals—even those with similar levels of global perceived stress—and answers theoretical mandates to better understand the influence of stress by sampling from many domain and role stressors. Further, the unique influences of each predictor relative to the others inform theory and applied work. Finally, examples of equifinality and multifinality call for targeted interventions. PMID:23341437

  18. Global Estimates of Fine Particulate Matter Using a Combined Geophysical-Statistical Method with Information from Satellites, Models, and Monitors

    NASA Technical Reports Server (NTRS)

    Van Donkelaar, Aaron; Martin, Randall V.; Brauer, Michael; Hsu, N. Christina; Kahn, Ralph A.; Levy, Robert C.; Lyapustin, Alexei; Sayer, Andrew M.; Winker, David M.

    2016-01-01

    We estimated global fine particulate matter (PM(sub 2.5)) concentrations using information from satellite-, simulation- and monitor-based sources by applying a Geographically Weighted Regression (GWR) to global geophysically-based satellite-derived PM(sub 2.5) estimates. Aerosol optical depth from multiple satellite products (MISR, MODIS Dark Target, MODIS and SeaWiFS Deep Blue, and MODIS MAIAC) was combined with simulation (GEOS-Chem) based upon their relative uncertainties as determined using ground-based sun photometer (AERONET) observations for 1998-2014. The GWR predictors included simulated aerosol composition and land use information. The resultant PM(sub 2.5) estimates were highly consistent (R(sup 2) equals 0.81) with out-of-sample cross-validated PM(sub 2.5) concentrations from monitors. The global population-weighted annual average PM(sub 2.5) concentrations were 3-fold higher than the 10 micrograms per cubic meter WHO guideline, driven by exposures in Asian and African regions. Estimates in regions with high contributions from mineral dust were associated with higher uncertainty, resulting from both sparse ground-based monitoring, and challenging conditions for retrieval and simulation. This approach demonstrates that the addition of even sparse ground-based measurements to more globally continuous PM(sub 2.5) data sources can yield valuable improvements to PM(sub 2.5) characterization on a global scale.

  19. Real-time anomaly detection for very short-term load forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Jian; Hong, Tao; Yue, Meng

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  20. Real-time anomaly detection for very short-term load forecasting

    DOE PAGES

    Luo, Jian; Hong, Tao; Yue, Meng

    2018-01-06

    Although the recent load information is critical to very short-term load forecasting (VSTLF), power companies often have difficulties in collecting the most recent load values accurately and timely for VSTLF applications. This paper tackles the problem of real-time anomaly detection in most recent load information used by VSTLF. This paper proposes a model-based anomaly detection method that consists of two components, a dynamic regression model and an adaptive anomaly threshold. The case study is developed using the data from ISO New England. This paper demonstrates that the proposed method significantly outperforms three other anomaly detection methods including two methods commonlymore » used in the field and one state-of-the-art method used by a winning team of the Global Energy Forecasting Competition 2014. Lastly, a general anomaly detection framework is proposed for the future research.« less

  1. Modeling Fire Occurrence at the City Scale: A Comparison between Geographically Weighted Regression and Global Linear Regression.

    PubMed

    Song, Chao; Kwan, Mei-Po; Zhu, Jiping

    2017-04-08

    An increasing number of fires are occurring with the rapid development of cities, resulting in increased risk for human beings and the environment. This study compares geographically weighted regression-based models, including geographically weighted regression (GWR) and geographically and temporally weighted regression (GTWR), which integrates spatial and temporal effects and global linear regression models (LM) for modeling fire risk at the city scale. The results show that the road density and the spatial distribution of enterprises have the strongest influences on fire risk, which implies that we should focus on areas where roads and enterprises are densely clustered. In addition, locations with a large number of enterprises have fewer fire ignition records, probably because of strict management and prevention measures. A changing number of significant variables across space indicate that heterogeneity mainly exists in the northern and eastern rural and suburban areas of Hefei city, where human-related facilities or road construction are only clustered in the city sub-centers. GTWR can capture small changes in the spatiotemporal heterogeneity of the variables while GWR and LM cannot. An approach that integrates space and time enables us to better understand the dynamic changes in fire risk. Thus governments can use the results to manage fire safety at the city scale.

  2. Modeling Fire Occurrence at the City Scale: A Comparison between Geographically Weighted Regression and Global Linear Regression

    PubMed Central

    Song, Chao; Kwan, Mei-Po; Zhu, Jiping

    2017-01-01

    An increasing number of fires are occurring with the rapid development of cities, resulting in increased risk for human beings and the environment. This study compares geographically weighted regression-based models, including geographically weighted regression (GWR) and geographically and temporally weighted regression (GTWR), which integrates spatial and temporal effects and global linear regression models (LM) for modeling fire risk at the city scale. The results show that the road density and the spatial distribution of enterprises have the strongest influences on fire risk, which implies that we should focus on areas where roads and enterprises are densely clustered. In addition, locations with a large number of enterprises have fewer fire ignition records, probably because of strict management and prevention measures. A changing number of significant variables across space indicate that heterogeneity mainly exists in the northern and eastern rural and suburban areas of Hefei city, where human-related facilities or road construction are only clustered in the city sub-centers. GTWR can capture small changes in the spatiotemporal heterogeneity of the variables while GWR and LM cannot. An approach that integrates space and time enables us to better understand the dynamic changes in fire risk. Thus governments can use the results to manage fire safety at the city scale. PMID:28397745

  3. Neighborhood Structural Similarity Mapping for the Classification of Masses in Mammograms.

    PubMed

    Rabidas, Rinku; Midya, Abhishek; Chakraborty, Jayasree

    2018-05-01

    In this paper, two novel feature extraction methods, using neighborhood structural similarity (NSS), are proposed for the characterization of mammographic masses as benign or malignant. Since gray-level distribution of pixels is different in benign and malignant masses, more regular and homogeneous patterns are visible in benign masses compared to malignant masses; the proposed method exploits the similarity between neighboring regions of masses by designing two new features, namely, NSS-I and NSS-II, which capture global similarity at different scales. Complementary to these global features, uniform local binary patterns are computed to enhance the classification efficiency by combining with the proposed features. The performance of the features are evaluated using the images from the mini-mammographic image analysis society (mini-MIAS) and digital database for screening mammography (DDSM) databases, where a tenfold cross-validation technique is incorporated with Fisher linear discriminant analysis, after selecting the optimal set of features using stepwise logistic regression method. The best area under the receiver operating characteristic curve of 0.98 with an accuracy of is achieved with the mini-MIAS database, while the same for the DDSM database is 0.93 with accuracy .

  4. Trends and associated uncertainty in the global mean temperature record

    NASA Astrophysics Data System (ADS)

    Poppick, A. N.; Moyer, E. J.; Stein, M.

    2016-12-01

    Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.

  5. Performance Evaluation of Machine Learning Methods for Leaf Area Index Retrieval from Time-Series MODIS Reflectance Data

    PubMed Central

    Wang, Tongtong; Xiao, Zhiqiang; Liu, Zhigang

    2017-01-01

    Leaf area index (LAI) is an important biophysical parameter and the retrieval of LAI from remote sensing data is the only feasible method for generating LAI products at regional and global scales. However, most LAI retrieval methods use satellite observations at a specific time to retrieve LAI. Because of the impacts of clouds and aerosols, the LAI products generated by these methods are spatially incomplete and temporally discontinuous, and thus they cannot meet the needs of practical applications. To generate high-quality LAI products, four machine learning algorithms, including back-propagation neutral network (BPNN), radial basis function networks (RBFNs), general regression neutral networks (GRNNs), and multi-output support vector regression (MSVR) are proposed to retrieve LAI from time-series Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data in this study and performance of these machine learning algorithms is evaluated. The results demonstrated that GRNNs, RBFNs, and MSVR exhibited low sensitivity to training sample size, whereas BPNN had high sensitivity. The four algorithms performed slightly better with red, near infrared (NIR), and short wave infrared (SWIR) bands than red and NIR bands, and the results were significantly better than those obtained using single band reflectance data (red or NIR). Regardless of band composition, GRNNs performed better than the other three methods. Among the four algorithms, BPNN required the least training time, whereas MSVR needed the most for any sample size. PMID:28045443

  6. Performance Evaluation of Machine Learning Methods for Leaf Area Index Retrieval from Time-Series MODIS Reflectance Data.

    PubMed

    Wang, Tongtong; Xiao, Zhiqiang; Liu, Zhigang

    2017-01-01

    Leaf area index (LAI) is an important biophysical parameter and the retrieval of LAI from remote sensing data is the only feasible method for generating LAI products at regional and global scales. However, most LAI retrieval methods use satellite observations at a specific time to retrieve LAI. Because of the impacts of clouds and aerosols, the LAI products generated by these methods are spatially incomplete and temporally discontinuous, and thus they cannot meet the needs of practical applications. To generate high-quality LAI products, four machine learning algorithms, including back-propagation neutral network (BPNN), radial basis function networks (RBFNs), general regression neutral networks (GRNNs), and multi-output support vector regression (MSVR) are proposed to retrieve LAI from time-series Moderate Resolution Imaging Spectroradiometer (MODIS) reflectance data in this study and performance of these machine learning algorithms is evaluated. The results demonstrated that GRNNs, RBFNs, and MSVR exhibited low sensitivity to training sample size, whereas BPNN had high sensitivity. The four algorithms performed slightly better with red, near infrared (NIR), and short wave infrared (SWIR) bands than red and NIR bands, and the results were significantly better than those obtained using single band reflectance data (red or NIR). Regardless of band composition, GRNNs performed better than the other three methods. Among the four algorithms, BPNN required the least training time, whereas MSVR needed the most for any sample size.

  7. Austrian firearm legislation and its effects on suicide and homicide mortality: A natural quasi-experiment amidst the global economic crisis.

    PubMed

    König, Daniel; Swoboda, Patrick; Cramer, Robert J; Krall, Christoph; Postuvan, Vita; Kapusta, Nestor D

    2018-08-01

    Restriction of access to suicide methods has been shown to effectively reduce suicide mortality rates. To examine how the global economic crisis of 2008 and the firearm legislation reform of 1997 affected suicide and homicide mortality rate within Austria. Official data for the years 1985-2016 for firearm certificates, suicide, homicide, unemployment rates and alcohol consumption were examined using auto regressive error and Poisson regression models. Firearm certificates, total suicide mortality rate, suicide and homicides by firearms, and the fraction of firearm suicides/homicides among all suicides/homicides decreased after the firearm legislation reform in 1997. However, significant trend changes can be observed after 2008. The availability of firearm certificates significantly increased and was accompanied by significant changes in trends of firearm suicide and homicide rates. Concurrently, the total suicide mortality rate in 2008, for the first time since 1985, stopped its decreasing trend. While the total homicide rate further decreased, the fraction of firearm homicides among all homicides significantly increased. The initially preventative effect of the firearm legislation reform in Austria in 1997 seems to have been counteracted by the global economic downturn of 2008. Increased firearm availability was associated with corresponding increases in both firearm suicide and firearm homicide mortality. Restrictive firearm legislation should be an imperative part of a country's suicide prevention programme. Although firearm legislation reform may have long-lasting effects, societal changes may facilitate compensatory firearm acquisitions and thus counteract preventive efforts, calling in turn again for adapted counter-measures. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  8. Inferring general relations between network characteristics from specific network ensembles.

    PubMed

    Cardanobile, Stefano; Pernice, Volker; Deger, Moritz; Rotter, Stefan

    2012-01-01

    Different network models have been suggested for the topology underlying complex interactions in natural systems. These models are aimed at replicating specific statistical features encountered in real-world networks. However, it is rarely considered to which degree the results obtained for one particular network class can be extrapolated to real-world networks. We address this issue by comparing different classical and more recently developed network models with respect to their ability to generate networks with large structural variability. In particular, we consider the statistical constraints which the respective construction scheme imposes on the generated networks. After having identified the most variable networks, we address the issue of which constraints are common to all network classes and are thus suitable candidates for being generic statistical laws of complex networks. In fact, we find that generic, not model-related dependencies between different network characteristics do exist. This makes it possible to infer global features from local ones using regression models trained on networks with high generalization power. Our results confirm and extend previous findings regarding the synchronization properties of neural networks. Our method seems especially relevant for large networks, which are difficult to map completely, like the neural networks in the brain. The structure of such large networks cannot be fully sampled with the present technology. Our approach provides a method to estimate global properties of under-sampled networks in good approximation. Finally, we demonstrate on three different data sets (C. elegans neuronal network, R. prowazekii metabolic network, and a network of synonyms extracted from Roget's Thesaurus) that real-world networks have statistical relations compatible with those obtained using regression models.

  9. Contribution of milk production to global greenhouse gas emissions. An estimation based on typical farms.

    PubMed

    Hagemann, Martin; Ndambi, Asaah; Hemme, Torsten; Latacz-Lohmann, Uwe

    2012-02-01

    Studies on the contribution of milk production to global greenhouse gas (GHG) emissions are rare (FAO 2010) and often based on crude data which do not appropriately reflect the heterogeneity of farming systems. This article estimates GHG emissions from milk production in different dairy regions of the world based on a harmonised farm data and assesses the contribution of milk production to global GHG emissions. The methodology comprises three elements: (1) the International Farm Comparison Network (IFCN) concept of typical farms and the related globally standardised dairy model farms representing 45 dairy regions in 38 countries; (2) a partial life cycle assessment model for estimating GHG emissions of the typical dairy farms; and (3) standard regression analysis to estimate GHG emissions from milk production in countries for which no typical farms are available in the IFCN database. Across the 117 typical farms in the 38 countries analysed, the average emission rate is 1.50 kg CO(2) equivalents (CO(2)-eq.)/kg milk. The contribution of milk production to the global anthropogenic emissions is estimated at 1.3 Gt CO(2)-eq./year, accounting for 2.65% of total global anthropogenic emissions (49 Gt; IPCC, Synthesis Report for Policy Maker, Valencia, Spain, 2007). We emphasise that our estimates of the contribution of milk production to global GHG emissions are subject to uncertainty. Part of the uncertainty stems from the choice of the appropriate methods for estimating emissions at the level of the individual animal.

  10. Global distribution of urban parameters derived from high-resolution global datasets for weather modelling

    NASA Astrophysics Data System (ADS)

    Kawano, N.; Varquez, A. C. G.; Dong, Y.; Kanda, M.

    2016-12-01

    Numerical model such as Weather Research and Forecasting model coupled with single-layer Urban Canopy Model (WRF-UCM) is one of the powerful tools to investigate urban heat island. Urban parameters such as average building height (Have), plain area index (λp) and frontal area index (λf), are necessary inputs for the model. In general, these parameters are uniformly assumed in WRF-UCM but this leads to unrealistic urban representation. Distributed urban parameters can also be incorporated into WRF-UCM to consider a detail urban effect. The problem is that distributed building information is not readily available for most megacities especially in developing countries. Furthermore, acquiring real building parameters often require huge amount of time and money. In this study, we investigated the potential of using globally available satellite-captured datasets for the estimation of the parameters, Have, λp, and λf. Global datasets comprised of high spatial resolution population dataset (LandScan by Oak Ridge National Laboratory), nighttime lights (NOAA), and vegetation fraction (NASA). True samples of Have, λp, and λf were acquired from actual building footprints from satellite images and 3D building database of Tokyo, New York, Paris, Melbourne, Istanbul, Jakarta and so on. Regression equations were then derived from the block-averaging of spatial pairs of real parameters and global datasets. Results show that two regression curves to estimate Have and λf from the combination of population and nightlight are necessary depending on the city's level of development. An index which can be used to decide which equation to use for a city is the Gross Domestic Product (GDP). On the other hand, λphas less dependence on GDP but indicated a negative relationship to vegetation fraction. Finally, a simplified but precise approximation of urban parameters through readily-available, high-resolution global datasets and our derived regressions can be utilized to estimate a global distribution of urban parameters for later incorporation into a weather model, thus allowing us to acquire a global understanding of urban climate (Global Urban Climatology). Acknowledgment: This research was supported by the Environment Research and Technology Development Fund (S-14) of the Ministry of the Environment, Japan.

  11. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    NASA Technical Reports Server (NTRS)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to the phenology, solar-view geometry, and atmospheric condition etc. factors but not actual landcover difference. Finally, we will compare the classification results from screened and unscreened training samples to assess the improvement achieved by cleaning up the training samples. Keywords:

  12. The Local Food Environment and Fruit and Vegetable Intake: A Geographically Weighted Regression Approach in the ORiEL Study.

    PubMed

    Clary, Christelle; Lewis, Daniel J; Flint, Ellen; Smith, Neil R; Kestens, Yan; Cummins, Steven

    2016-12-01

    Studies that explore associations between the local food environment and diet routinely use global regression models, which assume that relationships are invariant across space, yet such stationarity assumptions have been little tested. We used global and geographically weighted regression models to explore associations between the residential food environment and fruit and vegetable intake. Analyses were performed in 4 boroughs of London, United Kingdom, using data collected between April 2012 and July 2012 from 969 adults in the Olympic Regeneration in East London Study. Exposures were assessed both as absolute densities of healthy and unhealthy outlets, taken separately, and as a relative measure (proportion of total outlets classified as healthy). Overall, local models performed better than global models (lower Akaike information criterion). Locally estimated coefficients varied across space, regardless of the type of exposure measure, although changes of sign were observed only when absolute measures were used. Despite findings from global models showing significant associations between the relative measure and fruit and vegetable intake (β = 0.022; P < 0.01) only, geographically weighted regression models using absolute measures outperformed models using relative measures. This study suggests that greater attention should be given to nonstationary relationships between the food environment and diet. It further challenges the idea that a single measure of exposure, whether relative or absolute, can reflect the many ways the food environment may shape health behaviors. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Multi-linear regression of sea level in the south west Pacific as a first step towards local sea level projections

    NASA Astrophysics Data System (ADS)

    Kumar, Vandhna; Meyssignac, Benoit; Melet, Angélique; Ganachaud, Alexandre

    2017-04-01

    Rising sea levels are a critical concern in small island nations. The problem is especially serious in the western south Pacific, where the total sea level rise over the last 60 years is up to 3 times the global average. In this study, we attempt to reconstruct sea levels at selected sites in the region (Suva, Lautoka, Noumea - Fiji and New Caledonia) as a mutiple-linear regression of atmospheric and oceanic variables. We focus on interannual-to-decadal scale variability, and lower (including the global mean sea level rise) over the 1979-2014 period. Sea levels are taken from tide gauge records and the ORAS4 reanalysis dataset, and are expressed as a sum of steric and mass changes as a preliminary step. The key development in our methodology is using leading wind stress curl as a proxy for the thermosteric component. This is based on the knowledge that wind stress curl anomalies can modulate the thermocline depth and resultant sea levels via Rossby wave propagation. The analysis is primarily based on correlation between local sea level and selected predictors, the dominant one being wind stress curl. In the first step, proxy boxes for wind stress curl are determined via regions of highest correlation. The proportion of sea level explained via linear regression is then removed, leaving a residual. This residual is then correlated with other locally acting potential predictors: halosteric sea level, the zonal and meridional wind stress components, and sea surface temperature. The statistically significant predictors are used in a multi-linear regression function to simulate the observed sea level. The method is able to reproduce between 40 to 80% of the variance in observed sea level. Based on the skill of the model, it has high potential in sea level projection and downscaling studies.

  14. Improving salt marsh digital elevation model accuracy with full-waveform lidar and nonparametric predictive modeling

    NASA Astrophysics Data System (ADS)

    Rogers, Jeffrey N.; Parrish, Christopher E.; Ward, Larry G.; Burdick, David M.

    2018-03-01

    Salt marsh vegetation tends to increase vertical uncertainty in light detection and ranging (lidar) derived elevation data, often causing the data to become ineffective for analysis of topographic features governing tidal inundation or vegetation zonation. Previous attempts at improving lidar data collected in salt marsh environments range from simply computing and subtracting the global elevation bias to more complex methods such as computing vegetation-specific, constant correction factors. The vegetation specific corrections can be used along with an existing habitat map to apply separate corrections to different areas within a study site. It is hypothesized here that correcting salt marsh lidar data by applying location-specific, point-by-point corrections, which are computed from lidar waveform-derived features, tidal-datum based elevation, distance from shoreline and other lidar digital elevation model based variables, using nonparametric regression will produce better results. The methods were developed and tested using full-waveform lidar and ground truth for three marshes in Cape Cod, Massachusetts, U.S.A. Five different model algorithms for nonparametric regression were evaluated, with TreeNet's stochastic gradient boosting algorithm consistently producing better regression and classification results. Additionally, models were constructed to predict the vegetative zone (high marsh and low marsh). The predictive modeling methods used in this study estimated ground elevation with a mean bias of 0.00 m and a standard deviation of 0.07 m (0.07 m root mean square error). These methods appear very promising for correction of salt marsh lidar data and, importantly, do not require an existing habitat map, biomass measurements, or image based remote sensing data such as multi/hyperspectral imagery.

  15. Modeling ready biodegradability of fragrance materials.

    PubMed

    Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola

    2015-06-01

    In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.

  16. Self-rated health: small area large area comparisons amongst older adults at the state, district and sub-district level in India.

    PubMed

    Hirve, Siddhivinayak; Vounatsou, Penelope; Juvekar, Sanjay; Blomstedt, Yulia; Wall, Stig; Chatterji, Somnath; Ng, Nawi

    2014-03-01

    We compared prevalence estimates of self-rated health (SRH) derived indirectly using four different small area estimation methods for the Vadu (small) area from the national Study on Global AGEing (SAGE) survey with estimates derived directly from the Vadu SAGE survey. The indirect synthetic estimate for Vadu was 24% whereas the model based estimates were 45.6% and 45.7% with smaller prediction errors and comparable to the direct survey estimate of 50%. The model based techniques were better suited to estimate the prevalence of SRH than the indirect synthetic method. We conclude that a simplified mixed effects regression model can produce valid small area estimates of SRH. © 2013 Published by Elsevier Ltd.

  17. An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.

    2017-01-01

    Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.

  18. Mapping the spatial pattern of temperate forest above ground biomass by integrating airborne lidar with Radarsat-2 imagery via geostatistical models

    NASA Astrophysics Data System (ADS)

    Li, Wang; Niu, Zheng; Gao, Shuai; Wang, Cheng

    2014-11-01

    Light Detection and Ranging (LiDAR) and Synthetic Aperture Radar (SAR) are two competitive active remote sensing techniques in forest above ground biomass estimation, which is important for forest management and global climate change study. This study aims to further explore their capabilities in temperate forest above ground biomass (AGB) estimation by emphasizing the spatial auto-correlation of variables obtained from these two remote sensing tools, which is a usually overlooked aspect in remote sensing applications to vegetation studies. Remote sensing variables including airborne LiDAR metrics, backscattering coefficient for different SAR polarizations and their ratio variables for Radarsat-2 imagery were calculated. First, simple linear regression models (SLR) was established between the field-estimated above ground biomass and the remote sensing variables. Pearson's correlation coefficient (R2) was used to find which LiDAR metric showed the most significant correlation with the regression residuals and could be selected as co-variable in regression co-kriging (RCoKrig). Second, regression co-kriging was conducted by choosing the regression residuals as dependent variable and the LiDAR metric (Hmean) with highest R2 as co-variable. Third, above ground biomass over the study area was estimated using SLR model and RCoKrig model, respectively. The results for these two models were validated using the same ground points. Results showed that both of these two methods achieved satisfactory prediction accuracy, while regression co-kriging showed the lower estimation error. It is proved that regression co-kriging model is feasible and effective in mapping the spatial pattern of AGB in the temperate forest using Radarsat-2 data calibrated by airborne LiDAR metrics.

  19. Regression discontinuity was a valid design for dichotomous outcomes in three randomized trials.

    PubMed

    van Leeuwen, Nikki; Lingsma, Hester F; Mooijaart, Simon P; Nieboer, Daan; Trompet, Stella; Steyerberg, Ewout W

    2018-06-01

    Regression discontinuity (RD) is a quasi-experimental design that may provide valid estimates of treatment effects in case of continuous outcomes. We aimed to evaluate validity and precision in the RD design for dichotomous outcomes. We performed validation studies in three large randomized controlled trials (RCTs) (Corticosteroid Randomization After Significant Head injury [CRASH], the Global Utilization of Streptokinase and Tissue Plasminogen Activator for Occluded Coronary Arteries [GUSTO], and PROspective Study of Pravastatin in elderly individuals at risk of vascular disease [PROSPER]). To mimic the RD design, we selected patients above and below a cutoff (e.g., age 75 years) randomized to treatment and control, respectively. Adjusted logistic regression models using restricted cubic splines (RCS) and polynomials and local logistic regression models estimated the odds ratio (OR) for treatment, with 95% confidence intervals (CIs) to indicate precision. In CRASH, treatment increased mortality with OR 1.22 [95% CI 1.06-1.40] in the RCT. The RD estimates were 1.42 (0.94-2.16) and 1.13 (0.90-1.40) with RCS adjustment and local regression, respectively. In GUSTO, treatment reduced mortality (OR 0.83 [0.72-0.95]), with more extreme estimates in the RD analysis (OR 0.57 [0.35; 0.92] and 0.67 [0.51; 0.86]). In PROSPER, similar RCT and RD estimates were found, again with less precision in RD designs. We conclude that the RD design provides similar but substantially less precise treatment effect estimates compared with an RCT, with local regression being the preferred method of analysis. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Efficient Actor-Critic Algorithm with Hierarchical Model Learning and Planning

    PubMed Central

    Fu, QiMing

    2016-01-01

    To improve the convergence rate and the sample efficiency, two efficient learning methods AC-HMLP and RAC-HMLP (AC-HMLP with ℓ 2-regularization) are proposed by combining actor-critic algorithm with hierarchical model learning and planning. The hierarchical models consisting of the local and the global models, which are learned at the same time during learning of the value function and the policy, are approximated by local linear regression (LLR) and linear function approximation (LFA), respectively. Both the local model and the global model are applied to generate samples for planning; the former is used only if the state-prediction error does not surpass the threshold at each time step, while the latter is utilized at the end of each episode. The purpose of taking both models is to improve the sample efficiency and accelerate the convergence rate of the whole algorithm through fully utilizing the local and global information. Experimentally, AC-HMLP and RAC-HMLP are compared with three representative algorithms on two Reinforcement Learning (RL) benchmark problems. The results demonstrate that they perform best in terms of convergence rate and sample efficiency. PMID:27795704

  1. Efficient Actor-Critic Algorithm with Hierarchical Model Learning and Planning.

    PubMed

    Zhong, Shan; Liu, Quan; Fu, QiMing

    2016-01-01

    To improve the convergence rate and the sample efficiency, two efficient learning methods AC-HMLP and RAC-HMLP (AC-HMLP with ℓ 2 -regularization) are proposed by combining actor-critic algorithm with hierarchical model learning and planning. The hierarchical models consisting of the local and the global models, which are learned at the same time during learning of the value function and the policy, are approximated by local linear regression (LLR) and linear function approximation (LFA), respectively. Both the local model and the global model are applied to generate samples for planning; the former is used only if the state-prediction error does not surpass the threshold at each time step, while the latter is utilized at the end of each episode. The purpose of taking both models is to improve the sample efficiency and accelerate the convergence rate of the whole algorithm through fully utilizing the local and global information. Experimentally, AC-HMLP and RAC-HMLP are compared with three representative algorithms on two Reinforcement Learning (RL) benchmark problems. The results demonstrate that they perform best in terms of convergence rate and sample efficiency.

  2. Can we achieve Millennium Development Goal 4? New analysis of country trends and forecasts of under-5 mortality to 2015.

    PubMed

    Murray, Christopher J L; Laakso, Thomas; Shibuya, Kenji; Hill, Kenneth; Lopez, Alan D

    2007-09-22

    Global efforts have increased the accuracy and timeliness of estimates of under-5 mortality; however, these estimates fail to use all data available, do not use transparent and reproducible methods, do not distinguish predictions from measurements, and provide no indication of uncertainty around point estimates. We aimed to develop new reproducible methods and reanalyse existing data to elucidate detailed time trends. We merged available databases, added to them when possible, and then applied Loess regression to estimate past trends and forecast to 2015 for 172 countries. We developed uncertainty estimates based on different model specifications and estimated levels and trends in neonatal, post-neonatal, and childhood mortality. Global under-5 mortality has fallen from 110 (109-110) per 1000 in 1980 to 72 (70-74) per 1000 in 2005. Child deaths worldwide have decreased from 13.5 (13.4-13.6) million in 1980 to an estimated 9.7 (9.5-10.0) million in 2005. Global under-5 mortality is expected to decline by 27% from 1990 to 2015, substantially less than the target of Millennium Development Goal 4 (MDG4) of a 67% decrease. Several regions in Latin America, north Africa, the Middle East, Europe, and southeast Asia have had consistent annual rates of decline in excess of 4% over 35 years. Global progress on MDG4 is dominated by slow reductions in sub-Saharan Africa, which also has the slowest rates of decline in fertility. Globally, we are not doing a better job of reducing child mortality now than we were three decades ago. Further improvements in the quality and timeliness of child-mortality measurements should be possible by more fully using existing datasets and applying standard analytical strategies.

  3. WebGLORE: a web service for Grid LOgistic REgression.

    PubMed

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-12-15

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation.

  4. Mixed geographically weighted regression (MGWR) model with weighted adaptive bi-square for case of dengue hemorrhagic fever (DHF) in Surakarta

    NASA Astrophysics Data System (ADS)

    Astuti, H. N.; Saputro, D. R. S.; Susanti, Y.

    2017-06-01

    MGWR model is combination of linear regression model and geographically weighted regression (GWR) model, therefore, MGWR model could produce parameter estimation that had global parameter estimation, and other parameter that had local parameter in accordance with its observation location. The linkage between locations of the observations expressed in specific weighting that is adaptive bi-square. In this research, we applied MGWR model with weighted adaptive bi-square for case of DHF in Surakarta based on 10 factors (variables) that is supposed to influence the number of people with DHF. The observation unit in the research is 51 urban villages and the variables are number of inhabitants, number of houses, house index, many public places, number of healthy homes, number of Posyandu, area width, level population density, welfare of the family, and high-region. Based on this research, we obtained 51 MGWR models. The MGWR model were divided into 4 groups with significant variable is house index as a global variable, an area width as a local variable and the remaining variables vary in each. Global variables are variables that significantly affect all locations, while local variables are variables that significantly affect a specific location.

  5. A Global Land Use Regression Model for Nitrogen Dioxide Air Pollution

    PubMed Central

    Larkin, Andrew; Geddes, Jeffrey A.; Martin, Randall V.; Xiao, Qingyang; Liu, Yang; Marshall, Julian D.; Brauer, Michael; Hystad, Perry

    2017-01-01

    Nitrogen dioxide is a common air pollutant with growing evidence of health impacts independent of other common pollutants such as ozone and particulate matter. However, the global distribution of NO2 exposure and associated impacts on global health is still largely uncertain. To advance global exposure estimates we created a global nitrogen dioxide (NO2) land use regression model for 2011 using annual measurements from 5,220 air monitors in 58 countries. The model captured 54% of global NO2 variation, with a mean absolute error of 3.7 ppb. Regional performance varied from R2 = 0.42 (Africa) to 0.67 (South America). Repeated 10% cross-validation using bootstrap sampling (n=10,000) demonstrated robust performance with respect to air monitor sampling in North America, Europe, and Asia (adjusted R2 within 2%) but not for Africa and Oceania (adjusted R2 within 11%) where NO2 monitoring data are sparse. The final model included 10 variables that captured both between and within-city spatial gradients in NO2 concentrations. Variable contributions differed between continental regions but major roads within 100m and satellite-derived NO2 were consistently the strongest predictors. The resulting model will be made available and can be used for global risk assessments and health studies, particularly in countries without existing NO2 monitoring data or models. PMID:28520422

  6. Determining association constants from titration experiments in supramolecular chemistry.

    PubMed

    Thordarson, Pall

    2011-03-01

    The most common approach for quantifying interactions in supramolecular chemistry is a titration of the guest to solution of the host, noting the changes in some physical property through NMR, UV-Vis, fluorescence or other techniques. Despite the apparent simplicity of this approach, there are several issues that need to be carefully addressed to ensure that the final results are reliable. This includes the use of non-linear rather than linear regression methods, careful choice of stoichiometric binding model, the choice of method (e.g., NMR vs. UV-Vis) and concentration of host, the application of advanced data analysis methods such as global analysis and finally the estimation of uncertainties and confidence intervals for the results obtained. This tutorial review will give a systematic overview of all these issues-highlighting some of the key messages herein with simulated data analysis examples.

  7. A Statistical Method for Reducing Sidelobe Clutter for the Ku-Band Precipitation Radar on Board the GPM Core Observatory

    NASA Technical Reports Server (NTRS)

    Kubota, Takuji; Iguchi, Toshio; Kojima, Masahiro; Liao, Liang; Masaki, Takeshi; Hanado, Hiroshi; Meneghini, Robert; Oki, Riko

    2016-01-01

    A statistical method to reduce the sidelobe clutter of the Ku-band precipitation radar (KuPR) of the Dual-Frequency Precipitation Radar (DPR) on board the Global Precipitation Measurement (GPM) Core Observatory is described and evaluated using DPR observations. The KuPR sidelobe clutter was much more severe than that of the Precipitation Radar on board the Tropical Rainfall Measuring Mission (TRMM), and it has caused the misidentification of precipitation. The statistical method to reduce sidelobe clutter was constructed by subtracting the estimated sidelobe power, based upon a multiple regression model with explanatory variables of the normalized radar cross section (NRCS) of surface, from the received power of the echo. The saturation of the NRCS at near-nadir angles, resulting from strong surface scattering, was considered in the calculation of the regression coefficients.The method was implemented in the KuPR algorithm and applied to KuPR-observed data. It was found that the received power from sidelobe clutter over the ocean was largely reduced by using the developed method, although some of the received power from the sidelobe clutter still remained. From the statistical results of the evaluations, it was shown that the number of KuPR precipitation events in the clutter region, after the method was applied, was comparable to that in the clutter-free region. This confirms the reasonable performance of the method in removing sidelobe clutter. For further improving the effectiveness of the method, it is necessary to improve the consideration of the NRCS saturation, which will be explored in future work.

  8. The repeatability of mean defect with size III and size V standard automated perimetry.

    PubMed

    Wall, Michael; Doyle, Carrie K; Zamba, K D; Artes, Paul; Johnson, Chris A

    2013-02-15

    The mean defect (MD) of the visual field is a global statistical index used to monitor overall visual field change over time. Our goal was to investigate the relationship of MD and its variability for two clinically used strategies (Swedish Interactive Threshold Algorithm [SITA] standard size III and full threshold size V) in glaucoma patients and controls. We tested one eye, at random, for 46 glaucoma patients and 28 ocularly healthy subjects with Humphrey program 24-2 SITA standard for size III and full threshold for size V each five times over a 5-week period. The standard deviation of MD was regressed against the MD for the five repeated tests, and quantile regression was used to show the relationship of variability and MD. A Wilcoxon test was used to compare the standard deviations of the two testing methods following quantile regression. Both types of regression analysis showed increasing variability with increasing visual field damage. Quantile regression showed modestly smaller MD confidence limits. There was a 15% decrease in SD with size V in glaucoma patients (P = 0.10) and a 12% decrease in ocularly healthy subjects (P = 0.08). The repeatability of size V MD appears to be slightly better than size III SITA testing. When using MD to determine visual field progression, a change of 1.5 to 4 decibels (dB) is needed to be outside the normal 95% confidence limits, depending on the size of the stimulus and the amount of visual field damage.

  9. A fully traits-based approach to modeling global vegetation distribution.

    PubMed

    van Bodegom, Peter M; Douma, Jacob C; Verheijen, Lieneke M

    2014-09-23

    Dynamic Global Vegetation Models (DGVMs) are indispensable for our understanding of climate change impacts. The application of traits in DGVMs is increasingly refined. However, a comprehensive analysis of the direct impacts of trait variation on global vegetation distribution does not yet exist. Here, we present such analysis as proof of principle. We run regressions of trait observations for leaf mass per area, stem-specific density, and seed mass from a global database against multiple environmental drivers, making use of findings of global trait convergence. This analysis explained up to 52% of the global variation of traits. Global trait maps, generated by coupling the regression equations to gridded soil and climate maps, showed up to orders of magnitude variation in trait values. Subsequently, nine vegetation types were characterized by the trait combinations that they possess using Gaussian mixture density functions. The trait maps were input to these functions to determine global occurrence probabilities for each vegetation type. We prepared vegetation maps, assuming that the most probable (and thus, most suited) vegetation type at each location will be realized. This fully traits-based vegetation map predicted 42% of the observed vegetation distribution correctly. Our results indicate that a major proportion of the predictive ability of DGVMs with respect to vegetation distribution can be attained by three traits alone if traits like stem-specific density and seed mass are included. We envision that our traits-based approach, our observation-driven trait maps, and our vegetation maps may inspire a new generation of powerful traits-based DGVMs.

  10. Global Inequalities in Cervical Cancer Incidence and Mortality are Linked to Deprivation, Low Socioeconomic Status, and Human Development

    PubMed Central

    Singh, Gopal K.; Azuine, Romuladus E.; Siahpush, Mohammad

    2012-01-01

    Objectives This study examined global inequalities in cervical cancer incidence and mortality rates as a function of cross-national variations in the Human Development Index (HDI), socioeconomic factors, Gender Inequality Index (GII), and healthcare expenditure. Methods Age-adjusted incidence and mortality rates were calculated for women in 184 countries using the 2008 GLOBOCAN database, and incidence and mortality trends were analyzed using the WHO cancer mortality database. Log-linear regression was used to model annual trends, while OLS and Poisson regression models were used to estimate the impact of socioeconomic and human development factors on incidence and mortality rates. Results Cervical cancer incidence and mortality rates varied widely, with many African countries such as Guinea, Zambia, Comoros, Tanzania, and Malawi having at least 10-to-20-fold higher rates than several West Asian, Middle East, and European countries, including Iran, Saudi Arabia, Syria, Egypt, and Switzerland. HDI, GII, poverty rate, health expenditure per capita, urbanization, and literacy rate were all significantly related to cervical cancer incidence and mortality, with HDI and poverty rate each explaining >52% of the global variance in mortality. Both incidence and mortality rates increased in relation to lower human development and higher gender inequality levels. A 0.2 unit increase in HDI was associated with a 20% decrease in cervical cancer risk and a 33% decrease in cervical cancer mortality risk. The risk of a cervical cancer diagnosis increased by 24% and of cervical cancer death by 42% for a 0.2 unit increase in GII. Higher health expenditure levels were independently associated with decreased incidence and mortality risks. Conclusions and Public Health Implications Global inequalities in cervical cancer are clearly linked to disparities in human development, social inequality, and living standards. Reductions in cervical cancer rates are achievable by reducing inequalities in socioeconomic conditions, availability of preventive health services, and women’s social status. PMID:27621956

  11. Surface daytime net radiation estimation using artificial neural networks

    DOE PAGES

    Jiang, Bo; Zhang, Yi; Liang, Shunlin; ...

    2014-11-11

    Net all-wave surface radiation (R n) is one of the most important fundamental parameters in various applications. However, conventional R n measurements are difficult to collect because of the high cost and ongoing maintenance of recording instruments. Therefore, various empirical R n estimation models have been developed. This study presents the results of two artificial neural network (ANN) models (general regression neural networks (GRNN) and Neuroet) to estimate R n globally from multi-source data, including remotely sensed products, surface measurements, and meteorological reanalysis products. R n estimates provided by the two ANNs were tested against in-situ radiation measurements obtained frommore » 251 global sites between 1991–2010 both in global mode (all data were used to fit the models) and in conditional mode (the data were divided into four subsets and the models were fitted separately). Based on the results obtained from extensive experiments, it has been proved that the two ANNs were superior to linear-based empirical models in both global and conditional modes and that the GRNN performed better and was more stable than Neuroet. The GRNN estimates had a determination coefficient (R 2) of 0.92, a root mean square error (RMSE) of 34.27 W·m –2 , and a bias of –0.61 W·m –2 in global mode based on the validation dataset. In conclusion, ANN methods are a potentially powerful tool for global R n estimation.« less

  12. Estimating the Counterfactual Impact of Conservation Programs on Land Cover Outcomes: The Role of Matching and Panel Regression Techniques.

    PubMed

    Jones, Kelly W; Lewis, David J

    2015-01-01

    Deforestation and conversion of native habitats continues to be the leading driver of biodiversity and ecosystem service loss. A number of conservation policies and programs are implemented--from protected areas to payments for ecosystem services (PES)--to deter these losses. Currently, empirical evidence on whether these approaches stop or slow land cover change is lacking, but there is increasing interest in conducting rigorous, counterfactual impact evaluations, especially for many new conservation approaches, such as PES and REDD, which emphasize additionality. In addition, several new, globally available and free high-resolution remote sensing datasets have increased the ease of carrying out an impact evaluation on land cover change outcomes. While the number of conservation evaluations utilizing 'matching' to construct a valid control group is increasing, the majority of these studies use simple differences in means or linear cross-sectional regression to estimate the impact of the conservation program using this matched sample, with relatively few utilizing fixed effects panel methods--an alternative estimation method that relies on temporal variation in the data. In this paper we compare the advantages and limitations of (1) matching to construct the control group combined with differences in means and cross-sectional regression, which control for observable forms of bias in program evaluation, to (2) fixed effects panel methods, which control for observable and time-invariant unobservable forms of bias, with and without matching to create the control group. We then use these four approaches to estimate forest cover outcomes for two conservation programs: a PES program in Northeastern Ecuador and strict protected areas in European Russia. In the Russia case we find statistically significant differences across estimators--due to the presence of unobservable bias--that lead to differences in conclusions about effectiveness. The Ecuador case illustrates that if time-invariant unobservables are not present, matching combined with differences in means or cross-sectional regression leads to similar estimates of program effectiveness as matching combined with fixed effects panel regression. These results highlight the importance of considering observable and unobservable forms of bias and the methodological assumptions across estimators when designing an impact evaluation of conservation programs.

  13. “His” and “Her” Marriage? The Role of Positive and Negative Marital Characteristics in Global Marital Satisfaction Among Older Adults

    PubMed Central

    Jopp, Daniela S.; Carr, Deborah; Sosinsky, Laura; Kim, Se-Kang

    2014-01-01

    Objectives. We explore gender differences in older adults’ appraisals of positive and negative aspects of their marriages, examine how these appraisals relate to global marital satisfaction, and identify distinctive marital profiles associated with global satisfaction in men and women. Method. Data are from the Changing Lives of Older Couples Study (n = 1,110). We used a variant of principal components analysis to generate marital quality profiles, based on one’s endorsement of positive and negative marital characteristics. OLS regression was used to detect associations between marital profiles and global marital satisfaction. Results. Men offered more positive marital assessments than women, particularly on items reflecting positive treatment by one’s wife. Three marital quality profiles emerged: Positive, Positive–Negative, and Negative. Although marital satisfaction was best explained by positive appraisals in both genders, they were less important for men than for women. The negative profile showed a tendency for a stronger prediction in men. Discussion. Prior studies show small differences in men’s and women’s global marital satisfaction. Our work provides evidence that the presence and magnitude of such gender differences may vary based on the specific marital component considered. We discuss ways that gender shapes marital interactions, expectations, and perceptions, and the implications of our results for the well-being of married older adults. PMID:24742399

  14. A comparison of non-parametric techniques to estimate incident photosynthetically active radiation from MODIS for monitoring primary production

    NASA Astrophysics Data System (ADS)

    Brown, M. G. L.; He, T.; Liang, S.

    2016-12-01

    Satellite-derived estimates of incident photosynthetically active radiation (PAR) can be used to monitor global change, are required by most terrestrial ecosystem models, and can be used to estimate primary production according to the theory of light use efficiency. Compared with parametric approaches, non-parametric techniques that include an artificial neural network (ANN), support vector machine regression (SVM), an artificial bee colony (ABC), and a look-up table (LUT) do not require many ancillary data as inputs for the estimation of PAR from satellite data. In this study, a selection of machine learning methods to estimate PAR from MODIS top of atmosphere (TOA) radiances are compared to a LUT approach to determine which techniques might best handle the nonlinear relationship between TOA radiance and incident PAR. Evaluation of these methods (ANN, SVM, and LUT) is performed with ground measurements at seven SURFRAD sites. Due to the design of the ANN, it can handle the nonlinear relationship between TOA radiance and PAR better than linearly interpolating between the values in the LUT; however, training the ANN has to be carried out on an angular-bin basis, which results in a LUT of ANNs. The SVM model may be better for incorporating multiple viewing angles than the ANN; however, both techniques require a large amount of training data, which may introduce a regional bias based on where the most training and validation data are available. Based on the literature, the ABC is a promising alternative to an ANN, SVM regression and a LUT, but further development for this application is required before concrete conclusions can be drawn. For now, the LUT method outperforms the machine-learning techniques, but future work should be directed at developing and testing the ABC method. A simple, robust method to estimate direct and diffuse incident PAR, with minimal inputs and a priori knowledge, would be very useful for monitoring global change of primary production, particularly of pastures and rangeland, which have implications for livestock and food security. Future work will delve deeper into the utility of satellite-derived PAR estimation for monitoring primary production in pasture and rangelands.

  15. Regional Variation in the Prevalence of E. coli O157 in Cattle: A Meta-Analysis and Meta-Regression

    PubMed Central

    Islam, Md. Zohorul; Musekiwa, Alfred; Islam, Kamrul; Ahmed, Shahana; Chowdhury, Sharmin; Ahad, Abdul; Biswas, Paritosh Kumar

    2014-01-01

    Background Escherichia coli O157 (EcO157) infection has been recognized as an important global public health concern. But information on the prevalence of EcO157 in cattle at the global and at the wider geographical levels is limited, if not absent. This is the first meta-analysis to investigate the point prevalence of EcO157 in cattle at the global level and to explore the factors contributing to variation in prevalence estimates. Methods Seven electronic databases- CAB Abstracts, PubMed, Biosis Citation Index, Medline, Web of Knowledge, Scirus and Scopus were searched for relevant publications from 1980 to 2012. A random effect meta-analysis model was used to produce the pooled estimates. The potential sources of between study heterogeneity were identified using meta-regression. Principal findings A total of 140 studies consisting 220,427 cattle were included in the meta-analysis. The prevalence estimate of EcO157 in cattle at the global level was 5.68% (95% CI, 5.16–6.20). The random effects pooled prevalence estimates in Africa, Northern America, Oceania, Europe, Asia and Latin America-Caribbean were 31.20% (95% CI, 12.35–50.04), 7.35% (95% CI, 6.44–8.26), 6.85% (95% CI, 2.41–11.29), 5.15% (95% CI, 4.21–6.09), 4.69% (95% CI, 3.05–6.33) and 1.65% (95% CI, 0.77–2.53), respectively. Between studies heterogeneity was evidenced in most regions. World region (p<0.001), type of cattle (p<0.001) and to some extent, specimens (p = 0.074) as well as method of pre-enrichment (p = 0.110), were identified as factors for variation in the prevalence estimates of EcO157 in cattle. Conclusion The prevalence of the organism seems to be higher in the African and Northern American regions. The important factors that might have influence in the estimates of EcO157 are type of cattle and kind of screening specimen. Their roles need to be determined and they should be properly handled in any survey to estimate the true prevalence of EcO157. PMID:24691253

  16. A Land System representation for global assessments and land-use modeling.

    PubMed

    van Asselen, Sanneke; Verburg, Peter H

    2012-10-01

    Current global scale land-change models used for integrated assessments and climate modeling are based on classifications of land cover. However, land-use management intensity and livestock keeping are also important aspects of land use, and are an integrated part of land systems. This article aims to classify, map, and to characterize Land Systems (LS) at a global scale and analyze the spatial determinants of these systems. Besides proposing such a classification, the article tests if global assessments can be based on globally uniform allocation rules. Land cover, livestock, and agricultural intensity data are used to map LS using a hierarchical classification method. Logistic regressions are used to analyze variation in spatial determinants of LS. The analysis of the spatial determinants of LS indicates strong associations between LS and a range of socioeconomic and biophysical indicators of human-environment interactions. The set of identified spatial determinants of a LS differs among regions and scales, especially for (mosaic) cropland systems, grassland systems with livestock, and settlements. (Semi-)Natural LS have more similar spatial determinants across regions and scales. Using LS in global models is expected to result in a more accurate representation of land use capturing important aspects of land systems and land architecture: the variation in land cover and the link between land-use intensity and landscape composition. Because the set of most important spatial determinants of LS varies among regions and scales, land-change models that include the human drivers of land change are best parameterized at sub-global level, where similar biophysical, socioeconomic and cultural conditions prevail in the specific regions. © 2012 Blackwell Publishing Ltd.

  17. Spatial distribution of soil organic carbon and total nitrogen based on GIS and geostatistics in a small watershed in a hilly area of northern China.

    PubMed

    Peng, Gao; Bing, Wang; Guangpo, Geng; Guangcan, Zhang

    2013-01-01

    The spatial variability of soil organic carbon (SOC) and total nitrogen (STN) levels is important in both global carbon-nitrogen cycle and climate change research. There has been little research on the spatial distribution of SOC and STN at the watershed scale based on geographic information systems (GIS) and geostatistics. Ninety-seven soil samples taken at depths of 0-20 cm were collected during October 2010 and 2011 from the Matiyu small watershed (4.2 km(2)) of a hilly area in Shandong Province, northern China. The impacts of different land use types, elevation, vegetation coverage and other factors on SOC and STN spatial distributions were examined using GIS and a geostatistical method, regression-kriging. The results show that the concentration variations of SOC and STN in the Matiyu small watershed were moderate variation based on the mean, median, minimum and maximum, and the coefficients of variation (CV). Residual values of SOC and STN had moderate spatial autocorrelations, and the Nugget/Sill were 0.2% and 0.1%, respectively. Distribution maps of regression-kriging revealed that both SOC and STN concentrations in the Matiyu watershed decreased from southeast to northwest. This result was similar to the watershed DEM trend and significantly correlated with land use type, elevation and aspect. SOC and STN predictions with the regression-kriging method were more accurate than those obtained using ordinary kriging. This research indicates that geostatistical characteristics of SOC and STN concentrations in the watershed were closely related to both land-use type and spatial topographic structure and that regression-kriging is suitable for investigating the spatial distributions of SOC and STN in the complex topography of the watershed.

  18. Spatial Distribution of Soil Organic Carbon and Total Nitrogen Based on GIS and Geostatistics in a Small Watershed in a Hilly Area of Northern China

    PubMed Central

    Peng, Gao; Bing, Wang; Guangpo, Geng; Guangcan, Zhang

    2013-01-01

    The spatial variability of soil organic carbon (SOC) and total nitrogen (STN) levels is important in both global carbon-nitrogen cycle and climate change research. There has been little research on the spatial distribution of SOC and STN at the watershed scale based on geographic information systems (GIS) and geostatistics. Ninety-seven soil samples taken at depths of 0–20 cm were collected during October 2010 and 2011 from the Matiyu small watershed (4.2 km2) of a hilly area in Shandong Province, northern China. The impacts of different land use types, elevation, vegetation coverage and other factors on SOC and STN spatial distributions were examined using GIS and a geostatistical method, regression-kriging. The results show that the concentration variations of SOC and STN in the Matiyu small watershed were moderate variation based on the mean, median, minimum and maximum, and the coefficients of variation (CV). Residual values of SOC and STN had moderate spatial autocorrelations, and the Nugget/Sill were 0.2% and 0.1%, respectively. Distribution maps of regression-kriging revealed that both SOC and STN concentrations in the Matiyu watershed decreased from southeast to northwest. This result was similar to the watershed DEM trend and significantly correlated with land use type, elevation and aspect. SOC and STN predictions with the regression-kriging method were more accurate than those obtained using ordinary kriging. This research indicates that geostatistical characteristics of SOC and STN concentrations in the watershed were closely related to both land-use type and spatial topographic structure and that regression-kriging is suitable for investigating the spatial distributions of SOC and STN in the complex topography of the watershed. PMID:24391791

  19. Integrated analysis of DNA-methylation and gene expression using high-dimensional penalized regression: a cohort study on bone mineral density in postmenopausal women.

    PubMed

    Lien, Tonje G; Borgan, Ørnulf; Reppe, Sjur; Gautvik, Kaare; Glad, Ingrid Kristine

    2018-03-07

    Using high-dimensional penalized regression we studied genome-wide DNA-methylation in bone biopsies of 80 postmenopausal women in relation to their bone mineral density (BMD). The women showed BMD varying from severely osteoporotic to normal. Global gene expression data from the same individuals was available, and since DNA-methylation often affects gene expression, the overall aim of this paper was to include both of these omics data sets into an integrated analysis. The classical penalized regression uses one penalty, but we incorporated individual penalties for each of the DNA-methylation sites. These individual penalties were guided by the strength of association between DNA-methylations and gene transcript levels. DNA-methylations that were highly associated to one or more transcripts got lower penalties and were therefore favored compared to DNA-methylations showing less association to expression. Because of the complex pathways and interactions among genes, we investigated both the association between DNA-methylations and their corresponding cis gene, as well as the association between DNA-methylations and trans-located genes. Two integrating penalized methods were used: first, an adaptive group-regularized ridge regression, and secondly, variable selection was performed through a modified version of the weighted lasso. When information from gene expressions was integrated, predictive performance was considerably improved, in terms of predictive mean square error, compared to classical penalized regression without data integration. We found a 14.7% improvement in the ridge regression case and a 17% improvement for the lasso case. Our version of the weighted lasso with data integration found a list of 22 interesting methylation sites. Several corresponded to genes that are known to be important in bone formation. Using BMD as response and these 22 methylation sites as covariates, least square regression analyses resulted in R 2 =0.726, comparable to an average R 2 =0.438 for 10000 randomly selected groups of DNA-methylations with group size 22. Two recent types of penalized regression methods were adapted to integrate DNA-methylation and their association to gene expression in the analysis of bone mineral density. In both cases predictions clearly benefit from including the additional information on gene expressions.

  20. Retrieval and Mapping of Heavy Metal Concentration in Soil Using Time Series Landsat 8 Imagery

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Xu, L.; Peng, J.; Wang, H.; Wong, A.; Clausi, D. A.

    2018-04-01

    Heavy metal pollution is a critical global environmental problem which has always been a concern. Traditional approach to obtain heavy metal concentration relying on field sampling and lab testing is expensive and time consuming. Although many related studies use spectrometers data to build relational model between heavy metal concentration and spectra information, and then use the model to perform prediction using the hyperspectral imagery, this manner can hardly quickly and accurately map soil metal concentration of an area due to the discrepancies between spectrometers data and remote sensing imagery. Taking the advantage of easy accessibility of Landsat 8 data, this study utilizes Landsat 8 imagery to retrieve soil Cu concentration and mapping its distribution in the study area. To enlarge the spectral information for more accurate retrieval and mapping, 11 single date Landsat 8 imagery from 2013-2017 are selected to form a time series imagery. Three regression methods, partial least square regression (PLSR), artificial neural network (ANN) and support vector regression (SVR) are used to model construction. By comparing these models unbiasedly, the best model are selected to mapping Cu concentration distribution. The produced distribution map shows a good spatial autocorrelation and consistency with the mining area locations.

  1. Regression-based adaptive sparse polynomial dimensional decomposition for sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Tang, Kunkun; Congedo, Pietro; Abgrall, Remi

    2014-11-01

    Polynomial dimensional decomposition (PDD) is employed in this work for global sensitivity analysis and uncertainty quantification of stochastic systems subject to a large number of random input variables. Due to the intimate structure between PDD and Analysis-of-Variance, PDD is able to provide simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to polynomial chaos (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of the standard method unaffordable for real engineering applications. In order to address this problem of curse of dimensionality, this work proposes a variance-based adaptive strategy aiming to build a cheap meta-model by sparse-PDD with PDD coefficients computed by regression. During this adaptive procedure, the model representation by PDD only contains few terms, so that the cost to resolve repeatedly the linear system of the least-square regression problem is negligible. The size of the final sparse-PDD representation is much smaller than the full PDD, since only significant terms are eventually retained. Consequently, a much less number of calls to the deterministic model is required to compute the final PDD coefficients.

  2. Revisiting Gaussian Process Regression Modeling for Localization in Wireless Sensor Networks

    PubMed Central

    Richter, Philipp; Toledano-Ayala, Manuel

    2015-01-01

    Signal strength-based positioning in wireless sensor networks is a key technology for seamless, ubiquitous localization, especially in areas where Global Navigation Satellite System (GNSS) signals propagate poorly. To enable wireless local area network (WLAN) location fingerprinting in larger areas while maintaining accuracy, methods to reduce the effort of radio map creation must be consolidated and automatized. Gaussian process regression has been applied to overcome this issue, also with auspicious results, but the fit of the model was never thoroughly assessed. Instead, most studies trained a readily available model, relying on the zero mean and squared exponential covariance function, without further scrutinization. This paper studies the Gaussian process regression model selection for WLAN fingerprinting in indoor and outdoor environments. We train several models for indoor/outdoor- and combined areas; we evaluate them quantitatively and compare them by means of adequate model measures, hence assessing the fit of these models directly. To illuminate the quality of the model fit, the residuals of the proposed model are investigated, as well. Comparative experiments on the positioning performance verify and conclude the model selection. In this way, we show that the standard model is not the most appropriate, discuss alternatives and present our best candidate. PMID:26370996

  3. Modeling the spatio-temporal heterogeneity in the PM10-PM2.5 relationship

    NASA Astrophysics Data System (ADS)

    Chu, Hone-Jay; Huang, Bo; Lin, Chuan-Yao

    2015-02-01

    This paper explores the spatio-temporal patterns of particulate matter (PM) in Taiwan based on a series of methods. Using fuzzy c-means clustering first, the spatial heterogeneity (six clusters) in the PM data collected between 2005 and 2009 in Taiwan are identified and the industrial and urban areas of Taiwan (southwestern, west central, northwestern, and northern Taiwan) are found to have high PM concentrations. The PM10-PM2.5 relationship is then modeled with global ordinary least squares regression, geographically weighted regression (GWR), and geographically and temporally weighted regression (GTWR). The GTWR and GWR produce consistent results; however, GTWR provides more detailed information of spatio-temporal variations of the PM10-PM2.5 relationship. The results also show that GTWR provides a relatively high goodness of fit and sufficient space-time explanatory power. In particular, the PM2.5 or PM10 varies with time and space, depending on weather conditions and the spatial distribution of land use and emission patterns in local areas. Such information can be used to determine patterns of spatio-temporal heterogeneity in PM that will allow the control of pollutants and the reduction of public exposure.

  4. Constructive thinking, rational intelligence and irritable bowel syndrome

    PubMed Central

    Rey, Enrique; Ortega, Marta Moreno; Alonso, Monica Olga Garcia; Diaz-Rubio, Manuel

    2009-01-01

    AIM: To evaluate rational and experiential intelligence in irritable bowel syndrome (IBS) sufferers. METHODS: We recruited 100 subjects with IBS as per Rome II criteria (50 consulters and 50 non-consulters) and 100 healthy controls, matched by age, sex and educational level. Cases and controls completed a clinical questionnaire (including symptom characteristics and medical consultation) and the following tests: rational-intelligence (Wechsler Adult Intelligence Scale, 3rd edition); experiential-intelligence (Constructive Thinking Inventory); personality (NEO personality inventory); psychopathology (MMPI-2), anxiety (state-trait anxiety inventory) and life events (social readjustment rating scale). Analysis of variance was used to compare the test results of IBS-sufferers and controls, and a logistic regression model was then constructed and adjusted for age, sex and educational level to evaluate any possible association with IBS. RESULTS: No differences were found between IBS cases and controls in terms of IQ (102.0 ± 10.8 vs 102.8 ± 12.6), but IBS sufferers scored significantly lower in global constructive thinking (43.7 ± 9.4 vs 49.6 ± 9.7). In the logistic regression model, global constructive thinking score was independently linked to suffering from IBS [OR 0.92 (0.87-0.97)], without significant OR for total IQ. CONCLUSION: IBS subjects do not show lower rational intelligence than controls, but lower experiential intelligence is nevertheless associated with IBS. PMID:19575489

  5. Regional estimation of extreme suspended sediment concentrations using watershed characteristics

    NASA Astrophysics Data System (ADS)

    Tramblay, Yves; Ouarda, Taha B. M. J.; St-Hilaire, André; Poulin, Jimmy

    2010-01-01

    SummaryThe number of stations monitoring daily suspended sediment concentration (SSC) has been decreasing since the 1980s in North America while suspended sediment is considered as a key variable for water quality. The objective of this study is to test the feasibility of regionalising extreme SSC, i.e. estimating SSC extremes values for ungauged basins. Annual maximum SSC for 72 rivers in Canada and USA were modelled with probability distributions in order to estimate quantiles corresponding to different return periods. Regionalisation techniques, originally developed for flood prediction in ungauged basins, were tested using the climatic, topographic, land cover and soils attributes of the watersheds. Two approaches were compared, using either physiographic characteristics or seasonality of extreme SSC to delineate the regions. Multiple regression models to estimate SSC quantiles as a function of watershed characteristics were built in each region, and compared to a global model including all sites. Regional estimates of SSC quantiles were compared with the local values. Results show that regional estimation of extreme SSC is more efficient than a global regression model including all sites. Groups/regions of stations have been identified, using either the watershed characteristics or the seasonality of occurrence for extreme SSC values providing a method to better describe the extreme events of SSC. The most important variables for predicting extreme SSC are the percentage of clay in the soils, precipitation intensity and forest cover.

  6. Evaluation of 41 Candidate Gene Variants for Obesity in the EPIC-Potsdam Cohort by Multi-Locus Stepwise Regression

    PubMed Central

    Knüppel, Sven; Rohde, Klaus; Meidtner, Karina; Drogan, Dagmar; Holzhütter, Hermann-Georg; Boeing, Heiner; Fisher, Eva

    2013-01-01

    Objective Obesity has become a leading preventable cause of morbidity and mortality in many parts of the world. It is thought to originate from multiple genetic and environmental determinants. The aim of the current study was to introduce haplotype-based multi-locus stepwise regression (MSR) as a method to investigate combinations of unlinked single nucleotide polymorphisms (SNPs) for obesity phenotypes. Methods In 2,122 healthy randomly selected men and women of the EPIC-Potsdam cohort, the association between 41 SNPs from 18 obesity-candidate genes and either body mass index (BMI, mean = 25.9 kg/m2, SD = 4.1) or waist circumference (WC, mean = 85.2 cm, SD = 12.6) was assessed. Single SNP analyses were done by using linear regression adjusted for age, sex, and other covariates. Subsequently, MSR was applied to search for the ‘best’ SNP combinations. Combinations were selected according to specific AICc and p-value criteria. Model uncertainty was accounted for by a permutation test. Results The strongest single SNP effects on BMI were found for TBC1D1 rs637797 (β = −0.33, SE = 0.13), FTO rs9939609 (β = 0.28, SE = 0.13), MC4R rs17700144 (β = 0.41, SE = 0.15), and MC4R rs10871777 (β = 0.34, SE = 0.14). All these SNPs showed similar effects on waist circumference. The two ‘best’ six-SNP combinations for BMI (global p-value = 3.45⋅10–6 and 6.82⋅10–6) showed effects ranging from −1.70 (SE = 0.34) to 0.74 kg/m2 (SE = 0.21) per allele combination. We selected two six-SNP combinations on waist circumference (global p-value = 7.80⋅10–6 and 9.76⋅10–6) with an allele combination effect of −2.96 cm (SE = 0.76) at maximum. Additional adjustment for BMI revealed 15 three-SNP combinations (global p-values ranged from 3.09⋅10–4 to 1.02⋅10–2). However, after carrying out the permutation test all SNP combinations lost significance indicating that the statistical associations might have occurred by chance. Conclusion MSR provides a tool to search for risk-related SNP combinations of common traits or diseases. However, the search process does not always find meaningful SNP combinations in a dataset. PMID:23874820

  7. Resting Heart Rate as Predictor for Left Ventricular Dysfunction and Heart Failure: The Multi-Ethnic Study of Atherosclerosis

    PubMed Central

    Opdahl, Anders; Venkatesh, Bharath Ambale; Fernandes, Veronica R. S.; Wu, Colin O.; Nasir, Khurram; Choi, Eui-Young; Almeida, Andre L. C.; Rosen, Boaz; Carvalho, Benilton; Edvardsen, Thor; Bluemke, David A.; Lima, Joao A. C.

    2014-01-01

    OBJECTIVE To investigate the relationship between baseline resting heart rate and incidence of heart failure (HF) and global and regional left ventricular (LV) dysfunction. BACKGROUND The association of resting heart rate to HF and LV function is not well described in an asymptomatic multi-ethnic population. METHODS Participants in the Multi-Ethnic Study of Atherosclerosis had resting heart rate measured at inclusion. Incident HF was registered (n=176) during follow-up (median 7 years) in those who underwent cardiac MRI (n=5000). Changes in ejection fraction (ΔEF) and peak circumferential strain (Δεcc) were measured as markers of developing global and regional LV dysfunction in 1056 participants imaged at baseline and 5 years later. Time to HF (Cox model) and Δεcc and ΔEF (multiple linear regression models) were adjusted for demographics, traditional cardiovascular risk factors, calcium score, LV end-diastolic volume and mass in addition to resting heart rate. RESULTS Cox analysis demonstrated that for 1 bpm increase in resting heart rate there was a 4% greater adjusted relative risk for incident HF (Hazard Ratio: 1.04 (1.02, 1.06 (95% CI); P<0.001). Adjusted multiple regression models demonstrated that resting heart rate was positively associated with deteriorating εcc and decrease in EF, even in analyses when all coronary heart disease events were excluded from the model. CONCLUSION Elevated resting heart rate is associated with increased risk for incident HF in asymptomatic participants in MESA. Higher heart rate is related to development of regional and global LV dysfunction independent of subclinical atherosclerosis and coronary heart disease. PMID:24412444

  8. Change of sleep quality from pre- to 3 years post-solid organ transplantation: The Swiss Transplant Cohort Study

    PubMed Central

    Denhaerynck, Kris; Huynh-Do, Uyen; Binet, Isabelle; Hadaya, Karine; De Geest, Sabina

    2017-01-01

    Background Poor sleep quality (SQ) is common after solid organ transplantation; however, very little is known about its natural history. We assessed the changes in SQ from pre- to 3 years post-transplant in adult heart, kidney, liver and lung recipients included in the prospective nation-wide Swiss Transplant Cohort Study. We explored associations with selected variables in patients suffering persistent poor SQ compared to those with good or variable SQ. Methods Adult single organ transplant recipients enrolled in the Swiss Transplant Cohort Study with pre-transplant and at least 3 post-transplant SQ assessment data were included. SQ was self-reported pre-transplant (at listing), then at 6, 12, 24 and 36 months post-transplant. A single SQ item was used to identify poor (0–5) and good sleepers (6–10). Between organ groups, SQ was compared via logistic regression analysis with generalized estimating equations. Within the group reporting persistently poor SQ, we used logistic regression or Kaplan-Meier analysis as appropriate to check for differences in global quality of life and survival. Results In a sample of 1173 transplant patients (age: 52.1±13.2 years; 65% males; 66% kidney, 17% liver, 10% lung, 7% heart) transplanted between 2008 and 2012, pre- transplant poor SQ was highest in liver (50%) and heart (49%) recipients. Overall, poor SQ decreased significantly from pre-transplant (38%) to 24 months post-transplant (26%) and remained stable at 3 years (29%). Patients reporting persistently poor SQ had significantly more depressive symptomatology and lower global quality of life. Conclusion Because self-reported poor SQ is related to poorer global quality of life, these results emphasize the need for further studies to find suitable treatment options for poor SQ in transplant recipients. PMID:29020112

  9. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  10. Global Potential Net Prmary Production Predicted from Vegetation Class, Precipitation, and Temperature

    USDA-ARS?s Scientific Manuscript database

    Net Primary Production (NPP), the difference between CO2 fixed by photosynthesis and CO2 lost to autotrophic respiration, is one of the most important components of the carbon cycle. Our goal was to develop a simple regression model to estimate global NPP using climate and land cover data. Approxima...

  11. Quantitative estimation of global patterns of surface ocean biological productivity and its seasonal variation on timescales from centuries to millennia

    NASA Astrophysics Data System (ADS)

    Loubere, Paul; Fariduddin, Mohammad

    1999-03-01

    We present a quantitative method, based on the relative abundances of benthic foraminifera in deep-sea sediments, for estimating surface ocean biological productivity over the timescale of centuries to millennia. We calibrate the method using a global data set composed of 207 samples from the Atlantic, Pacific, and Indian Oceans from a water depth range between 2300 and 3600 m. The sample set was developed so that other, potentially significant, environmental variables would be uncorrelated to overlying surface ocean productivity. A regression of assemblages against productivity yielded an r2 = 0.89 demonstrating a strong productivity signal in the faunal data. In addition, we examined assemblage response to annual variability in biological productivity (seasonality). Our data set included a range of seasonalities which we quantified into a seasonality index using the pigment color bands from the coastal zone color scanner (CZCS). The response of benthic foraminiferal assemblage composition to our seasonality index was tested with regression analysis. We obtained a statistically highly significant r2 = 0.75. Further, discriminant function analysis revealed a clear separation among sample groups based on surface ocean productivity and our seasonality index. Finally, we tested the response of benthic foraminiferal assemblages to three different modes of seasonality. We observed a distinct separation of our samples into groups representing low seasonal variability, strong seasonality with a single main productivity event in the year, and strong seasonality with multiple productivity events in the year. Reconstructing surface ocean biological productivity with benthic foraminifera will aid in modeling marine biogeochemical cycles. Also, estimating mode and range of annual seasonality will provide insight to changing oceanic processes, allowing the examination of the mechanisms causing changes in the marine biotic system over time. This article contains supplementary material.

  12. Global atmospheric emissions and transport of polycyclic aromatic hydrocarbons: Evaluation of modeling and transboundary pollution

    NASA Astrophysics Data System (ADS)

    Shen, Huizhong; Tao, Shu

    2014-05-01

    Global atmospheric emissions of 16 polycyclic aromatic hydrocarbons (PAHs) from 69 major sources were estimated for a period from 1960 to 2030. Regression models and a technology split method were used to estimated country and time specific emission factors, resulting in a new estimate of PAH emission factor variation among different countries and over time. PAH emissions in 2007 were spatially resolved to 0.1° × 0.1° grids based on a newly developed global high-resolution fuel combustion inventory (PKU-FUEL-2007). MOZART-4 (The Model for Ozone and Related Chemical Tracers, version 4) was applied to simulate the global tropospheric transport of Benzo(a)pyrene, one of the high molecular weight carcinogenic PAHs, at a horizontal resolution of 1.875° (longitude) × 1.8947° (latitude). The reaction with OH radical, gas/particle partitioning, wet deposition, dry deposition, and dynamic soil/ocean-air exchange of PAHs were considered. The simulation was validated by observations at both background and non-background sites, including Alert site in Canadian High Arctic, EMEP sites in Europe, and other 254 urban/rural sites reported from literatures. Key factors effecting long-range transport of BaP were addressed, and transboundary pollution was discussed.

  13. Derivation of global vegetation biophysical parameters from EUMETSAT Polar System

    NASA Astrophysics Data System (ADS)

    García-Haro, Francisco Javier; Campos-Taberner, Manuel; Muñoz-Marí, Jordi; Laparra, Valero; Camacho, Fernando; Sánchez-Zapero, Jorge; Camps-Valls, Gustau

    2018-05-01

    This paper presents the algorithm developed in LSA-SAF (Satellite Application Facility for Land Surface Analysis) for the derivation of global vegetation parameters from the AVHRR (Advanced Very High Resolution Radiometer) sensor on board MetOp (Meteorological-Operational) satellites forming the EUMETSAT (European Organization for the Exploitation of Meteorological Satellites) Polar System (EPS). The suite of LSA-SAF EPS vegetation products includes the leaf area index (LAI), the fractional vegetation cover (FVC), and the fraction of absorbed photosynthetically active radiation (FAPAR). LAI, FAPAR, and FVC characterize the structure and the functioning of vegetation and are key parameters for a wide range of land-biosphere applications. The algorithm is based on a hybrid approach that blends the generalization capabilities offered by physical radiative transfer models with the accuracy and computational efficiency of machine learning methods. One major feature is the implementation of multi-output retrieval methods able to jointly and more consistently estimate all the biophysical parameters at the same time. We propose a multi-output Gaussian process regression (GPRmulti), which outperforms other considered methods over PROSAIL (coupling of PROSPECT and SAIL (Scattering by Arbitrary Inclined Leaves) radiative transfer models) EPS simulations. The global EPS products include uncertainty estimates taking into account the uncertainty captured by the retrieval method and input errors propagation. A sensitivity analysis is performed to assess several sources of uncertainties in retrievals and maximize the positive impact of modeling the noise in training simulations. The paper discusses initial validation studies and provides details about the characteristics and overall quality of the products, which can be of interest to assist the successful use of the data by a broad user's community. The consistent generation and distribution of the EPS vegetation products will constitute a valuable tool for monitoring of earth surface dynamic processes.

  14. Synthesizing Regression Results: A Factored Likelihood Method

    ERIC Educational Resources Information Center

    Wu, Meng-Jia; Becker, Betsy Jane

    2013-01-01

    Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

  15. Exploration of time-course combinations of outcome scales for use in a global test of stroke recovery.

    PubMed

    Goldie, Fraser C; Fulton, Rachael L; Dawson, Jesse; Bluhmki, Erich; Lees, Kennedy R

    2014-08-01

    Clinical trials for acute ischemic stroke treatment require large numbers of participants and are expensive to conduct. Methods that enhance statistical power are therefore desirable. We explored whether this can be achieved by a measure incorporating both early and late measures of outcome (e.g. seven-day NIH Stroke Scale combined with 90-day modified Rankin scale). We analyzed sensitivity to treatment effect, using proportional odds logistic regression for ordinal scales and generalized estimating equation method for global outcomes, with all analyses adjusted for baseline severity and age. We ran simulations to assess relations between sample size and power for ordinal scales and corresponding global outcomes. We used R version 2·12·1 (R Development Core Team. R Foundation for Statistical Computing, Vienna, Austria) for simulations and SAS 9·2 (SAS Institute Inc., Cary, NC, USA) for all other analyses. Each scale considered for combination was sensitive to treatment effect in isolation. The mRS90 and NIHSS90 had adjusted odds ratio of 1·56 and 1·62, respectively. Adjusted odds ratio for global outcomes of the combination of mRS90 with NIHSS7 and NIHSS90 with NIHSS7 were 1·69 and 1·73, respectively. The smallest sample sizes required to generate statistical power ≥80% for mRS90, NIHSS7, and global outcomes of mRS90 and NIHSS7 combined and NIHSS90 and NIHSS7 combined were 500, 490, 400, and 380, respectively. When data concerning both early and late outcomes are combined into a global measure, there is increased sensitivity to treatment effect compared with solitary ordinal scales. This delivers a 20% reduction in required sample size at 80% power. Combining early with late outcomes merits further consideration. © 2013 The Authors. International Journal of Stroke © 2013 World Stroke Organization.

  16. Predictors of health-related and global quality of life among young adults with difficult-to-treat epilepsy and mild intellectual disability.

    PubMed

    Endermann, Michael

    2013-02-01

    This study evaluated predictors of health-related quality of life (HRQOL) and global quality of life (QOL) among young adults with difficult-to-treat epilepsy and mild intellectual disability. One hundred and forty-two persons with epilepsy and cognitive problems were routinely screened on HRQOL, global QOL, and psychological distress four weeks after admission to a time-limited residential rehabilitation unit. The PESOS scales (PE = PErformance, SO = SOciodemographic aspects, S = Subjective evaluation/estimation) on epilepsy-specific problems were administered as measures of HRQOL; a questionnaire on life satisfaction and an item on overall QOL were used as measures of global QOL. Psychological distress was captured with the Symptom Checklist 90-R. Further data were gained from medical files. Quality-of- life predictors were identified using univariate methods and stepwise regression analyses. Psychological distress was the only predictor of all HRQOL and global QOL parameters. Seizure frequency was a predictor of most HRQOL variables. Other epilepsy variables affected only some HRQOL variables but were not associated with global QOL. Health-related quality of life did not seem to be strongly impaired. Only low correlations were found between HRQOL and global QOL. The notion of psychological distress as the most influential predictor of all QOL measures is in line with most findings on QOL in epilepsy. Former observations of weak associations between HRQOL and global QOL among patients with epilepsy and mild intellectual disability are supported. Thus, interventions to reduce psychological distress, besides epilepsy treatment, seem to be of great importance to improve QOL. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Logistic Regression for Seismically Induced Landslide Predictions: Using Uniform Hazard and Geophysical Layers as Predictor Variables

    NASA Astrophysics Data System (ADS)

    Nowicki, M. A.; Hearne, M.; Thompson, E.; Wald, D. J.

    2012-12-01

    Seismically induced landslides present a costly and often fatal threats in many mountainous regions. Substantial effort has been invested to understand where seismically induced landslides may occur in the future. Both slope-stability methods and, more recently, statistical approaches to the problem are described throughout the literature. Though some regional efforts have succeeded, no uniformly agreed-upon method is available for predicting the likelihood and spatial extent of seismically induced landslides. For use in the U. S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, we would like to routinely make such estimates, in near-real time, around the globe. Here we use the recently produced USGS ShakeMap Atlas of historic earthquakes to develop an empirical landslide probability model. We focus on recent events, yet include any digitally-mapped landslide inventories for which well-constrained ShakeMaps are also available. We combine these uniform estimates of the input shaking (e.g., peak acceleration and velocity) with broadly available susceptibility proxies, such as topographic slope and surface geology. The resulting database is used to build a predictive model of the probability of landslide occurrence with logistic regression. The landslide database includes observations from the Northridge, California (1994); Wenchuan, China (2008); ChiChi, Taiwan (1999); and Chuetsu, Japan (2004) earthquakes; we also provide ShakeMaps for moderate-sized events without landslide for proper model testing and training. The performance of the regression model is assessed with both statistical goodness-of-fit metrics and a qualitative review of whether or not the model is able to capture the spatial extent of landslides for each event. Part of our goal is to determine which variables can be employed based on globally-available data or proxies, and whether or not modeling results from one region are transferrable to geomorphologically-similar regions that lack proper calibration events. Combined with near-real time ShakeMaps, we anticipate using our model to make generalized predictions of whether or not (and if so, where) landslides are likely to occur for earthquakes around the globe; we also intend to incorporate this functionality into the USGS PAGER system.

  18. WebGLORE: a Web service for Grid LOgistic REgression

    PubMed Central

    Jiang, Wenchao; Li, Pinghao; Wang, Shuang; Wu, Yuan; Xue, Meng; Ohno-Machado, Lucila; Jiang, Xiaoqian

    2013-01-01

    WebGLORE is a free web service that enables privacy-preserving construction of a global logistic regression model from distributed datasets that are sensitive. It only transfers aggregated local statistics (from participants) through Hypertext Transfer Protocol Secure to a trusted server, where the global model is synthesized. WebGLORE seamlessly integrates AJAX, JAVA Applet/Servlet and PHP technologies to provide an easy-to-use web service for biomedical researchers to break down policy barriers during information exchange. Availability and implementation: http://dbmi-engine.ucsd.edu/webglore3/. WebGLORE can be used under the terms of GNU general public license as published by the Free Software Foundation. Contact: x1jiang@ucsd.edu PMID:24072732

  19. Empirical Assessment of Spatial Prediction Methods for Location Cost Adjustment Factors

    PubMed Central

    Migliaccio, Giovanni C.; Guindani, Michele; D'Incognito, Maria; Zhang, Linlin

    2014-01-01

    In the feasibility stage, the correct prediction of construction costs ensures that budget requirements are met from the start of a project's lifecycle. A very common approach for performing quick-order-of-magnitude estimates is based on using Location Cost Adjustment Factors (LCAFs) that compute historically based costs by project location. Nowadays, numerous LCAF datasets are commercially available in North America, but, obviously, they do not include all locations. Hence, LCAFs for un-sampled locations need to be inferred through spatial interpolation or prediction methods. Currently, practitioners tend to select the value for a location using only one variable, namely the nearest linear-distance between two sites. However, construction costs could be affected by socio-economic variables as suggested by macroeconomic theories. Using a commonly used set of LCAFs, the City Cost Indexes (CCI) by RSMeans, and the socio-economic variables included in the ESRI Community Sourcebook, this article provides several contributions to the body of knowledge. First, the accuracy of various spatial prediction methods in estimating LCAF values for un-sampled locations was evaluated and assessed in respect to spatial interpolation methods. Two Regression-based prediction models were selected, a Global Regression Analysis and a Geographically-weighted regression analysis (GWR). Once these models were compared against interpolation methods, the results showed that GWR is the most appropriate way to model CCI as a function of multiple covariates. The outcome of GWR, for each covariate, was studied for all the 48 states in the contiguous US. As a direct consequence of spatial non-stationarity, it was possible to discuss the influence of each single covariate differently from state to state. In addition, the article includes a first attempt to determine if the observed variability in cost index values could be, at least partially explained by independent socio-economic variables. PMID:25018582

  20. Fruit fly optimization based least square support vector regression for blind image restoration

    NASA Astrophysics Data System (ADS)

    Zhang, Jiao; Wang, Rui; Li, Junshan; Yang, Yawei

    2014-11-01

    The goal of image restoration is to reconstruct the original scene from a degraded observation. It is a critical and challenging task in image processing. Classical restorations require explicit knowledge of the point spread function and a description of the noise as priors. However, it is not practical for many real image processing. The recovery processing needs to be a blind image restoration scenario. Since blind deconvolution is an ill-posed problem, many blind restoration methods need to make additional assumptions to construct restrictions. Due to the differences of PSF and noise energy, blurring images can be quite different. It is difficult to achieve a good balance between proper assumption and high restoration quality in blind deconvolution. Recently, machine learning techniques have been applied to blind image restoration. The least square support vector regression (LSSVR) has been proven to offer strong potential in estimating and forecasting issues. Therefore, this paper proposes a LSSVR-based image restoration method. However, selecting the optimal parameters for support vector machine is essential to the training result. As a novel meta-heuristic algorithm, the fruit fly optimization algorithm (FOA) can be used to handle optimization problems, and has the advantages of fast convergence to the global optimal solution. In the proposed method, the training samples are created from a neighborhood in the degraded image to the central pixel in the original image. The mapping between the degraded image and the original image is learned by training LSSVR. The two parameters of LSSVR are optimized though FOA. The fitness function of FOA is calculated by the restoration error function. With the acquired mapping, the degraded image can be recovered. Experimental results show the proposed method can obtain satisfactory restoration effect. Compared with BP neural network regression, SVR method and Lucy-Richardson algorithm, it speeds up the restoration rate and performs better. Both objective and subjective restoration performances are studied in the comparison experiments.

  1. Estimating Achievable Accuracy for Global Imaging Spectroscopy Measurement of Non-Photosynthetic Vegetation Cover

    NASA Astrophysics Data System (ADS)

    Dennison, P. E.; Kokaly, R. F.; Daughtry, C. S. T.; Roberts, D. A.; Thompson, D. R.; Chambers, J. Q.; Nagler, P. L.; Okin, G. S.; Scarth, P.

    2016-12-01

    Terrestrial vegetation is dynamic, expressing seasonal, annual, and long-term changes in response to climate and disturbance. Phenology and disturbance (e.g. drought, insect attack, and wildfire) can result in a transition from photosynthesizing "green" vegetation to non-photosynthetic vegetation (NPV). NPV cover can include dead and senescent vegetation, plant litter, agricultural residues, and non-photosynthesizing stem tissue. NPV cover is poorly captured by conventional remote sensing vegetation indices, but it is readily separable from substrate cover based on spectral absorption features in the shortwave infrared. We will present past research motivating the need for global NPV measurements, establishing that mapping seasonal NPV cover is critical for improving our understanding of ecosystem function and carbon dynamics. We will also present new research that helps determine a best achievable accuracy for NPV cover estimation. To test the sensitivity of different NPV cover estimation methods, we simulated satellite imaging spectrometer data using field spectra collected over mixtures of NPV, green vegetation, and soil substrate. We incorporated atmospheric transmittance and modeled sensor noise to create simulated spectra with spectral resolutions ranging from 10 to 30 nm. We applied multiple methods of NPV estimation to the simulated spectra, including spectral indices, spectral feature analysis, multiple endmember spectral mixture analysis, and partial least squares regression, and compared the accuracy and bias of each method. These results prescribe sensor characteristics for an imaging spectrometer mission with NPV measurement capabilities, as well as a "Quantified Earth Science Objective" for global measurement of NPV cover. Copyright 2016, all rights reserved.

  2. Improving Global Gross Primary Productivity Estimates by Computing Optimum Light Use Efficiencies Using Flux Tower Data

    NASA Astrophysics Data System (ADS)

    Madani, Nima; Kimball, John S.; Running, Steven W.

    2017-11-01

    In the light use efficiency (LUE) approach of estimating the gross primary productivity (GPP), plant productivity is linearly related to absorbed photosynthetically active radiation assuming that plants absorb and convert solar energy into biomass within a maximum LUE (LUEmax) rate, which is assumed to vary conservatively within a given biome type. However, it has been shown that photosynthetic efficiency can vary within biomes. In this study, we used 149 global CO2 flux towers to derive the optimum LUE (LUEopt) under prevailing climate conditions for each tower location, stratified according to model training and test sites. Unlike LUEmax, LUEopt varies according to heterogeneous landscape characteristics and species traits. The LUEopt data showed large spatial variability within and between biome types, so that a simple biome classification explained only 29% of LUEopt variability over 95 global tower training sites. The use of explanatory variables in a mixed effect regression model explained 62.2% of the spatial variability in tower LUEopt data. The resulting regression model was used for global extrapolation of the LUEopt data and GPP estimation. The GPP estimated using the new LUEopt map showed significant improvement relative to global tower data, including a 15% R2 increase and 34% root-mean-square error reduction relative to baseline GPP calculations derived from biome-specific LUEmax constants. The new global LUEopt map is expected to improve the performance of LUE-based GPP algorithms for better assessment and monitoring of global terrestrial productivity and carbon dynamics.

  3. A Simulation-Based Comparison of Several Stochastic Linear Regression Methods in the Presence of Outliers.

    ERIC Educational Resources Information Center

    Rule, David L.

    Several regression methods were examined within the framework of weighted structural regression (WSR), comparing their regression weight stability and score estimation accuracy in the presence of outlier contamination. The methods compared are: (1) ordinary least squares; (2) WSR ridge regression; (3) minimum risk regression; (4) minimum risk 2;…

  4. The development of global motion discrimination in school aged children

    PubMed Central

    Bogfjellmo, Lotte-Guri; Bex, Peter J.; Falkenberg, Helle K.

    2014-01-01

    Global motion perception matures during childhood and involves the detection of local directional signals that are integrated across space. We examine the maturation of local directional selectivity and global motion integration with an equivalent noise paradigm applied to direction discrimination. One hundred and three observers (6–17 years) identified the global direction of motion in a 2AFC task. The 8° central stimuli consisted of 100 dots of 10% Michelson contrast moving 2.8°/s or 9.8°/s. Local directional selectivity and global sampling efficiency were estimated from direction discrimination thresholds as a function of external directional noise, speed, and age. Direction discrimination thresholds improved gradually until the age of 14 years (linear regression, p < 0.05) for both speeds. This improvement was associated with a gradual increase in sampling efficiency (linear regression, p < 0.05), with no significant change in internal noise. Direction sensitivity was lower for dots moving at 2.8°/s than at 9.8°/s for all ages (paired t test, p < 0.05) and is mainly due to lower sampling efficiency. Global motion perception improves gradually during development and matures by age 14. There was no change in internal noise after the age of 6, suggesting that local direction selectivity is mature by that age. The improvement in global motion perception is underpinned by a steady increase in the efficiency with which direction signals are pooled, suggesting that global motion pooling processes mature for longer and later than local motion processing. PMID:24569985

  5. A resilient domain decomposition polynomial chaos solver for uncertain elliptic PDEs

    NASA Astrophysics Data System (ADS)

    Mycek, Paul; Contreras, Andres; Le Maître, Olivier; Sargsyan, Khachik; Rizzi, Francesco; Morris, Karla; Safta, Cosmin; Debusschere, Bert; Knio, Omar

    2017-07-01

    A resilient method is developed for the solution of uncertain elliptic PDEs on extreme scale platforms. The method is based on a hybrid domain decomposition, polynomial chaos (PC) framework that is designed to address soft faults. Specifically, parallel and independent solves of multiple deterministic local problems are used to define PC representations of local Dirichlet boundary-to-boundary maps that are used to reconstruct the global solution. A LAD-lasso type regression is developed for this purpose. The performance of the resulting algorithm is tested on an elliptic equation with an uncertain diffusivity field. Different test cases are considered in order to analyze the impacts of correlation structure of the uncertain diffusivity field, the stochastic resolution, as well as the probability of soft faults. In particular, the computations demonstrate that, provided sufficiently many samples are generated, the method effectively overcomes the occurrence of soft faults.

  6. Comparison of continuously acquired resting state and extracted analogues from active tasks.

    PubMed

    Ganger, Sebastian; Hahn, Andreas; Küblböck, Martin; Kranz, Georg S; Spies, Marie; Vanicek, Thomas; Seiger, René; Sladky, Ronald; Windischberger, Christian; Kasper, Siegfried; Lanzenberger, Rupert

    2015-10-01

    Functional connectivity analysis of brain networks has become an important tool for investigation of human brain function. Although functional connectivity computations are usually based on resting-state data, the application to task-specific fMRI has received growing attention. Three major methods for extraction of resting-state data from task-related signal have been proposed (1) usage of unmanipulated task data for functional connectivity; (2) regression against task effects, subsequently using the residuals; and (3) concatenation of baseline blocks located in-between task blocks. Despite widespread application in current research, consensus on which method best resembles resting-state seems to be missing. We, therefore, evaluated these techniques in a sample of 26 healthy controls measured at 7 Tesla. In addition to continuous resting-state, two different task paradigms were assessed (emotion discrimination and right finger-tapping) and five well-described networks were analyzed (default mode, thalamus, cuneus, sensorimotor, and auditory). Investigating the similarity to continuous resting-state (Dice, Intraclass correlation coefficient (ICC), R(2) ) showed that regression against task effects yields functional connectivity networks most alike to resting-state. However, all methods exhibited significant differences when compared to continuous resting-state and similarity metrics were lower than test-retest of two resting-state scans. Omitting global signal regression did not change these findings. Visually, the networks are highly similar, but through further investigation marked differences can be found. Therefore, our data does not support referring to resting-state when extracting signals from task designs, although functional connectivity computed from task-specific data may indeed yield interesting information. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  7. Comparison of continuously acquired resting state and extracted analogues from active tasks

    PubMed Central

    Ganger, Sebastian; Hahn, Andreas; Küblböck, Martin; Kranz, Georg S.; Spies, Marie; Vanicek, Thomas; Seiger, René; Sladky, Ronald; Windischberger, Christian; Kasper, Siegfried

    2015-01-01

    Abstract Functional connectivity analysis of brain networks has become an important tool for investigation of human brain function. Although functional connectivity computations are usually based on resting‐state data, the application to task‐specific fMRI has received growing attention. Three major methods for extraction of resting‐state data from task‐related signal have been proposed (1) usage of unmanipulated task data for functional connectivity; (2) regression against task effects, subsequently using the residuals; and (3) concatenation of baseline blocks located in‐between task blocks. Despite widespread application in current research, consensus on which method best resembles resting‐state seems to be missing. We, therefore, evaluated these techniques in a sample of 26 healthy controls measured at 7 Tesla. In addition to continuous resting‐state, two different task paradigms were assessed (emotion discrimination and right finger‐tapping) and five well‐described networks were analyzed (default mode, thalamus, cuneus, sensorimotor, and auditory). Investigating the similarity to continuous resting‐state (Dice, Intraclass correlation coefficient (ICC), R 2) showed that regression against task effects yields functional connectivity networks most alike to resting‐state. However, all methods exhibited significant differences when compared to continuous resting‐state and similarity metrics were lower than test‐retest of two resting‐state scans. Omitting global signal regression did not change these findings. Visually, the networks are highly similar, but through further investigation marked differences can be found. Therefore, our data does not support referring to resting‐state when extracting signals from task designs, although functional connectivity computed from task‐specific data may indeed yield interesting information. Hum Brain Mapp 36:4053–4063, 2015. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:26178250

  8. Does transport time help explain the high trauma mortality rates in rural areas? New and traditional predictors assessed by new and traditional statistical methods

    PubMed Central

    Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas

    2015-01-01

    Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600

  9. The Homogeneity of the Potsdam Solar Radiation Data

    NASA Astrophysics Data System (ADS)

    Behrens, K.

    2009-04-01

    At Meteorological Station in Potsdam (Germany) the measurement of sunshine duration started already in 1983. Later on, in 1937 the registration of global, diffuse and direct solar radiation was begun with pyranometers and a pyrheliometer. Since 1983 sunshine duration has been measured with the same method, the Campbell-Stokes sunshine recorder, at the same site, while the measurements of solar radiation changed as well as in equipment, measurement methods and location. Furthermore, it was firstly necessary to supplement some missing data within the time series and secondly, it was desirable to extend the series of global radiation by regression with the sunshine duration backward to 1893. Because solar radiation, especially global radiation, is one of the most important quantities for climate research, it is necessary to investigate the homogeneity of these time series. At first the history was studied and as much as possible information about all parameters, which could influence the data, were gathered. In a second step these metadata were reviewed critically followed by a discussion about the potential effects of local factors on the homogeneity of the data. In a first step of data rehabilitation the so-called engineering correction (data levelling to WRR and SI units) were made followed by the supplementation of gaps. Finally, for every month and the year the so generated time series of measured data (1937/2008) and the complete series, prolonged by regression and measurements (1893/2008), were tested on homogeneity with the following distribution-free tests: WILCOXON (U) test, MANN-KENDALL test and progressive analysis were used for the examination of the stability of the mean and the dispersion, while with the Wald-Wolfowitz test the first order autocorrelation was checked. These non-parametric test were used, because frequently radiation data do not fulfil the assumption of a GAUSSian or normal distribution. The investigations showed, that discontinuities which were found in most cases are not in relation to metadata marking changes of site, equipment etc. Also, the point of intersection, where the calculated time series were connected to the measurements were not marked. This means that the time series are stable and measurements and the calculated part are in good agreement.

  10. WAVELET-DOMAIN REGRESSION AND PREDICTIVE INFERENCE IN PSYCHIATRIC NEUROIMAGING

    PubMed Central

    Reiss, Philip T.; Huo, Lan; Zhao, Yihong; Kelly, Clare; Ogden, R. Todd

    2016-01-01

    An increasingly important goal of psychiatry is the use of brain imaging data to develop predictive models. Here we present two contributions to statistical methodology for this purpose. First, we propose and compare a set of wavelet-domain procedures for fitting generalized linear models with scalar responses and image predictors: sparse variants of principal component regression and of partial least squares, and the elastic net. Second, we consider assessing the contribution of image predictors over and above available scalar predictors, in particular via permutation tests and an extension of the idea of confounding to the case of functional or image predictors. Using the proposed methods, we assess whether maps of a spontaneous brain activity measure, derived from functional magnetic resonance imaging, can meaningfully predict presence or absence of attention deficit/hyperactivity disorder (ADHD). Our results shed light on the role of confounding in the surprising outcome of the recent ADHD-200 Global Competition, which challenged researchers to develop algorithms for automated image-based diagnosis of the disorder. PMID:27330652

  11. Hybrid Rocket Performance Prediction with Coupling Method of CFD and Thermal Conduction Calculation

    NASA Astrophysics Data System (ADS)

    Funami, Yuki; Shimada, Toru

    The final purpose of this study is to develop a design tool for hybrid rocket engines. This tool is a computer code which will be used in order to investigate rocket performance characteristics and unsteady phenomena lasting through the burning time, such as fuel regression or combustion oscillation. When phenomena inside a combustion chamber, namely boundary layer combustion, are described, it is difficult to use rigorous models for this target. It is because calculation cost may be too expensive. Therefore simple models are required for this calculation. In this study, quasi-one-dimensional compressible Euler equations for flowfields inside a chamber and the equation for thermal conduction inside a solid fuel are numerically solved. The energy balance equation at the solid fuel surface is solved to estimate fuel regression rate. Heat feedback model is Karabeyoglu's model dependent on total mass flux. Combustion model is global single step reaction model for 4 chemical species or chemical equilibrium model for 9 chemical species. As a first step, steady-state solutions are reported.

  12. Deep supervised dictionary learning for no-reference image quality assessment

    NASA Astrophysics Data System (ADS)

    Huang, Yuge; Liu, Xuesong; Tian, Xiang; Zhou, Fan; Chen, Yaowu; Jiang, Rongxin

    2018-03-01

    We propose a deep convolutional neural network (CNN) for general no-reference image quality assessment (NR-IQA), i.e., accurate prediction of image quality without a reference image. The proposed model consists of three components such as a local feature extractor that is a fully CNN, an encoding module with an inherent dictionary that aggregates local features to output a fixed-length global quality-aware image representation, and a regression module that maps the representation to an image quality score. Our model can be trained in an end-to-end manner, and all of the parameters, including the weights of the convolutional layers, the dictionary, and the regression weights, are simultaneously learned from the loss function. In addition, the model can predict quality scores for input images of arbitrary sizes in a single step. We tested our method on commonly used image quality databases and showed that its performance is comparable with that of state-of-the-art general-purpose NR-IQA algorithms.

  13. Changes of visual-field global indices after cataract surgery in primary open-angle glaucoma patients.

    PubMed

    Seol, Bo Ram; Jeoung, Jin Wook; Park, Ki Ho

    2016-11-01

    To determine changes of visual-field (VF) global indices after cataract surgery and the factors associated with the effect of cataracts on those indices in primary open-angle glaucoma (POAG) patients. A retrospective chart review of 60 POAG patients who had undergone phacoemulsification and intraocular lens insertion was conducted. All of the patients were evaluated with standard automated perimetry (SAP; 30-2 Swedish interactive threshold algorithm; Carl Zeiss Meditec Inc.) before and after surgery. VF global indices before surgery were compared with those after surgery. The best-corrected visual acuity, intraocular pressure (IOP), number of glaucoma medications before surgery, mean total deviation (TD) values, mean pattern deviation (PD) value, and mean TD-PD value were also compared with the corresponding postoperative values. Additionally, postoperative peak IOP and mean IOP were evaluated. Univariate and multivariate logistic regression analyses were performed to identify the factors associated with the effect of cataract on global indices. Mean deviation (MD) after cataract surgery was significantly improved compared with the preoperative MD. Pattern standard deviation (PSD) and visual-field index (VFI) after surgery were similar to those before surgery. Also, mean TD and mean TD-PD were significantly improved after surgery. The posterior subcapsular cataract (PSC) type showed greater MD changes than did the non-PSC type in both the univariate and multivariate logistic regression analyses. In the univariate logistic regression analysis, the preoperative TD-PD value and type of cataract were associated with MD change. However, in the multivariate logistic regression analysis, type of cataract was the only associated factor. None of the other factors was associated with MD change. MD was significantly affected by cataracts, whereas PSD and VFI were not. Most notably, the PSC type showed better MD improvement compared with the non-PSC type after cataract surgery. Clinicians therefore should carefully analyze VF examination results for POAG patients with the PSC type.

  14. Global Land Use Regression Model for Nitrogen Dioxide Air Pollution.

    PubMed

    Larkin, Andrew; Geddes, Jeffrey A; Martin, Randall V; Xiao, Qingyang; Liu, Yang; Marshall, Julian D; Brauer, Michael; Hystad, Perry

    2017-06-20

    Nitrogen dioxide is a common air pollutant with growing evidence of health impacts independent of other common pollutants such as ozone and particulate matter. However, the worldwide distribution of NO 2 exposure and associated impacts on health is still largely uncertain. To advance global exposure estimates we created a global nitrogen dioxide (NO 2 ) land use regression model for 2011 using annual measurements from 5,220 air monitors in 58 countries. The model captured 54% of global NO 2 variation, with a mean absolute error of 3.7 ppb. Regional performance varied from R 2 = 0.42 (Africa) to 0.67 (South America). Repeated 10% cross-validation using bootstrap sampling (n = 10,000) demonstrated a robust performance with respect to air monitor sampling in North America, Europe, and Asia (adjusted R 2 within 2%) but not for Africa and Oceania (adjusted R 2 within 11%) where NO 2 monitoring data are sparse. The final model included 10 variables that captured both between and within-city spatial gradients in NO 2 concentrations. Variable contributions differed between continental regions, but major roads within 100 m and satellite-derived NO 2 were consistently the strongest predictors. The resulting model can be used for global risk assessments and health studies, particularly in countries without existing NO 2 monitoring data or models.

  15. A prospective study of differential sources of school-related social support and adolescent global life satisfaction.

    PubMed

    Siddall, James; Huebner, E Scott; Jiang, Xu

    2013-01-01

    This study examined the cross-sectional and prospective relationships between three sources of school-related social support (parent involvement, peer support for learning, and teacher-student relationships) and early adolescents' global life satisfaction. The participants were 597 middle school students from 1 large school in the southeastern United States who completed measures of school social climate and life satisfaction on 2 occasions, 5 months apart. The results revealed that school-related experiences in terms of social support for learning contributed substantial amounts of variance to individual differences in adolescents' satisfaction with their lives as a whole. Cross-sectional multiple regression analyses of the differential contributions of the sources of support demonstrated that family and peer support for learning contributed statistically significant, unique variance to global life satisfaction reports. Prospective multiple regression analyses demonstrated that only family support for learning continued to contribute statistically significant, unique variance to the global life satisfaction reports at Time 2. The results suggest that school-related experiences, especially family-school interactions, spill over into adolescents' overall evaluations of their lives at a time when direct parental involvement in schooling and adolescents' global life satisfaction are generally declining. Recommendations for future research and educational policies and practices are discussed. © 2013 American Orthopsychiatric Association.

  16. PRIM versus CART in subgroup discovery: when patience is harmful.

    PubMed

    Abu-Hanna, Ameen; Nannings, Barry; Dongelmans, Dave; Hasman, Arie

    2010-10-01

    We systematically compare the established algorithms CART (Classification and Regression Trees) and PRIM (Patient Rule Induction Method) in a subgroup discovery task on a large real-world high-dimensional clinical database. Contrary to current conjectures, PRIM's performance was generally inferior to CART's. PRIM often considered "peeling of" a large chunk of data at a value of a relevant discrete ordinal variable unattractive, ultimately missing an important subgroup. This finding has considerable significance in clinical medicine where ordinal scores are ubiquitous. PRIM's utility in clinical databases would increase when global information about (ordinal) variables is better put to use and when the search algorithm keeps track of alternative solutions.

  17. Regimen Difficulty and Medication Non-Adherence and the Interaction Effects of Gender and Age.

    PubMed

    Dalvi, Vidya; Mekoth, Nandakumar

    2017-12-08

    Medication non-adherence is a global health issue. Numerous factors predict it. This study is aimed to identify the association between regimen difficulty and medication non-adherence among patients with chronic conditions and testing the interaction effects of gender and age on the same. It was a cross-sectional study conducted among 479 outpatients from India. Convenience sampling method was used. Multiple regression analyses were performed to find the predictors of non-adherence and to test interaction effects. Regimen difficulty predicted medication non-adherence. The patient's gender and age have interaction effects on the relationship between regimen difficulty and medication non-adherence.

  18. Learning Receptive Fields and Quality Lookups for Blind Quality Assessment of Stereoscopic Images.

    PubMed

    Shao, Feng; Lin, Weisi; Wang, Shanshan; Jiang, Gangyi; Yu, Mei; Dai, Qionghai

    2016-03-01

    Blind quality assessment of 3D images encounters more new challenges than its 2D counterparts. In this paper, we propose a blind quality assessment for stereoscopic images by learning the characteristics of receptive fields (RFs) from perspective of dictionary learning, and constructing quality lookups to replace human opinion scores without performance loss. The important feature of the proposed method is that we do not need a large set of samples of distorted stereoscopic images and the corresponding human opinion scores to learn a regression model. To be more specific, in the training phase, we learn local RFs (LRFs) and global RFs (GRFs) from the reference and distorted stereoscopic images, respectively, and construct their corresponding local quality lookups (LQLs) and global quality lookups (GQLs). In the testing phase, blind quality pooling can be easily achieved by searching optimal GRF and LRF indexes from the learnt LQLs and GQLs, and the quality score is obtained by combining the LRF and GRF indexes together. Experimental results on three publicly 3D image quality assessment databases demonstrate that in comparison with the existing methods, the devised algorithm achieves high consistent alignment with subjective assessment.

  19. [Habitat suitability index of larval Japanese Halfbeak (Hyporhamphus sajori) in Bohai Sea based on geographically weighted regression.

    PubMed

    Zhao, Yang; Zhang, Xue Qing; Bian, Xiao Dong

    2018-01-01

    To investigate the early supplementary processes of fishre sources in the Bohai Sea, the geographically weighted regression (GWR) was introduced to the habitat suitability index (HSI) model. The Bohai Sea larval Japanese Halfbeak HSI GWR model was established with four environmental variables, including sea surface temperature (SST), sea surface salinity (SSS), water depth (DEP), and chlorophyll a concentration (Chl a). Results of the simulation showed that the four variables had different performances in August 2015. SST and Chl a were global variables, and had little impacts on HSI, with the regression coefficients of -0.027 and 0.006, respectively. SSS and DEP were local variables, and had larger impacts on HSI, while the average values of absolute values of their regression coefficients were 0.075 and 0.129, respectively. In the central Bohai Sea, SSS showed a negative correlation with HSI, and the most negative correlation coefficient was -0.3. In contrast, SSS was correlated positively but weakly with HSI in the three bays of Bohai Sea, and the largest correlation coefficient was 0.1. In particular, DEP and HSI were negatively correlated in the entire Bohai Sea, while they were more negatively correlated in the three bays of Bohai than in the central Bohai Sea, and the most negative correlation coefficient was -0.16 in the three bays. The Poisson regression coefficient of the HSI GWR model was 0.705, consistent with field measurements. Therefore, it could provide a new method for the research on fish habitats in the future.

  20. [International financial cooperation in the fight against AIDS in Latin America and the Caribbean].

    PubMed

    Leyva-Flores, René; Castillo, José Gabriel; Serván-Mori, Edson; Ballesteros, Maria Luisa Gontes; Rodríguez, Juan Francisco Molina

    2014-07-01

    This study analyzed the financial contribution by the Global Fund to Fight HIV/AIDS, Tuberculosis, and Malaria and its relationship to eligibility criteria for funding in Latin America and the Caribbean in 2002-2010. Descriptive analysis (linear regression) was conducted for the Global Fund financial contributions according to eligibility criteria (income level, burden of disease, governmental co-investment). Financial contributions totaled US$ 705 million. Lower-income countries received higher shares; there was no relationship between Global Fund contributions and burden of disease. The Global Fund's international financing complements governmental expenditure, with equity policies for financial allocation.

  1. Inequalities in global health inequalities research: A 50-year bibliometric analysis (1966-2015)

    PubMed Central

    Pericàs, Juan M.; Benach, Joan

    2018-01-01

    Background Increasing evidence shows that health inequalities exist between and within countries, and emphasis has been placed on strengthening the production and use of the global health inequalities research, so as to improve capacities to act. Yet, a comprehensive overview of this evidence base is still needed, to determine what is known about the global and historical scientific production on health inequalities to date, how is it distributed in terms of country income groups and world regions, how has it changed over time, and what international collaboration dynamics exist. Methods A comprehensive bibliometric analysis of the global scientific production on health inequalities, from 1966 to 2015, was conducted using Scopus database. The historical and global evolution of the study of health inequalities was considered, and through joinpoint regression analysis and visualisation network maps, the preceding questions were examined. Findings 159 countries (via authorship affiliation) contributed to this scientific production, three times as many countries than previously found. Scientific output on health inequalities has exponentially grown over the last five decades, with several marked shift points, and a visible country-income group affiliation gradient in the initiation and consistent publication frequency. Higher income countries, especially Anglo-Saxon and European countries, disproportionately dominate first and co-authorship, and are at the core of the global collaborative research networks, with the Global South on the periphery. However, several country anomalies exist that suggest that the causes of these research inequalities, and potential underlying dependencies, run deeper than simply differences in country income and language. Conclusions Whilst the global evidence base has expanded, Global North-South research gaps exist, persist and, in some cases, are widening. Greater understanding of the structural determinants of these research inequalities and national research capacities is needed, to further strengthen the evidence base, and support the long term agenda for global health equity. PMID:29385197

  2. Lived experience of economic and political trends related to globalization.

    PubMed

    Cushon, Jennifer A; Muhajarine, Nazeem; Labonte, Ronald

    2010-01-01

    A multi-method case study examined how the economic and political processes of globalization have influenced the determinants of health among low-income children in Saskatoon, Saskatchewan, Canada. This paper presents the results from the qualitative interview component of the case study. The purpose of the interviews was to uncover the lived experience of low-income families and their children in Saskatoon with regards to political and economic trends related to globalization, an important addition to the usual globalization and health research that relies primarily on cross-country regressions in which the personal impacts remain hidden. In-depth phenomenological interviews with 26 low-income parents of young children (aged zero to five) who were residents of Saskatoon. A combination of volunteer and criterion sampling was used. Interview questions were open-ended and based upon an analytical framework. Analysis proceeded through immersion in the data, a process of open coding, and finally through a process of selective coding. The larger case study and interviews indicate that globalization has largely not been benefiting low-income parents with young children. Low-income families with young children were struggling to survive, despite the tremendous economic growth occurring in Saskatchewan and Saskatoon at the time of the interviews. This often led to participants expressing a sense of helplessness, despair, isolation, and/or anger. Respondents' experiences suggest that globalization-related changes in social conditions and public policies and programs have great potential to negatively affect family health through either psychosocial effects in individuals and/or decreased levels of social cohesion in the community.

  3. Multicollinearity in prognostic factor analyses using the EORTC QLQ-C30: identification and impact on model selection.

    PubMed

    Van Steen, Kristel; Curran, Desmond; Kramer, Jocelyn; Molenberghs, Geert; Van Vreckem, Ann; Bottomley, Andrew; Sylvester, Richard

    2002-12-30

    Clinical and quality of life (QL) variables from an EORTC clinical trial of first line chemotherapy in advanced breast cancer were used in a prognostic factor analysis of survival and response to chemotherapy. For response, different final multivariate models were obtained from forward and backward selection methods, suggesting a disconcerting instability. Quality of life was measured using the EORTC QLQ-C30 questionnaire completed by patients. Subscales on the questionnaire are known to be highly correlated, and therefore it was hypothesized that multicollinearity contributed to model instability. A correlation matrix indicated that global QL was highly correlated with 7 out of 11 variables. In a first attempt to explore multicollinearity, we used global QL as dependent variable in a regression model with other QL subscales as predictors. Afterwards, standard diagnostic tests for multicollinearity were performed. An exploratory principal components analysis and factor analysis of the QL subscales identified at most three important components and indicated that inclusion of global QL made minimal difference to the loadings on each component, suggesting that it is redundant in the model. In a second approach, we advocate a bootstrap technique to assess the stability of the models. Based on these analyses and since global QL exacerbates problems of multicollinearity, we therefore recommend that global QL be excluded from prognostic factor analyses using the QLQ-C30. The prognostic factor analysis was rerun without global QL in the model, and selected the same significant prognostic factors as before. Copyright 2002 John Wiley & Sons, Ltd.

  4. An Update of the Bodeker Scientific Vertically Resolved, Global, Gap-Free Ozone Database

    NASA Astrophysics Data System (ADS)

    Kremser, S.; Bodeker, G. E.; Lewis, J.; Hassler, B.

    2016-12-01

    High vertical resolution ozone measurements from multiple satellite-based instruments have been merged with measurements from the global ozonesonde network to calculate monthly mean ozone values in 5º latitude zones. Ozone number densities and ozone mixing ratios are provided on 70 altitude levels (1 to 70 km) and on 70 pressure levels spaced approximately 1 km apart (878.4 hPa to 0.046 hPa). These data are sparse and do not cover the entire globe or altitude range. To provide a gap-free database, a least squares regression model is fitted to these data and then evaluated globally. By applying a single fit at each level, and using the approach of allowing the regression fits to change only slightly from one level to the next, the regression is less sensitive to measurement anomalies at individual stations or to individual satellite-based instruments. Particular attention is paid to ensuring that the low ozone abundances in the polar regions are captured. This presentation reports on updates to an earlier version of the vertically resolved ozone database, including the incorporation of new ozone measurements and new techniques for combining the data. Compared to previous versions of the database, particular attention is paid to avoiding spatial and temporal sampling biases and tracing uncertainties through to the final product. This updated database, developed within the New Zealand Deep South National Science Challenge, is suitable for assessing ozone fields from chemistry-climate model simulations or for providing the ozone boundary conditions for global climate model simulations that do not treat stratospheric chemistry interactively.

  5. Using Linear Equating to Map PROMIS(®) Global Health Items and the PROMIS-29 V2.0 Profile Measure to the Health Utilities Index Mark 3.

    PubMed

    Hays, Ron D; Revicki, Dennis A; Feeny, David; Fayers, Peter; Spritzer, Karen L; Cella, David

    2016-10-01

    Preference-based health-related quality of life (HR-QOL) scores are useful as outcome measures in clinical studies, for monitoring the health of populations, and for estimating quality-adjusted life-years. This was a secondary analysis of data collected in an internet survey as part of the Patient-Reported Outcomes Measurement Information System (PROMIS(®)) project. To estimate Health Utilities Index Mark 3 (HUI-3) preference scores, we used the ten PROMIS(®) global health items, the PROMIS-29 V2.0 single pain intensity item and seven multi-item scales (physical functioning, fatigue, pain interference, depressive symptoms, anxiety, ability to participate in social roles and activities, sleep disturbance), and the PROMIS-29 V2.0 items. Linear regression analyses were used to identify significant predictors, followed by simple linear equating to avoid regression to the mean. The regression models explained 48 % (global health items), 61 % (PROMIS-29 V2.0 scales), and 64 % (PROMIS-29 V2.0 items) of the variance in the HUI-3 preference score. Linear equated scores were similar to observed scores, although differences tended to be larger for older study participants. HUI-3 preference scores can be estimated from the PROMIS(®) global health items or PROMIS-29 V2.0. The estimated HUI-3 scores from the PROMIS(®) health measures can be used for economic applications and as a measure of overall HR-QOL in research.

  6. Detecting outliers when fitting data with nonlinear regression – a new method based on robust nonlinear regression and the false discovery rate

    PubMed Central

    Motulsky, Harvey J; Brown, Ronald E

    2006-01-01

    Background Nonlinear regression, like linear regression, assumes that the scatter of data around the ideal curve follows a Gaussian or normal distribution. This assumption leads to the familiar goal of regression: to minimize the sum of the squares of the vertical or Y-value distances between the points and the curve. Outliers can dominate the sum-of-the-squares calculation, and lead to misleading results. However, we know of no practical method for routinely identifying outliers when fitting curves with nonlinear regression. Results We describe a new method for identifying outliers when fitting data with nonlinear regression. We first fit the data using a robust form of nonlinear regression, based on the assumption that scatter follows a Lorentzian distribution. We devised a new adaptive method that gradually becomes more robust as the method proceeds. To define outliers, we adapted the false discovery rate approach to handling multiple comparisons. We then remove the outliers, and analyze the data using ordinary least-squares regression. Because the method combines robust regression and outlier removal, we call it the ROUT method. When analyzing simulated data, where all scatter is Gaussian, our method detects (falsely) one or more outlier in only about 1–3% of experiments. When analyzing data contaminated with one or several outliers, the ROUT method performs well at outlier identification, with an average False Discovery Rate less than 1%. Conclusion Our method, which combines a new method of robust nonlinear regression with a new method of outlier identification, identifies outliers from nonlinear curve fits with reasonable power and few false positives. PMID:16526949

  7. Bifunctional staining for ex vivo determination of area at risk in rabbits with reperfused myocardial infarction

    PubMed Central

    Feng, Yuanbo; Ma, Zhan-Long; Chen, Feng; Yu, Jie; Cona, Marlein Miranda; Xie, Yi; Li, Yue; Ni, Yicheng

    2013-01-01

    AIM: To develop a method for studying myocardial area at risk (AAR) in ischemic heart disease in correlation with cardiac magnetic resonance imaging (cMRI). METHODS: Nine rabbits were anesthetized, intubated and subjected to occlusion and reperfusion of the left circumflex coronary artery (LCx) to induce myocardial infarction (MI). ECG-triggered cMRI with delayed enhancement was performed at 3.0 T. After euthanasia, the heart was excised with the LCx re-ligated. Bifunctional staining was performed by perfusing the aorta with a homemade red-iodized-oil (RIO) dye. The heart was then agar-embedded for ex vivo magnetic resonance imaging and sliced into 3 mm-sections. The AAR was defined by RIO-staining and digital radiography (DR). The perfusion density rate (PDR) was derived from DR for the AAR and normal myocardium. The MI was measured by in vivo delayed enhancement (iDE) and ex vivo delayed enhancement (eDE) cMRI. The AAR and MI were compared to validate the bifunctional straining for cardiac imaging research. Linear regression with Bland-Altman agreement, one way-ANOVA with Bonferroni’s multiple comparison, and paired t tests were applied for statistics. RESULTS: All rabbits tolerated well the surgical procedure and subsequent cMRI sessions. The open-chest occlusion and close-chest reperfusion of the LCx, double suture method and bifunctional staining were successfully applied in all animals. The percentage MI volumes globally (n = 6) and by slice (n = 25) were 36.59% ± 13.68% and 32.88% ± 12.38% on iDE, and 35.41% ± 12.25% and 32.40% ± 12.34% on eDE. There were no significant differences for MI determination with excellent linear regression correspondence (rglobal = 0.89; rslice = 0.9) between iDE and eDE. The percentage AAR volumes globally (n = 6) and by slice (n = 25) were 44.82% ± 15.18% and 40.04% ± 13.64% with RIO-staining, and 44.74% ± 15.98% and 40.48% ± 13.26% by DR showing high correlation in linear regression analysis (rglobal = 0.99; rslice = 1.0). The mean differences of the two AAR measurements on Bland-Altman were almost zero, indicating RIO-staining and DR were essentially equivalent or inter-replaceable. The AAR was significantly larger than MI both globally and slice-by-slice (P < 0.01). After correction with the background and the blank heart without bifunctional staining (n = 3), the PDR for the AAR and normal myocardium was 32% ± 15% and 35.5% ± 35%, respectively, which is significantly different (P < 0.001), suggesting that blood perfusion to the AAR probably by collateral circulation was only less than 10% of that in the normal myocardium. CONCLUSION: The myocardial area at risk in ischemic heart disease could be accurately determined postmortem by this novel bifunctional staining, which may substantially contribute to translational cardiac imaging research. PMID:25237621

  8. Registration of opthalmic images using control points

    NASA Astrophysics Data System (ADS)

    Heneghan, Conor; Maguire, Paul

    2003-03-01

    A method for registering pairs of digital ophthalmic images of the retina is presented using anatomical features as control points present in both images. The anatomical features chosen are blood vessel crossings and bifurcations. These control points are identified by a combination of local contrast enhancement, and morphological processing. In general, the matching between control points is unknown, however, so an automated algorithm is used to determine the matching pairs of control points in the two images as follows. Using two control points from each image, rigid global transform (RGT) coefficients are calculated for all possible combinations of control point pairs, and the set of RGT coefficients is identified. Once control point pairs are established, registration of two images can be achieved by using linear regression to optimize an RGT, bilinear or second order polynomial global transform. An example of cross-modal image registration using an optical image and a fluorescein angiogram of an eye is presented to illustrate the technique.

  9. Comparing Self-Concept Among Youth Currently Receiving Inpatient Versus Outpatient Mental Health Services

    PubMed Central

    Choi, Chris; Ferro, Mark A.

    2018-01-01

    Objective This study compared levels of self-concept among youth who were currently receiving inpatient versus outpatient mental health services. Method Forty-seven youth were recruited from the Child & Youth Mental Health Program at McMaster Children’s Hospital. Self-concept was measured using the Self-Perception Profile for Children and Adolescents. Results The mean age was 14.5 years and most participants were female (70.2%). ANOVAs comparing self-concept with population norms showed large significant effects (d = 0.77 to 1.93) indicating compromised self-concept among youth receiving mental health services. Regression analyses controlling for patient age, sex, family income, and diagnoses of major depressive disorder, generalized social phobia, and generalized anxiety showed that the inpatient setting was a significant predictor of lower global self-worth (β=−.26; p=.035). Conclusions Compared to outpatients, inpatients generally reported lower self-concept, but differences were significant only for global self-worth. Future research replicating this finding and assessing its clinical significance is encouraged. PMID:29375635

  10. Comparing Self-Concept Among Youth Currently Receiving Inpatient Versus Outpatient Mental Health Services

    PubMed Central

    Choi, Chris; Ferro, Mark A.

    2018-01-01

    Objective This study compared levels of self-concept among youth who were currently receiving inpatient versus outpatient mental health services. Method Forty-seven youth were recruited from the Child & Youth Mental Health Program at McMaster Children’s Hospital. Self-concept was measured using the Self-Perception Profile for Children and Adolescents. Results The mean age was 14.5 years and most participants were female (70.2%). ANOVAs comparing self-concept with population norms showed large significant effects (d = 0.77 to 1.93) indicating compromised self-concept among youth receiving mental health services. Regression analyses controlling for patient age, sex, family income, and diagnoses of major depressive disorder, generalized social phobia, and generalized anxiety showed that the inpatient setting was a significant predictor of lower global self-worth (β=−.26; p=.035). Conclusions Compared to outpatients, inpatients generally reported lower self-concept, but differences were significant only for global self-worth. Future research replicating this finding and assessing its clinical significance is encouraged. PMID:29399020

  11. Multiple Linear Regression for Reconstruction of Gene Regulatory Networks in Solving Cascade Error Problems

    PubMed Central

    Zainudin, Suhaila; Arif, Shereena M.

    2017-01-01

    Gene regulatory network (GRN) reconstruction is the process of identifying regulatory gene interactions from experimental data through computational analysis. One of the main reasons for the reduced performance of previous GRN methods had been inaccurate prediction of cascade motifs. Cascade error is defined as the wrong prediction of cascade motifs, where an indirect interaction is misinterpreted as a direct interaction. Despite the active research on various GRN prediction methods, the discussion on specific methods to solve problems related to cascade errors is still lacking. In fact, the experiments conducted by the past studies were not specifically geared towards proving the ability of GRN prediction methods in avoiding the occurrences of cascade errors. Hence, this research aims to propose Multiple Linear Regression (MLR) to infer GRN from gene expression data and to avoid wrongly inferring of an indirect interaction (A → B → C) as a direct interaction (A → C). Since the number of observations of the real experiment datasets was far less than the number of predictors, some predictors were eliminated by extracting the random subnetworks from global interaction networks via an established extraction method. In addition, the experiment was extended to assess the effectiveness of MLR in dealing with cascade error by using a novel experimental procedure that had been proposed in this work. The experiment revealed that the number of cascade errors had been very minimal. Apart from that, the Belsley collinearity test proved that multicollinearity did affect the datasets used in this experiment greatly. All the tested subnetworks obtained satisfactory results, with AUROC values above 0.5. PMID:28250767

  12. Mapping Regional Impervious Surface Distribution from Night Time Light: The Variability across Global Cities

    NASA Astrophysics Data System (ADS)

    Lin, M.; Yang, Z.; Park, H.; Qian, S.; Chen, J.; Fan, P.

    2017-12-01

    Impervious surface area (ISA) has become an important indicator for studying urban environments, but mapping ISA at the regional or global scale is still challenging due to the complexity of impervious surface features. The Defense Meteorological Satellite Program's Operational Linescan System (DMSP-OLS) nighttime light data is (NTL) and Resolution Imaging Spectroradiometer (MODIS) are the major remote sensing data source for regional ISA mapping. A single regression relationship between fractional ISA and NTL or various index derived based on NTL and MODIS vegetation index (NDVI) data was established in many previous studies for regional ISA mapping. However, due to the varying geographical, climatic, and socio-economic characteristics of different cities, the same regression relationship may vary significantly across different cities in the same region in terms of both fitting performance (i.e. R2) and the rate of change (Slope). In this study, we examined the regression relationship between fractional ISA and Vegetation Adjusted Nighttime light Urban Index (VANUI) for 120 randomly selected cities around the world with a multilevel regression model. We found that indeed there is substantial variability of both the R2 (0.68±0.29) and slopes (0.64±0.40) among individual regressions, which suggests that multilevel/hierarchical models are needed for accuracy improvement of future regional ISA mapping .Further analysis also let us find the this substantial variability are affected by climate conditions, socio-economic status, and urban spatial structures. However, all these effects are nonlinear rather than linear, thus could not modeled explicitly in multilevel linear regression models.

  13. Analysis of task-evoked systemic interference in fNIRS measurements: insights from fMRI.

    PubMed

    Erdoğan, Sinem B; Yücel, Meryem A; Akın, Ata

    2014-02-15

    Functional near infrared spectroscopy (fNIRS) is a promising method for monitoring cerebral hemodynamics with a wide range of clinical applications. fNIRS signals are contaminated with systemic physiological interferences from both the brain and superficial tissues, resulting in a poor estimation of the task related neuronal activation. In this study, we use the anatomical resolution of functional magnetic resonance imaging (fMRI) to extract scalp and brain vascular signals separately and construct an optically weighted spatial average of the fMRI blood oxygen level-dependent (BOLD) signal for characterizing the scalp signal contribution to fNIRS measurements. We introduce an extended superficial signal regression (ESSR) method for canceling physiology-based systemic interference where the effects of cerebral and superficial systemic interference are treated separately. We apply and validate our method on the optically weighted BOLD signals, which are obtained by projecting the fMRI image onto optical measurement space by use of the optical forward problem. The performance of ESSR method in removing physiological artifacts is compared to i) a global signal regression (GSR) method and ii) a superficial signal regression (SSR) method. The retrieved signals from each method are compared with the neural signals that represent the 'ground truth' brain activation cleaned from cerebral systemic fluctuations. We report significant improvements in the recovery of task induced neural activation with the ESSR method when compared to the other two methods as reflected in the Pearson R(2) coefficient and mean square error (MSE) metrics (two tailed paired t-tests, p<0.05). The signal quality is enhanced most when ESSR method is applied with higher spatial localization, lower inter-trial variability, a clear canonical waveform and higher contrast-to-noise (CNR) improvement (60%). Our findings suggest that, during a cognitive task i) superficial scalp signal contribution to fNIRS signals varies significantly among different regions on the forehead and ii) using an average scalp measurement together with a local measure of superficial hemodynamics better accounts for the systemic interference inherent in the brain as well as superficial scalp tissue. We conclude that maximizing the overlap between the optical pathlength of superficial and deeper penetration measurements is of crucial importance for accurate recovery of the evoked hemodynamic response in fNIRS recordings. © 2013 Elsevier Inc. All rights reserved.

  14. An Analysis of Determinants of Under-5 Mortality across Countries: Defining Priorities to Achieve Targets in Sustainable Developmental Goals.

    PubMed

    Acheampong, Michael; Ejiofor, Chukwudi; Salinas-Miranda, Abraham

    2017-06-01

    Objectives The end of the era of millennium development goals (MDGs) ushered in the sustainable development goals (SDGs) with a new target for the reduction of under-five mortality rates (U5MR). Although U5MR decreased globally, the reduction was insufficient to meet MDGs targets because significant socioeconomic inequities remain unaddressed across and within countries. Thus, further progress in achieving the new SDGs target will be hindered if there is no adequate prioritization of important socioeconomic, healthcare, and environmental factors. The objective of this study was to assess factors that account most for the differences in U5MR between countries around the globe. Methods We conducted an ordinary least squares (OLS) regression-based prioritization analysis of socioeconomic, healthcare, and environmental variables from 109 countries to understand which factors explain the differences in U5MR best. Results All indicators examined individually affected differences in U5MR between countries. However, the results of multivariate OLS regression showed that the most important factors that accounted for the differences were, in order: fertility rate, total health expenditure per capita, access to improved water and sanitation, and female employment rate. Conclusions To achieve the new global target for U5MR, policymakers must focus on certain priority areas, such as interventions that address access to affordable maternal healthcare services, educational programs for mothers, especially those who are adolescents, and safe drinking water and sanitation.

  15. Shrinkage regression-based methods for microarray missing value imputation.

    PubMed

    Wang, Hsiuying; Chiu, Chia-Chun; Wu, Yi-Ching; Wu, Wei-Sheng

    2013-01-01

    Missing values commonly occur in the microarray data, which usually contain more than 5% missing values with up to 90% of genes affected. Inaccurate missing value estimation results in reducing the power of downstream microarray data analyses. Many types of methods have been developed to estimate missing values. Among them, the regression-based methods are very popular and have been shown to perform better than the other types of methods in many testing microarray datasets. To further improve the performances of the regression-based methods, we propose shrinkage regression-based methods. Our methods take the advantage of the correlation structure in the microarray data and select similar genes for the target gene by Pearson correlation coefficients. Besides, our methods incorporate the least squares principle, utilize a shrinkage estimation approach to adjust the coefficients of the regression model, and then use the new coefficients to estimate missing values. Simulation results show that the proposed methods provide more accurate missing value estimation in six testing microarray datasets than the existing regression-based methods do. Imputation of missing values is a very important aspect of microarray data analyses because most of the downstream analyses require a complete dataset. Therefore, exploring accurate and efficient methods for estimating missing values has become an essential issue. Since our proposed shrinkage regression-based methods can provide accurate missing value estimation, they are competitive alternatives to the existing regression-based methods.

  16. Application of spatial and non-spatial data analysis in determination of the factors that impact municipal solid waste generation rates in Turkey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keser, Saniye; Duzgun, Sebnem; Department of Geodetic and Geographic Information Technologies, Middle East Technical University, 06800 Ankara

    Highlights: Black-Right-Pointing-Pointer Spatial autocorrelation exists in municipal solid waste generation rates for different provinces in Turkey. Black-Right-Pointing-Pointer Traditional non-spatial regression models may not provide sufficient information for better solid waste management. Black-Right-Pointing-Pointer Unemployment rate is a global variable that significantly impacts the waste generation rates in Turkey. Black-Right-Pointing-Pointer Significances of global parameters may diminish at local scale for some provinces. Black-Right-Pointing-Pointer GWR model can be used to create clusters of cities for solid waste management. - Abstract: In studies focusing on the factors that impact solid waste generation habits and rates, the potential spatial dependency in solid waste generation datamore » is not considered in relating the waste generation rates to its determinants. In this study, spatial dependency is taken into account in determination of the significant socio-economic and climatic factors that may be of importance for the municipal solid waste (MSW) generation rates in different provinces of Turkey. Simultaneous spatial autoregression (SAR) and geographically weighted regression (GWR) models are used for the spatial data analyses. Similar to ordinary least squares regression (OLSR), regression coefficients are global in SAR model. In other words, the effect of a given independent variable on a dependent variable is valid for the whole country. Unlike OLSR or SAR, GWR reveals the local impact of a given factor (or independent variable) on the waste generation rates of different provinces. Results show that provinces within closer neighborhoods have similar MSW generation rates. On the other hand, this spatial autocorrelation is not very high for the exploratory variables considered in the study. OLSR and SAR models have similar regression coefficients. GWR is useful to indicate the local determinants of MSW generation rates. GWR model can be utilized to plan waste management activities at local scale including waste minimization, collection, treatment, and disposal. At global scale, the MSW generation rates in Turkey are significantly related to unemployment rate and asphalt-paved roads ratio. Yet, significances of these variables may diminish at local scale for some provinces. At local scale, different factors may be important in affecting MSW generation rates.« less

  17. Spacebased Estimation of Moisture Transport in Marine Atmosphere Using Support Vector Regression

    NASA Technical Reports Server (NTRS)

    Xie, Xiaosu; Liu, W. Timothy; Tang, Benyang

    2007-01-01

    An improved algorithm is developed based on support vector regression (SVR) to estimate horizonal water vapor transport integrated through the depth of the atmosphere ((Theta)) over the global ocean from observations of surface wind-stress vector by QuikSCAT, cloud drift wind vector derived from the Multi-angle Imaging SpectroRadiometer (MISR) and geostationary satellites, and precipitable water from the Special Sensor Microwave/Imager (SSM/I). The statistical relation is established between the input parameters (the surface wind stress, the 850 mb wind, the precipitable water, time and location) and the target data ((Theta) calculated from rawinsondes and reanalysis of numerical weather prediction model). The results are validated with independent daily rawinsonde observations, monthly mean reanalysis data, and through regional water balance. This study clearly demonstrates the improvement of (Theta) derived from satellite data using SVR over previous data sets based on linear regression and neural network. The SVR methodology reduces both mean bias and standard deviation comparedwith rawinsonde observations. It agrees better with observations from synoptic to seasonal time scales, and compare more favorably with the reanalysis data on seasonal variations. Only the SVR result can achieve the water balance over South America. The rationale of the advantage by SVR method and the impact of adding the upper level wind will also be discussed.

  18. Impact of global change on ground subsidence related to aquifer exploitation. The case of the Vega de Granada aquifer (SE Spain)

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; María Mateos, Rosa; Rueda, Ramon; Pegalajar-Cuellar, Manuel; Ezquerro, Pablo; Béjar, Marta; Herrera, Gerardo; Collados-Lara, Antonio-Juan

    2017-04-01

    In this research, we intend to develop a methodology to assess the impact of potential global change scenarios on land subsidence. Subsidence rates in wide areas could be estimated by using remote sensing techniques, such as DInSAR and specifically the new radar information obtained by the Sentinel set of satellites from the European Space Agency (ESA). A symbolic regression method will be developed to obtain an explicit quantitative relationship between subsidence, hydraulic head changes and other physical variables (e.g. percentage of clay and silt in the ground, load of buildings and constructions, fill-in works etc.). Different ensemble and downscaling techniques will be used to define potential future global change scenarios for the test-regions based on the data coming from simulations with different Regional Circulation Models (RCMs). Future drawdowns can be estimated from these global change scenarios under different management options. The regression approach will be employed to simulate the impacts of these drawdowns, in terms of land-subsidence, taking into account the estimated hydraulic head changes. It will allow to assess sustainable management of detrital aquifers taking into account subsidence issues. Classic regression analysis attempts to postulate a hypothesis function f, and the regression is reduced to the problem of finding the optimal parameters w of the hypothesis y=f(x, w), to explain a set of dependent variables y from the values of independent variables x, where x and y are known input/output data. Symbolic regression generalizes this process by assuming that f is also unknown in advance, so that the problem is formulated as finding the optimal analytical expression and its parameters that best approximate the data y considering the data in x. To achieve that purpose, in this work Straight Line Programs (SLP) will be used to represent analytical expressions, and a genetic programming approach will be used to find an optimal SLP that better explains the relationship between subsidence, hydraulic changes and the remaining independent variables. This methodology has been applied to the Vega de Granada aquifer system (Granada, SE Spain). The Vega de Granada detrital aquifer (with an extension of 200 km2) is one of the largest groundwater reservoirs in Andalusia and it is considered as strategic for the economy of this semi-arid region. Ground motion was monitored by exploiting SAR images from ENVISAT (2003-2009), Cosmo-SkyMed (2011-2014) and Sentinel-1A (2015-2016). PSInSAR results show an inelastic deformation in the aquifer and land surface displacements values up to -55 mm. The most widespread land subsidence is detected for the ENVISAT period (2003-2009), which coincided with a dry, long period in the region. The highest recorded data accounts up to 10 mm/yr in surface displacement velocity, which were detected in the central part of the aquifer, where many villages are located. For this period, a good correlation between groundwater level depletion and the augmentation of the subsidence average velocity is obtained, and light hydraulic head changes (< 2 m) have a rapid ground motion response. This research will contribute to assess a sustainable management plan of this vital aquifer, taking into account critical levels of groundwater level depletion to avoid land subsidence on the identified vulnerable areas and during drought critical scenarios. This research has been supported by the CGL2013-48424-C2-2-R (MINECO) project.

  19. Timing of global regression and microbial bloom linked with the Permian-Triassic boundary mass extinction: implications for driving mechanisms

    PubMed Central

    Baresel, Björn; Bucher, Hugo; Bagherpour, Borhan; Brosse, Morgane; Guodun, Kuang; Schaltegger, Urs

    2017-01-01

    New high-resolution U-Pb dates indicate a duration of 89 ± 38 kyr for the Permian hiatus and of 14 ± 57 kyr for the overlying Triassic microbial limestone in shallow water settings of the Nanpanjiang Basin, South China. The age and duration of the hiatus coincides with the Permian-Triassic boundary (PTB) and the extinction interval in the Meishan Global Stratotype Section and Point, and strongly supports a glacio-eustatic regression, which best explains the genesis of the worldwide hiatus straddling the PTB in shallow water records. In adjacent deep marine troughs, rates of sediment accumulation display a six-fold decrease across the PTB compatible with a dryer and cooler climate as indicated by terrestrial plants. Our model of the Permian-Triassic boundary mass extinction (PTBME) hinges on the synchronicity of the hiatus with the onset of the Siberian Traps volcanism. This early eruptive phase released sulfur-rich volatiles into the stratosphere, thus simultaneously eliciting a short-lived ice age responsible for the global regression and a brief but intense acidification. Abrupt cooling, shrunk habitats on shelves and acidification may all have synergistically triggered the PTBME. Subsequently, the build-up of volcanic CO2 induced a transient cool climate whose early phase saw the deposition of the microbial limestone. PMID:28262815

  20. Timing of global regression and microbial bloom linked with the Permian-Triassic boundary mass extinction: implications for driving mechanisms

    NASA Astrophysics Data System (ADS)

    Baresel, Björn; Bucher, Hugo; Bagherpour, Borhan; Brosse, Morgane; Guodun, Kuang; Schaltegger, Urs

    2017-03-01

    New high-resolution U-Pb dates indicate a duration of 89 ± 38 kyr for the Permian hiatus and of 14 ± 57 kyr for the overlying Triassic microbial limestone in shallow water settings of the Nanpanjiang Basin, South China. The age and duration of the hiatus coincides with the Permian-Triassic boundary (PTB) and the extinction interval in the Meishan Global Stratotype Section and Point, and strongly supports a glacio-eustatic regression, which best explains the genesis of the worldwide hiatus straddling the PTB in shallow water records. In adjacent deep marine troughs, rates of sediment accumulation display a six-fold decrease across the PTB compatible with a dryer and cooler climate as indicated by terrestrial plants. Our model of the Permian-Triassic boundary mass extinction (PTBME) hinges on the synchronicity of the hiatus with the onset of the Siberian Traps volcanism. This early eruptive phase released sulfur-rich volatiles into the stratosphere, thus simultaneously eliciting a short-lived ice age responsible for the global regression and a brief but intense acidification. Abrupt cooling, shrunk habitats on shelves and acidification may all have synergistically triggered the PTBME. Subsequently, the build-up of volcanic CO2 induced a transient cool climate whose early phase saw the deposition of the microbial limestone.

  1. Globalization and suicide: an ecological study across five regions of the world.

    PubMed

    Milner, Allison; McClure, Rod; De Leo, Diego

    2012-01-01

    The impact of globalization on health is recognized to be influenced by country and regional-level factors. This study aimed to investigate the possible relationship between globalization and suicide in five world regions. An index measure of globalization was developed at the country level over 1980 to 2006. The association between the index and sex specific suicide rates was tested using a fixed-effect regression model. Over time, the globalization index seemed to be associated with increased suicide rates in Asia and the Eastern European/Baltic region. In contrast, it was associated with decreased rates in Scandinavia. There was no significant relationship between globalization and suicide in Southern and Western Europe. The effects of globalization could be determined by specific regional (i.e., cultural and societal) factors. Identification of these mediators might provide opportunities to protect countries from the adverse impacts of globalization.

  2. Using Time-Series Regression to Predict Academic Library Circulations.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.

    1984-01-01

    Four methods were used to forecast monthly circulation totals in 15 midwestern academic libraries: dummy time-series regression, lagged time-series regression, simple average (straight-line forecasting), monthly average (naive forecasting). In tests of forecasting accuracy, dummy regression method and monthly mean method exhibited smallest average…

  3. Long-Term Efficacy of Psychosocial Treatments for Adults With Attention-Deficit/Hyperactivity Disorder: A Meta-Analytic Review

    PubMed Central

    López-Pinar, Carlos; Martínez-Sanchís, Sonia; Carbonell-Vayá, Enrique; Fenollar-Cortés, Javier; Sánchez-Meca, Julio

    2018-01-01

    Background: Recent evidence suggests that psychosocial treatments, particularly cognitive-behavioral therapy (CBT), are effective interventions for adult attention deficit hyperactivity disorder (ADHD). The objective of this review was to determine the long-term efficacy of psychosocial interventions in improving clinically relevant variables, including ADHD core symptoms, clinical global impression (CGI), and global functioning. Methods: In total, nine randomized controlled trials and three uncontrolled single-group pretest-posttest studies were included. The data from these studies were combined using the inverse variance method. Heterogeneity and risk of bias were assessed. Subgroup analyses and meta-regressions were performed, to determine the influence of different potential moderator variables (risk of bias, medication status, follow-up length, therapy type and setting, and control group type) on effect size (ES) estimates. Results: Up to 680 of a total of 1,073 participants assessed pre-treatment were retained at follow-up. Treatment groups showed greater improvement than control groups in self-reported total ADHD symptoms, inattention, and hyperactivity/impulsivity, in addition to CGI and global functioning. Blind assessors also reported a large ES in within-subject outcomes. Studies using dialectical behavioral therapy (DBT) in a group setting, with active control matching, and that were rated as having an unclear risk of bias, achieved significantly lower ES estimates for most outcomes. Treatment effectiveness, according to the CGI measure, and global functioning were significantly increased when the percentage of medicated participants was greater. Conclusions: Our results indicate that the post-treatment gains reported in previous reviews are sustained for at least 12 months. Nevertheless, these results must be interpreted with caution, because of a high level of heterogeneity among studies and the risk of bias observed in the majority of outcomes. Thus, these findings indicate that psychological interventions are a highly valuable and stable clinical tool for the treatment of core symptoms and global functioning in adults with ADHD. PMID:29780342

  4. Long-Term Efficacy of Psychosocial Treatments for Adults With Attention-Deficit/Hyperactivity Disorder: A Meta-Analytic Review.

    PubMed

    López-Pinar, Carlos; Martínez-Sanchís, Sonia; Carbonell-Vayá, Enrique; Fenollar-Cortés, Javier; Sánchez-Meca, Julio

    2018-01-01

    Background: Recent evidence suggests that psychosocial treatments, particularly cognitive-behavioral therapy (CBT), are effective interventions for adult attention deficit hyperactivity disorder (ADHD). The objective of this review was to determine the long-term efficacy of psychosocial interventions in improving clinically relevant variables, including ADHD core symptoms, clinical global impression (CGI), and global functioning. Methods: In total, nine randomized controlled trials and three uncontrolled single-group pretest-posttest studies were included. The data from these studies were combined using the inverse variance method. Heterogeneity and risk of bias were assessed. Subgroup analyses and meta-regressions were performed, to determine the influence of different potential moderator variables (risk of bias, medication status, follow-up length, therapy type and setting, and control group type) on effect size (ES) estimates. Results: Up to 680 of a total of 1,073 participants assessed pre-treatment were retained at follow-up. Treatment groups showed greater improvement than control groups in self-reported total ADHD symptoms, inattention, and hyperactivity/impulsivity, in addition to CGI and global functioning. Blind assessors also reported a large ES in within-subject outcomes. Studies using dialectical behavioral therapy (DBT) in a group setting, with active control matching, and that were rated as having an unclear risk of bias, achieved significantly lower ES estimates for most outcomes. Treatment effectiveness, according to the CGI measure, and global functioning were significantly increased when the percentage of medicated participants was greater. Conclusions: Our results indicate that the post-treatment gains reported in previous reviews are sustained for at least 12 months. Nevertheless, these results must be interpreted with caution, because of a high level of heterogeneity among studies and the risk of bias observed in the majority of outcomes. Thus, these findings indicate that psychological interventions are a highly valuable and stable clinical tool for the treatment of core symptoms and global functioning in adults with ADHD.

  5. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  6. A regularization corrected score method for nonlinear regression models with covariate error.

    PubMed

    Zucker, David M; Gorfine, Malka; Li, Yi; Tadesse, Mahlet G; Spiegelman, Donna

    2013-03-01

    Many regression analyses involve explanatory variables that are measured with error, and failing to account for this error is well known to lead to biased point and interval estimates of the regression coefficients. We present here a new general method for adjusting for covariate error. Our method consists of an approximate version of the Stefanski-Nakamura corrected score approach, using the method of regularization to obtain an approximate solution of the relevant integral equation. We develop the theory in the setting of classical likelihood models; this setting covers, for example, linear regression, nonlinear regression, logistic regression, and Poisson regression. The method is extremely general in terms of the types of measurement error models covered, and is a functional method in the sense of not involving assumptions on the distribution of the true covariate. We discuss the theoretical properties of the method and present simulation results in the logistic regression setting (univariate and multivariate). For illustration, we apply the method to data from the Harvard Nurses' Health Study concerning the relationship between physical activity and breast cancer mortality in the period following a diagnosis of breast cancer. Copyright © 2013, The International Biometric Society.

  7. Global patterns of current and future road infrastructure

    NASA Astrophysics Data System (ADS)

    Meijer, Johan R.; Huijbregts, Mark A. J.; Schotten, Kees C. G. J.; Schipper, Aafke M.

    2018-06-01

    Georeferenced information on road infrastructure is essential for spatial planning, socio-economic assessments and environmental impact analyses. Yet current global road maps are typically outdated or characterized by spatial bias in coverage. In the Global Roads Inventory Project we gathered, harmonized and integrated nearly 60 geospatial datasets on road infrastructure into a global roads dataset. The resulting dataset covers 222 countries and includes over 21 million km of roads, which is two to three times the total length in the currently best available country-based global roads datasets. We then related total road length per country to country area, population density, GDP and OECD membership, resulting in a regression model with adjusted R 2 of 0.90, and found that that the highest road densities are associated with densely populated and wealthier countries. Applying our regression model to future population densities and GDP estimates from the Shared Socioeconomic Pathway (SSP) scenarios, we obtained a tentative estimate of 3.0–4.7 million km additional road length for the year 2050. Large increases in road length were projected for developing nations in some of the world’s last remaining wilderness areas, such as the Amazon, the Congo basin and New Guinea. This highlights the need for accurate spatial road datasets to underpin strategic spatial planning in order to reduce the impacts of roads in remaining pristine ecosystems.

  8. Instantaneous global spatial interaction? Exploring the Gaussian inequality, distance and Internet pings in a global network

    NASA Astrophysics Data System (ADS)

    Baker, R. G. V.

    2005-12-01

    The Internet has been publicly portrayed as a new technological horizon yielding instantaneous interaction to a point where geography no longer matters. This research aims to dispel this impression by applying a dynamic form of trip modelling to investigate pings in a global computer network compiled by the Stanford Linear Accelerator Centre (SLAC) from 1998 to 2004. Internet flows have been predicted to have the same mathematical operators as trips to a supermarket, since they are both periodic and constrained by a distance metric. Both actual and virtual trips are part of a spectrum of origin-destination pairs in the time-space convergence of trip time-lines. Internet interaction is very near to the convergence of these time-lines (at a very small time scale in milliseconds, but with interactions over thousands of kilometres). There is a lag effect and this is formalised by the derivation of Gaussian and gravity inequalities between the time taken (Δ t) and the partitioning of distance (Δ x). This inequality seems to be robust for a regression of Δ t to Δ x in the SLAC data set for each year (1998 to 2004). There is a constant ‘forbidden zone’ in the interaction, underpinned by the fact that pings do not travel faster than the speed of light. Superimposed upon this zone is the network capacity where a linear regression of Δ t to Δ x is a proxy summarising global Internet connectivity for that year. The results suggest that there has been a substantial improvement in connectivity over the period with R 2 increasing steadily from 0.39 to 0.65 from less Gaussian spreading of the ping latencies. Further, the regression line shifts towards the inequality boundary from 1998 to 2004, where the increased slope shows a greater proportional rise in local connectivity over global connectivity. A conclusion is that national geography still does matter in spatial interaction modelling of the Internet.

  9. Revealing transient strain in geodetic data with Gaussian process regression

    NASA Astrophysics Data System (ADS)

    Hines, T. T.; Hetland, E. A.

    2018-03-01

    Transient strain derived from global navigation satellite system (GNSS) data can be used to detect and understand geophysical processes such as slow slip events and post-seismic deformation. Here we propose using Gaussian process regression (GPR) as a tool for estimating transient strain from GNSS data. GPR is a non-parametric, Bayesian method for interpolating scattered data. In our approach, we assume a stochastic prior model for transient displacements. The prior describes how much we expect transient displacements to covary spatially and temporally. A posterior estimate of transient strain is obtained by differentiating the posterior transient displacements, which are formed by conditioning the prior with the GNSS data. As a demonstration, we use GPR to detect transient strain resulting from slow slip events in the Pacific Northwest. Maximum likelihood methods are used to constrain a prior model for transient displacements in this region. The temporal covariance of our prior model is described by a compact Wendland covariance function, which significantly reduces the computational burden that can be associated with GPR. Our results reveal the spatial and temporal evolution of strain from slow slip events. We verify that the transient strain estimated with GPR is in fact geophysical signal by comparing it to the seismic tremor that is associated with Pacific Northwest slow slip events.

  10. A short-term and high-resolution distribution system load forecasting approach using support vector regression with hybrid parameters optimization

    DOE PAGES

    Jiang, Huaiguang; Zhang, Yingchen; Muljadi, Eduard; ...

    2016-01-01

    This paper proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system. The performance of the proposed approach is compared to some classic methods in later sections of the paper.« less

  11. Testing Different Model Building Procedures Using Multiple Regression.

    ERIC Educational Resources Information Center

    Thayer, Jerome D.

    The stepwise regression method of selecting predictors for computer assisted multiple regression analysis was compared with forward, backward, and best subsets regression, using 16 data sets. The results indicated the stepwise method was preferred because of its practical nature, when the models chosen by different selection methods were similar…

  12. Assessment of LVEF using a new 16-segment wall motion score in echocardiography.

    PubMed

    Lebeau, Real; Serri, Karim; Lorenzo, Maria Di; Sauvé, Claude; Le, Van Hoai Viet; Soulières, Vicky; El-Rayes, Malak; Pagé, Maude; Zaïani, Chimène; Garot, Jérôme; Poulin, Frédéric

    2018-06-01

    Simpson biplane method and 3D by transthoracic echocardiography (TTE), radionuclide angiography (RNA) and cardiac magnetic resonance imaging (CMR) are the most accepted techniques for left ventricular ejection fraction (LVEF) assessment. Wall motion score index (WMSI) by TTE is an accepted complement. However, the conversion from WMSI to LVEF is obtained through a regression equation, which may limit its use. In this retrospective study, we aimed to validate a new method to derive LVEF from the wall motion score in 95 patients. The new score consisted of attributing a segmental EF to each LV segment based on the wall motion score and averaging all 16 segmental EF into a global LVEF. This segmental EF score was calculated on TTE in 95 patients, and RNA was used as the reference LVEF method. LVEF using the new segmental EF 15-40-65 score on TTE was compared to the reference methods using linear regression and Bland-Altman analyses. The median LVEF was 45% (interquartile range 32-53%; range from 15 to 65%). Our new segmental EF 15-40-65 score derived on TTE correlated strongly with RNA-LVEF ( r  = 0.97). Overall, the new score resulted in good agreement of LVEF compared to RNA (mean bias 0.61%). The standard deviations (s.d.s) of the distributions of inter-method difference for the comparison of the new score with RNA were 6.2%, indicating good precision. LVEF assessment using segmental EF derived from the wall motion score applied to each of the 16 LV segments has excellent correlation and agreement with a reference method. © 2018 The authors.

  13. Regression without truth with Markov chain Monte-Carlo

    NASA Astrophysics Data System (ADS)

    Madan, Hennadii; Pernuš, Franjo; Likar, Boštjan; Å piclin, Žiga

    2017-03-01

    Regression without truth (RWT) is a statistical technique for estimating error model parameters of each method in a group of methods used for measurement of a certain quantity. A very attractive aspect of RWT is that it does not rely on a reference method or "gold standard" data, which is otherwise difficult RWT was used for a reference-free performance comparison of several methods for measuring left ventricular ejection fraction (EF), i.e. a percentage of blood leaving the ventricle each time the heart contracts, and has since been applied for various other quantitative imaging biomarkerss (QIBs). Herein, we show how Markov chain Monte-Carlo (MCMC), a computational technique for drawing samples from a statistical distribution with probability density function known only up to a normalizing coefficient, can be used to augment RWT to gain a number of important benefits compared to the original approach based on iterative optimization. For instance, the proposed MCMC-based RWT enables the estimation of joint posterior distribution of the parameters of the error model, straightforward quantification of uncertainty of the estimates, estimation of true value of the measurand and corresponding credible intervals (CIs), does not require a finite support for prior distribution of the measureand generally has a much improved robustness against convergence to non-global maxima. The proposed approach is validated using synthetic data that emulate the EF data for 45 patients measured with 8 different methods. The obtained results show that 90% CI of the corresponding parameter estimates contain the true values of all error model parameters and the measurand. A potential real-world application is to take measurements of a certain QIB several different methods and then use the proposed framework to compute the estimates of the true values and their uncertainty, a vital information for diagnosis based on QIB.

  14. Round Robin evaluation of soil moisture retrieval models for the MetOp-A ASCAT Instrument

    NASA Astrophysics Data System (ADS)

    Gruber, Alexander; Paloscia, Simonetta; Santi, Emanuele; Notarnicola, Claudia; Pasolli, Luca; Smolander, Tuomo; Pulliainen, Jouni; Mittelbach, Heidi; Dorigo, Wouter; Wagner, Wolfgang

    2014-05-01

    Global soil moisture observations are crucial to understand hydrologic processes, earth-atmosphere interactions and climate variability. ESA's Climate Change Initiative (CCI) project aims to create a global consistent long-term soil moisture data set based on the merging of the best available active and passive satellite-based microwave sensors and retrieval algorithms. Within the CCI, a Round Robin evaluation of existing retrieval algorithms for both active and passive instruments was carried out. In this study we present the comparison of five different retrieval algorithms covering three different modelling principles applied to active MetOp-A ASCAT L1 backscatter data. These models include statistical models (Bayesian Regression and Support Vector Regression, provided by the Institute for Applied Remote Sensing, Eurac Research Viale Druso, Italy, and an Artificial Neural Network, provided by the Institute of Applied Physics, CNR-IFAC, Italy), a semi-empirical model (provided by the Finnish Meteorological Institute), and a change detection model (provided by the Vienna University of Technology). The algorithms were applied on L1 backscatter data within the period of 2007-2011, resampled to a 12.5 km grid. The evaluation was performed over 75 globally distributed, quality controlled in situ stations drawn from the International Soil Moisture Network (ISMN) using surface soil moisture data from the Global Land Data Assimilation System (GLDAS-) Noah land surface model as second independent reference. The temporal correlation between the data sets was analyzed and random errors of the the different algorithms were estimated using the triple collocation method. Absolute soil moisture values as well as soil moisture anomalies were considered including both long-term anomalies from the mean seasonal cycle and short-term anomalies from a five weeks moving average window. Results show a very high agreement between all five algorithms for most stations. A slight vegetation dependency of the errors and a spatial decorrelation of the performance patterns of the different algorithms was found. We conclude that future research should focus on understanding, combining and exploiting the advantages of all available modelling approaches rather than trying to optimize one approach to fit every possible condition.

  15. Regional variation in the prevalence of E. coli O157 in cattle: a meta-analysis and meta-regression.

    PubMed

    Islam, Md Zohorul; Musekiwa, Alfred; Islam, Kamrul; Ahmed, Shahana; Chowdhury, Sharmin; Ahad, Abdul; Biswas, Paritosh Kumar

    2014-01-01

    Escherichia coli O157 (EcO157) infection has been recognized as an important global public health concern. But information on the prevalence of EcO157 in cattle at the global and at the wider geographical levels is limited, if not absent. This is the first meta-analysis to investigate the point prevalence of EcO157 in cattle at the global level and to explore the factors contributing to variation in prevalence estimates. Seven electronic databases- CAB Abstracts, PubMed, Biosis Citation Index, Medline, Web of Knowledge, Scirus and Scopus were searched for relevant publications from 1980 to 2012. A random effect meta-analysis model was used to produce the pooled estimates. The potential sources of between study heterogeneity were identified using meta-regression. A total of 140 studies consisting 220,427 cattle were included in the meta-analysis. The prevalence estimate of EcO157 in cattle at the global level was 5.68% (95% CI, 5.16-6.20). The random effects pooled prevalence estimates in Africa, Northern America, Oceania, Europe, Asia and Latin America-Caribbean were 31.20% (95% CI, 12.35-50.04), 7.35% (95% CI, 6.44-8.26), 6.85% (95% CI, 2.41-11.29), 5.15% (95% CI, 4.21-6.09), 4.69% (95% CI, 3.05-6.33) and 1.65% (95% CI, 0.77-2.53), respectively. Between studies heterogeneity was evidenced in most regions. World region (p<0.001), type of cattle (p<0.001) and to some extent, specimens (p = 0.074) as well as method of pre-enrichment (p = 0.110), were identified as factors for variation in the prevalence estimates of EcO157 in cattle. The prevalence of the organism seems to be higher in the African and Northern American regions. The important factors that might have influence in the estimates of EcO157 are type of cattle and kind of screening specimen. Their roles need to be determined and they should be properly handled in any survey to estimate the true prevalence of EcO157.

  16. Global, regional and national prevalence of overweight and obesity in children and adults 1980-2013: A systematic analysis

    PubMed Central

    Ng, Marie; Fleming, Tom; Robinson, Margaret; Thomson, Blake; Graetz, Nicholas; Margono, Christopher; Mullany, Erin C; Biryukov, Stan; Abbafati, Cristiana; Abera, Semaw Ferede; Abraham, Jerry P; Abu-Rmeileh, Niveen ME; Achoki, Tom; AlBuhairan, Fadia S; Alemu, Zewdie A; Alfonso, Rafael; Ali, Mohammed K; Ali, Raghib; Guzman, Nelson Alvis; Ammar, Walid; Anwari, Palwasha; Banerjee, Amitava; Barquera, Simon; Basu, Sanjay; Bennett, Derrick A; Bhutta, Zulfiqar; Blore, Jed; Cabral, Norberto; Nonato, Ismael Campos; Chang, Jung-Chen; Chowdhury, Rajiv; Courville, Karen J; Criqui, Michael H; Cundiff, David K; Dabhadkar, Kaustubh C; Dandona, Lalit; Davis, Adrian; Dayama, Anand; Dharmaratne, Samath D; Ding, Eric L; Durrani, Adnan M; Esteghamati, Alireza; Farzadfar, Farshad; Fay, Derek FJ; Feigin, Valery L; Flaxman, Abraham; Forouzanfar, Mohammad H; Goto, Atsushi; Green, Mark A; Gupta, Rajeev; Hafezi-Nejad, Nima; Hankey, Graeme J; Harewood, Heather C; Havmoeller, Rasmus; Hay, Simon; Hernandez, Lucia; Husseini, Abdullatif; Idrisov, Bulat T; Ikeda, Nayu; Islami, Farhad; Jahangir, Eiman; Jassal, Simerjot K; Jee, Sun Ha; Jeffreys, Mona; Jonas, Jost B; Kabagambe, Edmond K; Khalifa, Shams Eldin Ali Hassan; Kengne, Andre Pascal; Khader, Yousef Saleh; Khang, Young-Ho; Kim, Daniel; Kimokoti, Ruth W; Kinge, Jonas M; Kokubo, Yoshihiro; Kosen, Soewarta; Kwan, Gene; Lai, Taavi; Leinsalu, Mall; Li, Yichong; Liang, Xiaofeng; Liu, Shiwei; Logroscino, Giancarlo; Lotufo, Paulo A; Lu, Yuan; Ma, Jixiang; Mainoo, Nana Kwaku; Mensah, George A; Merriman, Tony R; Mokdad, Ali H; Moschandreas, Joanna; Naghavi, Mohsen; Naheed, Aliya; Nand, Devina; Narayan, KM Venkat; Nelson, Erica Leigh; Neuhouser, Marian L; Nisar, Muhammad Imran; Ohkubo, Takayoshi; Oti, Samuel O; Pedroza, Andrea; Prabhakaran, Dorairaj; Roy, Nobhojit; Sampson, Uchechukwu; Seo, Hyeyoung; Sepanlou, Sadaf G; Shibuya, Kenji; Shiri, Rahman; Shiue, Ivy; Singh, Gitanjali M; Singh, Jasvinder A; Skirbekk, Vegard; Stapelberg, Nicolas JC; Sturua, Lela; Sykes, Bryan L; Tobias, Martin; Tran, Bach X; Trasande, Leonardo; Toyoshima, Hideaki; van de Vijver, Steven; Vasankari, Tommi J; Veerman, J Lennert; Velasquez-Melendez, Gustavo; Vlassov, Vasiliy Victorovich; Vollset, Stein Emil; Vos, Theo; Wang, Claire; Wang, Sharon XiaoRong; Weiderpass, Elisabete; Werdecker, Andrea; Wright, Jonathan L; Yang, Y Claire; Yatsuya, Hiroshi; Yoon, Jihyun; Yoon, Seok-Jun; Zhao, Yong; Zhou, Maigeng; Zhu, Shankuan; Lopez, Alan D; Murray, Christopher JL

    2015-01-01

    Background In 2010, overweight and obesity were estimated to cause 3.4 million deaths, 3.9% of years of life lost, and 3.8% of DALYs globally. The rise in obesity has led to widespread calls for regular monitoring of changes in overweight and obesity prevalence in all populations. Comparative, up-to-date information on levels and trends is essential both to quantify population health effects and to prompt decision-makers to prioritize action. Methods We systematically identified surveys, reports, and published studies (n = 1,769) that included information on height and weight, both through physical measurements and self-reports. Mixed effects linear regression was used to correct for the bias in self-reports. Age-sex-country-year observations (n = 19,244) on prevalence of obesity and overweight were synthesized using a spatio-temporal Gaussian Process Regression model to estimate prevalence with 95% uncertainty intervals. Findings Globally, the proportion of adults with a body mass index (BMI) of 25 or greater increased from 28.8% (95% UI: 28.4-29.3) in 1980 to 36.9% (36.3-37.4) in 2013 for men and from 29.8% (29.3-30.2) to 38.0% (37.5-38.5) for women. Increases were observed in both developed and developing countries. There have been substantial increases in prevalence among children and adolescents in developed countries, with 23.8% (22.9-24.7) of boys and 22.6% (21.7-23.6) of girls being either overweight or obese in 2013. The prevalence of overweight and obesity is also rising among children and adolescents in developing countries as well, rising from 8.1% (7.7-8.6) to 12.9% (12.3-13.5) in 2013 for boys and from 8.4% (8.1-8.8) to 13.4% (13.0-13.9) in girls. Among adults, estimated prevalence of obesity exceeds 50% among men in Tonga and women in Kuwait, Kiribati, Federated States of Micronesia, Libya, Qatar, Tonga, and Samoa. Since 2006, the increase in adult obesity in developed countries has stabilized. Interpretation Because of the established health risks and substantial increases in prevalence, obesity has become a major global health challenge. Contrary to other major global risks, there is little evidence of successful population-level intervention strategies to reduce exposure. Not only is obesity increasing, but there are no national success stories over the past 33 years. Urgent global action and leadership is required to assist countries to more effectively intervene. PMID:24880830

  17. Reduced-order modelling of parameter-dependent, linear and nonlinear dynamic partial differential equation models.

    PubMed

    Shah, A A; Xing, W W; Triantafyllidis, V

    2017-04-01

    In this paper, we develop reduced-order models for dynamic, parameter-dependent, linear and nonlinear partial differential equations using proper orthogonal decomposition (POD). The main challenges are to accurately and efficiently approximate the POD bases for new parameter values and, in the case of nonlinear problems, to efficiently handle the nonlinear terms. We use a Bayesian nonlinear regression approach to learn the snapshots of the solutions and the nonlinearities for new parameter values. Computational efficiency is ensured by using manifold learning to perform the emulation in a low-dimensional space. The accuracy of the method is demonstrated on a linear and a nonlinear example, with comparisons with a global basis approach.

  18. Reduced-order modelling of parameter-dependent, linear and nonlinear dynamic partial differential equation models

    PubMed Central

    Xing, W. W.; Triantafyllidis, V.

    2017-01-01

    In this paper, we develop reduced-order models for dynamic, parameter-dependent, linear and nonlinear partial differential equations using proper orthogonal decomposition (POD). The main challenges are to accurately and efficiently approximate the POD bases for new parameter values and, in the case of nonlinear problems, to efficiently handle the nonlinear terms. We use a Bayesian nonlinear regression approach to learn the snapshots of the solutions and the nonlinearities for new parameter values. Computational efficiency is ensured by using manifold learning to perform the emulation in a low-dimensional space. The accuracy of the method is demonstrated on a linear and a nonlinear example, with comparisons with a global basis approach. PMID:28484327

  19. Student experiences of participating in five collaborative blended learning courses in Africa and Asia: a survey

    PubMed Central

    Atkins, Salla; Yan, Weirong; Meragia, Elnta; Mahomed, Hassan; Rosales-Klintz, Senia; Skinner, Donald; Zwarenstein, Merrick

    2016-01-01

    Background As blended learning (BL; a combination of face-to-face and e-learning methods) becomes more commonplace, it is important to assess whether students find it useful for their studies. ARCADE HSSR and ARCADE RSDH (African Regional Capacity Development for Health Systems and Services Research; Asian Regional Capacity Development for Research on Social Determinants of Health) were unique capacity-building projects, focusing on developing BL in Africa and Asia on issues related to global health. Objective We aimed to evaluate the student experience of participating in any of five ARCADE BL courses implemented collaboratively at institutions from Africa, Asia, and Europe. Design A post-course student survey with 118 students was conducted. The data were collected using email or through an e-learning platform. Data were analysed with SAS, using bivariate and multiple logistic regression. We focused on the associations between various demographic and experience variables and student-reported overall perceptions of the courses. Results In total, 82 students responded to the survey. In bivariate logistic regression, the course a student took [p=0.0067, odds ratio (OR)=0.192; 95% confidence interval (CI): 0.058–0.633], male gender of student (p=0.0474, OR=0.255; 95% CI: 0.066–0.985), not experiencing technical problems (p<0.001, OR=17.286; 95% CI: 4.629–64.554), and reporting the discussion forum as adequate for student needs (p=0.0036, OR=0.165; 95% CI: 0.049–0.555) were found to be associated with a more positive perception of BL, as measured by student rating of the overall helpfulness of the e-learning component to their studies. In contrast, perceiving the assessment as adequate was associated with a worse perception of overall usefulness. In a multiple regression, the course, experiencing no technical problems, and perceiving the discussion as adequate remained significantly associated with a more positively rated perception of the usefulness of the online component of the blended courses. Discussion The results suggest that lack of technical problems and functioning discussion forums are of importance during BL courses focusing on global health-related topics. Through paying attention to these aspects, global health education could be provided using BL approaches to student satisfaction. PMID:27725077

  20. Discussion on common errors in analyzing sea level accelerations, solar trends and global warming

    NASA Astrophysics Data System (ADS)

    Scafetta, N.

    2013-05-01

    Herein I discuss common errors in applying regression models and wavelet filters used to analyze geophysical signals. I demonstrate that: (1) multidecadal natural oscillations (e.g. the quasi 60 yr Multidecadal Atlantic Oscillation (AMO), North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation (PDO)) need to be taken into account for properly quantifying anomalous background accelerations in tide gauge records such as in New York City; (2) uncertainties and multicollinearity among climate forcing functions also prevent a proper evaluation of the solar contribution to the 20th century global surface temperature warming using overloaded linear regression models during the 1900-2000 period alone; (3) when periodic wavelet filters, which require that a record is pre-processed with a reflection methodology, are improperly applied to decompose non-stationary solar and climatic time series, Gibbs boundary artifacts emerge yielding misleading physical interpretations. By correcting these errors and using optimized regression models that reduce multicollinearity artifacts, I found the following results: (1) the relative sea level in New York City is not accelerating in an alarming way, and may increase by about 350 ± 30 mm from 2000 to 2100 instead of the previously projected values varying from 1130 ± 480 mm to 1550 ± 400 mm estimated using the methods proposed, e.g., by Sallenger Jr. et al. (2012) and Boon (2012), respectively; (2) the solar activity increase during the 20th century contributed at least about 50% of the 0.8 °C global warming observed during the 20th century instead of only 7-10% (e.g.: IPCC, 2007; Benestad and Schmidt, 2009; Lean and Rind, 2009; Rohde et al., 2013). The first result was obtained by using a quadratic polynomial function plus a 60 yr harmonic to fit a required 110 yr-long sea level record. The second result was obtained by using solar, volcano, greenhouse gases and aerosol constructors to fit modern paleoclimatic temperature reconstructions (e.g.: Moberg et al., 2005; Mann et al., 2008; Christiansen and Ljungqvist, 2012) since the Medieval Warm Period, which show a large millennial cycle that is well correlated to the millennial solar cycle (e.g.: Kirkby, 2007; Scafetta and West, 2007; Scafetta, 2012c). These findings stress the importance of natural oscillations and of the sun to properly interpret climatic changes.

  1. Student experiences of participating in five collaborative blended learning courses in Africa and Asia: a survey.

    PubMed

    Atkins, Salla; Yan, Weirong; Meragia, Elnta; Mahomed, Hassan; Rosales-Klintz, Senia; Skinner, Donald; Zwarenstein, Merrick

    2016-01-01

    As blended learning (BL; a combination of face-to-face and e-learning methods) becomes more commonplace, it is important to assess whether students find it useful for their studies. ARCADE HSSR and ARCADE RSDH (African Regional Capacity Development for Health Systems and Services Research; Asian Regional Capacity Development for Research on Social Determinants of Health) were unique capacity-building projects, focusing on developing BL in Africa and Asia on issues related to global health. We aimed to evaluate the student experience of participating in any of five ARCADE BL courses implemented collaboratively at institutions from Africa, Asia, and Europe. A post-course student survey with 118 students was conducted. The data were collected using email or through an e-learning platform. Data were analysed with SAS, using bivariate and multiple logistic regression. We focused on the associations between various demographic and experience variables and student-reported overall perceptions of the courses. In total, 82 students responded to the survey. In bivariate logistic regression, the course a student took [ p =0.0067, odds ratio (OR)=0.192; 95% confidence interval (CI): 0.058-0.633], male gender of student ( p =0.0474, OR=0.255; 95% CI: 0.066-0.985), not experiencing technical problems ( p <0.001, OR=17.286; 95% CI: 4.629-64.554), and reporting the discussion forum as adequate for student needs ( p =0.0036, OR=0.165; 95% CI: 0.049-0.555) were found to be associated with a more positive perception of BL, as measured by student rating of the overall helpfulness of the e-learning component to their studies. In contrast, perceiving the assessment as adequate was associated with a worse perception of overall usefulness. In a multiple regression, the course, experiencing no technical problems, and perceiving the discussion as adequate remained significantly associated with a more positively rated perception of the usefulness of the online component of the blended courses. The results suggest that lack of technical problems and functioning discussion forums are of importance during BL courses focusing on global health-related topics. Through paying attention to these aspects, global health education could be provided using BL approaches to student satisfaction.

  2. Estimating the Counterfactual Impact of Conservation Programs on Land Cover Outcomes: The Role of Matching and Panel Regression Techniques

    PubMed Central

    Jones, Kelly W.; Lewis, David J.

    2015-01-01

    Deforestation and conversion of native habitats continues to be the leading driver of biodiversity and ecosystem service loss. A number of conservation policies and programs are implemented—from protected areas to payments for ecosystem services (PES)—to deter these losses. Currently, empirical evidence on whether these approaches stop or slow land cover change is lacking, but there is increasing interest in conducting rigorous, counterfactual impact evaluations, especially for many new conservation approaches, such as PES and REDD, which emphasize additionality. In addition, several new, globally available and free high-resolution remote sensing datasets have increased the ease of carrying out an impact evaluation on land cover change outcomes. While the number of conservation evaluations utilizing ‘matching’ to construct a valid control group is increasing, the majority of these studies use simple differences in means or linear cross-sectional regression to estimate the impact of the conservation program using this matched sample, with relatively few utilizing fixed effects panel methods—an alternative estimation method that relies on temporal variation in the data. In this paper we compare the advantages and limitations of (1) matching to construct the control group combined with differences in means and cross-sectional regression, which control for observable forms of bias in program evaluation, to (2) fixed effects panel methods, which control for observable and time-invariant unobservable forms of bias, with and without matching to create the control group. We then use these four approaches to estimate forest cover outcomes for two conservation programs: a PES program in Northeastern Ecuador and strict protected areas in European Russia. In the Russia case we find statistically significant differences across estimators—due to the presence of unobservable bias—that lead to differences in conclusions about effectiveness. The Ecuador case illustrates that if time-invariant unobservables are not present, matching combined with differences in means or cross-sectional regression leads to similar estimates of program effectiveness as matching combined with fixed effects panel regression. These results highlight the importance of considering observable and unobservable forms of bias and the methodological assumptions across estimators when designing an impact evaluation of conservation programs. PMID:26501964

  3. Use of Subjective Global Assessment, Patient-Generated Subjective Global Assessment and Nutritional Risk Screening 2002 to evaluate the nutritional status of non-critically ill patients on parenteral nutrition.

    PubMed

    Badia-Tahull, M B; Cobo-Sacristán, S; Leiva-Badosa, E; Miquel-Zurita, M E; Méndez-Cabalerio, N; Jódar-Masanés, R; Llop-Talaverón, J

    2014-02-01

    To evaluate the nutritional status of non-critically ill digestive surgery patients at the moment of parenteral nutrition initiation using three different nutritional test tools and to study their correlation. To study the association between the tests and the clinical and laboratory parameters used in the follow-up of PN treatment. Prospective study over 4 months. Anthropometric and clinical variables were recorded. Results of Subjective Global Assessment; Patient-Generated Subjective Global Assessment; and Nutritional Risk Screening 2002 were compared applying kappa test. Relationship between the clinical and laboratory parameters with Subjective Global Assessment was studied by multinominal regression and with the other two tests by multiple linear regression models. Age and sex were included as adjustment variables. Malnutrition in 45 studied patients varied from 51% to 57%. Subjective Global Assessment correlated well with Patient-Generated Subjective Global Assessment and Nutritional Risk Screening 2002 (κ = 0531 p = 0.000). The test with the greatest correlation with the clinical and analytical variables was the Nutritional Risk Screening 2002. Worse nutritional state in this test was associated with worse results in albumin (B = -0.087; CI = -0.169/-0.005], prealbumin (B = -0.005; CI = [-0.011/-0.001]), C-reactive protein (B = 0.006;CI = [0.001/ 0.011]) and leukocytes (B = 0.134; CI = [0.031/0.237]) at the en of parenteral nutrition treatment. Half of the digestive surgery patients were at malnutritional risk at the moment of initiating parenteral nutrition. Nutritional Risk Screening 2002 was the test with best association with the parameters used in the clinical follow-up of parenteral nutrition treated patients. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  4. A trend analysis of surgical operations under a global payment system in Tehran, Iran (2005–2015)

    PubMed Central

    Goudari, Faranak Behzadi; Rashidian, Arash; Arab, Mohammad; Mahmoudi, Mahmood

    2018-01-01

    Background Global payment system is a first example of per-case payment system that contains 60 commonly used surgical operations for which payment is based on the average cost per case in Iran. Objective The aim of the study was to determine the amount of reduction, increase or no change in the trend of global operations. Methods In this retrospective longitudinal study, data on the 60 primary global surgery codes was gathered from Tehran Health Insurance Organization within the ten-year period of 2005–2015 separately, for each month. Out of 60 surgery codes, only acceptable data for 46 codes were available based on the insurance documents sent by medical centers. A quantitative analysis of time series through Regression Analysis Model using STATA software v.11 was performed. Results Some global surgery codes had an upward trend and some were downwards. Of N Codes, N83, N20, N28, N63, and N93 had an upward trend (p<0.05) and N32, N43, N81 and N90 showed a significant downward trend (p<0.05). Similarly, all H Codes except for H18 had a significant upward trend (p<0.000). As such, K Codes including K45, K56 and K81 had an increasing movement. S Codes also experienced both increasing and decreasing trends. However, none of the O Codes changed according to time. Other global surgical codes like C61, E07, M51, L60, J98 (p<0.000), I84 (p<0.031) and I86 (p<0.000) shown upward and downward trends. Total global surgeries trend was significantly upwards (B=24.26109, p<0.000). Conclusion The varying trend of global surgeries can partly reflect the behavior of service providers in order to increase their profits and minimize their costs. PMID:29765576

  5. Mapping Global Urban Extent and Intensity for Environmental Monitoring and Modeling

    NASA Astrophysics Data System (ADS)

    Schneider, A.; Friedl, M. A.

    2007-05-01

    The human dimensions of global environmental change have received increased attention in policy, decision- making, research, and even the media. However, the influence of urban areas in global change processes is still often assumed to be negligible. Although local environmental conditions such as the urban heat island effect are well-documented, little or no work has focused on cross-scale interactions, or the ways in which local urban processes cumulatively impact global changes. Given the rapid rates of rural-urban migration, economic development and urban spatial expansion, it is becoming increasingly clear that the `ecological footprint' of cities may play a critical role in environmental changes at regional and global scales. Our understanding of the cumulative impacts of urban areas on natural systems has been limited foremost by a lack of reliable, accurate data on current urban form and extent at the global scale. The data sets that have emerged to fill this gap (LandScan, GRUMP, nighttime lights) suffer from a number of limitations that prevent widespread use. Building on our early efforts with MODIS data, our current work focuses on: (1) completing a new, validated map of global urban extent; and (2) developing methods to estimate the subpixel fraction of impervious surface, vegetation, and other land cover types within urbanized areas using coarse resolution satellite imagery. For the first task, a technique called boosting is used to improve classification accuracy and provides a means to integrate 500 m resolution MODIS data with ancillary data sources. For the second task, we present an approach for estimating percent cover that relies on continuous training data for a full range of city types. These exemplars are used as inputs to fuzzy neural network and regression tree algorithms to predict fractional amounts of land cover types with increased accuracy. Preliminary results for a global sample of 100 cities (which vary in population size, level of economic development, and spatial extent) show good agreement with the expected morphology in each region.

  6. STAKEHOLDERS’ OPINIONS AND EXPECTATIONS OF THE GLOBAL FUND AND THEIR POTENTIAL ECONOMIC IMPLICATIONS

    PubMed Central

    Galárraga, Omar; Bertozzi, Stefano M.

    2009-01-01

    Objective To analyze stakeholder opinions and expectations of the Global Fund to Fight AIDS, Tuberculosis, and Malaria, and to discuss their potential economic and financial implications. Design The Global Fund commissioned an independent study, the “360° Stakeholder Assessment,” to canvas feedback on the organization’s reputation and performance with an on-line survey of 909 respondents representing major stakeholders worldwide. We created a proxy for expectations based on categorical responses for specific Global Fund attributes’ importance to the stakeholders, and current perceived performance. Methods Using multivariate regression, we analyzed 23 unfulfilled expectations related to: resource mobilization; impact measurement; harmonization and inclusion; effectiveness of the Global Fund partner environment; and portfolio characteristics. The independent variables are personal- and regional-level characteristics that affect expectations. Results The largest unfulfilled expectations relate to: mobilization of private sector resources; efficiency in disbursing funds; and assurance that people affected by the three diseases are reached. Stakeholders involved with the Fund through the Country Coordinating Mechanisms, those working in multilateral organizations, and persons living with HIV are more likely to have unfulfilled expectations. In contrast, higher levels of involvement with the Fund correlate with fulfilled expectations. Stakeholders living in sub-Saharan Africa were less likely to have their expectations met. Conclusions Stakeholders unfulfilled expectations result largely from factors external to them, but also from factors over which they have influence. In particular, attributes related to partnership score poorly even though stakeholders have influence in that area. Joint efforts to address perceived performance gaps may improve future performance, and positively influence investment levels and economic viability. PMID:18664957

  7. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological stations. The proposed model will be validated by using the (National Centers for Environmental Prediction / National Center for Atmospheric Research) NCEP/NCAR predictors for the period of 1960-1990 and validated for 1990-2000. To investigate the efficiency of the proposed model, it will be compared with the multivariate multiple regression model and with dynamical downscaling climate models by using different climate indices that describe the frequency, intensity and duration of the variables of interest. KEY WORDS: Climate change, Copula, Monsoon, Quantile regression, Spatio-temporal distribution.

  8. Cognition, glucose metabolism and amyloid burden in Alzheimer’s disease

    PubMed Central

    Furst, Ansgar J.; Rabinovici, Gil D.; Rostomian, Ara H.; Steed, Tyler; Alkalay, Adi; Racine, Caroline; Miller, Bruce L.; Jagust, William J.

    2010-01-01

    We investigated relationships between glucose metabolism, amyloid load and measures of cognitive and functional impairment in Alzheimer’s disease (AD). Patients meeting criteria for probable AD underwent [11C]PIB and [18F]FDG PET imaging and were assessed on a set of clinical measures. PIB Distribution volume ratios and FDG scans were spatially normalized and average PIB counts from regions-of-interest (ROI) were used to compute a measure of global PIB uptake. Separate voxel-wise regressions explored local and global relationships between metabolism, amyloid burden and clinical measures. Regressions reflected cognitive domains assessed by individual measures, with visuospatial tests associated with more posterior metabolism, and language tests associated with metabolism in the left hemisphere. Correlating regional FDG uptake with these measures confirmed these findings. In contrast, no correlations were found between either voxel-wise or regional PIB uptake and any of the clinical measures. Finally, there were no associations between regional PIB and FDG uptake. We conclude that regional and global amyloid burden does not correlate with clinical status or glucose metabolism in AD. PMID:20417582

  9. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009), III. Re-computed MS and mb, proxy MW, final magnitude composition and completeness assessment

    NASA Astrophysics Data System (ADS)

    Di Giacomo, Domenico; Bondár, István; Storchak, Dmitry A.; Engdahl, E. Robert; Bormann, Peter; Harris, James

    2015-02-01

    This paper outlines the re-computation and compilation of the magnitudes now contained in the final ISC-GEM Reference Global Instrumental Earthquake Catalogue (1900-2009). The catalogue is available via the ISC website (http://www.isc.ac.uk/iscgem/). The available re-computed MS and mb provided an ideal basis for deriving new conversion relationships to moment magnitude MW. Therefore, rather than using previously published regression models, we derived new empirical relationships using both generalized orthogonal linear and exponential non-linear models to obtain MW proxies from MS and mb. The new models were tested against true values of MW, and the newly derived exponential models were then preferred to the linear ones in computing MW proxies. For the final magnitude composition of the ISC-GEM catalogue, we preferred directly measured MW values as published by the Global CMT project for the period 1976-2009 (plus intermediate-depth earthquakes between 1962 and 1975). In addition, over 1000 publications have been examined to obtain direct seismic moment M0 and, therefore, also MW estimates for 967 large earthquakes during 1900-1978 (Lee and Engdahl, 2015) by various alternative methods to the current GCMT procedure. In all other instances we computed MW proxy values by converting our re-computed MS and mb values into MW, using the newly derived non-linear regression models. The final magnitude composition is an improvement in terms of magnitude homogeneity compared to previous catalogues. The magnitude completeness is not homogeneous over the 110 years covered by the ISC-GEM catalogue. Therefore, seismicity rate estimates may be strongly affected without a careful time window selection. In particular, the ISC-GEM catalogue appears to be complete down to MW 5.6 starting from 1964, whereas for the early instrumental period the completeness varies from ∼7.5 to 6.2. Further time and resources would be necessary to homogenize the magnitude of completeness over the entire catalogue length.

  10. LRP-1 polymorphism is associated with global and regional amyloid load in Alzheimer's disease in humans in-vivo

    PubMed Central

    Grimmer, Timo; Goldhardt, Oliver; Guo, Liang-Hao; Yousefi, Behrooz H.; Förster, Stefan; Drzezga, Alexander; Sorg, Christian; Alexopoulos, Panagiotis; Förstl, Hans; Kurz, Alexander; Perneczky, Robert

    2014-01-01

    Objective Impaired amyloid clearance has been proposed to contribute to β-amyloid deposition in sporadic late-onset Alzheimer's disease (AD). Low density lipoprotein receptor-related protein 1 (LRP-1) is involved in the active outward transport of β-amyloid across the blood–brain barrier (BBB). The C667T polymorphism (rs1799986) of the LRP-1 gene has been inconsistently associated with AD in genetic studies. We aimed to elucidate the association of this polymorphism with in-vivo brain amyloid load of AD patients using amyloid PET with [11C]PiB. Materials and methods 72 patients with very mild to moderate AD were examined with amyloid PET and C667T polymorphism was obtained using TaqMan PCR assays. The association of C667T polymorphism with global and regional amyloid load was calculated using linear regression and voxel based analysis, respectively. The effect of the previously identified modulator of amyloid uptake, the apolipoprotein E genotype, on this association was also determined. Results The regression analysis between amyloid load and C667T polymorphism was statistically significant (p = 0.046, β = 0.236). In an additional analysis ApoE genotype and gender were identified to explain further variability of amyloid load. Voxel based analysis revealed a significant (p < 0.05) association between C667T polymorphism and amyloid uptake in the temporo-parietal cortex bilaterally. ApoE did not interact significantly with the LRP-1 polymorphism. Discussion In conclusion, C667T polymorphism of LRP-1 is moderately but significantly associated with global and regional amyloid deposition in AD. The relationship appears to be independent of the ApoE genotype. This finding is compatible with the hypothesis that impaired amyloid clearance contributes to amyloid deposition in late-onset sporadic AD. PMID:24596678

  11. Relationship between altitude and the prevalence of hypertension in Tibet: a systematic review

    PubMed Central

    Mingji, Cuomu; Onakpoya, Igho J; Perera, Rafael; Ward, Alison M; Heneghan, Carl J

    2015-01-01

    Introduction Hypertension is a leading cause of cardiovascular disease, which is the cause of one-third of global deaths and is a primary and rising contributor to the global disease burden. The objective of this systematic review was to determine the prevalence and awareness of hypertension among the inhabitants of Tibet and its association with altitude, using the data from published observational studies. Methods We conducted electronic searches in Medline, Embase, ISI Web of Science and Global Health. No gender or language restrictions were imposed. We assessed the methodological characteristics of included studies using the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) criteria. Two reviewers independently determined the eligibility of studies, assessed the methodology of included studies and extracted the data. We used meta-regression to estimate the degree of change in hypertension prevalence with increasing altitude. Results We identified 22 eligible articles of which eight cross-sectional studies with a total of 16 913 participants were included. The prevalence of hypertension ranged between 23% and 56%. A scatter plot of altitude against overall prevalence revealed a statistically significant correlation (r=0.68; p=0.04). Meta-regression analysis revealed a 2% increase in the prevalence of hypertension with every 100 m increase in altitude (p=0.06). The locations and socioeconomic status of subjects affected the awareness and subsequent treatment and control of hypertension. Conclusions The results from cross-sectional studies suggest that there is a significant correlation between altitude and the prevalence of hypertension among inhabitants of Tibet. The socioeconomic status of the inhabitants can influence awareness and management of hypertension. Very little research into hypertension has been conducted in other prefectures of Tibet where the altitude is much higher. Further research examining the impact of altitude on blood pressure is warranted. PMID:25953970

  12. Time-varying trends of global vegetation activity

    NASA Astrophysics Data System (ADS)

    Pan, N.; Feng, X.; Fu, B.

    2016-12-01

    Vegetation plays an important role in regulating the energy change, water cycle and biochemical cycle in terrestrial ecosystems. Monitoring the dynamics of vegetation activity and understanding their driving factors have been an important issue in global change research. Normalized Difference Vegetation Index (NDVI), an indicator of vegetation activity, has been widely used in investigating vegetation changes at regional and global scales. Most studies utilized linear regression or piecewise linear regression approaches to obtain an averaged changing rate over a certain time span, with an implicit assumption that the trend didn't change over time during that period. However, no evidence shows that this assumption is right for the non-linear and non-stationary NDVI time series. In this study, we adopted the multidimensional ensemble empirical mode decomposition (MEEMD) method to extract the time-varying trends of NDVI from original signals without any a priori assumption of their functional form. Our results show that vegetation trends are spatially and temporally non-uniform during 1982-2013. Most vegetated area exhibited greening trends in the 1980s. Nevertheless, the area with greening trends decreased over time since the early 1990s, and the greening trends have stalled or even reversed in many places. Regions with browning trends were mainly located in southern low latitudes in the 1980s, whose area decreased before the middle 1990s and then increased at an accelerated rate. The greening-to-browning reversals were widespread across all continents except Oceania (43% of the vegetated areas), most of which happened after the middle 1990s. In contrast, the browning-to-greening reversals occurred in smaller area and earlier time. The area with monotonic greening and browning trends accounted for 33% and 5% of the vegetated area, respectively. By performing partial correlation analyses between NDVI and climatic elements (temperature, precipitation and cloud cover) and analyzing the MEEMD-extracted trends of these climatic elements, we discussed possible driving factors of the time-varying trends of NDVI in several specific regions where trend reversals occurred.

  13. The local and global climate forcings induced inhomogeneity of Indian rainfall.

    PubMed

    Nair, P J; Chakraborty, A; Varikoden, H; Francis, P A; Kuttippurath, J

    2018-04-16

    India is home for more than a billion people and its economy is largely based on agrarian society. Therefore, rainfall received not only decides its livelihood, but also influences its water security and economy. This situation warrants continuous surveillance and analysis of Indian rainfall. These kinds of studies would also help forecasters to better tune their models for accurate weather prediction. Here, we introduce a new method for estimating variability and trends in rainfall over different climate regions of India. The method based on multiple linear regression helps to assess contributions of different remote and local climate forcings to seasonal and regional inhomogeneity in rainfall. We show that the Indian Summer Monsoon Rainfall (ISMR) variability is governed by Eastern and Central Pacific El Niño Southern Oscillation, equatorial zonal winds, Atlantic zonal mode and surface temperatures of the Arabian Sea and Bay of Bengal, and the North East Monsoon Rainfall variability is controlled by the sea surface temperature of the North Atlantic and extratropial oceans. Also, our analyses reveal significant positive trends (0.43 mm/day/dec) in the North West for ISMR in the 1979-2017 period. This study cautions against the significant changes in Indian rainfall in a perspective of global climate change.

  14. QSAR studies of the bioactivity of hepatitis C virus (HCV) NS3/4A protease inhibitors by multiple linear regression (MLR) and support vector machine (SVM).

    PubMed

    Qin, Zijian; Wang, Maolin; Yan, Aixia

    2017-07-01

    In this study, quantitative structure-activity relationship (QSAR) models using various descriptor sets and training/test set selection methods were explored to predict the bioactivity of hepatitis C virus (HCV) NS3/4A protease inhibitors by using a multiple linear regression (MLR) and a support vector machine (SVM) method. 512 HCV NS3/4A protease inhibitors and their IC 50 values which were determined by the same FRET assay were collected from the reported literature to build a dataset. All the inhibitors were represented with selected nine global and 12 2D property-weighted autocorrelation descriptors calculated from the program CORINA Symphony. The dataset was divided into a training set and a test set by a random and a Kohonen's self-organizing map (SOM) method. The correlation coefficients (r 2 ) of training sets and test sets were 0.75 and 0.72 for the best MLR model, 0.87 and 0.85 for the best SVM model, respectively. In addition, a series of sub-dataset models were also developed. The performances of all the best sub-dataset models were better than those of the whole dataset models. We believe that the combination of the best sub- and whole dataset SVM models can be used as reliable lead designing tools for new NS3/4A protease inhibitors scaffolds in a drug discovery pipeline. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Hypothesis Testing Using Factor Score Regression: A Comparison of Four Methods

    ERIC Educational Resources Information Center

    Devlieger, Ines; Mayer, Axel; Rosseel, Yves

    2016-01-01

    In this article, an overview is given of four methods to perform factor score regression (FSR), namely regression FSR, Bartlett FSR, the bias avoiding method of Skrondal and Laake, and the bias correcting method of Croon. The bias correcting method is extended to include a reliable standard error. The four methods are compared with each other and…

  16. Regression trees for predicting mortality in patients with cardiovascular disease: What improvement is achieved by using ensemble-based methods?

    PubMed Central

    Austin, Peter C; Lee, Douglas S; Steyerberg, Ewout W; Tu, Jack V

    2012-01-01

    In biomedical research, the logistic regression model is the most commonly used method for predicting the probability of a binary outcome. While many clinical researchers have expressed an enthusiasm for regression trees, this method may have limited accuracy for predicting health outcomes. We aimed to evaluate the improvement that is achieved by using ensemble-based methods, including bootstrap aggregation (bagging) of regression trees, random forests, and boosted regression trees. We analyzed 30-day mortality in two large cohorts of patients hospitalized with either acute myocardial infarction (N = 16,230) or congestive heart failure (N = 15,848) in two distinct eras (1999–2001 and 2004–2005). We found that both the in-sample and out-of-sample prediction of ensemble methods offered substantial improvement in predicting cardiovascular mortality compared to conventional regression trees. However, conventional logistic regression models that incorporated restricted cubic smoothing splines had even better performance. We conclude that ensemble methods from the data mining and machine learning literature increase the predictive performance of regression trees, but may not lead to clear advantages over conventional logistic regression models for predicting short-term mortality in population-based samples of subjects with cardiovascular disease. PMID:22777999

  17. Self-organising mixture autoregressive model for non-stationary time series modelling.

    PubMed

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  18. Determination of monthly mean humidity in the atmospheric surface layer over oceans from satellite data

    NASA Technical Reports Server (NTRS)

    Liu, W. T.; Niiler, P. P.

    1984-01-01

    A simple statistical technique is described to determine monthly mean marine surface-layer humidity, which is essential in the specification of surface latent heat flux, from total water vapor in the atmospheric column measured by space-borne sensors. Good correlation between the two quantities was found in examining the humidity soundings from radiosonde reports of mid-ocean island stations and weather ships. The relation agrees with that obtained from satellite (Seasat) data and ship reports averaged over 2 deg areas and a 92-day period in the North Atlantic and in the tropical Pacific. The results demonstrate that, by using a local regression in the tropical Pacific, total water vapor can be used to determine monthly mean surface layer humidity to an accuracy of 0.4 g/kg. With a global regression, determination to an accuracy of 0.8 g/kg is possible. These accuracies correspond to approximately 10 to 20 W/sq m in the determination of latent heat flux with the bulk parameterization method, provided that other required parameters are known.

  19. Methods to detect, characterize, and remove motion artifact in resting state fMRI

    PubMed Central

    Power, Jonathan D; Mitra, Anish; Laumann, Timothy O; Snyder, Abraham Z; Schlaggar, Bradley L; Petersen, Steven E

    2013-01-01

    Head motion systematically alters correlations in resting state functional connectivity fMRI (RSFC). In this report we examine impact of motion on signal intensity and RSFC correlations. We find that motion-induced signal changes (1) are often complex and variable waveforms, (2) are often shared across nearly all brain voxels, and (3) often persist more than 10 seconds after motion ceases. These signal changes, both during and after motion, increase observed RSFC correlations in a distance-dependent manner. Motion-related signal changes are not removed by a variety of motion-based regressors, but are effectively reduced by global signal regression. We link several measures of data quality to motion, changes in signal intensity, and changes in RSFC correlations. We demonstrate that improvements in data quality measures during processing may represent cosmetic improvements rather than true correction of the data. We demonstrate a within-subject, censoring-based artifact removal strategy based on volume censoring that reduces group differences due to motion to chance levels. We note conditions under which group-level regressions do and do not correct motion-related effects. PMID:23994314

  20. A Short-Term and High-Resolution System Load Forecasting Approach Using Support Vector Regression with Hybrid Parameters Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaiguang

    This work proposes an approach for distribution system load forecasting, which aims to provide highly accurate short-term load forecasting with high resolution utilizing a support vector regression (SVR) based forecaster and a two-step hybrid parameters optimization method. Specifically, because the load profiles in distribution systems contain abrupt deviations, a data normalization is designed as the pretreatment for the collected historical load data. Then an SVR model is trained by the load data to forecast the future load. For better performance of SVR, a two-step hybrid optimization algorithm is proposed to determine the best parameters. In the first step of themore » hybrid optimization algorithm, a designed grid traverse algorithm (GTA) is used to narrow the parameters searching area from a global to local space. In the second step, based on the result of the GTA, particle swarm optimization (PSO) is used to determine the best parameters in the local parameter space. After the best parameters are determined, the SVR model is used to forecast the short-term load deviation in the distribution system.« less

  1. Linear regression analysis for comparing two measurers or methods of measurement: but which regression?

    PubMed

    Ludbrook, John

    2010-07-01

    1. There are two reasons for wanting to compare measurers or methods of measurement. One is to calibrate one method or measurer against another; the other is to detect bias. Fixed bias is present when one method gives higher (or lower) values across the whole range of measurement. Proportional bias is present when one method gives values that diverge progressively from those of the other. 2. Linear regression analysis is a popular method for comparing methods of measurement, but the familiar ordinary least squares (OLS) method is rarely acceptable. The OLS method requires that the x values are fixed by the design of the study, whereas it is usual that both y and x values are free to vary and are subject to error. In this case, special regression techniques must be used. 3. Clinical chemists favour techniques such as major axis regression ('Deming's method'), the Passing-Bablok method or the bivariate least median squares method. Other disciplines, such as allometry, astronomy, biology, econometrics, fisheries research, genetics, geology, physics and sports science, have their own preferences. 4. Many Monte Carlo simulations have been performed to try to decide which technique is best, but the results are almost uninterpretable. 5. I suggest that pharmacologists and physiologists should use ordinary least products regression analysis (geometric mean regression, reduced major axis regression): it is versatile, can be used for calibration or to detect bias and can be executed by hand-held calculator or by using the loss function in popular, general-purpose, statistical software.

  2. Feasibility Analysis of DEM Differential Method on Tree Height Assessment wit Terra-SAR/TanDEM-X Data

    NASA Astrophysics Data System (ADS)

    Zhang, Wangfei; Chen, Erxue; Li, Zengyuan; Feng, Qi; Zhao, Lei

    2016-08-01

    DEM Differential Method is an effective and efficient way for forest tree height assessment with Polarimetric and interferometric technology, however, the assessment accuracy of it is based on the accuracy of interferometric results and DEM. Terra-SAR/TanDEM-X, which established the first spaceborne bistatic interferometer, can provide highly accurate cross-track interferometric images in the whole global without inherent accuracy limitations like temporal decorrelation and atmospheric disturbance. These characters of Terra-SAR/TandDEM-X give great potential for global or regional tree height assessment, which have been constraint by the temporal decorrelation in traditional repeat-pass interferometry. Currently, in China, it will be costly to collect high accurate DEM with Lidar. At the same time, it is also difficult to get truly representative ground survey samples to test and verify the assessment results. In this paper, we analyzed the feasibility of using TerraSAR/TanDEM-X data to assess forest tree height with current free DEM data like ASTER-GDEM and archived ground in-suit data like forest management inventory data (FMI). At first, the accuracy and of ASTER-GDEM and forest management inventory data had been assessment according to the DEM and canopy height model (CHM) extracted from Lidar data. The results show the average elevation RMSE between ASTER-GEDM and Lidar-DEM is about 13 meters, but they have high correlation with the correlation coefficient of 0.96. With a linear regression model, we can compensate ASTER-GDEM and improve its accuracy nearly to the Lidar-DEM with same scale. The correlation coefficient between FMI and CHM is 0.40. its accuracy is able to be improved by a linear regression model withinconfidence intervals of 95%. After compensation of ASTER-GDEM and FMI, we calculated the tree height in Mengla test site with DEM Differential Method. The results showed that the corrected ASTER-GDEM can effectively improve the assessment accuracy. The average assessment accuracy before and after corrected is 0.73 and 0.76, the RMSE is 5.5 and 4.4, respectively.

  3. Boiling points of halogenated aliphatic compounds: a quantitative structure-property relationship for prediction and validation.

    PubMed

    Oberg, Tomas

    2004-01-01

    Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.

  4. Age-related differences in reaction time task performance in young children.

    PubMed

    Kiselev, Sergey; Espy, Kimberly Andrews; Sheffield, Tiffany

    2009-02-01

    Performance of reaction time (RT) tasks was investigated in young children and adults to test the hypothesis that age-related differences in processing speed supersede a "global" mechanism and are a function of specific differences in task demands and processing requirements. The sample consisted of 54 4-year-olds, 53 5-year-olds, 59 6-year-olds, and 35 adults from Russia. Using the regression approach pioneered by Brinley and the transformation method proposed by Madden and colleagues and Ridderinkhoff and van der Molen, age-related differences in processing speed differed among RT tasks with varying demands. In particular, RTs differed between children and adults on tasks that required response suppression, discrimination of color or spatial orientation, reversal of contingencies of previously learned stimulus-response rules, and greater stimulus-response complexity. Relative costs of these RT task differences were larger than predicted by the global difference hypothesis except for response suppression. Among young children, age-related differences larger than predicted by the global difference hypothesis were evident when tasks required color or spatial orientation discrimination and stimulus-response rule complexity, but not for response suppression or reversal of stimulus-response contingencies. Process-specific, age-related differences in processing speed that support heterochronicity of brain development during childhood were revealed.

  5. Relationship of Soft Drink Consumption to Global Overweight, Obesity, and Diabetes: A Cross-National Analysis of 75 Countries

    PubMed Central

    McKee, Martin; Galea, Gauden; Stuckler, David

    2013-01-01

    Objectives. We estimated the relationship between soft drink consumption and obesity and diabetes worldwide. Methods. We used multivariate linear regression to estimate the association between soft drink consumption and overweight, obesity, and diabetes prevalence in 75 countries, controlling for other foods (cereals, meats, fruits and vegetables, oils, and total calories), income, urbanization, and aging. Data were obtained from the Euromonitor Global Market Information Database, the World Health Organization, and the International Diabetes Federation. Bottled water consumption, which increased with per-capita income in parallel to soft drink consumption, served as a natural control group. Results. Soft drink consumption increased globally from 9.5 gallons per person per year in 1997 to 11.4 gallons in 2010. A 1% rise in soft drink consumption was associated with an additional 4.8 overweight adults per 100 (adjusted B; 95% confidence interval [CI] = 3.1, 6.5), 2.3 obese adults per 100 (95% CI = 1.1, 3.5), and 0.3 adults with diabetes per 100 (95% CI = 0.1, 0.8). These findings remained robust in low- and middle-income countries. Conclusions. Soft drink consumption is significantly linked to overweight, obesity, and diabetes worldwide, including in low- and middle-income countries. PMID:23488503

  6. SELECTION OF ENDOCRINOLOGY SUBSPECIALTY TRAINEES: WHICH APPLICANT CHARACTERISTICS ARE ASSOCIATED WITH PERFORMANCE DURING FELLOWSHIP TRAINING?

    PubMed Central

    Natt, Neena; Chang, Alice Y.; Berbari, Elie F.; Kennel, Kurt A.; Kearns, Ann E.

    2016-01-01

    Objective To determine which residency characteristics are associated with performance during endocrinology fellowship training as measured by competency-based faculty evaluation scores and faculty global ratings of trainee performance. Method We performed a retrospective review of interview applications from endocrinology fellows who graduated from a single academic institution between 2006 and 2013. Performance measures included competency-based faculty evaluation scores and faculty global ratings. The association between applicant characteristics and measures of performance during fellowship was examined by linear regression. Results The presence of a laudatory comparative statement in the residency program director’s letter of recommendation (LoR) or experience as a chief resident was significantly associated with competency-based faculty evaluation scores (β = 0.22, P = 0.001; and β = 0.24, P = 0.009, respectively) and faculty global ratings (β = 0.85, P = 0.006; and β = 0.96, P = 0.015, respectively). Conclusion The presence of a laudatory comparative statement in the residency program director’s LoR or experience as a chief resident were significantly associated with overall performance during subspecialty fellowship training. Future studies are needed in other cohorts to determine the broader implications of these findings in the application and selection process. PMID:26437219

  7. Socioeconomic Impact on the Prevalence of Cardiovascular Risk Factors in Wallonia, Belgium: A Population-Based Study.

    PubMed

    Streel, Sylvie; Donneau, Anne-Françoise; Hoge, Axelle; Majerus, Sven; Kolh, Philippe; Chapelle, Jean-Paul; Albert, Adelin; Guillaume, Michèle

    2015-01-01

    Background. Monitoring the epidemiology of cardiovascular risk factors (CRFs) and their determinants is important to develop appropriate recommendations to prevent cardiovascular diseases in specific risk groups. The NESCaV study was designed to collect standardized data to estimate the prevalence of CRFs in relation to socioeconomic parameters among the general adult population in the province of Liège, Wallonia, Belgium. Methods. A representative stratified random sample of 1017 subjects, aged 20-69 years, participated in the NESCaV study (2010-2012). A self-administered questionnaire, a clinical examination, and laboratory tests were performed on participants. CRFs included hypertension, dyslipidemia, global obesity, abdominal obesity, diabetes, current smoking, and physical inactivity. Covariates were education and subjective and objective socioeconomic levels. Data were analyzed by weighted logistic regression. Results. The prevalence of hypertension, abdominal obesity, global obesity, current smoking, and physical inactivity was higher in subjects with low education and who considered themselves "financially in need." Living below poverty threshold also increased the risk of global and abdominal obesity, current smoking, and physical inactivity. Conclusion. The study shows that socioeconomic factors impact the prevalence of CRFs in the adult population of Wallonia. Current public health policies should be adjusted to reduce health inequalities in specific risk groups.

  8. Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning.

    PubMed

    Gorban, A N; Mirkes, E M; Zinovyev, A

    2016-12-01

    Most of machine learning approaches have stemmed from the application of minimizing the mean squared distance principle, based on the computationally efficient quadratic optimization methods. However, when faced with high-dimensional and noisy data, the quadratic error functionals demonstrated many weaknesses including high sensitivity to contaminating factors and dimensionality curse. Therefore, a lot of recent applications in machine learning exploited properties of non-quadratic error functionals based on L 1 norm or even sub-linear potentials corresponding to quasinorms L p (0

  9. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    NASA Astrophysics Data System (ADS)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of the model is rather poor, and possible explanations are discussed.

  10. Disability weights from a household survey in a low socio-economic setting: how does it compare to the global burden of disease 2010 study?

    PubMed

    Neethling, Ian; Jelsma, Jennifer; Ramma, Lebogang; Schneider, Helen; Bradshaw, Debbie

    2016-01-01

    The global burden of disease (GBD) 2010 study used a universal set of disability weights to estimate disability adjusted life years (DALYs) by country. However, it is not clear whether these weights can be applied universally in calculating DALYs to inform local decision-making. This study derived disability weights for a resource-constrained community in Cape Town, South Africa, and interrogated whether the GBD 2010 disability weights necessarily represent the preferences of economically disadvantaged communities. A household survey was conducted in Lavender Hill, Cape Town, to assess the health state preferences of the general public. The responses from a paired comparison valuation method were assessed using a probit regression. The probit coefficients were anchored onto the 0 to 1 disability weight scale by running a lowess regression on the GBD 2010 disability weights and interpolating the coefficients between the upper and lower limit of the smoothed disability weights. Heroin and opioid dependence had the highest disability weight of 0.630, whereas intellectual disability had the lowest (0.040). Untreated injuries ranked higher than severe mental disorders. There were some counterintuitive results, such as moderate (15th) and severe vision impairment (16th) ranking higher than blindness (20th). A moderate correlation between the disability weights of the local study and those of the GBD 2010 study was observed (R(2)=0.440, p<0.05). This indicates that there was a relationship, although some conditions, such as untreated fracture of the radius or ulna, showed large variability in disability weights (0.488 in local study and 0.043 in GBD 2010). Respondents seemed to value physical mobility higher than cognitive functioning, which is in contrast to the GBD 2010 study. This study shows that not all health state preferences are universal. Studies estimating DALYs need to derive local disability weights using methods that are less cognitively demanding for respondents.

  11. Using a time-series statistical framework to quantify trends and abrupt change in US corn, soybean, and wheat yields from 1970-2016

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Ives, A. R.; Turner, M. G.; Kucharik, C. J.

    2017-12-01

    Previous studies have identified global agricultural regions where "stagnation" of long-term crop yield increases has occurred. These studies have used a variety of simple statistical methods that often ignore important aspects of time series regression modeling. These methods can lead to differing and contradictory results, which creates uncertainty regarding food security given rapid global population growth. Here, we present a new statistical framework incorporating time series-based algorithms into standard regression models to quantify spatiotemporal yield trends of US maize, soybean, and winter wheat from 1970-2016. Our primary goal was to quantify spatial differences in yield trends for these three crops using USDA county level data. This information was used to identify regions experiencing the largest changes in the rate of yield increases over time, and to determine whether abrupt shifts in the rate of yield increases have occurred. Although crop yields continue to increase in most maize-, soybean-, and winter wheat-growing areas, yield increases have stagnated in some key agricultural regions during the most recent 15 to 16 years: some maize-growing areas, except for the northern Great Plains, have shown a significant trend towards smaller annual yield increases for maize; soybean has maintained an consistent long-term yield gains in the Northern Great Plains, the Midwest, and southeast US, but has experienced a shift to smaller annual increases in other regions; winter wheat maintained a moderate annual increase in eastern South Dakota and eastern US locations, but showed a decline in the magnitude of annual increases across the central Great Plains and western US regions. Our results suggest that there were abrupt shifts in the rate of annual yield increases in a variety of US regions among the three crops. The framework presented here can be broadly applied to additional yield trend analyses for different crops and regions of the Earth.

  12. Optimizing and Interpreting Insular Functional Connectivity Maps Obtained During Acute Experimental Pain: The Effects of Global Signal and Task Paradigm Regression.

    PubMed

    Ibinson, James W; Vogt, Keith M; Taylor, Kevin B; Dua, Shiv B; Becker, Christopher J; Loggia, Marco; Wasan, Ajay D

    2015-12-01

    The insula is uniquely located between the temporal and parietal cortices, making it anatomically well-positioned to act as an integrating center between the sensory and affective domains for the processing of painful stimulation. This can be studied through resting-state functional connectivity (fcMRI) imaging; however, the lack of a clear methodology for the analysis of fcMRI complicates the interpretation of these data during acute pain. Detected connectivity changes may reflect actual alterations in low-frequency synchronous neuronal activity related to pain, may be due to changes in global cerebral blood flow or the superimposed task-induced neuronal activity. The primary goal of this study was to investigate the effects of global signal regression (GSR) and task paradigm regression (TPR) on the changes in functional connectivity of the left (contralateral) insula in healthy subjects at rest and during acute painful electric nerve stimulation of the right hand. The use of GSR reduced the size and statistical significance of connectivity clusters and created negative correlation coefficients for some connectivity clusters. TPR with cyclic stimulation gave task versus rest connectivity differences similar to those with a constant task, suggesting that analysis which includes TPR is more accurately reflective of low-frequency neuronal activity. Both GSR and TPR have been inconsistently applied to fcMRI analysis. Based on these results, investigators need to consider the impact GSR and TPR have on connectivity during task performance when attempting to synthesize the literature.

  13. Gender-specific estimates of COPD prevalence: a systematic review and meta-analysis.

    PubMed

    Ntritsos, Georgios; Franek, Jacob; Belbasis, Lazaros; Christou, Maria A; Markozannes, Georgios; Altman, Pablo; Fogel, Robert; Sayre, Tobias; Ntzani, Evangelia E; Evangelou, Evangelos

    2018-01-01

    COPD has been perceived as being a disease of older men. However, >7 million women are estimated to live with COPD in the USA alone. Despite a growing body of literature suggesting an increasing burden of COPD in women, the evidence is limited. To assess and synthesize the available evidence among population-based epidemiologic studies and calculate the global prevalence of COPD in men and women. A systematic review and meta-analysis reporting gender-specific prevalence of COPD was undertaken. Gender-specific prevalence estimates were abstracted from relevant studies. Associated patient characteristics as well as custom variables pertaining to the diagnostic method and other important epidemiologic covariates were also collected. A Bayesian random-effects meta-analysis was performed investigating gender-specific prevalence of COPD stratified by age, geography, calendar time, study setting, diagnostic method, and disease severity. Among 194 eligible studies, summary prevalence was 9.23% (95% credible interval [CrI]: 8.16%-10.36%) in men and 6.16% (95% CrI: 5.41%-6.95%) in women. Gender prevalences varied widely by the World Health Organization Global Burden of Disease subregions, with the highest female prevalence found in North America (8.07% vs 7.30%) and in participants in urban settings (13.03% vs 8.34%). Meta-regression indicated that age ≥40 and bronchodilator testing contributed most significantly to heterogeneity of prevalence estimates across studies. We conducted the largest ever systematic review and meta-analysis of global prevalence of COPD and the first large gender-specific review. These results will increase awareness of COPD as a critical woman's health issue.

  14. The Global Fund's resource allocation decisions for HIV programmes: addressing those in need

    PubMed Central

    2011-01-01

    Background Between 2002 and 2010, the Global Fund to Fight AIDS, Tuberculosis and Malaria's investment in HIV increased substantially to reach US$12 billion. We assessed how the Global Fund's investments in HIV programmes were targeted to key populations in relation to disease burden and national income. Methods We conducted an assessment of the funding approved by the Global Fund Board for HIV programmes in Rounds 1-10 (2002-2010) in 145 countries. We used the UNAIDS National AIDS Spending Assessment framework to analyze the Global Fund investments in HIV programmes by HIV spending category and type of epidemic. We examined funding per capita and its likely predictors (HIV adult prevalence, HIV prevalence in most-at-risk populations and gross national income per capita) using stepwise backward regression analysis. Results About 52% ($6.1 billion) of the cumulative Global Fund HIV funding was targeted to low- and low-middle-income countries. Around 56% of the total ($6.6 billion) was channelled to countries in sub-Saharan Africa. The majority of funds were for HIV treatment (36%; $4.3 billion) and prevention (29%; $3.5 billion), followed by health systems and community systems strengthening and programme management (22%; $2.6 billion), enabling environment (7%; $0.9 billion) and other activities. The Global Fund investment by country was positively correlated with national adult HIV prevalence. About 10% ($0.4 billion) of the cumulative HIV resources for prevention targeted most-at-risk populations. Conclusions There has been a sustained scale up of the Global Fund's HIV support. Funding has targeted the countries and populations with higher HIV burden and lower income. Prevention in most-at-risk populations is not adequately prioritized in most of the recipient countries. The Global Fund Board has recently modified eligibility and prioritization criteria to better target most-at-risk populations in Round 10 and beyond. More guidance is being provided for Round 11 to strategically focus demand for Global Fund financing in the present resource-constrained environment. PMID:22029667

  15. International law's effects on health and its social determinants: protocol for a systematic review, meta-analysis, and meta-regression analysis.

    PubMed

    Hoffman, Steven J; Hughsam, Matthew; Randhawa, Harkanwal; Sritharan, Lathika; Guyatt, Gordon; Lavis, John N; Røttingen, John-Arne

    2016-04-16

    In recent years, there have been numerous calls for global institutions to develop and enforce new international laws. International laws are, however, often blunt instruments with many uncertain benefits, costs, risks of harm, and trade-offs. Thus, they are probably not always appropriate solutions to global health challenges. Given these uncertainties and international law's potential importance for improving global health, the paucity of synthesized evidence addressing whether international laws achieve their intended effects or whether they are superior in comparison to other approaches is problematic. Ten electronic bibliographic databases were searched using predefined search strategies, including MEDLINE, Global Health, CINAHL, Applied Social Sciences Index and Abstracts, Dissertations and Theses, International Bibliography of Social Sciences, International Political Science Abstracts, Social Sciences Abstracts, Social Sciences Citation Index, PAIS International, and Worldwide Political Science Abstracts. Two reviewers will independently screen titles and abstracts using predefined inclusion criteria. Pairs of reviewers will then independently screen the full-text of articles for inclusion using predefined inclusion criteria and then independently extract data and assess risk of bias for included studies. Where feasible, results will be pooled through subgroup analyses, meta-analyses, and meta-regression techniques. The findings of this review will contribute to a better understanding of the expected benefits and possible harms of using international law to address different kinds of problems, thereby providing important evidence-informed guidance on when and how it can be effectively introduced and implemented by countries and global institutions. PROSPERO CRD42015019830.

  16. Globalization and eating disorder risk: peer influence, perceived social norms, and adolescent disordered eating in Fiji.

    PubMed

    Gerbasi, Margaret E; Richards, Lauren K; Thomas, Jennifer J; Agnew-Blais, Jessica C; Thompson-Brenner, Heather; Gilman, Stephen E; Becker, Anne E

    2014-11-01

    The increasing global health burden imposed by eating disorders warrants close examination of social exposures associated with globalization that potentially elevate risk during the critical developmental period of adolescence in low- and middle-income countries (LMICs). The study aim was to investigate the association of peer influence and perceived social norms with adolescent eating pathology in Fiji, a LMIC undergoing rapid social change. We measured peer influence on eating concerns (with the Inventory of Peer Influence on Eating Concerns; IPIEC), perceived peer norms associated with disordered eating and body concerns, perceived community cultural norms, and individual cultural orientations in a representative sample of school-going ethnic Fijian adolescent girls (n = 523). We then developed a multivariable linear regression model to examine their relation to eating pathology (measured by the Eating Disorder Examination-Questionnaire; EDE-Q). We found independent and statistically significant associations between both IPIEC scores and our proxy for perceived social norms specific to disordered eating (both p < .001) and EDE-Q global scores in a fully adjusted linear regression model. Study findings support the possibility that peer influence as well as perceived social norms relevant to disordered eating may elevate risk for disordered eating in Fiji, during the critical developmental period of adolescence. Replication and extension of these research findings in other populations undergoing rapid social transition--and where globalization is also influencing local social norms--may enrich etiologic models and inform strategies to mitigate risk. © 2014 Wiley Periodicals, Inc.

  17. Estimating Top-of-Atmosphere Thermal Infrared Radiance Using MERRA-2 Atmospheric Data

    NASA Astrophysics Data System (ADS)

    Kleynhans, Tania

    Space borne thermal infrared sensors have been extensively used for environmental research as well as cross-calibration of other thermal sensing systems. Thermal infrared data from satellites such as Landsat and Terra/MODIS have limited temporal resolution (with a repeat cycle of 1 to 2 days for Terra/MODIS, and 16 days for Landsat). Thermal instruments with finer temporal resolution on geostationary satellites have limited utility for cross-calibration due to their large view angles. Reanalysis atmospheric data is available on a global spatial grid at three hour intervals making it a potential alternative to existing satellite image data. This research explores using the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis data product to predict top-of-atmosphere (TOA) thermal infrared radiance globally at time scales finer than available satellite data. The MERRA-2 data product provides global atmospheric data every three hours from 1980 to the present. Due to the high temporal resolution of the MERRA-2 data product, opportunities for novel research and applications are presented. While MERRA-2 has been used in renewable energy and hydrological studies, this work seeks to leverage the model to predict TOA thermal radiance. Two approaches have been followed, namely physics-based approach and a supervised learning approach, using Terra/MODIS band 31 thermal infrared data as reference. The first physics-based model uses forward modeling to predict TOA thermal radiance. The second model infers the presence of clouds from the MERRA-2 atmospheric data, before applying an atmospheric radiative transfer model. The last physics-based model parameterized the previous model to minimize computation time. The second approach applied four different supervised learning algorithms to the atmospheric data. The algorithms included a linear least squares regression model, a non-linear support vector regression (SVR) model, a multi-layer perceptron (MLP), and a convolutional neural network (CNN). This research found that the multi-layer perceptron model produced the lowest error rates overall, with an RMSE of 1.22W / m2 sr mum when compared to actual Terra/MODIS band 31 image data. This research further aimed to characterize the errors associated with each method so that any potential user will have the best information available should they wish to apply these methods towards their own application.

  18. Assessing NARCCAP climate model effects using spatial confidence regions.

    PubMed

    French, Joshua P; McGinnis, Seth; Schwartzman, Armin

    2017-01-01

    We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP) climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Kunkun, E-mail: ktg@illinois.edu; Inria Bordeaux – Sud-Ouest, Team Cardamom, 200 avenue de la Vieille Tour, 33405 Talence; Congedo, Pietro M.

    The Polynomial Dimensional Decomposition (PDD) is employed in this work for the global sensitivity analysis and uncertainty quantification (UQ) of stochastic systems subject to a moderate to large number of input random variables. Due to the intimate connection between the PDD and the Analysis of Variance (ANOVA) approaches, PDD is able to provide a simpler and more direct evaluation of the Sobol' sensitivity indices, when compared to the Polynomial Chaos expansion (PC). Unfortunately, the number of PDD terms grows exponentially with respect to the size of the input random vector, which makes the computational cost of standard methods unaffordable formore » real engineering applications. In order to address the problem of the curse of dimensionality, this work proposes essentially variance-based adaptive strategies aiming to build a cheap meta-model (i.e. surrogate model) by employing the sparse PDD approach with its coefficients computed by regression. Three levels of adaptivity are carried out in this paper: 1) the truncated dimensionality for ANOVA component functions, 2) the active dimension technique especially for second- and higher-order parameter interactions, and 3) the stepwise regression approach designed to retain only the most influential polynomials in the PDD expansion. During this adaptive procedure featuring stepwise regressions, the surrogate model representation keeps containing few terms, so that the cost to resolve repeatedly the linear systems of the least-squares regression problem is negligible. The size of the finally obtained sparse PDD representation is much smaller than the one of the full expansion, since only significant terms are eventually retained. Consequently, a much smaller number of calls to the deterministic model is required to compute the final PDD coefficients.« less

  20. Special cases of AIRS v4.0.x retrievals: missing forecast surface pressure and regression-only retrieval

    NASA Technical Reports Server (NTRS)

    Hearty, Thomas; Manning, Evan

    2005-01-01

    This memo examines the differences that can be expected when performing two special cases of retrievals with the v.4.0.x PGE: (1) retrivals without the surface pressure from the NOAA Global Forecast System (GFS) and (2) regression only retrievals. An understanding of these differences is important for users who may want to give up some accuracy in the retrieval in exchange for a rapid solution.

  1. Discontinuation, Efficacy, and Safety of Cholinesterase Inhibitors for Alzheimer’s Disease: a Meta-Analysis and Meta-Regression of 43 Randomized Clinical Trials Enrolling 16 106 Patients

    PubMed Central

    Blanco-Silvente, Lídia; Saez, Marc; Barceló, Maria Antònia; Garre-Olmo, Josep; Vilalta-Franch, Joan; Capellà, Dolors

    2017-01-01

    Abstract Background: We investigated the effect of cholinesterase inhibitors on all-cause discontinuation, efficacy and safety, and the effects of study design-, intervention-, and patient-related covariates on the risk-benefit of cholinesterase inhibitors for Alzheimer’s disease. Methods: A systematic review and meta-analysis of randomized placebo-controlled clinical trials comparing cholinesterase inhibitors and placebo was performed. The effect of covariates on study outcomes was analysed by means of meta-regression using a Bayesian framework. Results: Forty-three randomized placebo-controlled clinical trials involving 16106 patients were included. All-cause discontinuation was higher with cholinesterase inhibitors (OR = 1.66), as was discontinuation due to adverse events (OR=1.75). Cholinesterase inhibitors improved cognitive function (standardized mean difference = 0.38), global symptomatology (standardized mean difference=0.28) and functional capacity (standardized mean difference=0.16) but not neuropsychiatric symptoms. Rivastigmine was associated with a poorer outcome on all-cause discontinuation (Diff OR = 1.66) and donepezil with a higher efficacy on global change (Diff standardized mean difference = 0.41). The proportion of patients with serious adverse events decreased with age (Diff OR = -0.09). Mortality was lower with cholinesterase inhibitors than with placebo (OR = 0.65). Conclusion: While cholinesterase inhibitors show a poor risk-benefit relationship as indicated by mild symptom improvement and a higher than placebo all-cause discontinuation, a reduction of mortality was suggested. Intervention- and patient-related factors modify the effect of cholinesterase inhibitors in patients with Alzheimer’s disease. PMID:28201726

  2. Implementation of a health management mentoring program: year-1 evaluation of its impact on health system strengthening in Zambézia Province, Mozambique

    PubMed Central

    Edwards, Laura J.; Moisés, Abú; Nzaramba, Mathias; Cassimo, Aboobacar; Silva, Laura; Mauricio, Joaquim; Wester, C. William; Vermund, Sten H.; Moon, Troy D.

    2015-01-01

    Background: Avante Zambézia is an initiative of a Non-Governmental Organization (NGO), Friends in Global Health, LLC (FGH) and the Vanderbilt Institute for Global Health (VIGH) to provide technical assistance to the Mozambican Ministry of Health (MoH) in rural Zambézia Province. Avante Zambézia developed a district level Health Management Mentorship (HMM) program to strengthen health systems in ten of Zambézia’s 17 districts. Our objective was to preliminarily analyze changes in four domains of health system capacity after the HMM’s first year: accounting, Human Resources (HRs), Monitoring and Evaluation (M&E), and transportation management. Methods: Quantitative metrics were developed in each domain. During district visits for weeklong, on-site mentoring, the health management mentoring teams documented each indicator as a success ratio percentage. We analyzed data using linear regressions of each indicator’s mean success ratio across all districts submitting a report over time. Results: Of the four domains, district performance in the accounting domain was the strongest and most sustained. Linear regressions of mean monthly compliance for HR objectives indicated improvement in three of six mean success ratios. The M&E capacity domain showed the least overall improvement. The one indicator analyzed for transportation management suggested progress. Conclusion: Our outcome evaluation demonstrates improvement in health system performance during a HMM initiative. Evaluating which elements of our mentoring program are succeeding in strengthening district level health systems is vital in preparing to transition fiscal and managerial responsibility to local authorities. PMID:26029894

  3. Left frontal cortex connectivity underlies cognitive reserve in prodromal Alzheimer disease

    PubMed Central

    Franzmeier, Nicolai; Duering, Marco; Weiner, Michael; Dichgans, Martin

    2017-01-01

    Objective: To test whether higher global functional connectivity of the left frontal cortex (LFC) in Alzheimer disease (AD) is associated with more years of education (a proxy of cognitive reserve [CR]) and mitigates the association between AD-related fluorodeoxyglucose (FDG)-PET hypometabolism and episodic memory. Methods: Forty-four amyloid-PET–positive patients with amnestic mild cognitive impairment (MCI-Aβ+) and 24 amyloid-PET–negative healthy controls (HC) were included. Voxel-based linear regression analyses were used to test the association between years of education and FDG-PET in MCI-Aβ+, controlled for episodic memory performance. Global LFC (gLFC) connectivity was computed through seed-based resting-state fMRI correlations between the LFC (seed) and each voxel in the gray matter. In linear regression analyses, education as a predictor of gLFC connectivity and the interaction of gLFC connectivity × FDG-PET hypometabolism on episodic memory were tested. Results: FDG-PET metabolism in the precuneus was reduced in MCI-Aβ+ compared to HC (p = 0.028), with stronger reductions observed in MCI-Aβ+ with more years of education (p = 0.006). In MCI-Aβ+, higher gLFC connectivity was associated with more years of education (p = 0.021). At higher levels of gLFC connectivity, the association between precuneus FDG-PET hypometabolism and lower memory performance was attenuated (p = 0.027). Conclusions: Higher gLFC connectivity is a functional substrate of CR that helps to maintain episodic memory relatively well in the face of emerging FDG-PET hypometabolism in early-stage AD. PMID:28188306

  4. The colorectal cancer mortality-to-incidence ratio as an indicator of global cancer screening and care.

    PubMed

    Sunkara, Vasu; Hébert, James R

    2015-05-15

    Disparities in cancer screening, incidence, treatment, and survival are worsening globally. The mortality-to-incidence ratio (MIR) has been used previously to evaluate such disparities. The MIR for colorectal cancer is calculated for all Organisation for Economic Cooperation and Development (OECD) countries using the 2012 GLOBOCAN incidence and mortality statistics. Health system rankings were obtained from the World Health Organization. Two linear regression models were fit with the MIR as the dependent variable and health system ranking as the independent variable; one included all countries and one model had the "divergents" removed. The regression model for all countries explained 24% of the total variance in the MIR. Nine countries were found to have regression-calculated MIRs that differed from the actual MIR by >20%. Countries with lower-than-expected MIRs were found to have strong national health systems characterized by formal colorectal cancer screening programs. Conversely, countries with higher-than-expected MIRs lack screening programs. When these divergent points were removed from the data set, the recalculated regression model explained 60% of the total variance in the MIR. The MIR proved useful for identifying disparities in cancer screening and treatment internationally. It has potential as an indicator of the long-term success of cancer surveillance programs and may be extended to other cancer types for these purposes. © 2015 American Cancer Society.

  5. Longitudinal decline in structural networks predicts dementia in cerebral small vessel disease

    PubMed Central

    Lawrence, Andrew J.; Zeestraten, Eva A.; Benjamin, Philip; Lambert, Christian P.; Morris, Robin G.; Barrick, Thomas R.

    2018-01-01

    Objective To determine whether longitudinal change in white matter structural network integrity predicts dementia and future cognitive decline in cerebral small vessel disease (SVD). To investigate whether network disruption has a causal role in cognitive decline and mediates the association between conventional MRI markers of SVD with both cognitive decline and dementia. Methods In the prospective longitudinal SCANS (St George's Cognition and Neuroimaging in Stroke) Study, 97 dementia-free individuals with symptomatic lacunar stroke were followed with annual MRI for 3 years and annual cognitive assessment for 5 years. Conversion to dementia was recorded. Structural networks were constructed from diffusion tractography using a longitudinal registration pipeline, and network global efficiency was calculated. Linear mixed-effects regression was used to assess change over time. Results Seventeen individuals (17.5%) converted to dementia, and significant decline in global cognition occurred (p = 0.0016). Structural network measures declined over the 3-year MRI follow-up, but the degree of change varied markedly between individuals. The degree of reductions in network global efficiency was associated with conversion to dementia (B = −2.35, odds ratio = 0.095, p = 0.00056). Change in network global efficiency mediated much of the association of conventional MRI markers of SVD with cognitive decline and progression to dementia. Conclusions Network disruption has a central role in the pathogenesis of cognitive decline and dementia in SVD. It may be a useful disease marker to identify that subgroup of patients with SVD who progress to dementia. PMID:29695593

  6. Global Seasonality of Rotavirus Disease

    PubMed Central

    Patel, Manish M.; Pitzer, Virginia; Alonso, Wladimir J.; Vera, David; Lopman, Ben; Tate, Jacqueline; Viboud, Cecile; Parashar, Umesh D.

    2012-01-01

    Background A substantial number of surveillance studies have documented rotavirus prevalence among children admitted for dehydrating diarrhea. We sought to establish global seasonal patterns of rotavirus disease before widespread vaccine introduction. Methods We reviewed studies of rotavirus detection in children with diarrhea published since 1995. We assessed potential relationships between seasonal prevalence and locality by plotting the average monthly proportion of diarrhea cases positive for rotavirus according to geography, country development, and latitude. We used linear regression to identify variables that were potentially associated with the seasonal intensity of rotavirus. Results Among a total of 99 studies representing all six geographical regions of the world, patterns of year-round disease were more evident in low- and low-middle income countries compared with upper-middle and high income countries where disease was more likely to be seasonal. The level of country development was a stronger predictor of strength of seasonality (P=0.001) than geographical location or climate. However, the observation of distinctly different seasonal patterns of rotavirus disease in some countries with similar geographical location, climate and level of development indicate that a single unifying explanation for variation in seasonality of rotavirus disease is unlikely. Conclusion While no unifying explanation emerged for varying rotavirus seasonality globally, the country income level was somewhat more predictive of the likelihood of having seasonal disease than other factors. Future evaluation of the effect of rotavirus vaccination on seasonal patterns of disease in different settings may help understand factors that drive the global seasonality of rotavirus disease. PMID:23190782

  7. The Global Precipitation Climatology Project: First Algorithm Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Arkin, Phillip A.; Xie, Pingping

    1994-01-01

    The Global Precipitation Climatology Project (GPCP) was established by the World Climate Research Program to produce global analyses of the area- and time-averaged precipitation for use in climate research. To achieve the required spatial coverage, the GPCP uses simple rainfall estimates derived from IR and microwave satellite observations. In this paper, we describe the GPCP and its first Algorithm Intercomparison Project (AIP/1), which compared a variety of rainfall estimates derived from Geostationary Meteorological Satellite visible and IR observations and Special Sensor Microwave/Imager (SSM/I) microwave observations with rainfall derived from a combination of radar and raingage data over the Japanese islands and the adjacent ocean regions during the June and mid-July through mid-August periods of 1989. To investigate potential improvements in the use of satellite IR data for the estimation of large-scale rainfall for the GPCP, the relationship between rainfall and the fractional coverage of cold clouds in the AIP/1 dataset is examined. Linear regressions between fractional coverage and rainfall are analyzed for a number of latitude-longitude areas and for a range of averaging times. The results show distinct differences in the character of the relationship for different portions of the area. These results suggest that the simple IR-based estimation technique currently used in the GPCP can be used to estimate rainfall for global tropical and subtropical areas, provided that a method for adjusting the proportional coefficient for varying areas and seasons can be determined.

  8. Downscaling soil moisture over East Asia through multi-sensor data fusion and optimization of regression trees

    NASA Astrophysics Data System (ADS)

    Park, Seonyoung; Im, Jungho; Park, Sumin; Rhee, Jinyoung

    2017-04-01

    Soil moisture is one of the most important keys for understanding regional and global climate systems. Soil moisture is directly related to agricultural processes as well as hydrological processes because soil moisture highly influences vegetation growth and determines water supply in the agroecosystem. Accurate monitoring of the spatiotemporal pattern of soil moisture is important. Soil moisture has been generally provided through in situ measurements at stations. Although field survey from in situ measurements provides accurate soil moisture with high temporal resolution, it requires high cost and does not provide the spatial distribution of soil moisture over large areas. Microwave satellite (e.g., advanced Microwave Scanning Radiometer on the Earth Observing System (AMSR2), the Advanced Scatterometer (ASCAT), and Soil Moisture Active Passive (SMAP)) -based approaches and numerical models such as Global Land Data Assimilation System (GLDAS) and Modern- Era Retrospective Analysis for Research and Applications (MERRA) provide spatial-temporalspatiotemporally continuous soil moisture products at global scale. However, since those global soil moisture products have coarse spatial resolution ( 25-40 km), their applications for agriculture and water resources at local and regional scales are very limited. Thus, soil moisture downscaling is needed to overcome the limitation of the spatial resolution of soil moisture products. In this study, GLDAS soil moisture data were downscaled up to 1 km spatial resolution through the integration of AMSR2 and ASCAT soil moisture data, Shuttle Radar Topography Mission (SRTM) Digital Elevation Model (DEM), and Moderate Resolution Imaging Spectroradiometer (MODIS) data—Land Surface Temperature, Normalized Difference Vegetation Index, and Land cover—using modified regression trees over East Asia from 2013 to 2015. Modified regression trees were implemented using Cubist, a commercial software tool based on machine learning. An optimization based on pruning of rules derived from the modified regression trees was conducted. Root Mean Square Error (RMSE) and Correlation coefficients (r) were used to optimize the rules, and finally 59 rules from modified regression trees were produced. The results show high validation r (0.79) and low validation RMSE (0.0556m3/m3). The 1 km downscaled soil moisture was evaluated using ground soil moisture data at 14 stations, and both soil moisture data showed similar temporal patterns (average r=0.51 and average RMSE=0.041). The spatial distribution of the 1 km downscaled soil moisture well corresponded with GLDAS soil moisture that caught both extremely dry and wet regions. Correlation between GLDAS and the 1 km downscaled soil moisture during growing season was positive (mean r=0.35) in most regions.

  9. The comparison of robust partial least squares regression with robust principal component regression on a real

    NASA Astrophysics Data System (ADS)

    Polat, Esra; Gunay, Suleyman

    2013-10-01

    One of the problems encountered in Multiple Linear Regression (MLR) is multicollinearity, which causes the overestimation of the regression parameters and increase of the variance of these parameters. Hence, in case of multicollinearity presents, biased estimation procedures such as classical Principal Component Regression (CPCR) and Partial Least Squares Regression (PLSR) are then performed. SIMPLS algorithm is the leading PLSR algorithm because of its speed, efficiency and results are easier to interpret. However, both of the CPCR and SIMPLS yield very unreliable results when the data set contains outlying observations. Therefore, Hubert and Vanden Branden (2003) have been presented a robust PCR (RPCR) method and a robust PLSR (RPLSR) method called RSIMPLS. In RPCR, firstly, a robust Principal Component Analysis (PCA) method for high-dimensional data on the independent variables is applied, then, the dependent variables are regressed on the scores using a robust regression method. RSIMPLS has been constructed from a robust covariance matrix for high-dimensional data and robust linear regression. The purpose of this study is to show the usage of RPCR and RSIMPLS methods on an econometric data set, hence, making a comparison of two methods on an inflation model of Turkey. The considered methods have been compared in terms of predictive ability and goodness of fit by using a robust Root Mean Squared Error of Cross-validation (R-RMSECV), a robust R2 value and Robust Component Selection (RCS) statistic.

  10. A Study on the Potential Applications of Satellite Data in Air Quality Monitoring and Forecasting

    NASA Technical Reports Server (NTRS)

    Li, Can; Hsu, N. Christina; Tsay, Si-Chee

    2011-01-01

    In this study we explore the potential applications of MODIS (Moderate Resolution Imaging Spectroradiometer) -like satellite sensors in air quality research for some Asian regions. The MODIS aerosol optical thickness (AOT), NCEP global reanalysis meteorological data, and daily surface PM(sub 10) concentrations over China and Thailand from 2001 to 2009 were analyzed using simple and multiple regression models. The AOT-PM(sub 10) correlation demonstrates substantial seasonal and regional difference, likely reflecting variations in aerosol composition and atmospheric conditions, Meteorological factors, particularly relative humidity, were found to influence the AOT-PM(sub 10) relationship. Their inclusion in regression models leads to more accurate assessment of PM(sub 10) from space borne observations. We further introduced a simple method for employing the satellite data to empirically forecast surface particulate pollution, In general, AOT from the previous day (day 0) is used as a predicator variable, along with the forecasted meteorology for the following day (day 1), to predict the PM(sub 10) level for day 1. The contribution of regional transport is represented by backward trajectories combined with AOT. This method was evaluated through PM(sub 10) hindcasts for 2008-2009, using ohservations from 2005 to 2007 as a training data set to obtain model coefficients. For five big Chinese cities, over 50% of the hindcasts have percentage error less than or equal to 30%. Similar performance was achieved for cities in northern Thailand. The MODIS AOT data are responsible for at least part of the demonstrated forecasting skill. This method can be easily adapted for other regions, but is probably most useful for those having sparse ground monitoring networks or no access to sophisticated deterministic models. We also highlight several existing issues, including some inherent to a regression-based approach as exemplified by a case study for Beijing, Further studies will be necessa1Y before satellite data can see more extensive applications in the operational air quality monitoring and forecasting.

  11. Global motion perception is associated with motor function in 2-year-old children.

    PubMed

    Thompson, Benjamin; McKinlay, Christopher J D; Chakraborty, Arijit; Anstice, Nicola S; Jacobs, Robert J; Paudel, Nabin; Yu, Tzu-Ying; Ansell, Judith M; Wouldes, Trecia A; Harding, Jane E

    2017-09-29

    The dorsal visual processing stream that includes V1, motion sensitive area V5 and the posterior parietal lobe, supports visually guided motor function. Two recent studies have reported associations between global motion perception, a behavioural measure of processing in V5, and motor function in pre-school and school aged children. This indicates a relationship between visual and motor development and also supports the use of global motion perception to assess overall dorsal stream function in studies of human neurodevelopment. We investigated whether associations between vision and motor function were present at 2 years of age, a substantially earlier stage of development. The Bayley III test of Infant and Toddler Development and measures of vision including visual acuity (Cardiff Acuity Cards), stereopsis (Lang stereotest) and global motion perception were attempted in 404 2-year-old children (±4 weeks). Global motion perception (quantified as a motion coherence threshold) was assessed by observing optokinetic nystagmus in response to random dot kinematograms of varying coherence. Linear regression revealed that global motion perception was modestly, but statistically significantly associated with Bayley III composite motor (r 2 =0.06, P<0.001, n=375) and gross motor scores (r 2 =0.06, p<0.001, n=375). The associations remained significant when language score was included in the regression model. In addition, when language score was included in the model, stereopsis was significantly associated with composite motor and fine motor scores, but unaided visual acuity was not statistically significantly associated with any of the motor scores. These results demonstrate that global motion perception and binocular vision are associated with motor function at an early stage of development. Global motion perception can be used as a partial measure of dorsal stream function from early childhood. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A Logistic Regression and Markov Chain Model for the Prediction of Nation-state Violent Conflicts and Transitions

    DTIC Science & Technology

    2016-03-24

    McCarthy, Blood Meridian 1.1 General Issue Violent conflict between competing groups has been a pervasive and driving force for all of human history...It has evolved from small skirmishes between unarmed groups , wielding rudimentary weapons, to industrialized global conflagrations. Global...methodology is presented in Figure 2. Figure 2: Study Methodology 5 1.6 Study Assumptions and Limitations Assumptions Four underlying assumptions were

  13. The Calibration of AVHRR/3 Visible Dual Gain Using Meteosat-8 as a MODIS Calibration Transfer Medium

    NASA Technical Reports Server (NTRS)

    Avey, Lance; Garber, Donald; Nguyen, Louis; Minnis, Patrick

    2007-01-01

    This viewgraph presentation reviews the NOAA-17 AVHRR visible channels calibrated against MET-8/MODIS using dual gain regression methods. The topics include: 1) Motivation; 2) Methodology; 3) Dual Gain Regression Methods; 4) Examples of Regression methods; 5) AVHRR/3 Regression Strategy; 6) Cross-Calibration Method; 7) Spectral Response Functions; 8) MET8/NOAA-17; 9) Example of gain ratio adjustment; 10) Effect of mixed low/high count FOV; 11) Monitor dual gains over time; and 12) Conclusions

  14. Regression-Based Norms for a Bi-factor Model for Scoring the Brief Test of Adult Cognition by Telephone (BTACT).

    PubMed

    Gurnani, Ashita S; John, Samantha E; Gavett, Brandon E

    2015-05-01

    The current study developed regression-based normative adjustments for a bi-factor model of the The Brief Test of Adult Cognition by Telephone (BTACT). Archival data from the Midlife Development in the United States-II Cognitive Project were used to develop eight separate linear regression models that predicted bi-factor BTACT scores, accounting for age, education, gender, and occupation-alone and in various combinations. All regression models provided statistically significant fit to the data. A three-predictor regression model fit best and accounted for 32.8% of the variance in the global bi-factor BTACT score. The fit of the regression models was not improved by gender. Eight different regression models are presented to allow the user flexibility in applying demographic corrections to the bi-factor BTACT scores. Occupation corrections, while not widely used, may provide useful demographic adjustments for adult populations or for those individuals who have attained an occupational status not commensurate with expected educational attainment. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Semisupervised Clustering by Iterative Partition and Regression with Neuroscience Applications

    PubMed Central

    Qian, Guoqi; Wu, Yuehua; Ferrari, Davide; Qiao, Puxue; Hollande, Frédéric

    2016-01-01

    Regression clustering is a mixture of unsupervised and supervised statistical learning and data mining method which is found in a wide range of applications including artificial intelligence and neuroscience. It performs unsupervised learning when it clusters the data according to their respective unobserved regression hyperplanes. The method also performs supervised learning when it fits regression hyperplanes to the corresponding data clusters. Applying regression clustering in practice requires means of determining the underlying number of clusters in the data, finding the cluster label of each data point, and estimating the regression coefficients of the model. In this paper, we review the estimation and selection issues in regression clustering with regard to the least squares and robust statistical methods. We also provide a model selection based technique to determine the number of regression clusters underlying the data. We further develop a computing procedure for regression clustering estimation and selection. Finally, simulation studies are presented for assessing the procedure, together with analyzing a real data set on RGB cell marking in neuroscience to illustrate and interpret the method. PMID:27212939

  16. Sample size determination for logistic regression on a logit-normal distribution.

    PubMed

    Kim, Seongho; Heath, Elisabeth; Heilbrun, Lance

    2017-06-01

    Although the sample size for simple logistic regression can be readily determined using currently available methods, the sample size calculation for multiple logistic regression requires some additional information, such as the coefficient of determination ([Formula: see text]) of a covariate of interest with other covariates, which is often unavailable in practice. The response variable of logistic regression follows a logit-normal distribution which can be generated from a logistic transformation of a normal distribution. Using this property of logistic regression, we propose new methods of determining the sample size for simple and multiple logistic regressions using a normal transformation of outcome measures. Simulation studies and a motivating example show several advantages of the proposed methods over the existing methods: (i) no need for [Formula: see text] for multiple logistic regression, (ii) available interim or group-sequential designs, and (iii) much smaller required sample size.

  17. A Global Model for Bankruptcy Prediction

    PubMed Central

    Alaminos, David; del Castillo, Agustín; Fernández, Manuel Ángel

    2016-01-01

    The recent world financial crisis has increased the number of bankruptcies in numerous countries and has resulted in a new area of research which responds to the need to predict this phenomenon, not only at the level of individual countries, but also at a global level, offering explanations of the common characteristics shared by the affected companies. Nevertheless, few studies focus on the prediction of bankruptcies globally. In order to compensate for this lack of empirical literature, this study has used a methodological framework of logistic regression to construct predictive bankruptcy models for Asia, Europe and America, and other global models for the whole world. The objective is to construct a global model with a high capacity for predicting bankruptcy in any region of the world. The results obtained have allowed us to confirm the superiority of the global model in comparison to regional models over periods of up to three years prior to bankruptcy. PMID:27880810

  18. Local warming: daily temperature change influences belief in global warming.

    PubMed

    Li, Ye; Johnson, Eric J; Zaval, Lisa

    2011-04-01

    Although people are quite aware of global warming, their beliefs about it may be malleable; specifically, their beliefs may be constructed in response to questions about global warming. Beliefs may reflect irrelevant but salient information, such as the current day's temperature. This replacement of a more complex, less easily accessed judgment with a simple, more accessible one is known as attribute substitution. In three studies, we asked residents of the United States and Australia to report their opinions about global warming and whether the temperature on the day of the study was warmer or cooler than usual. Respondents who thought that day was warmer than usual believed more in and had greater concern about global warming than did respondents who thought that day was colder than usual. They also donated more money to a global-warming charity if they thought that day seemed warmer than usual. We used instrumental variable regression to rule out some alternative explanations.

  19. Improved CRDS δ13C Stability Through New Calibration Application For CO2 and CH4

    NASA Astrophysics Data System (ADS)

    Arata, C.; Rella, C.

    2014-12-01

    Stable carbon isotope ratio measurements of CO2 and CH4 provide valuable insight into global and regional sources and sinks of the two most important greenhouse gasses. Methodologies based on Cavity Ring-Down Spectroscopy (CRDS) have been developed capable of delivering δ13C measurements with a precision greater than 0.12 permil for CO2 and 0.4 permil for CH4 (1 hour window, 5 minute average). Here we present a method to further improve this measurement's stability. We have developed a two point calibration method which corrects for δ13C drift due to a dependance on carbon species concentration. This method calibrates for both carbon species concentration as well as δ13C. We go on to show that this added stability is especially valuable when using carbon isotope data in linear regression models such as Keeling plots, where even small amounts of error can be magnified to give inconclusive results. This method is demonstrated in both laboratory and ambient atmospheric conditions, and we demonstrate how to select the calibration frequency.

  20. The Satellite Clock Bias Prediction Method Based on Takagi-Sugeno Fuzzy Neural Network

    NASA Astrophysics Data System (ADS)

    Cai, C. L.; Yu, H. G.; Wei, Z. C.; Pan, J. D.

    2017-05-01

    The continuous improvement of the prediction accuracy of Satellite Clock Bias (SCB) is the key problem of precision navigation. In order to improve the precision of SCB prediction and better reflect the change characteristics of SCB, this paper proposes an SCB prediction method based on the Takagi-Sugeno fuzzy neural network. Firstly, the SCB values are pre-treated based on their characteristics. Then, an accurate Takagi-Sugeno fuzzy neural network model is established based on the preprocessed data to predict SCB. This paper uses the precise SCB data with different sampling intervals provided by IGS (International Global Navigation Satellite System Service) to realize the short-time prediction experiment, and the results are compared with the ARIMA (Auto-Regressive Integrated Moving Average) model, GM(1,1) model, and the quadratic polynomial model. The results show that the Takagi-Sugeno fuzzy neural network model is feasible and effective for the SCB short-time prediction experiment, and performs well for different types of clocks. The prediction results for the proposed method are better than the conventional methods obviously.

  1. A positional misalignment correction method for Fourier ptychographic microscopy based on simulated annealing

    NASA Astrophysics Data System (ADS)

    Sun, Jiasong; Zhang, Yuzhen; Chen, Qian; Zuo, Chao

    2017-02-01

    Fourier ptychographic microscopy (FPM) is a newly developed super-resolution technique, which employs angularly varying illuminations and a phase retrieval algorithm to surpass the diffraction limit of a low numerical aperture (NA) objective lens. In current FPM imaging platforms, accurate knowledge of LED matrix's position is critical to achieve good recovery quality. Furthermore, considering such a wide field-of-view (FOV) in FPM, different regions in the FOV have different sensitivity of LED positional misalignment. In this work, we introduce an iterative method to correct position errors based on the simulated annealing (SA) algorithm. To improve the efficiency of this correcting process, large number of iterations for several images with low illumination NAs are firstly implemented to estimate the initial values of the global positional misalignment model through non-linear regression. Simulation and experimental results are presented to evaluate the performance of the proposed method and it is demonstrated that this method can both improve the quality of the recovered object image and relax the LED elements' position accuracy requirement while aligning the FPM imaging platforms.

  2. Knowledge and use of emergency contraception: a multicountry analysis.

    PubMed

    Palermo, Tia; Bleck, Jennifer; Westley, Elizabeth

    2014-06-01

    Globally, evidence on knowledge and use of emergency contraception from population-based data is limited, though such information would be helpful in increasing access to the method. We examined knowledge and use of emergency contraception in 45 countries using population-based survey data. Demographic and Health Survey (DHS) data on women aged 15-49 were analyzed by country in logistic regressions to identify associations between women's characteristics and their having heard of emergency contraception or having ever used it. Trends were examined, by region and globally, according to individual, household and community descriptors, including women's age, education, marital status, socioeconomic status, and urban or rural location. The proportion of women who had heard of emergency contraception ranged from 2% in Chad to 66% in Colombia, and the proportion of sexually experienced women who had used it ranged from less than 0.1% in Chad to 12% in Colombia. The odds of having heard of or used the method generally increased with wealth, and although the relationship between marital status and knowing of the method varied by region, never-married women were more likely than married women to have used emergency contraception in countries where significant differences existed. In some countries, urban residence was associated with having heard of the method, but in only three countries were women from urban areas more likely to have used it. Our findings support the need for broader dissemination of information on emergency contraception, particularly among low-income individuals. Variations in use and knowledge within regions suggest a need for programs to be tailored to country-level characteristics.

  3. The spectral details of observed and simulated short-term water vapor feedbacks of El Niño-Southern Oscillation

    NASA Astrophysics Data System (ADS)

    Pan, F.; Huang, X.; Chen, X.

    2015-12-01

    Radiative kernel method has been validated and widely used in the study of climate feedbacks. This study uses spectrally resolved longwave radiative kernels to examine the short-term water vapor feedbacks associated with the ENSO cycles. Using a 500-year GFDL CM3 and a 100-year NCAR CCSM4 pre-industry control simulation, we have constructed two sets of longwave spectral radiative kernels. We then composite El Niño, La Niña and ENSO-neutral states and estimate the water vapor feedbacks associated with the El Niño and La Niña phases of ENSO cycles in both simulations. Similar analysis is also applied to 35-year (1979-2014) ECMWF ERA-interim reanalysis data, which is deemed as observational results here. When modeled and observed broadband feedbacks are compared to each other, they show similar geographic patterns but with noticeable discrepancies in the contrast between the tropics and extra-tropics. Especially, in El Niño phase, the feedback estimated from reanalysis is much greater than those from the model simulations. Considering the observational data span, we carry out a sensitivity test to explore the variability of feedback-deriving using 35-year data. To do so, we calculate the water vapor feedback within every 35-year segment of the GFDL CM3 control run by two methods: one is to composite El Nino or La Nina phases as mentioned above and the other is to regressing the TOA flux perturbation caused by water vapor change (δR_H­2O) against the global-mean surface temperature a­­­­nomaly. We find that the short-term feedback strengths derived from composite method can change considerably from one segment to another segment, while the feedbacks by regression method are less sensitive to the choice of segment and their strengths are also much smaller than those from composite analysis. This study suggests that caution is warranted in order to infer long-term feedbacks from a few decades of observations. When spectral details of the global-mean feedbacks are examined, more inconsistencies can be revealed in many spectral bands, especially H2O continuum absorption bands and window regions. These discrepancies can be attributed back to differences in observed and modeled water vapor profiles in responses to tropical SST.

  4. Spatial and temporal epidemiological analysis in the Big Data era.

    PubMed

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle large datasets faster than classical regression approaches, are now also used to analyse spatial and spatio-temporal data. Multi-criteria decision analysis methods have gained greater acceptance, due in part, to the need to increasingly combine data from diverse sources including published scientific information and expert opinion in an attempt to fill important knowledge gaps. The opportunities for more effective prevention, detection and control of animal health threats arising from these developments are immense, but not without risks given the different types, and much higher frequency, of biases associated with these data. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Controlling Type I Error Rates in Assessing DIF for Logistic Regression Method Combined with SIBTEST Regression Correction Procedure and DIF-Free-Then-DIF Strategy

    ERIC Educational Resources Information Center

    Shih, Ching-Lin; Liu, Tien-Hsiang; Wang, Wen-Chung

    2014-01-01

    The simultaneous item bias test (SIBTEST) method regression procedure and the differential item functioning (DIF)-free-then-DIF strategy are applied to the logistic regression (LR) method simultaneously in this study. These procedures are used to adjust the effects of matching true score on observed score and to better control the Type I error…

  6. Global atmospheric emissions of polycyclic aromatic hydrocarbons from 1960 to 2008 and future predictions.

    PubMed

    Shen, Huizhong; Huang, Ye; Wang, Rong; Zhu, Dan; Li, Wei; Shen, Guofeng; Wang, Bin; Zhang, Yanyan; Chen, Yuanchen; Lu, Yan; Chen, Han; Li, Tongchao; Sun, Kang; Li, Bengang; Liu, Wenxin; Liu, Junfeng; Tao, Shu

    2013-06-18

    Global atmospheric emissions of 16 polycyclic aromatic hydrocarbons (PAHs) from 69 major sources were estimated for a period from 1960 to 2030. Regression models and a technology split method were used to estimate country and time specific emission factors, resulting in a new estimate of PAH emission factor variation among different countries and over time. PAH emissions in 2007 were spatially resolved to 0.1° × 0.1° grids based on a newly developed global high-resolution fuel combustion inventory (PKU-FUEL-2007). The global total annual atmospheric emission of 16 PAHs in 2007 was 504 Gg (331-818 Gg, as interquartile range), with residential/commercial biomass burning (60.5%), open-field biomass burning (agricultural waste burning, deforestation, and wildfire, 13.6%), and petroleum consumption by on-road motor vehicles (12.8%) as the major sources. South (87 Gg), East (111 Gg), and Southeast Asia (52 Gg) were the regions with the highest PAH emission densities, contributing half of the global total PAH emissions. Among the global total PAH emissions, 6.19% of the emissions were in the form of high molecular weight carcinogenic compounds and the percentage of the carcinogenic PAHs was higher in developing countries (6.22%) than in developed countries (5.73%), due to the differences in energy structures and the disparities of technology. The potential health impact of the PAH emissions was greatest in the parts of the world with high anthropogenic PAH emissions, because of the overlap of the high emissions and high population densities. Global total PAH emissions peaked at 592 Gg in 1995 and declined gradually to 499 Gg in 2008. Total PAH emissions from developed countries peaked at 122 Gg in the early 1970s and decreased to 38 Gg in 2008. Simulation of PAH emissions from 2009 to 2030 revealed that PAH emissions in developed and developing countries would decrease by 46-71% and 48-64%, respectively, based on the six IPCC SRES scenarios.

  7. Global atmospheric emissions of polycyclic aromatic hydrocarbons from 1960 to 2008 and future predictions

    PubMed Central

    Shen, Huizhong; Huang, Ye; Wang, Rong; Zhu, Dan; Li, Wei; Shen, Guofeng; Wang, Bin; Zhang, Yanyan; Chen, Yuanchen; Lu, Yan; Chen, Han; Li, Tongchao; Sun, Kang; Li, Bengang; Liu, Wenxin; Liu, Junfeng; Tao, Shu

    2013-01-01

    Global atmospheric emissions of 16 polycyclic aromatic hydrocarbons (PAHs) from 69 major sources were estimated for a period from 1960 to 2030. Regression models and a technology split method were used to estimate country and time specific emission factors, resulting in a new estimate of PAH emission factor variation among different countries and over time. PAH emissions in 2007 were spatially resolved to 0.1°× 0.1° grids based on a newly developed global high-resolution fuel combustion inventory (PKU-FUEL-2007). The global total annual atmospheric emission of 16 PAHs in 2007 was 504 Gg (331-818 Gg, as interquartile range), with residential/commercial biomass burning (60.5%), open-field biomass burning (agricultural waste burning, deforestation, and wildfire, 13.6%), and petroleum consumption by on-road motor vehicles (12.8%) as the major sources. South (87 Gg), East (111 Gg), and Southeast Asia (52 Gg) were the regions with the highest PAH emission densities, contributing half of the global total PAH emissions. Among the global total PAH emissions, 6.19% of the emissions were in the form of high molecular weight carcinogenic compounds and the percentage of the carcinogenic PAHs was higher in developing countries (6.22%) than in developed countries (5.73%), due to the differences in energy structures and the disparities of technology. The potential health impact of the PAH emissions was greatest in the parts of the world with high anthropogenic PAH emissions, because of the overlap of the high emissions and high population densities. Global total PAH emissions peaked at 592 Gg in 1995 and declined gradually to 499 Gg in 2008. Total PAH emissions from developed countries peaked at 122 Gg in the early 1970s and decreased to 38 Gg in 2008. Simulation of PAH emissions from 2009 to 2030 revealed that PAH emissions in developed and developing countries would decrease by 46-71% and 48-64%, respectively, based on the six IPCC SRES scenarios. PMID:23659377

  8. The Power of Neuroimaging Biomarkers for Screening Frontotemporal Dementia

    PubMed Central

    McMillan, Corey T.; Avants, Brian B.; Cook, Philip; Ungar, Lyle; Trojanowski, John Q.; Grossman, Murray

    2014-01-01

    Frontotemporal dementia (FTD) is a clinically and pathologically heterogeneous neurodegenerative disease that can result from either frontotemporal lobar degeneration (FTLD) or Alzheimer’s disease (AD) pathology. It is critical to establish statistically powerful biomarkers that can achieve substantial cost-savings and increase feasibility of clinical trials. We assessed three broad categories of neuroimaging methods to screen underlying FTLD and AD pathology in a clinical FTD series: global measures (e.g., ventricular volume), anatomical volumes of interest (VOIs) (e.g., hippocampus) using a standard atlas, and data-driven VOIs using Eigenanatomy. We evaluated clinical FTD patients (N=93) with cerebrospinal fluid, gray matter (GM) MRI, and diffusion tensor imaging (DTI) to assess whether they had underlying FTLD or AD pathology. Linear regression was performed to identify the optimal VOIs for each method in a training dataset and then we evaluated classification sensitivity and specificity in an independent test cohort. Power was evaluated by calculating minimum sample sizes (mSS) required in the test classification analyses for each model. The data-driven VOI analysis using a multimodal combination of GM MRI and DTI achieved the greatest classification accuracy (89% SENSITIVE; 89% SPECIFIC) and required a lower minimum sample size (N=26) relative to anatomical VOI and global measures. We conclude that a data-driven VOI approach employing Eigenanatomy provides more accurate classification, benefits from increased statistical power in unseen datasets, and therefore provides a robust method for screening underlying pathology in FTD patients for entry into clinical trials. PMID:24687814

  9. Empirical testing of two models for staging antidepressant treatment resistance.

    PubMed

    Petersen, Timothy; Papakostas, George I; Posternak, Michael A; Kant, Alexis; Guyker, Wendy M; Iosifescu, Dan V; Yeung, Albert S; Nierenberg, Andrew A; Fava, Maurizio

    2005-08-01

    An increasing amount of attention has been paid to treatment resistant depression. Although it is quite common to observe nonremission to not just one but consecutive antidepressant treatments during a major depressive episode, a relationship between the likelihood of achieving remission and one's degree of resistance is not clearly known at this time. This study was undertaken to empirically test 2 recent models for staging treatment resistance. Psychiatrists from 2 academic sites reviewed charts of patients on their caseloads. Clinical Global Impressions-Severity (CGI-S) and Clinical Global Impressions-Improvement (CGI-I) scales were used to measure severity of depression and response to treatment, and 2 treatment-resistant staging scores were classified for each patient using the Massachusetts General Hospital staging method (MGH-S) and the Thase and Rush staging method (TR-S). Out of the 115 patient records reviewed, 58 (49.6%) patients remitted at some point during treatment. There was a significant positive correlation between the 2 staging scores, and logistic regression results indicated that greater MGH-S scores, but not TR-S scores, predicted nonremission. This study suggests that the hierarchical manner in which the field has typically gauged levels of treatment resistance may not be strongly supported by empirical evidence. This study suggests that the MGH staging model may offer some advantages over the staging method by Thase and Rush, as it generates a continuous score that considers both number of trials and intensity/optimization of each trial.

  10. MEaSUREs Land Surface Temperature from GOES Satellites

    NASA Astrophysics Data System (ADS)

    Pinker, Rachel T.; Chen, Wen; Ma, Yingtao; Islam, Tanvir; Borbas, Eva; Hain, Chris; Hulley, Glynn; Hook, Simon

    2017-04-01

    Information on Land Surface Temperature (LST) can be generated from observations made from satellites in low Earth orbit (LEO) such as MODIS and ASTER and by sensors in geostationary Earth orbit (GEO) such as GOES. Under a project titled: "A Unified and Coherent Land Surface Temperature and Emissivity Earth System Data Record for Earth Science" led by Jet Propulsion Laboratory, an effort is underway to develop long term consistent information from both such systems. In this presentation we will describe an effort to derive LST information from GOES satellites. Results will be presented from two approaches: 1) based on regression developed from a wide range of simulations using MODTRAN, SeeBor Version 5.0 global atmospheric profiles and the CAMEL (Combined ASTER and MODIS Emissivity for Land) product based on the standard University of Wisconsin 5 km emissivity values (UWIREMIS) and the ASTER Global Emissivity Database (GED) product; 2) RTTOV radiative transfer model driven with MERRA-2 reanalysis fields. We will present results of evaluation of these two methods against various products, such as MOD11, and ground observations for the five year period of (2004-2008).

  11. Statistical classification of drug incidents due to look-alike sound-alike mix-ups.

    PubMed

    Wong, Zoie Shui Yee

    2016-06-01

    It has been recognised that medication names that look or sound similar are a cause of medication errors. This study builds statistical classifiers for identifying medication incidents due to look-alike sound-alike mix-ups. A total of 227 patient safety incident advisories related to medication were obtained from the Canadian Patient Safety Institute's Global Patient Safety Alerts system. Eight feature selection strategies based on frequent terms, frequent drug terms and constituent terms were performed. Statistical text classifiers based on logistic regression, support vector machines with linear, polynomial, radial-basis and sigmoid kernels and decision tree were trained and tested. The models developed achieved an average accuracy of above 0.8 across all the model settings. The receiver operating characteristic curves indicated the classifiers performed reasonably well. The results obtained in this study suggest that statistical text classification can be a feasible method for identifying medication incidents due to look-alike sound-alike mix-ups based on a database of advisories from Global Patient Safety Alerts. © The Author(s) 2014.

  12. Factors that influencing the usage of global distribution system

    NASA Astrophysics Data System (ADS)

    Budiasa, I. M.; Suparta, I. K.; Nadra, N. M.

    2018-01-01

    The advancement of Tourism is supported by Information and Communication Technology (ICT) innovation and changes. The use of GDS (Global Distribution System) i.e. Amadeus, Galileo, Sabre, and Worldspan in the tourism industry can increase the availability, frequency and speed of communication among the companies in providing services to potential tourists. This research is to investigate the factors that influence the actual use of GDS in the tourism industry especially travel agents, airlines and hotels in Bali. This research employed a mixed method of quantitative and qualitative approaches. Field surveys were conducted and 80 valid questionnaires were received and analyzed by using SPSS 17.0; descriptive, correlation, factor analysis and regression tests were conducted. The variables used are Perceived Ease of Use and Perceived Usefulness (Technology Acceptance Model); Awareness, Perceived Risk and Communication Channels are examined. This research revealed that Perceived Ease of Use, Perceived Usefulness, Awareness, and Communication Channels influence the Behavioural intention to use GDS, whereas Perceived Risk were found not significant influence the use of GDS. These findings enable travel agent, airline and hotel companies to make provision decision with respect to the actual use of GDS.

  13. A non-parametric consistency test of the ΛCDM model with Planck CMB data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aghamousa, Amir; Shafieloo, Arman; Hamann, Jan, E-mail: amir@aghamousa.com, E-mail: jan.hamann@unsw.edu.au, E-mail: shafieloo@kasi.re.kr

    Non-parametric reconstruction methods, such as Gaussian process (GP) regression, provide a model-independent way of estimating an underlying function and its uncertainty from noisy data. We demonstrate how GP-reconstruction can be used as a consistency test between a given data set and a specific model by looking for structures in the residuals of the data with respect to the model's best-fit. Applying this formalism to the Planck temperature and polarisation power spectrum measurements, we test their global consistency with the predictions of the base ΛCDM model. Our results do not show any serious inconsistencies, lending further support to the interpretation ofmore » the base ΛCDM model as cosmology's gold standard.« less

  14. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  15. Novice nurses' level of global interdependence identity: a quantitative research study.

    PubMed

    Kozlowski-Gibson, Maria

    2015-01-01

    Often, therapeutic relationships are cross-cultural in nature, which places both nurses and patients at risk for stress, depression, and anxiety. The purpose of this investigation was to describe novice nurses' level of global interdependence identity, as manifested by worldminded attitudes, and identify the strongest predictors of worldminded attitudes. Prospective descriptive with multiple regression study. The various nursing units of a large hospital in the great Cleveland, OH, area. The participants were novice nurses up to two years after graduation from nursing school and employed as hospital clinicians. Descriptive statistics with the mean and standard deviation of the scores was used for the delineation of the development of the participants. The study relied on a survey instrument, the Scale to Measure Worldminded Attitudes developed by Sampson and Smith (1957). The numerical data was scored and organized on a Microsoft Excel spreadsheet. The Statistical Package for Social Sciences (SPSS) version 21 was the program used to assist with analysis. The assessment of the models created through regression was completed using the model summary and analysis of variance (ANOVA). The nurses' mean level of global interdependence identity was slightly above the neutral point between extreme national-mindedness and full development of global interdependence identity. The best predictors of worldminded attitudes were immigration, patriotism, and war conceptualized under a global frame of reference. Novice nurses did not demonstrate an optimum developmental status of global interdependence identity to safeguard cross-cultural encounters with patients. The recommendation is the inclusion of immigration, patriotism, and war in the nursing curriculum and co-curriculum to promote student development and a turnaround improvement in patient experience. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Building Regression Models: The Importance of Graphics.

    ERIC Educational Resources Information Center

    Dunn, Richard

    1989-01-01

    Points out reasons for using graphical methods to teach simple and multiple regression analysis. Argues that a graphically oriented approach has considerable pedagogic advantages in the exposition of simple and multiple regression. Shows that graphical methods may play a central role in the process of building regression models. (Author/LS)

  17. Application of database methods to the prediction of B3LYP-optimized polyhedral water cluster geometries and electronic energies

    NASA Astrophysics Data System (ADS)

    Anick, David J.

    2003-12-01

    A method is described for a rapid prediction of B3LYP-optimized geometries for polyhedral water clusters (PWCs). Starting with a database of 121 B3LYP-optimized PWCs containing 2277 H-bonds, linear regressions yield formulas correlating O-O distances, O-O-O angles, and H-O-H orientation parameters, with local and global cluster descriptors. The formulas predict O-O distances with a rms error of 0.85 pm to 1.29 pm and predict O-O-O angles with a rms error of 0.6° to 2.2°. An algorithm is given which uses the O-O and O-O-O formulas to determine coordinates for the oxygen nuclei of a PWC. The H-O-H formulas then determine positions for two H's at each O. For 15 test clusters, the gap between the electronic energy of the predicted geometry and the true B3LYP optimum ranges from 0.11 to 0.54 kcal/mol or 4 to 18 cal/mol per H-bond. Linear regression also identifies 14 parameters that strongly correlate with PWC electronic energy. These descriptors include the number of H-bonds in which both oxygens carry a non-H-bonding H, the number of quadrilateral faces, the number of symmetric angles in 5- and in 6-sided faces, and the square of the cluster's estimated dipole moment.

  18. Recommendations for diagnosing effective radiative forcing from climate models for CMIP6

    DOE PAGES

    Forster, Piers M.; Richardson, Thomas; Maycock, Amanda C.; ...

    2016-10-27

    The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the eventual temperature response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea surface temperature (SST) method (ERF_fSST) has much more certainty than regression based methods. Thirty year integrations are sufficient to reducemore » the 5–95% confidence interval in global ERF_fSST to 0.1Wm ~2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the local confidence interval over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly averaged seasonally varying model’s preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30 year ERF_fSST experiments using the model’s own preindustrial climatology of SST and sea ice. The Aerosol and Chemistry Model Intercomparison Project (AerChemMIP) will also mainly use this approach. Lastly, we propose this as a standard method for diagnosing ERF and recommend that it be used across the climate modeling community to aid future comparisons.« less

  19. Recommendations for diagnosing effective radiative forcing from climate models for CMIP6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forster, Piers M.; Richardson, Thomas; Maycock, Amanda C.

    The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the eventual temperature response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea surface temperature (SST) method (ERF_fSST) has much more certainty than regression based methods. Thirty year integrations are sufficient to reducemore » the 5–95% confidence interval in global ERF_fSST to 0.1Wm ~2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the local confidence interval over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly averaged seasonally varying model’s preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30 year ERF_fSST experiments using the model’s own preindustrial climatology of SST and sea ice. The Aerosol and Chemistry Model Intercomparison Project (AerChemMIP) will also mainly use this approach. Lastly, we propose this as a standard method for diagnosing ERF and recommend that it be used across the climate modeling community to aid future comparisons.« less

  20. Estimating trends in the global mean temperature record

    NASA Astrophysics Data System (ADS)

    Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.

    2017-06-01

    Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.

  1. Recommendations for diagnosing effective radiative forcing from climate models for CMIP6

    NASA Astrophysics Data System (ADS)

    Smith, C. J.; Forster, P.; Richardson, T.; Myhre, G.; Pincus, R.

    2016-12-01

    The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the ultimate climate response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea-surface temperature (SST) method (ERF_fSST) has much more certainty than regression-based methods. Thirty-year integrations are sufficient to reduce the standard error in global ERF to 0.05 Wm-2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the standard error over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea-ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly-averaged seasonally varying model's preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30-year ERF_fSST experiments using the model's own preindustrial climatology of SST and sea-ice. The Aerosol and Chemistry Model intercomparison Project (AerChemMIP) will also mainly use this approach. We propose this as a standard method for diagnosing ERF in models and recommend that it be used across the climate modeling community to aid future comparisons.

  2. Recommendations for diagnosing effective radiative forcing from climate models for CMIP6

    NASA Astrophysics Data System (ADS)

    Forster, Piers M.; Richardson, Thomas; Maycock, Amanda C.; Smith, Christopher J.; Samset, Bjorn H.; Myhre, Gunnar; Andrews, Timothy; Pincus, Robert; Schulz, Michael

    2016-10-01

    The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the eventual temperature response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea surface temperature (SST) method (ERF_fSST) has much more certainty than regression based methods. Thirty year integrations are sufficient to reduce the 5-95% confidence interval in global ERF_fSST to 0.1 W m-2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the local confidence interval over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly averaged seasonally varying model's preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30 year ERF_fSST experiments using the model's own preindustrial climatology of SST and sea ice. The Aerosol and Chemistry Model Intercomparison Project (AerChemMIP) will also mainly use this approach. We propose this as a standard method for diagnosing ERF and recommend that it be used across the climate modeling community to aid future comparisons.

  3. Comparative study of some robust statistical methods: weighted, parametric, and nonparametric linear regression of HPLC convoluted peak responses using internal standard method in drug bioavailability studies.

    PubMed

    Korany, Mohamed A; Maher, Hadir M; Galal, Shereen M; Ragab, Marwa A A

    2013-05-01

    This manuscript discusses the application and the comparison between three statistical regression methods for handling data: parametric, nonparametric, and weighted regression (WR). These data were obtained from different chemometric methods applied to the high-performance liquid chromatography response data using the internal standard method. This was performed on a model drug Acyclovir which was analyzed in human plasma with the use of ganciclovir as internal standard. In vivo study was also performed. Derivative treatment of chromatographic response ratio data was followed by convolution of the resulting derivative curves using 8-points sin x i polynomials (discrete Fourier functions). This work studies and also compares the application of WR method and Theil's method, a nonparametric regression (NPR) method with the least squares parametric regression (LSPR) method, which is considered the de facto standard method used for regression. When the assumption of homoscedasticity is not met for analytical data, a simple and effective way to counteract the great influence of the high concentrations on the fitted regression line is to use WR method. WR was found to be superior to the method of LSPR as the former assumes that the y-direction error in the calibration curve will increase as x increases. Theil's NPR method was also found to be superior to the method of LSPR as the former assumes that errors could occur in both x- and y-directions and that might not be normally distributed. Most of the results showed a significant improvement in the precision and accuracy on applying WR and NPR methods relative to LSPR.

  4. [Visual field progression in glaucoma: cluster analysis].

    PubMed

    Bresson-Dumont, H; Hatton, J; Foucher, J; Fonteneau, M

    2012-11-01

    Visual field progression analysis is one of the key points in glaucoma monitoring, but distinction between true progression and random fluctuation is sometimes difficult. There are several different algorithms but no real consensus for detecting visual field progression. The trend analysis of global indices (MD, sLV) may miss localized deficits or be affected by media opacities. Conversely, point-by-point analysis makes progression difficult to differentiate from physiological variability, particularly when the sensitivity of a point is already low. The goal of our study was to analyse visual field progression with the EyeSuite™ Octopus Perimetry Clusters algorithm in patients with no significant changes in global indices or worsening of the analysis of pointwise linear regression. We analyzed the visual fields of 162 eyes (100 patients - 58 women, 42 men, average age 66.8 ± 10.91) with ocular hypertension or glaucoma. For inclusion, at least six reliable visual fields per eye were required, and the trend analysis (EyeSuite™ Perimetry) of visual field global indices (MD and SLV), could show no significant progression. The analysis of changes in cluster mode was then performed. In a second step, eyes with statistically significant worsening of at least one of their clusters were analyzed point-by-point with the Octopus Field Analysis (OFA). Fifty four eyes (33.33%) had a significant worsening in some clusters, while their global indices remained stable over time. In this group of patients, more advanced glaucoma was present than in stable group (MD 6.41 dB vs. 2.87); 64.82% (35/54) of those eyes in which the clusters progressed, however, had no statistically significant change in the trend analysis by pointwise linear regression. Most software algorithms for analyzing visual field progression are essentially trend analyses of global indices, or point-by-point linear regression. This study shows the potential role of analysis by clusters trend. However, for best results, it is preferable to compare the analyses of several tests in combination with morphologic exam. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  5. Evaluation of globally available precipitation data products as input for water balance models

    NASA Astrophysics Data System (ADS)

    Lebrenz, H.; Bárdossy, A.

    2009-04-01

    Subject of this study is the evaluation of globally available precipitation data products, which are intended to be used as input variables for water balance models in ungauged basins. The selected data sources are a) the Global Precipitation Climatology Centre (GPCC), b) the Global Precipitation Climatology Project (GPCP) and c) the Climate Research Unit (CRU), resulting into twelve globally available data products. The data products imply different data bases, different derivation routines and varying resolutions in time and space. For validation purposes, the ground data from South Africa were screened on homogeneity and consistency by various tests and an outlier detection using multi-linear regression was performed. External Drift Kriging was subsequently applied on the ground data and the resulting precipitation arrays were compared to the different products with respect to quantity and variance.

  6. A Unified and Comprehensible View of Parametric and Kernel Methods for Genomic Prediction with Application to Rice.

    PubMed

    Jacquin, Laval; Cao, Tuong-Vi; Ahmadi, Nourollah

    2016-01-01

    One objective of this study was to provide readers with a clear and unified understanding of parametric statistical and kernel methods, used for genomic prediction, and to compare some of these in the context of rice breeding for quantitative traits. Furthermore, another objective was to provide a simple and user-friendly R package, named KRMM, which allows users to perform RKHS regression with several kernels. After introducing the concept of regularized empirical risk minimization, the connections between well-known parametric and kernel methods such as Ridge regression [i.e., genomic best linear unbiased predictor (GBLUP)] and reproducing kernel Hilbert space (RKHS) regression were reviewed. Ridge regression was then reformulated so as to show and emphasize the advantage of the kernel "trick" concept, exploited by kernel methods in the context of epistatic genetic architectures, over parametric frameworks used by conventional methods. Some parametric and kernel methods; least absolute shrinkage and selection operator (LASSO), GBLUP, support vector machine regression (SVR) and RKHS regression were thereupon compared for their genomic predictive ability in the context of rice breeding using three real data sets. Among the compared methods, RKHS regression and SVR were often the most accurate methods for prediction followed by GBLUP and LASSO. An R function which allows users to perform RR-BLUP of marker effects, GBLUP and RKHS regression, with a Gaussian, Laplacian, polynomial or ANOVA kernel, in a reasonable computation time has been developed. Moreover, a modified version of this function, which allows users to tune kernels for RKHS regression, has also been developed and parallelized for HPC Linux clusters. The corresponding KRMM package and all scripts have been made publicly available.

  7. Caffeine and Insomnia in People Living With HIV From the Miami Adult Studies on HIV (MASH) Cohort.

    PubMed

    Ramamoorthy, Venkataraghavan; Campa, Adriana; Rubens, Muni; Martinez, Sabrina S; Fleetwood, Christina; Stewart, Tiffanie; Liuzzi, Juan P; George, Florence; Khan, Hafiz; Li, Yinghui; Baum, Marianna K

    We explored the relationship between caffeine consumption, insomnia, and HIV disease progression (CD4+ T cell counts and HIV viral loads). Caffeine intake and insomnia levels were measured using the Modified Caffeine Consumption Questionnaire and the Pittsburgh Insomnia Rating Scale (PIRS) in 130 clinically stable participants who were living with HIV, taking antiretroviral therapy, and recruited from the Miami Adult Studies on HIV cohort. Linear regressions showed that caffeine consumption was significantly and adversely associated with distress score, quality-of-life score, and global PIRS score. Linear regression analyses also showed that global PIRS score was significantly associated with lower CD4+ T cell counts and higher HIV viral loads. Caffeine could have precipitated insomnia in susceptible people living with HIV, which could be detrimental to their disease progression states. Copyright © 2017 Association of Nurses in AIDS Care. Published by Elsevier Inc. All rights reserved.

  8. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  9. Post-processing through linear regression

    NASA Astrophysics Data System (ADS)

    van Schaeybroeck, B.; Vannitsem, S.

    2011-03-01

    Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.

  10. Phobic Anxiety and Plasma Levels of Global Oxidative Stress in Women.

    PubMed

    Hagan, Kaitlin A; Wu, Tianying; Rimm, Eric B; Eliassen, A Heather; Okereke, Olivia I

    2015-01-01

    Psychological distress has been hypothesized to be associated with adverse biologic states such as higher oxidative stress and inflammation. Yet, little is known about associations between a common form of distress - phobic anxiety - and global oxidative stress. Thus, we related phobic anxiety to plasma fluorescent oxidation products (FlOPs), a global oxidative stress marker. We conducted a cross-sectional analysis among 1,325 women (aged 43-70 years) from the Nurses' Health Study. Phobic anxiety was measured using the Crown-Crisp Index (CCI). Adjusted least-squares mean log-transformed FlOPs were calculated across phobic categories. Logistic regression models were used to calculate odds ratios (OR) comparing the highest CCI category (≥6 points) vs. lower scores, across FlOPs quartiles. No association was found between phobic anxiety categories and mean FlOP levels in multivariable adjusted linear models. Similarly, in multivariable logistic regression models there were no associations between FlOPs quartiles and likelihood of being in the highest phobic category. Comparing women in the highest vs. lowest FlOPs quartiles: FlOP_360: OR=0.68 (95% CI: 0.40-1.15); FlOP_320: OR=0.99 (95% CI: 0.61-1.61); FlOP_400: OR=0.92 (95% CI: 0.52, 1.63). No cross-sectional association was found between phobic anxiety and a plasma measure of global oxidative stress in this sample of middle-aged and older women.

  11. Land cover in the Guayas Basin using SAR images from low resolution ASAR Global mode to high resolution Sentinel-1 images

    NASA Astrophysics Data System (ADS)

    Bourrel, Luc; Brodu, Nicolas; Frappart, Frédéric

    2016-04-01

    Remotely sensed images allow a frequent monitoring of land cover variations at regional and global scale. Recently launched Sentinel-1 satellite offers a global cover of land areas at an unprecedented spatial (20 m) and temporal (6 days at the Equator). We propose here to compare the performances of commonly used supervised classification techniques (i.e., k-nearest neighbors, linear and Gaussian support vector machines, naive Bayes, linear and quadratic discriminant analyzes, adaptative boosting, loggit regression, ridge regression with one-vs-one voting, random forest, extremely randomized trees) for land cover applications in the Guayas Basin, the largest river basin of the Pacific coast of Ecuator (area ~32,000 km²). The reason of this choice is the importance of this region in Ecuatorian economy as its watershed represents 13% of the total area of Ecuador where 40% of the Ecuadorian population lives. It also corresponds to the most productive region of Ecuador for agriculture and aquaculture. Fifty percents of the country shrimp farming production comes from this watershed, and represents with agriculture the largest source of revenue of the country. Similar comparisons are also performed using ENVISAT ASAR images acquired in global mode (1 km of spatial resolution). Accuracy of the results will be achieved using land cover map derived from multi-spectral images.

  12. Geodesic least squares regression for scaling studies in magnetic confinement fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, Geert

    In regression analyses for deriving scaling laws that occur in various scientific disciplines, usually standard regression methods have been applied, of which ordinary least squares (OLS) is the most popular. However, concerns have been raised with respect to several assumptions underlying OLS in its application to scaling laws. We here discuss a new regression method that is robust in the presence of significant uncertainty on both the data and the regression model. The method, which we call geodesic least squares regression (GLS), is based on minimization of the Rao geodesic distance on a probabilistic manifold. We demonstrate the superiority ofmore » the method using synthetic data and we present an application to the scaling law for the power threshold for the transition to the high confinement regime in magnetic confinement fusion devices.« less

  13. An Integrated Method to Analyze Farm Vulnerability to Climatic and Economic Variability According to Farm Configurations and Farmers' Adaptations.

    PubMed

    Martin, Guillaume; Magne, Marie-Angélina; Cristobal, Magali San

    2017-01-01

    The need to adapt to decrease farm vulnerability to adverse contextual events has been extensively discussed on a theoretical basis. We developed an integrated and operational method to assess farm vulnerability to multiple and interacting contextual changes and explain how this vulnerability can best be reduced according to farm configurations and farmers' technical adaptations over time. Our method considers farm vulnerability as a function of the raw measurements of vulnerability variables (e.g., economic efficiency of production), the slope of the linear regression of these measurements over time, and the residuals of this linear regression. The last two are extracted from linear mixed models considering a random regression coefficient (an intercept common to all farms), a global trend (a slope common to all farms), a random deviation from the general mean for each farm, and a random deviation from the general trend for each farm. Among all possible combinations, the lowest farm vulnerability is obtained through a combination of high values of measurements, a stable or increasing trend and low variability for all vulnerability variables considered. Our method enables relating the measurements, trends and residuals of vulnerability variables to explanatory variables that illustrate farm exposure to climatic and economic variability, initial farm configurations and farmers' technical adaptations over time. We applied our method to 19 cattle (beef, dairy, and mixed) farms over the period 2008-2013. Selected vulnerability variables, i.e., farm productivity and economic efficiency, varied greatly among cattle farms and across years, with means ranging from 43.0 to 270.0 kg protein/ha and 29.4-66.0% efficiency, respectively. No farm had a high level, stable or increasing trend and low residuals for both farm productivity and economic efficiency of production. Thus, the least vulnerable farms represented a compromise among measurement value, trend, and variability of both performances. No specific combination of farmers' practices emerged for reducing cattle farm vulnerability to climatic and economic variability. In the least vulnerable farms, the practices implemented (stocking rate, input use…) were more consistent with the objective of developing the properties targeted (efficiency, robustness…). Our method can be used to support farmers with sector-specific and local insights about most promising farm adaptations.

  14. An Integrated Method to Analyze Farm Vulnerability to Climatic and Economic Variability According to Farm Configurations and Farmers’ Adaptations

    PubMed Central

    Martin, Guillaume; Magne, Marie-Angélina; Cristobal, Magali San

    2017-01-01

    The need to adapt to decrease farm vulnerability to adverse contextual events has been extensively discussed on a theoretical basis. We developed an integrated and operational method to assess farm vulnerability to multiple and interacting contextual changes and explain how this vulnerability can best be reduced according to farm configurations and farmers’ technical adaptations over time. Our method considers farm vulnerability as a function of the raw measurements of vulnerability variables (e.g., economic efficiency of production), the slope of the linear regression of these measurements over time, and the residuals of this linear regression. The last two are extracted from linear mixed models considering a random regression coefficient (an intercept common to all farms), a global trend (a slope common to all farms), a random deviation from the general mean for each farm, and a random deviation from the general trend for each farm. Among all possible combinations, the lowest farm vulnerability is obtained through a combination of high values of measurements, a stable or increasing trend and low variability for all vulnerability variables considered. Our method enables relating the measurements, trends and residuals of vulnerability variables to explanatory variables that illustrate farm exposure to climatic and economic variability, initial farm configurations and farmers’ technical adaptations over time. We applied our method to 19 cattle (beef, dairy, and mixed) farms over the period 2008–2013. Selected vulnerability variables, i.e., farm productivity and economic efficiency, varied greatly among cattle farms and across years, with means ranging from 43.0 to 270.0 kg protein/ha and 29.4–66.0% efficiency, respectively. No farm had a high level, stable or increasing trend and low residuals for both farm productivity and economic efficiency of production. Thus, the least vulnerable farms represented a compromise among measurement value, trend, and variability of both performances. No specific combination of farmers’ practices emerged for reducing cattle farm vulnerability to climatic and economic variability. In the least vulnerable farms, the practices implemented (stocking rate, input use…) were more consistent with the objective of developing the properties targeted (efficiency, robustness…). Our method can be used to support farmers with sector-specific and local insights about most promising farm adaptations. PMID:28900435

  15. The Not-So-Global Blood Oxygen Level-Dependent Signal.

    PubMed

    Billings, Jacob; Keilholz, Shella

    2018-04-01

    Global signal regression is a controversial processing step for resting-state functional magnetic resonance imaging, partly because the source of the global blood oxygen level-dependent (BOLD) signal remains unclear. On the one hand, nuisance factors such as motion can readily introduce coherent BOLD changes across the whole brain. On the other hand, the global signal has been linked to neural activity and vigilance levels, suggesting that it contains important neurophysiological information and should not be discarded. Any widespread pattern of coordinated activity is likely to contribute appreciably to the global signal. Such patterns may include large-scale quasiperiodic spatiotemporal patterns, known also to be tied to performance on vigilance tasks. This uncertainty surrounding the separability of the global BOLD signal from concurrent neurological processes motivated an examination of the global BOLD signal's spatial distribution. The results clarify that although the global signal collects information from all tissue classes, a diverse subset of the BOLD signal's independent components contribute the most to the global signal. Further, the timing of each network's contribution to the global signal is not consistent across volunteers, confirming the independence of a constituent process that comprises the global signal.

  16. Global Intracoronary Infusion of Allogeneic Cardiosphere-Derived Cells Improves Ventricular Function and Stimulates Endogenous Myocyte Regeneration throughout the Heart in Swine with Hibernating Myocardium

    PubMed Central

    Suzuki, Gen; Weil, Brian R.; Leiker, Merced M.; Ribbeck, Amanda E.; Young, Rebeccah F.; Cimato, Thomas R.; Canty, John M.

    2014-01-01

    Background Cardiosphere-derived cells (CDCs) improve ventricular function and reduce fibrotic volume when administered via an infarct-related artery using the “stop-flow” technique. Unfortunately, myocyte loss and dysfunction occur globally in many patients with ischemic and non-ischemic cardiomyopathy, necessitating an approach to distribute CDCs throughout the entire heart. We therefore determined whether global intracoronary infusion of CDCs under continuous flow improves contractile function and stimulates new myocyte formation. Methods and Results Swine with hibernating myocardium from a chronic LAD occlusion were studied 3-months after instrumentation (n = 25). CDCs isolated from myocardial biopsies were infused into each major coronary artery (∼33×106 icCDCs). Global icCDC infusion was safe and while ∼3% of injected CDCs were retained, they did not affect ventricular function or myocyte proliferation in normal animals. In contrast, four-weeks after icCDCs were administered to animals with hibernating myocardium, %LADWT increased from 23±6 to 51±5% (p<0.01). In diseased hearts, myocyte proliferation (phospho-histone-H3) increased in hibernating and remote regions with a concomitant increase in myocyte nuclear density. These effects were accompanied by reductions in myocyte diameter consistent with new myocyte formation. Only rare myocytes arose from sex-mismatched donor CDCs. Conclusions Global icCDC infusion under continuous flow is feasible and improves contractile function, regresses myocyte cellular hypertrophy and increases myocyte proliferation in diseased but not normal hearts. New myocytes arising via differentiation of injected cells are rare, implicating stimulation of endogenous myocyte regeneration as the primary mechanism of repair. PMID:25402428

  17. Biogeography of Human Infectious Diseases: A Global Historical Analysis

    PubMed Central

    Cashdan, Elizabeth

    2014-01-01

    Objectives Human pathogen richness and prevalence vary widely across the globe, yet we know little about whether global patterns found in other taxa also predict diversity in this important group of organisms. This study (a) assesses the relative importance of temperature, precipitation, habitat diversity, and population density on the global distributions of human pathogens and (b) evaluates the species-area predictions of island biogeography for human pathogen distributions on oceanic islands. Methods Historical data were used in order to minimize the influence of differential access to modern health care on pathogen prevalence. The database includes coded data (pathogen, environmental and cultural) for a worldwide sample of 186 non-industrial cultures, including 37 on islands. Prevalence levels for 10 pathogens were combined into a pathogen prevalence index, and OLS regression was used to model the environmental determinants of the prevalence index and number of pathogens. Results Pathogens (number and prevalence index) showed the expected latitudinal gradient, but predictors varied by latitude. Pathogens increased with temperature in high-latitude zones, while mean annual precipitation was a more important predictor in low-latitude zones. Other environmental factors associated with more pathogens included seasonal dry extremes, frost-free climates, and human population density outside the tropics. Islands showed the expected species-area relationship for all but the smallest islands, and the relationship was not mediated by habitat diversity. Although geographic distributions of free-living and parasitic taxa typically have different determinants, these data show that variables that influence the distribution of free-living organisms also shape the global distribution of human pathogens. Understanding the cause of these distributions is potentially important, since geographical variation in human pathogens has an important influence on global disparities in human welfare. PMID:25271730

  18. Robust Eye Center Localization through Face Alignment and Invariant Isocentric Patterns

    PubMed Central

    Teng, Dongdong; Chen, Dihu; Tan, Hongzhou

    2015-01-01

    The localization of eye centers is a very useful cue for numerous applications like face recognition, facial expression recognition, and the early screening of neurological pathologies. Several methods relying on available light for accurate eye-center localization have been exploited. However, despite the considerable improvements that eye-center localization systems have undergone in recent years, only few of these developments deal with the challenges posed by the profile (non-frontal face). In this paper, we first use the explicit shape regression method to obtain the rough location of the eye centers. Because this method extracts global information from the human face, it is robust against any changes in the eye region. We exploit this robustness and utilize it as a constraint. To locate the eye centers accurately, we employ isophote curvature features, the accuracy of which has been demonstrated in a previous study. By applying these features, we obtain a series of eye-center locations which are candidates for the actual position of the eye-center. Among these locations, the estimated locations which minimize the reconstruction error between the two methods mentioned above are taken as the closest approximation for the eye centers locations. Therefore, we combine explicit shape regression and isophote curvature feature analysis to achieve robustness and accuracy, respectively. In practical experiments, we use BioID and FERET datasets to test our approach to obtaining an accurate eye-center location while retaining robustness against changes in scale and pose. In addition, we apply our method to non-frontal faces to test its robustness and accuracy, which are essential in gaze estimation but have seldom been mentioned in previous works. Through extensive experimentation, we show that the proposed method can achieve a significant improvement in accuracy and robustness over state-of-the-art techniques, with our method ranking second in terms of accuracy. According to our implementation on a PC with a Xeon 2.5Ghz CPU, the frame rate of the eye tracking process can achieve 38 Hz. PMID:26426929

  19. Modeling Governance KB with CATPCA to Overcome Multicollinearity in the Logistic Regression

    NASA Astrophysics Data System (ADS)

    Khikmah, L.; Wijayanto, H.; Syafitri, U. D.

    2017-04-01

    The problem often encounters in logistic regression modeling are multicollinearity problems. Data that have multicollinearity between explanatory variables with the result in the estimation of parameters to be bias. Besides, the multicollinearity will result in error in the classification. In general, to overcome multicollinearity in regression used stepwise regression. They are also another method to overcome multicollinearity which involves all variable for prediction. That is Principal Component Analysis (PCA). However, classical PCA in only for numeric data. Its data are categorical, one method to solve the problems is Categorical Principal Component Analysis (CATPCA). Data were used in this research were a part of data Demographic and Population Survey Indonesia (IDHS) 2012. This research focuses on the characteristic of women of using the contraceptive methods. Classification results evaluated using Area Under Curve (AUC) values. The higher the AUC value, the better. Based on AUC values, the classification of the contraceptive method using stepwise method (58.66%) is better than the logistic regression model (57.39%) and CATPCA (57.39%). Evaluation of the results of logistic regression using sensitivity, shows the opposite where CATPCA method (99.79%) is better than logistic regression method (92.43%) and stepwise (92.05%). Therefore in this study focuses on major class classification (using a contraceptive method), then the selected model is CATPCA because it can raise the level of the major class model accuracy.

  20. Estimating Contraceptive Prevalence Using Logistics Data for Short-Acting Methods: Analysis Across 30 Countries

    PubMed Central

    Cunningham, Marc; Brown, Niquelle; Sacher, Suzy; Hatch, Benjamin; Inglis, Andrew; Aronovich, Dana

    2015-01-01

    Background: Contraceptive prevalence rate (CPR) is a vital indicator used by country governments, international donors, and other stakeholders for measuring progress in family planning programs against country targets and global initiatives as well as for estimating health outcomes. Because of the need for more frequent CPR estimates than population-based surveys currently provide, alternative approaches for estimating CPRs are being explored, including using contraceptive logistics data. Methods: Using data from the Demographic and Health Surveys (DHS) in 30 countries, population data from the United States Census Bureau International Database, and logistics data from the Procurement Planning and Monitoring Report (PPMR) and the Pipeline Monitoring and Procurement Planning System (PipeLine), we developed and evaluated 3 models to generate country-level, public-sector contraceptive prevalence estimates for injectable contraceptives, oral contraceptives, and male condoms. Models included: direct estimation through existing couple-years of protection (CYP) conversion factors, bivariate linear regression, and multivariate linear regression. Model evaluation consisted of comparing the referent DHS prevalence rates for each short-acting method with the model-generated prevalence rate using multiple metrics, including mean absolute error and proportion of countries where the modeled prevalence rate for each method was within 1, 2, or 5 percentage points of the DHS referent value. Results: For the methods studied, family planning use estimates from public-sector logistics data were correlated with those from the DHS, validating the quality and accuracy of current public-sector logistics data. Logistics data for oral and injectable contraceptives were significantly associated (P<.05) with the referent DHS values for both bivariate and multivariate models. For condoms, however, that association was only significant for the bivariate model. With the exception of the CYP-based model for condoms, models were able to estimate public-sector prevalence rates for each short-acting method to within 2 percentage points in at least 85% of countries. Conclusions: Public-sector contraceptive logistics data are strongly correlated with public-sector prevalence rates for short-acting methods, demonstrating the quality of current logistics data and their ability to provide relatively accurate prevalence estimates. The models provide a starting point for generating interim estimates of contraceptive use when timely survey data are unavailable. All models except the condoms CYP model performed well; the regression models were most accurate but the CYP model offers the simplest calculation method. Future work extending the research to other modern methods, relating subnational logistics data with prevalence rates, and tracking that relationship over time is needed. PMID:26374805

  1. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  2. Investigating the Performance of Alternate Regression Weights by Studying All Possible Criteria in Regression Models with a Fixed Set of Predictors

    ERIC Educational Resources Information Center

    Waller, Niels; Jones, Jeff

    2011-01-01

    We describe methods for assessing all possible criteria (i.e., dependent variables) and subsets of criteria for regression models with a fixed set of predictors, x (where x is an n x 1 vector of independent variables). Our methods build upon the geometry of regression coefficients (hereafter called regression weights) in n-dimensional space. For a…

  3. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  4. Age assessment based on third molar mineralisation : An epidemiological-radiological study on a Central-European population.

    PubMed

    Hofmann, Elisabeth; Robold, Matthias; Proff, Peter; Kirschneck, Christian

    2017-03-01

    The method published in 1973 by Demirjian et al. to assess age based on the mineralisation stage of permanent teeth is standard practice in forensic and orthodontic diagnostics. From age 14 onwards, however, this method is only applicable to third molars. No current epidemiological data on third molar mineralisation are available for Caucasian Central-Europeans. Thus, a method for assessing age in this population based on third molar mineralisation is presented, taking into account possible topographic and gender-specific differences. The study included 486 Caucasian Central-European orthodontic patients (9-24 years) with unaffected dental development. In an anonymized, randomized, and blinded manner, one orthopantomogram of each patient at either start, mid or end of treatment was visually analysed regarding the mineralisation stage of the third molars according to the method by Demirjian et al. Corresponding topographic and gender-specific point scores were determined and added to form a dental maturity score. Prediction equations for age assessment were derived by linear regression analysis with chronological age and checked for reliability within the study population. Mineralisation of the lower third molars was slower than mineralisation of the upper third molars, whereas no jaw-side-specific differences were detected. Gender-specific differences were relatively small, but girls reached mineralisation stage C earlier than boys, whereas boys showed an accelerated mineralisation between the ages of 15 and 16. The global equation generated by regression analysis (age = -1.103 + 0.268 × dental maturity score 18 + 28 + 38 + 48) is sufficiently accurate and reliable for clinical use. Age assessment only based on either maxilla or mandible also shows good prognostic reliability.

  5. Assessing NARCCAP climate model effects using spatial confidence regions

    PubMed Central

    French, Joshua P.; McGinnis, Seth; Schwartzman, Armin

    2017-01-01

    We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP) climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference. PMID:28936474

  6. Intention to communicate BRCA1/BRCA2 genetic test results to the family.

    PubMed

    Barsevick, Andrea M; Montgomery, Susan V; Ruth, Karen; Ross, Eric A; Egleston, Brian L; Bingler, Ruth; Malick, John; Miller, Suzanne M; Cescon, Terrence P; Daly, Mary B

    2008-04-01

    Guided by the theory of planned behavior, this analysis explores the communication skills of women who had genetic testing for BRCA1 and BRCA2. The key outcome was intention to tell test results to adult first-degree relatives. The theory predicts that global and specific attitudes, global and specific perceived social norms, and perceived control will influence the communication of genetic test results. A logistic regression model revealed that global attitude (p < .05), specific social influence (p < .01), and perceived control (p < .05) were significant predictors of intention to tell. When gender and generation of relatives were added to the regression, participants were more likely to convey genetic test results to female than to male relatives (p < .05) and were also more likely to communicate test results to children (p < .01) or siblings (p < .05) than to parents. However, this association depended on knowing the relative's opinion of genetic testing. Intention to tell was lowest among participants who did not know their relative's opinion. These results extend the theory of planned behavior by showing that gender and generation influence intention when the relative's opinion is unknown. (c) 2008 APA, all rights reserved.

  7. Global Analysis of Empirical Relationships Between Annual Climate and Seasonality of NDVI

    NASA Technical Reports Server (NTRS)

    Potter, C. S.

    1997-01-01

    This study describes the use of satellite data to calibrate a new climate-vegetation greenness function for global change studies. We examined statistical relationships between annual climate indexes (temperature, precipitation, and surface radiation) and seasonal attributes of the AVHRR Normalized Difference Vegetation Index (NDVI) time series for the mid-1980s in order to refine our empirical understanding of intraannual patterns and global abiotic controls on natural vegetation dynamics. Multiple linear regression results using global l(sup o) gridded data sets suggest that three climate indexes: growing degree days, annual precipitation total, and an annual moisture index together can account to 70-80 percent of the variation in the NDVI seasonal extremes (maximum and minimum values) for the calibration year 1984. Inclusion of the same climate index values from the previous year explained no significant additional portion of the global scale variation in NDVI seasonal extremes. The monthly timing of NDVI extremes was closely associated with seasonal patterns in maximum and minimum temperature and rainfall, with lag times of 1 to 2 months. We separated well-drained areas from l(sup o) grid cells mapped as greater than 25 percent inundated coverage for estimation of both the magnitude and timing of seasonal NDVI maximum values. Predicted monthly NDVI, derived from our climate-based regression equations and Fourier smoothing algorithms, shows good agreement with observed NDVI at a series of ecosystem test locations from around the globe. Regions in which NDVI seasonal extremes were not accurately predicted are mainly high latitude ecosystems and other remote locations where climate station data are sparse.

  8. The X-Factor: an evaluation of common methods used to analyse major inter-segment kinematics during the golf swing.

    PubMed

    Brown, Susan J; Selbie, W Scott; Wallace, Eric S

    2013-01-01

    A common biomechanical feature of a golf swing, described in various ways in the literature, is the interaction between the thorax and pelvis, often termed the X-Factor. There is no consistent method used within golf biomechanics literature however to calculate these segment interactions. The purpose of this study was to examine X-factor data calculated using three reported methods in order to determine the similarity or otherwise of the data calculated using each method. A twelve-camera three-dimensional motion capture system was used to capture the driver swings of 19 participants and a subject specific three-dimensional biomechanical model was created with the position and orientation of each model estimated using a global optimisation algorithm. Comparison of the X-Factor methods showed significant differences for events during the swing (P < 0.05). Data for each kinematic measure were derived as a times series for all three methods and regression analysis of these data showed that whilst one method could be successfully mapped to another, the mappings between methods are subject dependent (P <0.05). Findings suggest that a consistent methodology considering the X-Factor from a joint angle approach is most insightful in describing a golf swing.

  9. How to perform Subjective Global Nutritional assessment in children.

    PubMed

    Secker, Donna J; Jeejeebhoy, Khursheed N

    2012-03-01

    Subjective Global Assessment (SGA) is a method for evaluating nutritional status based on a practitioner's clinical judgment rather than objective, quantitative measurements. Encompassing historical, symptomatic, and physical parameters, SGA aims to identify an individual's initial nutrition state and consider the interplay of factors influencing the progression or regression of nutrition abnormalities. SGA has been widely used for more than 25 years to assess the nutritional status of adults in both clinical and research settings. Perceiving multiple benefits of its use in children, we recently adapted and validated the SGA tool for use in a pediatric population, demonstrating its ability to identify the nutritional status of children undergoing surgery and their risk of developing nutrition-associated complications postoperatively. Objective measures of nutritional status, on the other hand, showed no association with outcomes. The purpose of this article is to describe in detail the methods used in conducting nutrition-focused physical examinations and the medical history components of a pediatric Subjective Global Nutritional Assessment tool. Guidelines are given for performing and interpreting physical examinations that look for evidence of loss of subcutaneous fat, muscle wasting, and/or edema in children of different ages. Age-related questionnaires are offered to guide history taking and the rating of growth, weight changes, dietary intake, gastrointestinal symptoms, functional capacity, and any metabolic stress. Finally, the associated rating form is provided, along with direction for how to consider all components of a physical exam and history in the context of each other, to assign an overall rating of normal/well nourished, moderate malnutrition, or severe malnutrition. With this information, interested health professionals will be able to perform Subjective Global Nutritional Assessment to determine a global rating of nutritional status for infants, children, and adolescents, and use this rating to guide decision making about what nutrition-related attention is necessary. Dietetics practitioners and other clinicians are encouraged to incorporate physical examination for signs of protein-energy depletion when assessing the nutritional status of children. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  10. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  11. Methods for identifying SNP interactions: a review on variations of Logic Regression, Random Forest and Bayesian logistic regression.

    PubMed

    Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula

    2011-01-01

    Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.

  12. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Price of gasoline: forecasting comparisons. [Box-Jenkins, econometric, and regression methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bopp, A.E.; Neri, J.A.

    Gasoline prices are simulated using three popular forecasting methodologies: A Box--Jenkins type method, an econometric method, and a regression method. One-period-ahead and 18-period-ahead comparisons are made. For the one-period-ahead method, a Box--Jenkins type time-series model simulated best, although all do well. However, for the 18-period simulation, the econometric and regression methods perform substantially better than the Box-Jenkins formulation. A rationale for and implications of these results ae discussed. 11 references.

  14. HIV prevention costs and program scale: data from the PANCEA project in five low and middle-income countries.

    PubMed

    Marseille, Elliot; Dandona, Lalit; Marshall, Nell; Gaist, Paul; Bautista-Arredondo, Sergio; Rollins, Brandi; Bertozzi, Stefano M; Coovadia, Jerry; Saba, Joseph; Lioznov, Dmitry; Du Plessis, Jo-Ann; Krupitsky, Evgeny; Stanley, Nicci; Over, Mead; Peryshkina, Alena; Kumar, S G Prem; Muyingo, Sowedi; Pitter, Christian; Lundberg, Mattias; Kahn, James G

    2007-07-12

    Economic theory and limited empirical data suggest that costs per unit of HIV prevention program output (unit costs) will initially decrease as small programs expand. Unit costs may then reach a nadir and start to increase if expansion continues beyond the economically optimal size. Information on the relationship between scale and unit costs is critical to project the cost of global HIV prevention efforts and to allocate prevention resources efficiently. The "Prevent AIDS: Network for Cost-Effectiveness Analysis" (PANCEA) project collected 2003 and 2004 cost and output data from 206 HIV prevention programs of six types in five countries. The association between scale and efficiency for each intervention type was examined for each country. Our team characterized the direction, shape, and strength of this association by fitting bivariate regression lines to scatter plots of output levels and unit costs. We chose the regression forms with the highest explanatory power (R2). Efficiency increased with scale, across all countries and interventions. This association varied within intervention and within country, in terms of the range in scale and efficiency, the best fitting regression form, and the slope of the regression. The fraction of variation in efficiency explained by scale ranged from 26-96%. Doubling in scale resulted in reductions in unit costs averaging 34.2% (ranging from 2.4% to 58.0%). Two regression trends, in India, suggested an inflection point beyond which unit costs increased. Unit costs decrease with scale across a wide range of service types and volumes. These country and intervention-specific findings can inform projections of the global cost of scaling up HIV prevention efforts.

  15. Impact of the global financial crisis on low birth weight in Portugal: a time-trend analysis

    PubMed Central

    Kana, Musa Abubakar; Correia, Sofia; Peleteiro, Barbara; Severo, Milton; Barros, Henrique

    2017-01-01

    Background The 2007–2008 global financial crisis had adverse consequences on population health of affected European countries. Few contemporary studies have studied its effect on perinatal indicators with long-lasting influence on adult health. Therefore, in this study, we investigated the impact of the 2007–2008 global financial crisis on low birth weight (LBW) in Portugal. Methods Data on 2 045 155 singleton births of 1995–2014 were obtained from Statistics Portugal. Joinpoint regression analysis was performed to identify the years in which changes in LBW trends occurred, and to estimate the annual per cent changes (APC). LBW risk by time period expressed as prevalence ratios were computed using the Poisson regression. Contextual changes in sociodemographic and economic factors were provided by their trends. Results The joinpoint analysis identified 3 distinct periods (2 jointpoints) with different APC in LBW, corresponding to 1995–1999 (APC=4.4; 95% CI 3.2 to 5.6), 2000–2006 (APC=0.1; 95% CI −050 to 0.7) and 2007–2014 (APC=1.6; 95% CI 1.2 to 2.0). For non-Portuguese, it was, respectively, 1995–1999 (APC=1.4; 95% CI −3.9 to 7.0%), 2000–2007 (APC=−4.2; 95% CI −6.4 to −2.0) and 2008–2014 (APC=3.1; 95% CI 0.8 to 5.5). Compared with 1995–1999, all specific maternal characteristics had a 10–15% increase in LBW risk in 2000–2006 and a 20–25% increase in 2007–2014, except among migrants, for which LBW risk remained lower than in 1995–1999 but increased after the crisis. The increasing LBW risk coincides with a deceleration in gross domestic product growth rate, reduction in health expenditure, social protection allocation on family/children support and sickness. Conclusions The 2007–2008 global financial crisis was associated with a significant increase in LBW, particularly among infants of non-Portuguese mothers. We recommend strengthening social policies aimed at maternity protection for vulnerable mothers and health system maintenance of social equity in perinatal healthcare. PMID:28589009

  16. Quality of life in breast cancer patients--a quantile regression analysis.

    PubMed

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  17. Fungible weights in logistic regression.

    PubMed

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data.

    PubMed

    Abram, Samantha V; Helwig, Nathaniel E; Moodie, Craig A; DeYoung, Colin G; MacDonald, Angus W; Waller, Niels G

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks.

  19. Bootstrap Enhanced Penalized Regression for Variable Selection with Neuroimaging Data

    PubMed Central

    Abram, Samantha V.; Helwig, Nathaniel E.; Moodie, Craig A.; DeYoung, Colin G.; MacDonald, Angus W.; Waller, Niels G.

    2016-01-01

    Recent advances in fMRI research highlight the use of multivariate methods for examining whole-brain connectivity. Complementary data-driven methods are needed for determining the subset of predictors related to individual differences. Although commonly used for this purpose, ordinary least squares (OLS) regression may not be ideal due to multi-collinearity and over-fitting issues. Penalized regression is a promising and underutilized alternative to OLS regression. In this paper, we propose a nonparametric bootstrap quantile (QNT) approach for variable selection with neuroimaging data. We use real and simulated data, as well as annotated R code, to demonstrate the benefits of our proposed method. Our results illustrate the practical potential of our proposed bootstrap QNT approach. Our real data example demonstrates how our method can be used to relate individual differences in neural network connectivity with an externalizing personality measure. Also, our simulation results reveal that the QNT method is effective under a variety of data conditions. Penalized regression yields more stable estimates and sparser models than OLS regression in situations with large numbers of highly correlated neural predictors. Our results demonstrate that penalized regression is a promising method for examining associations between neural predictors and clinically relevant traits or behaviors. These findings have important implications for the growing field of functional connectivity research, where multivariate methods produce numerous, highly correlated brain networks. PMID:27516732

  20. "That was intense!" Spirituality during childbirth: a mixed-method comparative study of mothers' and fathers' experiences in a public hospital.

    PubMed

    Bélanger-Lévesque, Marie-Noëlle; Dumas, Marc; Blouin, Simon; Pasquier, Jean-Charles

    2016-09-30

    While spirituality is well described in end-of-life care literature, research on its place in the delivery room remains largely limited to mother-oriented qualitative studies focusing on life-threatening situations (e.g., high-risk pregnancies). Our aim was to compare mothers' and fathers' spirituality during childbirth. A mixed methods questionnaire was developed from our childbirth-related spirituality categorization and distributed to all parents of newborns, 12-24 h postpartum, over 45 consecutive days. Paired-sample t-tests and qualitative thematic analysis were used to compare mothers and fathers. Multiple linear regressions identified factors associated with their respective global scores (vaginal and cesarean deliveries separately). The global scores for mothers (38.6/50) and fathers (37.2/50) were similarly high (N = 197; p = 0.001). Highest-ranked ("respect", "moral responsibility", "beauty of life", "gratitude") and lowest-ranked spiritual themes ("prayer", "greater than self") were in agreement. Fathers scored higher on "fragility of life" (p = 0.006) and mothers on "self-accomplishment" (p‹0.001), "letting go" (p‹0.001), and "meaningfulness" (p = 0.003). "Admission of baby in neonatal unit" was associated with higher global score for both mothers and fathers. Other factors also increased fathers' (witnessing a severe tear) and mothers' scores (birthplace outside Canada; for vaginal deliveries, religious belonging and longer pushing stage). These first quantitative data on the prevalence of spirituality during childbirth highlight a high score for both parents, among a non-selected public hospital population. Spirituality emerges not only from unordinary situations but from any childbirth as an "intensification of the human experience". Significant differences for some spiritual themes indicate the need to consider the spirituality of both parents.

  1. Reader reaction to "a robust method for estimating optimal treatment regimes" by Zhang et al. (2012).

    PubMed

    Taylor, Jeremy M G; Cheng, Wenting; Foster, Jared C

    2015-03-01

    A recent article (Zhang et al., 2012, Biometrics 168, 1010-1018) compares regression based and inverse probability based methods of estimating an optimal treatment regime and shows for a small number of covariates that inverse probability weighted methods are more robust to model misspecification than regression methods. We demonstrate that using models that fit the data better reduces the concern about non-robustness for the regression methods. We extend the simulation study of Zhang et al. (2012, Biometrics 168, 1010-1018), also considering the situation of a larger number of covariates, and show that incorporating random forests into both regression and inverse probability weighted based methods improves their properties. © 2014, The International Biometric Society.

  2. Improving the quality of marine geophysical track line data: Along-track analysis

    NASA Astrophysics Data System (ADS)

    Chandler, Michael T.; Wessel, Paul

    2008-02-01

    We have examined 4918 track line geophysics cruises archived at the U.S. National Geophysical Data Center (NGDC) using comprehensive error checking methods. Each cruise was checked for observation outliers, excessive gradients, metadata consistency, and general agreement with satellite altimetry-derived gravity and predicted bathymetry grids. Thresholds for error checking were determined empirically through inspection of histograms for all geophysical values, gradients, and differences with gridded data sampled along ship tracks. Robust regression was used to detect systematic scale and offset errors found by comparing ship bathymetry and free-air anomalies to the corresponding values from global grids. We found many recurring error types in the NGDC archive, including poor navigation, inappropriately scaled or offset data, excessive gradients, and extended offsets in depth and gravity when compared to global grids. While ˜5-10% of bathymetry and free-air gravity records fail our conservative tests, residual magnetic errors may exceed twice this proportion. These errors hinder the effective use of the data and may lead to mistakes in interpretation. To enable the removal of gross errors without over-writing original cruise data, we developed an errata system that concisely reports all errors encountered in a cruise. With such errata files, scientists may share cruise corrections, thereby preventing redundant processing. We have implemented these quality control methods in the modified MGD77 supplement to the Generic Mapping Tools software suite.

  3. Private land manager capacity to conserve threatened communities under climate change.

    PubMed

    Raymond, C M; Lechner, A M; Lockwood, M; Carter, O; Harris, R M B; Gilfedder, L

    2015-08-15

    Major global changes in vegetation community distributions and ecosystem processes are expected as a result of climate change. In agricultural regions with a predominance of private land, biodiversity outcomes will depend on the adaptive capacity of individual land managers, as well as their willingness to engage with conservation programs and actions. Understanding adaptive capacity of landholders is critical for assessing future prospects for biodiversity conservation in privately owned agricultural landscapes globally, given projected climate change. This paper is the first to develop and apply a set of statistical methods (correlation and bionomial regression analyses) for combining social data on land manager adaptive capacity and factors associated with conservation program participation with biophysical data describing the current and projected-future distribution of climate suitable for vegetation communities. We apply these methods to the Tasmanian Midlands region of Tasmania, Australia and discuss the implications of the modelled results on conservation program strategy design in other contexts. We find that the integrated results can be used by environmental management organisations to design community engagement programs, and to tailor their messages to land managers with different capacity types and information behaviours. We encourage environmental agencies to target high capacity land managers by diffusing climate change and grassland management information through well respected conservation NGOs and farm system groups, and engage low capacity land managers via formalized mentoring programs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Estimating Contraceptive Prevalence Using Logistics Data for Short-Acting Methods: Analysis Across 30 Countries.

    PubMed

    Cunningham, Marc; Bock, Ariella; Brown, Niquelle; Sacher, Suzy; Hatch, Benjamin; Inglis, Andrew; Aronovich, Dana

    2015-09-01

    Contraceptive prevalence rate (CPR) is a vital indicator used by country governments, international donors, and other stakeholders for measuring progress in family planning programs against country targets and global initiatives as well as for estimating health outcomes. Because of the need for more frequent CPR estimates than population-based surveys currently provide, alternative approaches for estimating CPRs are being explored, including using contraceptive logistics data. Using data from the Demographic and Health Surveys (DHS) in 30 countries, population data from the United States Census Bureau International Database, and logistics data from the Procurement Planning and Monitoring Report (PPMR) and the Pipeline Monitoring and Procurement Planning System (PipeLine), we developed and evaluated 3 models to generate country-level, public-sector contraceptive prevalence estimates for injectable contraceptives, oral contraceptives, and male condoms. Models included: direct estimation through existing couple-years of protection (CYP) conversion factors, bivariate linear regression, and multivariate linear regression. Model evaluation consisted of comparing the referent DHS prevalence rates for each short-acting method with the model-generated prevalence rate using multiple metrics, including mean absolute error and proportion of countries where the modeled prevalence rate for each method was within 1, 2, or 5 percentage points of the DHS referent value. For the methods studied, family planning use estimates from public-sector logistics data were correlated with those from the DHS, validating the quality and accuracy of current public-sector logistics data. Logistics data for oral and injectable contraceptives were significantly associated (P<.05) with the referent DHS values for both bivariate and multivariate models. For condoms, however, that association was only significant for the bivariate model. With the exception of the CYP-based model for condoms, models were able to estimate public-sector prevalence rates for each short-acting method to within 2 percentage points in at least 85% of countries. Public-sector contraceptive logistics data are strongly correlated with public-sector prevalence rates for short-acting methods, demonstrating the quality of current logistics data and their ability to provide relatively accurate prevalence estimates. The models provide a starting point for generating interim estimates of contraceptive use when timely survey data are unavailable. All models except the condoms CYP model performed well; the regression models were most accurate but the CYP model offers the simplest calculation method. Future work extending the research to other modern methods, relating subnational logistics data with prevalence rates, and tracking that relationship over time is needed. © Cunningham et al.

  5. Basis Selection for Wavelet Regression

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Lau, Sonie (Technical Monitor)

    1998-01-01

    A wavelet basis selection procedure is presented for wavelet regression. Both the basis and the threshold are selected using cross-validation. The method includes the capability of incorporating prior knowledge on the smoothness (or shape of the basis functions) into the basis selection procedure. The results of the method are demonstrated on sampled functions widely used in the wavelet regression literature. The results of the method are contrasted with other published methods.

  6. The rise of global warming skepticism: exploring affective image associations in the United States over time.

    PubMed

    Smith, Nicholas; Leiserowitz, Anthony

    2012-06-01

    This article explores how affective image associations to global warming have changed over time. Four nationally representative surveys of the American public were conducted between 2002 and 2010 to assess public global warming risk perceptions, policy preferences, and behavior. Affective images (positive or negative feelings and cognitive representations) were collected and content analyzed. The results demonstrate a large increase in "naysayer" associations, indicating extreme skepticism about the issue of climate change. Multiple regression analyses found that holistic affect and "naysayer" associations were more significant predictors of global warming risk perceptions than cultural worldviews or sociodemographic variables, including political party and ideology. The results demonstrate the important role affective imagery plays in judgment and decision-making processes, how these variables change over time, and how global warming is currently perceived by the American public. © 2012 Society for Risk Analysis.

  7. Vegetation Monitoring with Gaussian Processes and Latent Force Models

    NASA Astrophysics Data System (ADS)

    Camps-Valls, Gustau; Svendsen, Daniel; Martino, Luca; Campos, Manuel; Luengo, David

    2017-04-01

    Monitoring vegetation by biophysical parameter retrieval from Earth observation data is a challenging problem, where machine learning is currently a key player. Neural networks, kernel methods, and Gaussian Process (GP) regression have excelled in parameter retrieval tasks at both local and global scales. GP regression is based on solid Bayesian statistics, yield efficient and accurate parameter estimates, and provides interesting advantages over competing machine learning approaches such as confidence intervals. However, GP models are hampered by lack of interpretability, that prevented the widespread adoption by a larger community. In this presentation we will summarize some of our latest developments to address this issue. We will review the main characteristics of GPs and their advantages in vegetation monitoring standard applications. Then, three advanced GP models will be introduced. First, we will derive sensitivity maps for the GP predictive function that allows us to obtain feature ranking from the model and to assess the influence of examples in the solution. Second, we will introduce a Joint GP (JGP) model that combines in situ measurements and simulated radiative transfer data in a single GP model. The JGP regression provides more sensible confidence intervals for the predictions, respects the physics of the underlying processes, and allows for transferability across time and space. Finally, a latent force model (LFM) for GP modeling that encodes ordinary differential equations to blend data-driven modeling and physical models of the system is presented. The LFM performs multi-output regression, adapts to the signal characteristics, is able to cope with missing data in the time series, and provides explicit latent functions that allow system analysis and evaluation. Empirical evidence of the performance of these models will be presented through illustrative examples.

  8. Short-term prediction of threatening and violent behaviour in an Acute Psychiatric Intensive Care Unit based on patient and environment characteristics

    PubMed Central

    2011-01-01

    Background The aims of the present study were to investigate clinically relevant patient and environment-related predictive factors for threats and violent incidents the first three days in a PICU population based on evaluations done at admittance. Methods In 2000 and 2001 all 118 consecutive patients were assessed at admittance to a Psychiatric Intensive Care Unit (PICU). Patient-related conditions as actuarial data from present admission, global clinical evaluations by physician at admittance and clinical nurses first day, a single rating with an observer rated scale scoring behaviours that predict short-term violence in psychiatric inpatients (The Brøset Violence Checklist (BVC)) at admittance, and environment-related conditions as use of segregation or not were related to the outcome measure Staff Observation Aggression Scale-Revised (SOAS-R). A multiple logistic regression analysis with SOAS-R as outcome variable was performed. Results The global clinical evaluations and the BVC were effective and more suitable than actuarial data in predicting short-term aggression. The use of segregation reduced the number of SOAS-R incidents. Conclusions In a naturalistic group of patients in a PICU segregation of patients lowers the number of aggressive and threatening incidents. Prediction should be based on clinical global judgment, and instruments designed to predict short-term aggression in psychiatric inpatients. Trial registrations NCT00184119/NCT00184132 PMID:21418581

  9. Recent trends for drug lag in clinical development of oncology drugs in Japan: does the oncology drug lag still exist in Japan?

    PubMed

    Maeda, Hideki; Kurokawa, Tatsuo

    2015-12-01

    This study exhaustively and historically investigated the status of drug lag for oncology drugs approved in Japan. We comprehensively investigated oncology drugs approved in Japan between April 2001 and July 2014, using publicly available information. We also examined changes in the status of drug lag between Japan and the United States, as well as factors influencing drug lag. This study included 120 applications for approval of oncology drugs in Japan. The median difference over a 13-year period in the approval date between the United States and Japan was 875 days (29.2 months). This figure peaked in 2002, and showed a tendency to decline gradually each year thereafter. In 2014, the median approval lag was 281 days (9.4 months). Multiple regression analysis identified the following potential factors that reduce drug lag: "Japan's participation in global clinical trials"; "bridging strategies"; "designation of priority review in Japan"; and "molecularly targeted drugs". From 2001 to 2014, molecularly targeted drugs emerged as the predominant oncology drug, and the method of development has changed from full development in Japan or bridging strategy to global simultaneous development by Japan's taking part in global clinical trials. In line with these changes, the drug lag between the United States and Japan has significantly reduced to less than 1 year.

  10. Attitude of US obstetricians and gynaecologists to global warming and medical waste.

    PubMed

    Thiel, Cassandra; Duncan, Paula; Woods, Noe

    2017-01-01

    Objectives Global warming (or climate change) is a major public health issue, and health services are one of the largest contributors to greenhouse gas emissions in high-income countries. Despite the scale of the health care sector's resource consumption, little is known about the attitude of physicians and their willingness to participate in efforts to reduce the environmental impact of health services. Methods A survey of 236 obstetricians and gynaecologists at the University of Pittsburgh Medical Center in Western Pennsylvania, USA. Survey responses were compared to Gallup poll data from the general population using a one-sample test of proportions, Fisher's exact tests, Chi-square test, and logistic regression. Results Physicians in obstetrics and gynaecology were more likely than the public (84% vs. 54%; p<0.001) to believe that global warming is occurring, that media portrayal of its seriousness is accurate, and that it is caused by human activities. Two-thirds of physicians felt the amount of surgical waste generated is excessive and increasing. The majority (95%) would support efforts to reduce waste, with 66% favouring the use of reusable surgical tools over disposable where clinically equivalent. Despite their preference for reusable surgical instruments, only 20% preferred the reusable devices available to them. Conclusions Health care providers engaging in sustainability efforts may encounter significant support from physicians and may benefit from including physician leaders in their efforts.

  11. On the gravity and geoid effects of glacial isostatic adjustment in Fennoscandia - a short note

    NASA Astrophysics Data System (ADS)

    Sjöberg, L. E.

    2015-12-01

    Many geoscientists argue that there is a gravity low of 10-30 mGal in Fennoscandia as a remaining fingerprint of the last ice age and load, both vanished about 10 kyr ago. However, the extraction of the gravity signal related with Glacial Isostatic Adjustment (GIA) is complicated by the fact that the total gravity field is caused by many significant density distributions in the Earth. Here we recall a methodology originating with A. Bjerhammar 35 years ago, that emphasizes that the present land uplift phenomenon mainly occurs in the region thatwas covered by the ice cap, and it is highly correlated with the spectral window of degrees 10-22 of the global gravity field, whose lower limit fairly well corresponds to the wavelength that agrees with the size of the region. This implies that, although in principle the GIA is a global phenomenon, the geoid and gravity lows as well as the land upheaval in Fennoscandia are typically regional phenomena that cannot be seen in a global correlation study as it is blurred by many irrelevant gravity signals. It is suggested that a regional multi-regression analysis with a band-limited spectral gravity signal as the observable, a method tested already 2 decades ago, can absorb possible significant disturbing signals, e.g. from topographic and crustal depth variations, and thereby recover the GIA signal.

  12. Do mutations in DNMT3A/3B affect global DNA hypomethylation among benzene-exposed workers in Southeast China?: Effects of mutations in DNMT3A/3B on global DNA hypomethylation.

    PubMed

    Zhang, Guang-Hui; Lu, Ye; Ji, Bu-Qiang; Ren, Jing-Chao; Sun, Pin; Ding, Shibin; Liao, Xiaoling; Liao, Kaiju; Liu, Jinyi; Cao, Jia; Lan, Qing; Rothman, Nathaniel; Xia, Zhao-Lin

    2017-12-01

    Global DNA hypomethylation is commonly observed in benzene-exposed workers, but the underlying mechanisms remain unclear. We sought to discover the relationships among reduced white blood cell (WBC) counts, micronuclear (MN) frequency, and global DNA methylation to determine whether there were associations with mutations in DNMT3A/3B. Therefore, we recruited 410 shoe factory workers and 102 controls from Wenzhou in Zhenjiang Province. A Methylated DNA Quantification Kit was used to quantify global DNA methylation, and single nucleotide polymorphisms (SNPs) in DNMT3A (rs36012910, rs1550117, and R882) and DNMT3B (rs1569686, rs2424909, and rs2424913) were identified using the restriction fragment length polymorphism method. A multilinear regression analysis demonstrated that the benzene-exposed workers experienced significant global DNA hypomethylation compared with the controls (β = -0.51, 95% CI: -0.69 to -0.32, P < 0.001). The DNMT3A R882 mutant allele (R882H and R882C) (β = -0.25, 95% CI: -0.54 to 0.04, P = 0.094) and the DNMT3B rs2424909 GG allele (β = -0.37, 95% CI: -0.70 to -0.03, P = 0.031) were significantly associated with global DNA hypomethylation compared with the wild-type genotype after adjusting for confounding factors. Furthermore, the MN frequency in the R882 mutant allele (R882H and R882C) (FR = 1.18, 95% CI: 0.99 to 1.40, P = 0.054) was higher than that of the wild-type. The results imply that hypomethylation occurs due to benzene exposure and that mutations in DNMTs are significantly associated with global DNA methylation, which might have influenced the induction of MN following exposure to benzene. Environ. Mol. Mutagen. 58:678-687, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. A comparison of different interpolation methods for wind data in Central Asia

    NASA Astrophysics Data System (ADS)

    Reinhardt, Katja; Samimi, Cyrus

    2017-04-01

    For the assessment of the global climate change and its consequences, the results of computer based climate models are of central importance. The quality of these results and the validity of the derived forecasts are strongly determined by the quality of the underlying climate data. However, in many parts of the world high resolution data are not available. This is particularly true for many regions in Central Asia, where the density of climatological stations has often to be described as thinned out. Due to this insufficient data base the use of statistical methods to improve the resolution of existing climate data is of crucial importance. Only this can provide a substantial data base for a well-founded analysis of past climate changes as well as for a reliable forecast of future climate developments for the particular region. The study presented here shows a comparison of different interpolation methods for the wind components u and v for a region in Central Asia with a pronounced topography. The aim of the study is to find out whether there is an optimal interpolation method which can equally be applied for all pressure levels or if different interpolation methods have to be applied for each pressure level. The European reanalysis data Era-Interim for the years 1989 - 2015 are used as input data for the pressure levels of 850 hPa, 500 hPa and 200 hPa. In order to improve the input data, two different interpolation procedures were applied: On the one hand pure interpolation methods were used, such as inverse distance weighting and ordinary kriging. On the other hand machine learning algorithms, generalized additive models and regression kriging were applied, considering additional influencing factors, e.g. geopotential and topography. As a result it can be concluded that regression kriging provides the best results for all pressure levels, followed by support vector machine, neural networks and ordinary kriging. Inverse distance weighting showed the worst results.

  14. Quality of life and competitive work among adults with severe mental illness: moderating effects of family contact.

    PubMed

    Gold, Paul B

    2013-12-01

    OBJECTIVE Competitive employment may improve life quality for adults with severe mental illness, but it is not known for whom or under what circumstances. On the basis of previous research demonstrating benefits of family contact for African-American adults with severe mental illness, it was hypothesized that frequent family contact would moderate (enhance) a positive association between competitive employment and global quality of life for a rural sample of predominantly African-American adults. METHODS In a secondary analysis of data collected from a randomized trial of supported employment, a series of nested random regression analyses was conducted to test the hypothesized moderating effect of face-to-face family contact on the association between competitive employment and global quality of life, controlling for severity of psychiatric symptoms and satisfaction with family relations. RESULTS Most of the 143 study participants spent time with a family member at least once a month (80%)-and most at least weekly (60%). Participants who held a competitive job and had face-to-face contact with family members at least weekly reported higher global quality of life than all other study participants. CONCLUSIONS In this rural sample of African-American adults with severe mental illness, competitive work was associated with higher global quality of life only for those who frequently spent time with family members. Research is needed to test the generalizability of this finding to other geographic settings and cultures, as well as the feasibility and effectiveness of formal inclusion of family members in psychosocial rehabilitation interventions.

  15. Seasonality of cholera from 1974 to 2005: a review of global patterns

    PubMed Central

    Emch, Michael; Feldacker, Caryl; Islam, M Sirajul; Ali, Mohammad

    2008-01-01

    Background The seasonality of cholera is described in various study areas throughout the world. However, no study examines how temporal cycles of the disease vary around the world or reviews its hypothesized causes. This paper reviews the literature on the seasonality of cholera and describes its temporal cycles by compiling and analyzing 32 years of global cholera data. This paper also provides a detailed literature review on regional patterns and environmental and climatic drivers of cholera patterns. Data, Methods, and Results Cholera data are compiled from 1974 to 2005 from the World Health Organization Weekly Epidemiological Reports, a database that includes all reported cholera cases in 140 countries. The data are analyzed to measure whether season, latitude, and their interaction are significantly associated with the country-level number of outbreaks in each of the 12 preceding months using separate negative binomial regression models for northern, southern, and combined hemispheres. Likelihood ratios tests are used to determine the model of best fit. The results suggest that cholera outbreaks demonstrate seasonal patterns in higher absolute latitudes, but closer to the equator, cholera outbreaks do not follow a clear seasonal pattern. Conclusion The findings suggest that environmental and climatic factors partially control the temporal variability of cholera. These results also indirectly contribute to the growing debate about the effects of climate change and global warming. As climate change threatens to increase global temperature, resulting rises in sea levels and temperatures may influence the temporal fluctuations of cholera, potentially increasing the frequency and duration of cholera outbreaks. PMID:18570659

  16. Noninvasive evaluation of global and regional left ventricular function using computed tomography and magnetic resonance imaging: a meta-analysis.

    PubMed

    Kaniewska, Malwina; Schuetz, Georg M; Willun, Steffen; Schlattmann, Peter; Dewey, Marc

    2017-04-01

    To compare the diagnostic accuracy of computed tomography (CT) in the assessment of global and regional left ventricular (LV) function with magnetic resonance imaging (MRI). MEDLINE, EMBASE and ISI Web of Science were systematically reviewed. Evaluation included: ejection fraction (EF), end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV) and left ventricular mass (LVM). Differences between modalities were analysed using limits of agreement (LoA). Publication bias was measured by Egger's regression test. Heterogeneity was evaluated using Cochran's Q test and Higgins I 2 statistic. In the presence of heterogeneity the DerSimonian-Laird method was used for estimation of heterogeneity variance. Fifty-three studies including 1,814 patients were identified. The mean difference between CT and MRI was -0.56 % (LoA, -11.6-10.5 %) for EF, 2.62 ml (-34.1-39.3 ml) for EDV and 1.61 ml (-22.4-25.7 ml) for ESV, 3.21 ml (-21.8-28.3 ml) for SV and 0.13 g (-28.2-28.4 g) for LVM. CT detected wall motion abnormalities on a per-segment basis with 90 % sensitivity and 97 % specificity. CT is accurate for assessing global LV function parameters but the limits of agreement versus MRI are moderately wide, while wall motion deficits are detected with high accuracy. • CT helps to assess patients with coronary artery disease (CAD). • MRI is the reference standard for evaluation of left ventricular function. • CT provides accurate assessment of global left ventricular function.

  17. Relationships between climate and growth of Gymnocypris selincuoensis in the Tibetan Plateau.

    PubMed

    Tao, Juan; Chen, Yifeng; He, Dekui; Ding, Chengzhi

    2015-04-01

    The consequences of climate change are becoming increasingly evident in the Tibetan Plateau, represented by glaciers retreating and lakes expanding, but the biological response to climate change by plateau-lake ecosystems is poorly known. In this study, we applied dendrochronology methods to develop a growth index chronology with otolith increment widths of Selincuo naked carp (Gymnocypris selincuoensis), which is an endemic species in Lake Selincuo (4530 m), and investigated the relationships between fish growth and climate variables (regional and global) in the last three decades. A correlation analysis and principle component regression analysis between regional climate factors and the growth index chronology indicated that the growth of G. selincuoensis was significantly and positively correlated with length of the growing season and temperature-related variables, particularly during the growing season. Most of global climate variables, which are relevant to the Asian monsoon and the midlatitude westerlies, such as El Nino Southern Oscillation Index, the Arctic Oscillation, North Atlantic Oscillation, and North America Pattern, showed negative but not significant correlations with the annual growth of Selincuo naked carp. This may have resulted from the high elevation of the Tibetan Plateau and the high mountains surrounding this area. In comparison, the Pacific Decade Oscillation (PDO) negatively affected the growth of G. selincuoensis. The reason maybe that enhancement of the PDO can lead to cold conditions in this area. Taken together, the results indicate that the Tibetan Plateau fish has been affected by global climate change, particularly during the growing season, and global climate change likely has important effects on productivity of aquatic ecosystems in this area.

  18. Estimation of Standard Error of Regression Effects in Latent Regression Models Using Binder's Linearization. Research Report. ETS RR-07-09

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas

    2007-01-01

    Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…

  19. Bim expression in endothelial cells and pericytes is essential for regression of the fetal ocular vasculature.

    PubMed

    Wang, Shoujian; Zaitoun, Ismail S; Johnson, Ryan P; Jamali, Nasim; Gurel, Zafer; Wintheiser, Catherine M; Strasser, Andreas; Lindner, Volkhard; Sheibani, Nader; Sorenson, Christine M

    2017-01-01

    Apoptosis plays a central role in developmental and pathological angiogenesis and vessel regression. Bim is a pro-apoptotic Bcl-2 family member that plays a prominent role in both developmental and pathological ocular vessel regression, and neovascularization. Endothelial cells (EC) and pericytes (PC) each play unique roles during vascular development, maintenance and regression. We recently showed that germline deletion of Bim results in persistent hyaloid vasculature, increased retinal vascular density and prevents retinal vessel regression in response to hyperoxia. To determine whether retinal vascular regression is attributable to Bim expression in EC or PC we generated mice carrying a conditional Bim allele (BimFlox/Flox) and VE-cadherin-cre (BimEC mice) or Pdgfrb-cre (BimPC mice). BimEC and BimPC mice demonstrated attenuated hyaloid vessel regression and postnatal retinal vascular remodeling. We also observed decreased retinal vascular apoptosis and proliferation. Unlike global Bim -/- mice, mice conditionally lacking Bim in EC or PC underwent hyperoxia-mediated vessel obliteration and subsequent retinal neovascularization during oxygen-induced ischemic retinopathy similar to control littermates. Thus, understanding the cell autonomous role Bim plays in the retinal vascular homeostasis will give us new insight into how to modulate pathological retinal neovascularization and vessel regression to preserve vision.

  20. Assessing the Relationship between Airlines' Maintenance Outsourcing and Aviation Professionals' Job Satisfaction

    NASA Astrophysics Data System (ADS)

    McCamey, Rotorua

    The current economic and security challenges placed an additional burden on U.S. airlines to provide optimum service at reasonable costs to the flying public. In efforts to stay competitive, U.S. airlines increased foreign-based outsourcing of aircraft major repair and overhaul (MRO) mainly to reduce labor costs and conserve capital. This concentrated focus on outsourcing and restructuring, ignored job dissatisfaction among remaining employees which could reduce and or eliminate an airline's competitiveness. The purpose of this quantitative study was (a) to assess the relationship between increased levels of foreign-based MRO outsourcing and aviation professionals' job satisfaction (Y1); (b) to assess the influence of increased levels of foreign-based outsourcing on MRO control (Y2), MRO error rate (Y3), and MRO technical punctuality (Y4) as perceived by aviation professionals; and (c) to assess the influence of increased levels of foreign-based MRO outsourcing on technical skills (Y5) and morale ( Y6) as perceived by aviation professionals. The survey instrument was utilized based on Paul Spector's Job Satisfaction Questionnaire and MRO specific questions. A random sample of 300 U.S. airline participants was requested via MarketTools to meet required sample size of 110 as determined through a priori power analysis. Study data rendered 198 useable surveys of 213 total responses, and correlation, multiple regression, and ANOVA methods were used to test study hypotheses. The Spearman's rho for (Y 1) was statistically significant, p = .010 and multiple regression was statistically significant, p < .001. A one-way ANOVA indicated participants differed in their opinions of (Y2) through (Y6), Recommendations for future research include contrasting domestic and global MRO providers, and examining global aircraft parts suppliers and aviation technical training.

  1. The Natural Course of Bulimia Nervosa and Eating Disorder not Otherwise Specified is not Influenced by Personality Disorders

    PubMed Central

    Grilo, Carlos M.; Sanislow, Charles A.; Shea, M. Tracie; Skodol, Andrew E.; Stout, Robert L.; Pagano, Maria E.; Yen, Shirley; McGlashan, Thomas H.

    2013-01-01

    Objective To examine prospectively the natural course of bulimia nervosa (BN) and eating disorder not otherwise specified (EDNOS) and to test the effects of personality disorder (PD) comorbidity on the outcomes. Method Ninety-two female patients with current BN (N = 23) or EDNOS (N = 69) were evaluated at baseline enrollment in the Collaborative Longitudinal Personality Disorders Study (CLPS). Eating disorders (EDs) were assessed with the Structured Clinical Interview for DSM-IV Axis I Disorders. Personality disorders (PDs) were assessed with the Diagnostic Interview for DSM-IV PD (DIPD-IV). The course of BN and EDNOS was assessed with the Longitudinal Interval Follow-up Evaluation and the course of PDs was evaluated with the Follow-Along version of the DIPD-IV at 6, 12, and 24 months. Results Probability of remission at 24 months was 40% for BN and 59% for EDNOS. To test the effects of PD comorbidity on course, ED patients were divided into groups with no, one, and two or more PDs. Cox proportional regression analyses revealed that BN had a longer time to remission than EDNOS (p < .05). The number of PDs was not a significant predictor of time to remission, nor was the presence of Axis I psychiatric comorbidity or Global Assessment of Functioning scores. Analyses using proportional hazards regression with time-varying covariates revealed that PD instability was unrelated to changes in ED. Conclusions BN has a worse 24-month course (longer time to remission) than EDNOS. The natural course of BN and EDNOS is not influenced significantly by the presence, severity, or time-varying changes of co-occurring PDs, co-occurring Axis I disorders, or by global functioning. PMID:12949923

  2. Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review

    NASA Astrophysics Data System (ADS)

    Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca

    2014-05-01

    Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This raises some concern that existing studies do not always cover all the regions globally that currently experience landslides and landslide fatalities.

  3. Regression and direct methods do not give different estimates of digestible and metabolizable energy values of barley, sorghum, and wheat for pigs.

    PubMed

    Bolarinwa, O A; Adeola, O

    2016-02-01

    Direct or indirect methods can be used to determine the DE and ME of feed ingredients for pigs. In situations when only the indirect approach is suitable, the regression method presents a robust indirect approach. Three experiments were conducted to compare the direct and regression methods for determining the DE and ME values of barley, sorghum, and wheat for pigs. In each experiment, 24 barrows with an average initial BW of 31, 32, and 33 kg were assigned to 4 diets in a randomized complete block design. The 4 diets consisted of 969 g barley, sorghum, or wheat/kg plus minerals and vitamins for the direct method; a corn-soybean meal reference diet (RD); the RD + 300 g barley, sorghum, or wheat/kg; and the RD + 600 g barley, sorghum, or wheat/kg. The 3 corn-soybean meal diets were used for the regression method. Each diet was fed to 6 barrows in individual metabolism crates for a 5-d acclimation followed by a 5-d period of total but separate collection of feces and urine in each experiment. Graded substitution of barley or wheat, but not sorghum, into the RD linearly reduced ( < 0.05) dietary DE and ME. The direct method-derived DE and ME for barley were 3,669 and 3,593 kcal/kg DM, respectively. The regressions of barley contribution to DE and ME in kilocalories against the quantity of barley DMI in kilograms generated 3,746 kcal DE/kg DM and 3,647 kcal ME/kg DM. The DE and ME for sorghum by the direct method were 4,097 and 4,042 kcal/kg DM, respectively; the corresponding regression-derived estimates were 4,145 and 4,066 kcal/kg DM. Using the direct method, energy values for wheat were 3,953 kcal DE/kg DM and 3,889 kcal ME/kg DM. The regressions of wheat contribution to DE and ME in kilocalories against the quantity of wheat DMI in kilograms generated 3,960 kcal DE/kg DM and 3,874 kcal ME/kg DM. The DE and ME of barley using the direct method were not different (0.3 < < 0.4) from those obtained using the regression method (3,669 vs. 3,746 and 3,593 vs. 3,647 kcal/kg DM, respectively). The direct method-derived DE and ME of sorghum were not different (0.5 < < 0.7) from those obtained using the regression method (4,097 vs. 4,145 and 4,042 vs. 4,066 kcal/kg DM, respectively). The direct method- and regression method-derived DE (3,953 and 3,960 kcal/kg DM, respectively) and ME (3,889 and 3,874 kcal/kg DM, respectively) of wheat were not different (0.8 < < 0.9). Results of these 3 experiments suggest that regression and direct methods do not give different estimates of DE and ME in barley, sorghum, and wheat for pigs.

  4. Introduction to the use of regression models in epidemiology.

    PubMed

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  5. Drug use, mental health and problems related to crime and violence: cross-sectional study1

    PubMed Central

    Claro, Heloísa Garcia; de Oliveira, Márcia Aparecida Ferreira; Bourdreaux, Janet Titus; Fernandes, Ivan Filipe de Almeida Lopes; Pinho, Paula Hayasi; Tarifa, Rosana Ribeiro

    2015-01-01

    Objective: to investigate the correlation between disorders related to the use of alcohol and other drugs and symptoms of mental disorders, problems related to crime and violence and to age and gender. Methods: cross-sectional descriptive study carried out with 128 users of a Psychosocial Care Center for Alcohol and other Drugs, in the city of São Paulo, interviewed by means of the instrument entitled Global Appraisal of Individual Needs - Short Screener. Univariate and multiple linear regression models were used to verify the correlation between the variables. Results: using univariate regression models, internalizing and externalizing symptoms and problems related to crime/violence proved significant and were included in the multiple model, in which only the internalizing symptoms and problems related to crime and violence remained significant. Conclusions: there is a correlation between the severity of problems related to alcohol use and severity of mental health symptoms and crime and violence in the study sample. The results emphasize the need for an interdisciplinary and intersectional character of attention to users of alcohol and other drugs, since they live in a socially vulnerable environment. PMID:26626010

  6. A binary genetic programing model for teleconnection identification between global sea surface temperature and local maximum monthly rainfall events

    NASA Astrophysics Data System (ADS)

    Danandeh Mehr, Ali; Nourani, Vahid; Hrnjica, Bahrudin; Molajou, Amir

    2017-12-01

    The effectiveness of genetic programming (GP) for solving regression problems in hydrology has been recognized in recent studies. However, its capability to solve classification problems has not been sufficiently explored so far. This study develops and applies a novel classification-forecasting model, namely Binary GP (BGP), for teleconnection studies between sea surface temperature (SST) variations and maximum monthly rainfall (MMR) events. The BGP integrates certain types of data pre-processing and post-processing methods with conventional GP engine to enhance its ability to solve both regression and classification problems simultaneously. The model was trained and tested using SST series of Black Sea, Mediterranean Sea, and Red Sea as potential predictors as well as classified MMR events at two locations in Iran as predictand. Skill of the model was measured in regard to different rainfall thresholds and SST lags and compared to that of the hybrid decision tree-association rule (DTAR) model available in the literature. The results indicated that the proposed model can identify potential teleconnection signals of surrounding seas beneficial to long-term forecasting of the occurrence of the classified MMR events.

  7. Global Evidence on the Association between Cigarette Graphic Warning Labels and Cigarette Smoking Prevalence and Consumption

    PubMed Central

    Ngo, Anh; Cheng, Kai-Wen; Huang, Jidong; Chaloupka, Frank J.

    2018-01-01

    Background: In 2011, the courts ruled in favor of tobacco companies in preventing the implementation of graphic warning labels (GWLs) in the US, stating that FDA had not established the effectiveness of GWLs in reducing smoking. Methods: Data came from various sources: the WHO MPOWER package (GWLs, MPOWER policy measures, cigarette prices), Euromonitor International (smoking prevalence, cigarette consumption), and the World Bank database (countries’ demographic characteristics). The datasets were aggregated and linked using country and year identifiers. Fractional logit regressions and OLS regressions were applied to examine the associations between GWLs and smoking prevalence and cigarette consumption, controlling for MPOWER policy scores, cigarette prices, GDP per capita, unemployment, population aged 15–64 (%), aged 65 and over (%), year indicators, and country fixed effects. Results: GWLs were associated with a 0.9–3 percentage point decrease in adult smoking prevalence and were significantly associated with a reduction of 230–287 sticks in per capita cigarette consumption, compared to countries without GWLs. However, the association between GWLs and cigarette consumption became statistically insignificant once country indicators were included in the models. Conclusions: The implementation of GWLs may be associated with reduced cigarette smoking. PMID:29495581

  8. Global Evidence on the Association between Cigarette Graphic Warning Labels and Cigarette Smoking Prevalence and Consumption.

    PubMed

    Ngo, Anh; Cheng, Kai-Wen; Shang, Ce; Huang, Jidong; Chaloupka, Frank J

    2018-02-28

    Background : In 2011, the courts ruled in favor of tobacco companies in preventing the implementation of graphic warning labels (GWLs) in the US, stating that FDA had not established the effectiveness of GWLs in reducing smoking. Methods : Data came from various sources: the WHO MPOWER package (GWLs, MPOWER policy measures, cigarette prices), Euromonitor International (smoking prevalence, cigarette consumption), and the World Bank database (countries' demographic characteristics). The datasets were aggregated and linked using country and year identifiers. Fractional logit regressions and OLS regressions were applied to examine the associations between GWLs and smoking prevalence and cigarette consumption, controlling for MPOWER policy scores, cigarette prices, GDP per capita, unemployment, population aged 15-64 (%), aged 65 and over (%), year indicators, and country fixed effects. Results : GWLs were associated with a 0.9-3 percentage point decrease in adult smoking prevalence and were significantly associated with a reduction of 230-287 sticks in per capita cigarette consumption, compared to countries without GWLs. However, the association between GWLs and cigarette consumption became statistically insignificant once country indicators were included in the models. Conclusions : The implementation of GWLs may be associated with reduced cigarette smoking.

  9. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  10. The Global Burden of Mental, Neurological and Substance Use Disorders: An Analysis from the Global Burden of Disease Study 2010

    PubMed Central

    Whiteford, Harvey A.; Ferrari, Alize J.; Degenhardt, Louisa; Feigin, Valery; Vos, Theo

    2015-01-01

    Background The Global Burden of Disease Study 2010 (GBD 2010), estimated that a substantial proportion of the world’s disease burden came from mental, neurological and substance use disorders. In this paper, we used GBD 2010 data to investigate time, year, region and age specific trends in burden due to mental, neurological and substance use disorders. Method For each disorder, prevalence data were assembled from systematic literature reviews. DisMod-MR, a Bayesian meta-regression tool, was used to model prevalence by country, region, age, sex and year. Prevalence data were combined with disability weights derived from survey data to estimate years lived with disability (YLDs). Years lost to premature mortality (YLLs) were estimated by multiplying deaths occurring as a result of a given disorder by the reference standard life expectancy at the age death occurred. Disability-adjusted life years (DALYs) were computed as the sum of YLDs and YLLs. Results In 2010, mental, neurological and substance use disorders accounted for 10.4% of global DALYs, 2.3% of global YLLs and, 28.5% of global YLDs, making them the leading cause of YLDs. Mental disorders accounted for the largest proportion of DALYs (56.7%), followed by neurological disorders (28.6%) and substance use disorders (14.7%). DALYs peaked in early adulthood for mental and substance use disorders but were more consistent across age for neurological disorders. Females accounted for more DALYs in all mental and neurological disorders, except for mental disorders occurring in childhood, schizophrenia, substance use disorders, Parkinson’s disease and epilepsy where males accounted for more DALYs. Overall DALYs were highest in Eastern Europe/Central Asia and lowest in East Asia/the Pacific. Conclusion Mental, neurological and substance use disorders contribute to a significant proportion of disease burden. Health systems can respond by implementing established, cost effective interventions, or by supporting the research necessary to develop better prevention and treatment options. PMID:25658103

  11. Quantitative Trait Locus Analysis of SIX1-SIX6 with Retinal Nerve Fiber Layer Thickness in Individuals of European Descent

    PubMed Central

    Kuo, Jane Z.; Zangwill, Linda M.; Medeiros, Felipe A.; Liebmann, Jeffery M.; Girkin, Christopher A.; Hammel, Na’ama; Rotter, Jerome I.; Weinreb, Robert N.

    2015-01-01

    Purpose To perform a quantitative trait locus (QTL) analysis and evaluate whether a locus between SIX1 and SIX6 is associated with retinal nerve fiber layer (RNFL) thickness in individuals of European descent. Design Observational, multi-center, cross-sectional study. Methods 231 participants were recruited from the Diagnostic Innovations in Glaucoma Study and the African Descent and Glaucoma Evaluation Study. Association of rs10483727 in SIX1-SIX6 with global and sectoral RNFL thickness was performed. Quantitative trait analysis with the additive model of inheritance was analyzed using linear regression. Trend analysis was performed to evaluate the mean global and sectoral RNFL thickness with 3 genotypes of interest (T/T, C/T, C/C). All models were adjusted for age and gender. Results Direction of association between T allele and RNFL thickness was consistent in the global and different sectoral RNFL regions. Each copy of the T risk allele in rs10483727 was associated with −0.16 μm thinner global RNFL thickness (β=−0.16, 95% CI: −0.28 to −0.03; P=0.01). Similar patterns were found for the sectoral regions, including inferior (P=0.03), inferior-nasal (P=0.017), superior-nasal (P=0.0025), superior (P=0.002) and superior-temporal (P=0.008). The greatest differences were observed in the superior and inferior quadrants, supporting clinical observations for RNFL thinning in glaucoma. Thinner global RNFL was found in subjects with T/T genotypes compared to subjects with C/T and C/C genotypes (P=0.044). Conclusions Each copy of the T risk allele has an additive effect and was associated with thinner global and sectoral RNFL. Findings from this QTL analysis further support a genetic contribution to glaucoma pathophysiology. PMID:25849520

  12. Conversion of Local and Surface-Wave Magnitudes to Moment Magnitude for Earthquakes in the Chinese Mainland

    NASA Astrophysics Data System (ADS)

    Li, X.; Gao, M.

    2017-12-01

    The magnitude of an earthquake is one of its basic parameters and is a measure of its scale. It plays a significant role in seismology and earthquake engineering research, particularly in the calculations of the seismic rate and b value in earthquake prediction and seismic hazard analysis. However, several current types of magnitudes used in seismology research, such as local magnitude (ML), surface wave magnitude (MS), and body-wave magnitude (MB), have a common limitation, which is the magnitude saturation phenomenon. Fortunately, the problem of magnitude saturation was solved by a formula for calculating the seismic moment magnitude (MW) based on the seismic moment, which describes the seismic source strength. Now the moment magnitude is very commonly used in seismology research. However, in China, the earthquake scale is primarily based on local and surface-wave magnitudes. In the present work, we studied the empirical relationships between moment magnitude (MW) and local magnitude (ML) as well as surface wave magnitude (MS) in the Chinese Mainland. The China Earthquake Networks Center (CENC) ML catalog, China Seismograph Network (CSN) MS catalog, ANSS Comprehensive Earthquake Catalog (ComCat), and Global Centroid Moment Tensor (GCMT) are adopted to regress the relationships using the orthogonal regression method. The obtained relationships are as follows: MW=0.64+0.87MS; MW=1.16+0.75ML. Therefore, in China, if the moment magnitude of an earthquake is not reported by any agency in the world, we can use the equations mentioned above for converting ML to MW and MS to MW. These relationships are very important, because they will allow the China earthquake catalogs to be used more effectively for seismic hazard analysis, earthquake prediction, and other seismology research. We also computed the relationships of and (where Mo is the seismic moment) by linear regression using the Global Centroid Moment Tensor. The obtained relationships are as follows: logMo=18.21+1.05ML; logMo=17.04+1.32MS. This formula can be used by seismologists to convert the ML/MS of Chinese mainland events into their seismic moments.

  13. A Technique of Fuzzy C-Mean in Multiple Linear Regression Model toward Paddy Yield

    NASA Astrophysics Data System (ADS)

    Syazwan Wahab, Nur; Saifullah Rusiman, Mohd; Mohamad, Mahathir; Amira Azmi, Nur; Che Him, Norziha; Ghazali Kamardan, M.; Ali, Maselan

    2018-04-01

    In this paper, we propose a hybrid model which is a combination of multiple linear regression model and fuzzy c-means method. This research involved a relationship between 20 variates of the top soil that are analyzed prior to planting of paddy yields at standard fertilizer rates. Data used were from the multi-location trials for rice carried out by MARDI at major paddy granary in Peninsular Malaysia during the period from 2009 to 2012. Missing observations were estimated using mean estimation techniques. The data were analyzed using multiple linear regression model and a combination of multiple linear regression model and fuzzy c-means method. Analysis of normality and multicollinearity indicate that the data is normally scattered without multicollinearity among independent variables. Analysis of fuzzy c-means cluster the yield of paddy into two clusters before the multiple linear regression model can be used. The comparison between two method indicate that the hybrid of multiple linear regression model and fuzzy c-means method outperform the multiple linear regression model with lower value of mean square error.

  14. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  15. Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification.

    PubMed

    Fan, Jianqing; Feng, Yang; Jiang, Jiancheng; Tong, Xin

    We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS. In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing.

  16. Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification

    PubMed Central

    Feng, Yang; Jiang, Jiancheng; Tong, Xin

    2015-01-01

    We propose a high dimensional classification method that involves nonparametric feature augmentation. Knowing that marginal density ratios are the most powerful univariate classifiers, we use the ratio estimates to transform the original feature measurements. Subsequently, penalized logistic regression is invoked, taking as input the newly transformed or augmented features. This procedure trains models equipped with local complexity and global simplicity, thereby avoiding the curse of dimensionality while creating a flexible nonlinear decision boundary. The resulting method is called Feature Augmentation via Nonparametrics and Selection (FANS). We motivate FANS by generalizing the Naive Bayes model, writing the log ratio of joint densities as a linear combination of those of marginal densities. It is related to generalized additive models, but has better interpretability and computability. Risk bounds are developed for FANS. In numerical analysis, FANS is compared with competing methods, so as to provide a guideline on its best application domain. Real data analysis demonstrates that FANS performs very competitively on benchmark email spam and gene expression data sets. Moreover, FANS is implemented by an extremely fast algorithm through parallel computing. PMID:27185970

  17. Estimation and Mapping of Coastal Mangrove Biomass Using Both Passive and Active Remote Sensing Method

    NASA Astrophysics Data System (ADS)

    Yiqiong, L.; Lu, W.; Zhou, J.; Gan, W.; Cui, X.; Lin, G., Sr.

    2015-12-01

    Mangrove forests play an important role in global carbon cycle, but carbon stocks in different mangrove forests are not easily measured at large scale. In this research, both active and passive remote sensing methods were used to estimate the aboveground biomass of dominant mangrove communities in Zhanjiang National Mangrove Nature Reserve in Guangdong, China. We set up a decision tree including spectral, texture, position and geometry indexes to achieve mangrove inter-species classification among 5 main species named Aegiceras corniculatum, Aricennia marina, Bruguiera gymnorrhiza, Kandelia candel, Sonneratia apetala by using 5.8m multispectral ZY-3 images. In addition, Lidar data were collected and used to obtain the canopy height of different mangrove species. Then, regression equations between the field measured aboveground biomass and the canopy height deduced from Lidar data were established for these mangrove species. By combining these results, we were able to establish a relatively accurate method for differentiating mangrove species and mapping their aboveground biomass distribution at the estuary scale, which could be applied to mangrove forests in other regions.

  18. Prediction of maize phenotype based on whole-genome single nucleotide polymorphisms using deep belief networks

    NASA Astrophysics Data System (ADS)

    Rachmatia, H.; Kusuma, W. A.; Hasibuan, L. S.

    2017-05-01

    Selection in plant breeding could be more effective and more efficient if it is based on genomic data. Genomic selection (GS) is a new approach for plant-breeding selection that exploits genomic data through a mechanism called genomic prediction (GP). Most of GP models used linear methods that ignore effects of interaction among genes and effects of higher order nonlinearities. Deep belief network (DBN), one of the architectural in deep learning methods, is able to model data in high level of abstraction that involves nonlinearities effects of the data. This study implemented DBN for developing a GP model utilizing whole-genome Single Nucleotide Polymorphisms (SNPs) as data for training and testing. The case study was a set of traits in maize. The maize dataset was acquisitioned from CIMMYT’s (International Maize and Wheat Improvement Center) Global Maize program. Based on Pearson correlation, DBN is outperformed than other methods, kernel Hilbert space (RKHS) regression, Bayesian LASSO (BL), best linear unbiased predictor (BLUP), in case allegedly non-additive traits. DBN achieves correlation of 0.579 within -1 to 1 range.

  19. A Two-Stage Method to Determine Optimal Product Sampling considering Dynamic Potential Market

    PubMed Central

    Hu, Zhineng; Lu, Wei; Han, Bing

    2015-01-01

    This paper develops an optimization model for the diffusion effects of free samples under dynamic changes in potential market based on the characteristics of independent product and presents a two-stage method to figure out the sampling level. The impact analysis of the key factors on the sampling level shows that the increase of the external coefficient or internal coefficient has a negative influence on the sampling level. And the changing rate of the potential market has no significant influence on the sampling level whereas the repeat purchase has a positive one. Using logistic analysis and regression analysis, the global sensitivity analysis gives a whole analysis of the interaction of all parameters, which provides a two-stage method to estimate the impact of the relevant parameters in the case of inaccuracy of the parameters and to be able to construct a 95% confidence interval for the predicted sampling level. Finally, the paper provides the operational steps to improve the accuracy of the parameter estimation and an innovational way to estimate the sampling level. PMID:25821847

  20. Model Robust Calibration: Method and Application to Electronically-Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.; Starnes, B. Alden; Birch, Jeffery B.; Mays, James E.

    2010-01-01

    This article presents the application of a recently developed statistical regression method to the controlled instrument calibration problem. The statistical method of Model Robust Regression (MRR), developed by Mays, Birch, and Starnes, is shown to improve instrument calibration by reducing the reliance of the calibration on a predetermined parametric (e.g. polynomial, exponential, logarithmic) model. This is accomplished by allowing fits from the predetermined parametric model to be augmented by a certain portion of a fit to the residuals from the initial regression using a nonparametric (locally parametric) regression technique. The method is demonstrated for the absolute scale calibration of silicon-based pressure transducers.

  1. Estimation of Global Subsurface Thermal Structure from Satellite Remote Sensing Observations Based on Machine Learning

    NASA Astrophysics Data System (ADS)

    Su, H.; Yan, X. H.

    2017-12-01

    Subsurface thermal structure of the global ocean is a key factor that reflects the impact of the global climate variability and change. Accurately determining and describing the global subsurface and deeper ocean thermal structure from satellite measurements is becoming even more important for understanding the ocean interior anomaly and dynamic processes during recent global warming and hiatus. It is essential but challenging to determine the extent to which such surface remote sensing observations can be used to develop information about the global ocean interior. This study proposed a Support Vector Regression (SVR) method to estimate Subsurface Temperature Anomaly (STA) in the global ocean. The SVR model can well estimate the global STA upper 1000 m through a suite of satellite remote sensing observations of sea surface parameters (including Sea Surface Height Anomaly (SSHA), Sea Surface Temperature Anomaly (SSTA), Sea Surface Salinity Anomaly (SSSA) and Sea Surface Wind Anomaly (SSWA)) with in situ Argo data for training and testing at different depth levels. Here, we employed the MSE and R2 to assess SVR performance on the STA estimation. The results from the SVR model were validated for the accuracy and reliability using the worldwide Argo STA data. The average MSE and R2 of the 15 levels are 0.0090 / 0.0086 / 0.0087 and 0.443 / 0.457 / 0.485 for 2-attributes (SSHA, SSTA) / 3-attributes (SSHA, SSTA, SSSA) / 4-attributes (SSHA, SSTA, SSSA, SSWA) SVR, respectively. The estimation accuracy was improved by including SSSA and SSWA for SVR input (MSE decreased by 0.4% / 0.3% and R2 increased by 1.4% / 4.2% on average). While, the estimation accuracy gradually decreased with the increase of the depth from 500 m. The results showed that SSSA and SSWA, in addition to SSTA and SSHA, are useful parameters that can help estimate the subsurface thermal structure, as well as improve the STA estimation accuracy. In future, we can figure out more potential and useful sea surface parameters from satellite remote sensing as input attributes so as to further improve the STA sensing accuracy from machine learning. This study can provide a helpful technique for studying thermal variability in the ocean interior which has played an important role in recent global warming and hiatus from satellite observations over global scale.

  2. Linear regression techniques for use in the EC tracer method of secondary organic aerosol estimation

    NASA Astrophysics Data System (ADS)

    Saylor, Rick D.; Edgerton, Eric S.; Hartsell, Benjamin E.

    A variety of linear regression techniques and simple slope estimators are evaluated for use in the elemental carbon (EC) tracer method of secondary organic carbon (OC) estimation. Linear regression techniques based on ordinary least squares are not suitable for situations where measurement uncertainties exist in both regressed variables. In the past, regression based on the method of Deming [1943. Statistical Adjustment of Data. Wiley, London] has been the preferred choice for EC tracer method parameter estimation. In agreement with Chu [2005. Stable estimate of primary OC/EC ratios in the EC tracer method. Atmospheric Environment 39, 1383-1392], we find that in the limited case where primary non-combustion OC (OC non-comb) is assumed to be zero, the ratio of averages (ROA) approach provides a stable and reliable estimate of the primary OC-EC ratio, (OC/EC) pri. In contrast with Chu [2005. Stable estimate of primary OC/EC ratios in the EC tracer method. Atmospheric Environment 39, 1383-1392], however, we find that the optimal use of Deming regression (and the more general York et al. [2004. Unified equations for the slope, intercept, and standard errors of the best straight line. American Journal of Physics 72, 367-375] regression) provides excellent results as well. For the more typical case where OC non-comb is allowed to obtain a non-zero value, we find that regression based on the method of York is the preferred choice for EC tracer method parameter estimation. In the York regression technique, detailed information on uncertainties in the measurement of OC and EC is used to improve the linear best fit to the given data. If only limited information is available on the relative uncertainties of OC and EC, then Deming regression should be used. On the other hand, use of ROA in the estimation of secondary OC, and thus the assumption of a zero OC non-comb value, generally leads to an overestimation of the contribution of secondary OC to total measured OC.

  3. Linear regression based on Minimum Covariance Determinant (MCD) and TELBS methods on the productivity of phytoplankton

    NASA Astrophysics Data System (ADS)

    Gusriani, N.; Firdaniza

    2018-03-01

    The existence of outliers on multiple linear regression analysis causes the Gaussian assumption to be unfulfilled. If the Least Square method is forcedly used on these data, it will produce a model that cannot represent most data. For that, we need a robust regression method against outliers. This paper will compare the Minimum Covariance Determinant (MCD) method and the TELBS method on secondary data on the productivity of phytoplankton, which contains outliers. Based on the robust determinant coefficient value, MCD method produces a better model compared to TELBS method.

  4. Using the Ridge Regression Procedures to Estimate the Multiple Linear Regression Coefficients

    NASA Astrophysics Data System (ADS)

    Gorgees, HazimMansoor; Mahdi, FatimahAssim

    2018-05-01

    This article concerns with comparing the performance of different types of ordinary ridge regression estimators that have been already proposed to estimate the regression parameters when the near exact linear relationships among the explanatory variables is presented. For this situations we employ the data obtained from tagi gas filling company during the period (2008-2010). The main result we reached is that the method based on the condition number performs better than other methods since it has smaller mean square error (MSE) than the other stated methods.

  5. Geodesic least squares regression on information manifolds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verdoolaege, Geert, E-mail: geert.verdoolaege@ugent.be

    We present a novel regression method targeted at situations with significant uncertainty on both the dependent and independent variables or with non-Gaussian distribution models. Unlike the classic regression model, the conditional distribution of the response variable suggested by the data need not be the same as the modeled distribution. Instead they are matched by minimizing the Rao geodesic distance between them. This yields a more flexible regression method that is less constrained by the assumptions imposed through the regression model. As an example, we demonstrate the improved resistance of our method against some flawed model assumptions and we apply thismore » to scaling laws in magnetic confinement fusion.« less

  6. Advanced long-term bird banding and climate data mining in spring confirm passerine population declines for the Northeast Chinese-Russian flyway

    NASA Astrophysics Data System (ADS)

    Jiao, Shengwu; Huettmann, Falk; Guo, Yumin; Li, Xianda; Ouyang, Yanlan

    2016-09-01

    The migration of birds is fascinating for humans but it's also a serious environmental monitoring and management issue on a global level. Bird banding using mistnets has been the method of choice for decades worldwide; linking these data with climate data allows to infer on global warming and outlier events. However, good methods to achieve this effectively in time and space for many species are still missing; data for Asia are specifically sparse and often 'messy'. Here we present a data mining summary information for data from two bird banding stations (Gaofeng and Qingfeng) along the vast Northeast Chinese-Russian flyway. Bird data were collected during spring 2002-2011 with standardized techniques and then linked with related climate data in the banding as well as the wintering sites. This creates a complex data set which is based on a decade and which includes many predictors. This first-time data mining analysis with 'data cloning' and machine learning methods (boosted regression trees) shows how to extract the major signals in this unique dataset from highly correlated and interacting predictors. Our results indicate a large-scale warming trend for the flyway, with a start in 2003, and a freezing rain outlier event in 2008; the last years remained on a rather warm level. All evidence along this vast flyway supports major changes, warming trends, habitat losses and consequently strong passerine declines. Presumably human pressures are a major factor either way and we propose to address these problems immediately for betterment if meaningful conservation targets are to be met.

  7. Global sleep quality as a moderator of alcohol consumption and consequences in college students.

    PubMed

    Kenney, Shannon R; LaBrie, Joseph W; Hummer, Justin F; Pham, Andy T

    2012-04-01

    The authors examined the relationship between global sleep quality and alcohol risk, including the extent to which global sleep quality moderated the relationship between alcohol use and drinking-related consequences. Global sleep quality was measured using the Pittsburgh Sleep Quality Index (PSQI) and alcohol-related consequences were assessed using the Rutgers Alcohol Problem Index (RAPI). The sample consisted of 261 college students (61.3% female, 58.2% Caucasian) who completed online surveys. Using a four-step hierarchical multiple regression model, global sleep quality was found to predict alcohol consequences, over and above assessed covariates (demographics and weekly drinking). Further, global sleep quality emerged as a strong moderator in the drinking-consequences relationship such that among heavier drinkers, those with poorer global sleep quality experienced significantly greater alcohol-related harm. Campus health education and alcohol interventions may be adapted to address the importance of maintaining a healthy lifestyle, both in terms of healthful sleeping and drinking behaviors, which appear to play a strong synergistic role in alcohol-related risk. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Optimizing methods for linking cinematic features to fMRI data.

    PubMed

    Kauttonen, Janne; Hlushchuk, Yevhen; Tikka, Pia

    2015-04-15

    One of the challenges of naturalistic neurosciences using movie-viewing experiments is how to interpret observed brain activations in relation to the multiplicity of time-locked stimulus features. As previous studies have shown less inter-subject synchronization across viewers of random video footage than story-driven films, new methods need to be developed for analysis of less story-driven contents. To optimize the linkage between our fMRI data collected during viewing of a deliberately non-narrative silent film 'At Land' by Maya Deren (1944) and its annotated content, we combined the method of elastic-net regularization with the model-driven linear regression and the well-established data-driven independent component analysis (ICA) and inter-subject correlation (ISC) methods. In the linear regression analysis, both IC and region-of-interest (ROI) time-series were fitted with time-series of a total of 36 binary-valued and one real-valued tactile annotation of film features. The elastic-net regularization and cross-validation were applied in the ordinary least-squares linear regression in order to avoid over-fitting due to the multicollinearity of regressors, the results were compared against both the partial least-squares (PLS) regression and the un-regularized full-model regression. Non-parametric permutation testing scheme was applied to evaluate the statistical significance of regression. We found statistically significant correlation between the annotation model and 9 ICs out of 40 ICs. Regression analysis was also repeated for a large set of cubic ROIs covering the grey matter. Both IC- and ROI-based regression analyses revealed activations in parietal and occipital regions, with additional smaller clusters in the frontal lobe. Furthermore, we found elastic-net based regression more sensitive than PLS and un-regularized regression since it detected a larger number of significant ICs and ROIs. Along with the ISC ranking methods, our regression analysis proved a feasible method for ordering the ICs based on their functional relevance to the annotated cinematic features. The novelty of our method is - in comparison to the hypothesis-driven manual pre-selection and observation of some individual regressors biased by choice - in applying data-driven approach to all content features simultaneously. We found especially the combination of regularized regression and ICA useful when analyzing fMRI data obtained using non-narrative movie stimulus with a large set of complex and correlated features. Copyright © 2015. Published by Elsevier Inc.

  9. Employee impact and attitude analysis for GHS implementation in Taiwan.

    PubMed

    Chang, Yi-Kuo; Su, Teh-Sheng; Ouyang, Yun; Tseng, Jo-Ming

    2013-01-01

    The employee impact and attitude analysis for GHS implementation in Taiwan was investigated in this study. An impact assessment on the new regulations or changes in regulations for government, potential costs, benefits, and the global trade in chemicals to industries and hazard communication program for workers was studied by the methods of the questionnaire design and Delphi expert method. A survey was conducted using questionnaires and taking 200 experts from government's expert database and 500 selected respondents from case company. Results from present study revealed that the barrier associated with GHS implementation is existed; it is feasible to overcome. Both experts and employees think that business entities are insufficient to test and classify chemicals on their own, and the technical guidance from the government is needed. Data analyzed by the logistic regression revealed that more hours an employee spends on education and trainings of new GHS systems; the employee thinks implementation of GHS will improve hazard awareness for transporters. The weak labeling ability affects deployment of the new GHS system.

  10. CO2 forcing induces semi-direct effects with consequences for climate feedback interpretations

    NASA Astrophysics Data System (ADS)

    Andrews, Timothy; Forster, Piers M.

    2008-02-01

    Climate forcing and feedbacks are diagnosed from seven slab-ocean GCMs for 2 × CO2 using a regression method. Results are compared to those using conventional methodologies to derive a semi-direct forcing due to tropospheric adjustment, analogous to the semi-direct effect of absorbing aerosols. All models show a cloud semi-direct effect, indicating a rapid cloud response to CO2; cloud typically decreases, enhancing the warming. Similarly there is evidence of semi-direct effects from water-vapour, lapse-rate, ice and snow. Previous estimates of climate feedbacks are unlikely to have taken these semi-direct effects into account and so misinterpret processes as feedbacks that depend only on the forcing, but not the global surface temperature. We show that the actual cloud feedback is smaller than what previous methods suggest and that a significant part of the cloud response and the large spread between previous model estimates of cloud feedback is due to the semi-direct forcing.

  11. New Casemix Classification as an Alternative Method for Budget Allocation in Thai Oral Healthcare Service: A Pilot Study

    PubMed Central

    Wisaijohn, Thunthita; Pimkhaokham, Atiphan; Lapying, Phenkhae; Itthichaisri, Chumpot; Pannarunothai, Supasit; Igarashi, Isao; Kawabuchi, Koichi

    2010-01-01

    This study aimed to develop a new casemix classification system as an alternative method for the budget allocation of oral healthcare service (OHCS). Initially, the International Statistical of Diseases and Related Health Problem, 10th revision, Thai Modification (ICD-10-TM) related to OHCS was used for developing the software “Grouper”. This model was designed to allow the translation of dental procedures into eight-digit codes. Multiple regression analysis was used to analyze the relationship between the factors used for developing the model and the resource consumption. Furthermore, the coefficient of variance, reduction in variance, and relative weight (RW) were applied to test the validity. The results demonstrated that 1,624 OHCS classifications, according to the diagnoses and the procedures performed, showed high homogeneity within groups and heterogeneity between groups. Moreover, the RW of the OHCS could be used to predict and control the production costs. In conclusion, this new OHCS casemix classification has a potential use in a global decision making. PMID:20936134

  12. New casemix classification as an alternative method for budget allocation in thai oral healthcare service: a pilot study.

    PubMed

    Wisaijohn, Thunthita; Pimkhaokham, Atiphan; Lapying, Phenkhae; Itthichaisri, Chumpot; Pannarunothai, Supasit; Igarashi, Isao; Kawabuchi, Koichi

    2010-01-01

    This study aimed to develop a new casemix classification system as an alternative method for the budget allocation of oral healthcare service (OHCS). Initially, the International Statistical of Diseases and Related Health Problem, 10th revision, Thai Modification (ICD-10-TM) related to OHCS was used for developing the software "Grouper". This model was designed to allow the translation of dental procedures into eight-digit codes. Multiple regression analysis was used to analyze the relationship between the factors used for developing the model and the resource consumption. Furthermore, the coefficient of variance, reduction in variance, and relative weight (RW) were applied to test the validity. The results demonstrated that 1,624 OHCS classifications, according to the diagnoses and the procedures performed, showed high homogeneity within groups and heterogeneity between groups. Moreover, the RW of the OHCS could be used to predict and control the production costs. In conclusion, this new OHCS casemix classification has a potential use in a global decision making.

  13. [Sexual behavior and emergency contraception among adolescents from public schools in Pernambuco State, Brazil].

    PubMed

    Araújo, Maria Suely Peixoto de; Costa, Laura Olinda Bregieiro Fernandes

    2009-03-01

    This study focused on knowledge and use of emergency contraception among 4,210 adolescents (14-19 years) enrolled in public schools in Pernambuco State, Brazil. Information was collected using the Global School-Based Student Health Survey, previously validated. Knowledge, frequency, and form of use of emergency contraception were investigated. Independent variables were classified as socio-demographic and those related to sexual behavior. Most adolescents reported knowing and having received information about the method, but among those who had already used it, only 22.1% had done so correctly. Adjusted regression analysis showed greater likelihood of knowledge about the method among girls (OR = 5.03; 95%CI: 1.72-14.69) and the sexually initiated (OR = 1.52; 95%CI: 1.34-1.75), while rural residents were 68% less knowledgeable. Rural residents showed 1.68 times higher odds (CI95%: 1.09-2.25) of incorrect use, while girls showed 71% lower likelihood of incorrect use. Sexual and reproductive education is necessary, especially among male and rural adolescents.

  14. Impact of Drainage Networks on Cholera Outbreaks in Lusaka, Zambia

    PubMed Central

    Suzuki, Hiroshi; Fujino, Yasuyuki; Kimura, Yoshinari; Cheelo, Meetwell

    2009-01-01

    Objectives. We investigated the association between precipitation patterns and cholera outbreaks and the preventative roles of drainage networks against outbreaks in Lusaka, Zambia. Methods. We collected data on 6542 registered cholera patients in the 2003–2004 outbreak season and on 6045 cholera patients in the 2005–2006 season. Correlations between monthly cholera incidences and amount of precipitation were examined. The distribution pattern of the disease was analyzed by a kriging spatial analysis method. We analyzed cholera case distribution and spatiotemporal cluster by using 2590 cholera cases traced with a global positioning system in the 2005–2006 season. The association between drainage networks and cholera cases was analyzed with regression analysis. Results. Increased precipitation was associated with the occurrence of cholera outbreaks, and insufficient drainage networks were statistically associated with cholera incidences. Conclusions. Insufficient coverage of drainage networks elevated the risk of cholera outbreaks. Integrated development is required to upgrade high-risk areas with sufficient infrastructure for a long-term cholera prevention strategy. PMID:19762668

  15. Decreasing Multicollinearity: A Method for Models with Multiplicative Functions.

    ERIC Educational Resources Information Center

    Smith, Kent W.; Sasaki, M. S.

    1979-01-01

    A method is proposed for overcoming the problem of multicollinearity in multiple regression equations where multiplicative independent terms are entered. The method is not a ridge regression solution. (JKS)

  16. Evaluation of logistic regression models and effect of covariates for case-control study in RNA-Seq analysis.

    PubMed

    Choi, Seung Hoan; Labadorf, Adam T; Myers, Richard H; Lunetta, Kathryn L; Dupuis, Josée; DeStefano, Anita L

    2017-02-06

    Next generation sequencing provides a count of RNA molecules in the form of short reads, yielding discrete, often highly non-normally distributed gene expression measurements. Although Negative Binomial (NB) regression has been generally accepted in the analysis of RNA sequencing (RNA-Seq) data, its appropriateness has not been exhaustively evaluated. We explore logistic regression as an alternative method for RNA-Seq studies designed to compare cases and controls, where disease status is modeled as a function of RNA-Seq reads using simulated and Huntington disease data. We evaluate the effect of adjusting for covariates that have an unknown relationship with gene expression. Finally, we incorporate the data adaptive method in order to compare false positive rates. When the sample size is small or the expression levels of a gene are highly dispersed, the NB regression shows inflated Type-I error rates but the Classical logistic and Bayes logistic (BL) regressions are conservative. Firth's logistic (FL) regression performs well or is slightly conservative. Large sample size and low dispersion generally make Type-I error rates of all methods close to nominal alpha levels of 0.05 and 0.01. However, Type-I error rates are controlled after applying the data adaptive method. The NB, BL, and FL regressions gain increased power with large sample size, large log2 fold-change, and low dispersion. The FL regression has comparable power to NB regression. We conclude that implementing the data adaptive method appropriately controls Type-I error rates in RNA-Seq analysis. Firth's logistic regression provides a concise statistical inference process and reduces spurious associations from inaccurately estimated dispersion parameters in the negative binomial framework.

  17. Locally-constrained Boundary Regression for Segmentation of Prostate and Rectum in the Planning CT Images

    PubMed Central

    Shao, Yeqin; Gao, Yaozong; Wang, Qian; Yang, Xin; Shen, Dinggang

    2015-01-01

    Automatic and accurate segmentation of the prostate and rectum in planning CT images is a challenging task due to low image contrast, unpredictable organ (relative) position, and uncertain existence of bowel gas across different patients. Recently, regression forest was adopted for organ deformable segmentation on 2D medical images by training one landmark detector for each point on the shape model. However, it seems impractical for regression forest to guide 3D deformable segmentation as a landmark detector, due to large number of vertices in the 3D shape model as well as the difficulty in building accurate 3D vertex correspondence for each landmark detector. In this paper, we propose a novel boundary detection method by exploiting the power of regression forest for prostate and rectum segmentation. The contributions of this paper are as follows: 1) we introduce regression forest as a local boundary regressor to vote the entire boundary of a target organ, which avoids training a large number of landmark detectors and building an accurate 3D vertex correspondence for each landmark detector; 2) an auto-context model is integrated with regression forest to improve the accuracy of the boundary regression; 3) we further combine a deformable segmentation method with the proposed local boundary regressor for the final organ segmentation by integrating organ shape priors. Our method is evaluated on a planning CT image dataset with 70 images from 70 different patients. The experimental results show that our proposed boundary regression method outperforms the conventional boundary classification method in guiding the deformable model for prostate and rectum segmentations. Compared with other state-of-the-art methods, our method also shows a competitive performance. PMID:26439938

  18. Drivers of wetland conversion: a global meta-analysis.

    PubMed

    van Asselen, Sanneke; Verburg, Peter H; Vermaat, Jan E; Janse, Jan H

    2013-01-01

    Meta-analysis of case studies has become an important tool for synthesizing case study findings in land change. Meta-analyses of deforestation, urbanization, desertification and change in shifting cultivation systems have been published. This present study adds to this literature, with an analysis of the proximate causes and underlying forces of wetland conversion at a global scale using two complementary approaches of systematic review. Firstly, a meta-analysis of 105 case-study papers describing wetland conversion was performed, showing that different combinations of multiple-factor proximate causes, and underlying forces, drive wetland conversion. Agricultural development has been the main proximate cause of wetland conversion, and economic growth and population density are the most frequently identified underlying forces. Secondly, to add a more quantitative component to the study, a logistic meta-regression analysis was performed to estimate the likelihood of wetland conversion worldwide, using globally-consistent biophysical and socioeconomic location factor maps. Significant factors explaining wetland conversion, in order of importance, are market influence, total wetland area (lower conversion probability), mean annual temperature and cropland or built-up area. The regression analyses results support the outcomes of the meta-analysis of the processes of conversion mentioned in the individual case studies. In other meta-analyses of land change, similar factors (e.g., agricultural development, population growth, market/economic factors) are also identified as important causes of various types of land change (e.g., deforestation, desertification). Meta-analysis helps to identify commonalities across the various local case studies and identify which variables may lead to individual cases to behave differently. The meta-regression provides maps indicating the likelihood of wetland conversion worldwide based on the location factors that have determined historic conversions.

  19. Drivers of Wetland Conversion: a Global Meta-Analysis

    PubMed Central

    van Asselen, Sanneke; Verburg, Peter H.; Vermaat, Jan E.; Janse, Jan H.

    2013-01-01

    Meta-analysis of case studies has become an important tool for synthesizing case study findings in land change. Meta-analyses of deforestation, urbanization, desertification and change in shifting cultivation systems have been published. This present study adds to this literature, with an analysis of the proximate causes and underlying forces of wetland conversion at a global scale using two complementary approaches of systematic review. Firstly, a meta-analysis of 105 case-study papers describing wetland conversion was performed, showing that different combinations of multiple-factor proximate causes, and underlying forces, drive wetland conversion. Agricultural development has been the main proximate cause of wetland conversion, and economic growth and population density are the most frequently identified underlying forces. Secondly, to add a more quantitative component to the study, a logistic meta-regression analysis was performed to estimate the likelihood of wetland conversion worldwide, using globally-consistent biophysical and socioeconomic location factor maps. Significant factors explaining wetland conversion, in order of importance, are market influence, total wetland area (lower conversion probability), mean annual temperature and cropland or built-up area. The regression analyses results support the outcomes of the meta-analysis of the processes of conversion mentioned in the individual case studies. In other meta-analyses of land change, similar factors (e.g., agricultural development, population growth, market/economic factors) are also identified as important causes of various types of land change (e.g., deforestation, desertification). Meta-analysis helps to identify commonalities across the various local case studies and identify which variables may lead to individual cases to behave differently. The meta-regression provides maps indicating the likelihood of wetland conversion worldwide based on the location factors that have determined historic conversions. PMID:24282580

  20. Controls on Albian-Cenomanian carbonate platform sedimentation in middle eastern region: Kesalon event, a middle Cretaceous sea level change in Israel and its correlation with global sea level changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braun, M.; Hirsch, F.

    1987-05-01

    After Neocomian regional denudation, Aptian Telemim (= Blanche) carbonates onlapped the Arabian subplate, followed by Yavne-Tammun regression and Albian transgression. Near the Levant coast, the Albian-early Coniacian Judea carbonate platform interfingers with the Talme Yaffe basin to the west. To the south and east, Judea-type carbonates gradually onlap the mainly continental Kurnub (Nubia type) clastics of the peri-Arabian belt. Detailed analysis of the cyclic sedimentation within the 700-m thick Judea Limestone reveals a regressive trend near the top of the Albian Yagur Formation in Galilee, the Hevyon Formation in the Negev, and the ledge of the Kesalon formation in centralmore » Israel Judean Hills, which represents the end of the Early Cretaceous sedimentary cycle. The early Cenomanian marly chalk of the En Yorqeam Formation starts the Cenomanian cycle, followed by bedded and massive dolomite and ammonoid-bearing limestone. Platform sedimentation before this Kesalon event is dominated by bank facies with some rudistid bioherms of presumable Albian age. After the Kesalon event, Cenomanian and Turonian platforms have fast-changing paleogeography on basinal chalks, shales, bioherms and backreef lagoons. Facies boundaries, running mainly east-west to southwest-northeast up to the Early Cretaceous, became close to north-south in the Late Cretaceous. Albian-Cenomanian regressive-transgressive cycles in Israel match fairly well with global sea level changes, in particular the Kesalon event, which corresponds to the Ka-Kb sea level change of Vail et al. Late Turonian-early Senonian thrusting of the peri-Arabian alpine belt and folding in the Syrian arc heavily affect the unraveling of global sea level changes on the Arabian subplate.« less

Top